WorldWideScience

Sample records for testbed automated in-situ

  1. SCDU Testbed Automated In-Situ Alignment, Data Acquisition and Analysis

    Science.gov (United States)

    Werne, Thomas A.; Wehmeier, Udo J.; Wu, Janet P.; An, Xin; Goullioud, Renaud; Nemati, Bijan; Shao, Michael; Shen, Tsae-Pyng J.; Wang, Xu; Weilert, Mark A.; hide

    2010-01-01

    In the course of fulfilling its mandate, the Spectral Calibration Development Unit (SCDU) testbed for SIM-Lite produces copious amounts of raw data. To effectively spend time attempting to understand the science driving the data, the team devised computerized automations to limit the time spent bringing the testbed to a healthy state and commanding it, and instead focus on analyzing the processed results. We developed a multi-layered scripting language that emphasized the scientific experiments we conducted, which drastically shortened our experiment scripts, improved their readability, and all-but-eliminated testbed operator errors. In addition to scientific experiment functions, we also developed a set of automated alignments that bring the testbed up to a well-aligned state with little more than the push of a button. These scripts were written in the scripting language, and in Matlab via an interface library, allowing all members of the team to augment the existing scripting language with complex analysis scripts. To keep track of these results, we created an easily-parseable state log in which we logged both the state of the testbed and relevant metadata. Finally, we designed a distributed processing system that allowed us to farm lengthy analyses to a collection of client computers which reported their results in a central log. Since these logs were parseable, we wrote query scripts that gave us an effortless way to compare results collected under different conditions. This paper serves as a case-study, detailing the motivating requirements for the decisions we made and explaining the implementation process.

  2. Implementation strategies for load center automation on the space station module/power management and distribution testbed

    Science.gov (United States)

    Watson, Karen

    1990-01-01

    The Space Station Module/Power Management and Distribution (SSM/PMAD) testbed was developed to study the tertiary power management on modules in large spacecraft. The main goal was to study automation techniques, not necessarily develop flight ready systems. Because of the confidence gained in many of automation strategies investigated, it is appropriate to study, in more detail, implementation strategies in order to find better trade-offs for nearer to flight ready systems. These trade-offs particularly concern the weight, volume, power consumption, and performance of the automation system. These systems, in their present implementation are described.

  3. 21 CFR 866.4700 - Automated fluorescence in situ hybridization (FISH) enumeration systems.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated fluorescence in situ hybridization (FISH... Laboratory Equipment and Reagents § 866.4700 Automated fluorescence in situ hybridization (FISH) enumeration... Hybridization (FISH) Enumeration Systems.” See § 866.1(e) for the availability of this guidance document. [70 FR...

  4. Automated tools and techniques for distributed Grid Software Development of the testbed infrastructure

    CERN Document Server

    Aguado Sanchez, C

    2007-01-01

    Grid technology is becoming more and more important as the new paradigm for sharing computational resources across different organizations in a secure way. The great powerfulness of this solution, requires the definition of a generic stack of services and protocols and this is the scope of the different Grid initiatives. As a result of international collaborations for its development, the Open Grid Forum created the Open Grid Services Architecture (OGSA) which aims to define the common set of services that will enable interoperability across the different implementations. This master thesis has been developed in this framework, as part of the two European-funded projects ETICS and OMII-Europe. The main objective is to contribute to the design and maintenance of large distributed development projects with the automated tool that enables to implement Software Engineering techniques oriented to achieve an acceptable level of quality at the release process. Specifically, this thesis develops the testbed concept a...

  5. Progress in automated extraction and purification of in situ {sup 14}C from quartz: Results from the Purdue in situ {sup 14}C laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Lifton, Nathaniel, E-mail: nlifton@purdue.edu [Department of Earth, Atmospheric, and Planetary Sciences, Purdue University, 550 Stadium Mall Drive, West Lafayette, IN 47907 (United States); Department of Physics and Astronomy and Purdue Rare Isotope Measurement Laboratory (PRIME Lab), Purdue University, 525 Northwestern Avenue, West Lafayette, IN 47907 (United States); Goehring, Brent, E-mail: bgoehrin@tulane.edu [Department of Earth, Atmospheric, and Planetary Sciences, Purdue University, 550 Stadium Mall Drive, West Lafayette, IN 47907 (United States); Wilson, Jim, E-mail: jim.wilson@aeonlaboratories.com [Aeon Laboratories, LLC, 5835 North Genematas Drive, Tucson, AZ 85704 (United States); Kubley, Thomas [Department of Physics and Astronomy and Purdue Rare Isotope Measurement Laboratory (PRIME Lab), Purdue University, 525 Northwestern Avenue, West Lafayette, IN 47907 (United States); Caffee, Marc [Department of Earth, Atmospheric, and Planetary Sciences, Purdue University, 550 Stadium Mall Drive, West Lafayette, IN 47907 (United States); Department of Physics and Astronomy and Purdue Rare Isotope Measurement Laboratory (PRIME Lab), Purdue University, 525 Northwestern Avenue, West Lafayette, IN 47907 (United States)

    2015-10-15

    Current extraction methods for in situ {sup 14}C from quartz [e.g., Lifton et al., (2001), Pigati et al., (2010), Hippe et al., (2013)] are time-consuming and repetitive, making them an attractive target for automation. We report on the status of in situ {sup 14}C extraction and purification systems originally automated at the University of Arizona that have now been reconstructed and upgraded at the Purdue Rare Isotope Measurement Laboratory (PRIME Lab). The Purdue in situ {sup 14}C laboratory builds on the flow-through extraction system design of Pigati et al. (2010), automating most of the procedure by retrofitting existing valves with external servo-controlled actuators, regulating the pressure of research purity O{sub 2} inside the furnace tube via a PID-based pressure controller in concert with an inlet mass flow controller, and installing an automated liquid N{sub 2} distribution system, all driven by LabView® software. A separate system for cryogenic CO{sub 2} purification, dilution, and splitting is also fully automated, ensuring a highly repeatable process regardless of the operator. We present results from procedural blanks and an intercomparison material (CRONUS-A), as well as results of experiments to increase the amount of material used in extraction, from the standard 5 g to 10 g or above. Results thus far are quite promising with procedural blanks comparable to previous work and significant improvements in reproducibility for CRONUS-A measurements. The latter analyses also demonstrate the feasibility of quantitative extraction of in situ {sup 14}C from sample masses up to 10 g. Our lab is now analyzing unknowns routinely, but lowering overall blank levels is the focus of ongoing research.

  6. Future Autonomous and Automated Systems Testbed

    Data.gov (United States)

    National Aeronautics and Space Administration — Trust is the greatest obstacle to implementing greater autonomy and automation (A&A) in the human spaceflight program. The Future Autonomous and Automated...

  7. Trace explosives sensor testbed (TESTbed)

    Science.gov (United States)

    Collins, Greg E.; Malito, Michael P.; Tamanaha, Cy R.; Hammond, Mark H.; Giordano, Braden C.; Lubrano, Adam L.; Field, Christopher R.; Rogers, Duane A.; Jeffries, Russell A.; Colton, Richard J.; Rose-Pehrsson, Susan L.

    2017-03-01

    A novel vapor delivery testbed, referred to as the Trace Explosives Sensor Testbed, or TESTbed, is demonstrated that is amenable to both high- and low-volatility explosives vapors including nitromethane, nitroglycerine, ethylene glycol dinitrate, triacetone triperoxide, 2,4,6-trinitrotoluene, pentaerythritol tetranitrate, and hexahydro-1,3,5-trinitro-1,3,5-triazine. The TESTbed incorporates a six-port dual-line manifold system allowing for rapid actuation between a dedicated clean air source and a trace explosives vapor source. Explosives and explosives-related vapors can be sourced through a number of means including gas cylinders, permeation tube ovens, dynamic headspace chambers, and a Pneumatically Modulated Liquid Delivery System coupled to a perfluoroalkoxy total-consumption microflow nebulizer. Key features of the TESTbed include continuous and pulseless control of trace vapor concentrations with wide dynamic range of concentration generation, six sampling ports with reproducible vapor profile outputs, limited low-volatility explosives adsorption to the manifold surface, temperature and humidity control of the vapor stream, and a graphical user interface for system operation and testing protocol implementation.

  8. A geometrical approach for semi-automated crystal centering and in situ X-ray diffraction data collection

    International Nuclear Information System (INIS)

    Mohammad Yaser Heidari Khajepour; Ferrer, Jean-Luc; Lebrette, Hugo; Vernede, Xavier; Rogues, Pierrick

    2013-01-01

    High-throughput protein crystallography projects pushed forward the development of automated crystallization platforms that are now commonly used. This created an urgent need for adapted and automated equipment for crystal analysis. However, first these crystals have to be harvested, cryo-protected and flash-cooled, operations that can fail or negatively impact on the crystal. In situ X-ray diffraction analysis has become a valid alternative to these operations, and a growing number of users apply it for crystal screening and to solve structures. Nevertheless, even this shortcut may require a significant amount of beam time. In this in situ high-throughput approach, the centering of crystals relative to the beam represents the bottleneck in the analysis process. In this article, a new method to accelerate this process, by recording accurately the local geometry coordinates for each crystal in the crystallization plate, is presented. Subsequently, the crystallization plate can be presented to the X-ray beam by an automated plate-handling device, such as a six-axis robot arm, for an automated crystal centering in the beam, in situ screening or data collection. Here the preliminary results of such a semi-automated pipeline are reported for two distinct test proteins. (authors)

  9. Implementation of a virtual link between power system testbeds at Marshall Spaceflight Center and Lewis Research Center

    Science.gov (United States)

    Doreswamy, Rajiv

    1990-01-01

    The Marshall Space Flight Center (MSFC) owns and operates a space station module power management and distribution (SSM-PMAD) testbed. This system, managed by expert systems, is used to analyze and develop power system automation techniques for Space Station Freedom. The Lewis Research Center (LeRC), Cleveland, Ohio, has developed and implemented a space station electrical power system (EPS) testbed. This system and its power management controller are representative of the overall Space Station Freedom power system. A virtual link is being implemented between the testbeds at MSFC and LeRC. This link would enable configuration of SSM-PMAD as a load center for the EPS testbed at LeRC. This connection will add to the versatility of both systems, and provide an environment of enhanced realism for operation of both testbeds.

  10. An overview of the U.S. Army Research Laboratory's Sensor Information Testbed for Collaborative Research Environment (SITCORE) and Automated Online Data Repository (AODR) capabilities

    Science.gov (United States)

    Ward, Dennis W.; Bennett, Kelly W.

    2017-05-01

    The Sensor Information Testbed COllaberative Research Environment (SITCORE) and the Automated Online Data Repository (AODR) are significant enablers of the U.S. Army Research Laboratory (ARL)'s Open Campus Initiative and together create a highly-collaborative research laboratory and testbed environment focused on sensor data and information fusion. SITCORE creates a virtual research development environment allowing collaboration from other locations, including DoD, industry, academia, and collation facilities. SITCORE combined with AODR provides end-toend algorithm development, experimentation, demonstration, and validation. The AODR enterprise allows the U.S. Army Research Laboratory (ARL), as well as other government organizations, industry, and academia to store and disseminate multiple intelligence (Multi-INT) datasets collected at field exercises and demonstrations, and to facilitate research and development (R and D), and advancement of analytical tools and algorithms supporting the Intelligence, Surveillance, and Reconnaissance (ISR) community. The AODR provides a potential central repository for standards compliant datasets to serve as the "go-to" location for lessons-learned and reference products. Many of the AODR datasets have associated ground truth and other metadata which provides a rich and robust data suite for researchers to develop, test, and refine their algorithms. Researchers download the test data to their own environments using a sophisticated web interface. The AODR allows researchers to request copies of stored datasets and for the government to process the requests and approvals in an automated fashion. Access to the AODR requires two-factor authentication in the form of a Common Access Card (CAC) or External Certificate Authority (ECA)

  11. Automated quantitative analysis of in-situ NaI measured spectra in the marine environment using a wavelet-based smoothing technique

    International Nuclear Information System (INIS)

    Tsabaris, Christos; Prospathopoulos, Aristides

    2011-01-01

    An algorithm for automated analysis of in-situ NaI γ-ray spectra in the marine environment is presented. A standard wavelet denoising technique is implemented for obtaining a smoothed spectrum, while the stability of the energy spectrum is achieved by taking advantage of the permanent presence of two energy lines in the marine environment. The automated analysis provides peak detection, net area calculation, energy autocalibration, radionuclide identification and activity calculation. The results of the algorithm performance, presented for two different cases, show that analysis of short-term spectra with poor statistical information is considerably improved and that incorporation of further advancements could allow the use of the algorithm in early-warning marine radioactivity systems. - Highlights: → Algorithm for automated analysis of in-situ NaI γ-ray marine spectra. → Wavelet denoising technique provides smoothed spectra even at parts of the energy spectrum that exhibits strong statistical fluctuations. → Automated analysis provides peak detection, net area calculation, energy autocalibration, radionuclide identification and activity calculation. → Analysis of short-term spectra with poor statistical information is considerably improved.

  12. Building a ROS-Based Testbed for Realistic Multi-Robot Simulation: Taking the Exploration as an Example

    Directory of Open Access Journals (Sweden)

    Zhi Yan

    2017-09-01

    Full Text Available While the robotics community agrees that the benchmarking is of high importance to objectively compare different solutions, there are only few and limited tools to support it. To address this issue in the context of multi-robot systems, we have defined a benchmarking process based on experimental designs, which aimed at improving the reproducibility of experiments by making explicit all elements of a benchmark such as parameters, measurements and metrics. We have also developed a ROS (Robot Operating System-based testbed with the goal of making it easy for users to validate, benchmark, and compare different algorithms including coordination strategies. Our testbed uses the MORSE (Modular OpenRobots Simulation Engine simulator for realistic simulation and a computer cluster for decentralized computation. In this paper, we present our testbed in details with the architecture and infrastructure, the issues encountered in implementing the infrastructure, and the automation of the deployment. We also report a series of experiments on multi-robot exploration, in order to demonstrate the capabilities of our testbed.

  13. Automated Image Analysis for Quantitative Fluorescence In Situ Hybridization with Environmental Samples▿ †

    OpenAIRE

    Zhou, Zhi; Pons, Marie Noëlle; Raskin, Lutgarde; Zilles, Julie L.

    2007-01-01

    When fluorescence in situ hybridization (FISH) analyses are performed with complex environmental samples, difficulties related to the presence of microbial cell aggregates and nonuniform background fluorescence are often encountered. The objective of this study was to develop a robust and automated quantitative FISH method for complex environmental samples, such as manure and soil. The method and duration of sample dispersion were optimized to reduce the interference of cell aggregates. An au...

  14. Automated gas bubble imaging at sea floor – a new method of in situ gas flux quantification

    Directory of Open Access Journals (Sweden)

    G. Bohrmann

    2010-06-01

    Full Text Available Photo-optical systems are common in marine sciences and have been extensively used in coastal and deep-sea research. However, due to technical limitations in the past photo images had to be processed manually or semi-automatically. Recent advances in technology have rapidly improved image recording, storage and processing capabilities which are used in a new concept of automated in situ gas quantification by photo-optical detection. The design for an in situ high-speed image acquisition and automated data processing system is reported ("Bubblemeter". New strategies have been followed with regards to back-light illumination, bubble extraction, automated image processing and data management. This paper presents the design of the novel method, its validation procedures and calibration experiments. The system will be positioned and recovered from the sea floor using a remotely operated vehicle (ROV. It is able to measure bubble flux rates up to 10 L/min with a maximum error of 33% for worst case conditions. The Bubblemeter has been successfully deployed at a water depth of 1023 m at the Makran accretionary prism offshore Pakistan during a research expedition with R/V Meteor in November 2007.

  15. Fluorescence In Situ Hybridization (FISH Signal Analysis Using Automated Generated Projection Images

    Directory of Open Access Journals (Sweden)

    Xingwei Wang

    2012-01-01

    Full Text Available Fluorescence in situ hybridization (FISH tests provide promising molecular imaging biomarkers to more accurately and reliably detect and diagnose cancers and genetic disorders. Since current manual FISH signal analysis is low-efficient and inconsistent, which limits its clinical utility, developing automated FISH image scanning systems and computer-aided detection (CAD schemes has been attracting research interests. To acquire high-resolution FISH images in a multi-spectral scanning mode, a huge amount of image data with the stack of the multiple three-dimensional (3-D image slices is generated from a single specimen. Automated preprocessing these scanned images to eliminate the non-useful and redundant data is important to make the automated FISH tests acceptable in clinical applications. In this study, a dual-detector fluorescence image scanning system was applied to scan four specimen slides with FISH-probed chromosome X. A CAD scheme was developed to detect analyzable interphase cells and map the multiple imaging slices recorded FISH-probed signals into the 2-D projection images. CAD scheme was then applied to each projection image to detect analyzable interphase cells using an adaptive multiple-threshold algorithm, identify FISH-probed signals using a top-hat transform, and compute the ratios between the normal and abnormal cells. To assess CAD performance, the FISH-probed signals were also independently visually detected by an observer. The Kappa coefficients for agreement between CAD and observer ranged from 0.69 to 1.0 in detecting/counting FISH signal spots in four testing samples. The study demonstrated the feasibility of automated FISH signal analysis that applying a CAD scheme to the automated generated 2-D projection images.

  16. Automatic Integration Testbeds validation on Open Science Grid

    International Nuclear Information System (INIS)

    Caballero, J; Potekhin, M; Thapa, S; Gardner, R

    2011-01-01

    A recurring challenge in deploying high quality production middleware is the extent to which realistic testing occurs before release of the software into the production environment. We describe here an automated system for validating releases of the Open Science Grid software stack that leverages the (pilot-based) PanDA job management system developed and used by the ATLAS experiment. The system was motivated by a desire to subject the OSG Integration Testbed to more realistic validation tests. In particular those which resemble to every extent possible actual job workflows used by the experiments thus utilizing job scheduling at the compute element (CE), use of the worker node execution environment, transfer of data to/from the local storage element (SE), etc. The context is that candidate releases of OSG compute and storage elements can be tested by injecting large numbers of synthetic jobs varying in complexity and coverage of services tested. The native capabilities of the PanDA system can thus be used to define jobs, monitor their execution, and archive the resulting run statistics including success and failure modes. A repository of generic workflows and job types to measure various metrics of interest has been created. A command-line toolset has been developed so that testbed managers can quickly submit 'VO-like' jobs into the system when newly deployed services are ready for testing. A system for automatic submission has been crafted to send jobs to integration testbed sites, collecting the results in a central service and generating regular reports for performance and reliability.

  17. Automatic Integration Testbeds validation on Open Science Grid

    Science.gov (United States)

    Caballero, J.; Thapa, S.; Gardner, R.; Potekhin, M.

    2011-12-01

    A recurring challenge in deploying high quality production middleware is the extent to which realistic testing occurs before release of the software into the production environment. We describe here an automated system for validating releases of the Open Science Grid software stack that leverages the (pilot-based) PanDA job management system developed and used by the ATLAS experiment. The system was motivated by a desire to subject the OSG Integration Testbed to more realistic validation tests. In particular those which resemble to every extent possible actual job workflows used by the experiments thus utilizing job scheduling at the compute element (CE), use of the worker node execution environment, transfer of data to/from the local storage element (SE), etc. The context is that candidate releases of OSG compute and storage elements can be tested by injecting large numbers of synthetic jobs varying in complexity and coverage of services tested. The native capabilities of the PanDA system can thus be used to define jobs, monitor their execution, and archive the resulting run statistics including success and failure modes. A repository of generic workflows and job types to measure various metrics of interest has been created. A command-line toolset has been developed so that testbed managers can quickly submit "VO-like" jobs into the system when newly deployed services are ready for testing. A system for automatic submission has been crafted to send jobs to integration testbed sites, collecting the results in a central service and generating regular reports for performance and reliability.

  18. The DataTAG transatlantic testbed

    CERN Document Server

    Martin, O; Martin-Flatin, J P; Moroni, P; Nae, D; Newman, H; Ravot, S

    2005-01-01

    Wide area network testbeds allow researchers and engineers to test out new equipment, protocols and services in real-life situations, without jeopardizing the stability and reliability of production networks. The Data TransAtlantic Grid (DataTAG) testbed, deployed in 2002 between CERN, Geneva, Switzerland and StarLight, Chicago, IL, USA, is probably the largest testbed built to date. Jointly managed by CERN and Caltech, it is funded by the European Commission, the U.S. Department of Energy and the U.S. National Science Foundation. The main objectives of this testbed are to improve the Grid community's understanding of the networking issues posed by data- intensive Grid applications over transoceanic gigabit networks, design and develop new Grid middleware services, and improve the interoperability of European and U.S. Grid applications in High- Energy and Nuclear Physics. In this paper, we give an overview of this testbed, describe its various topologies over time, and summarize the main lessons learned after...

  19. A demonstration of remote survey and characterization of a buried waste site using the SRIP [Soldier Robot Interface Project] testbed

    International Nuclear Information System (INIS)

    Burks, B.L.; Richardson, B.S.; Armstrong, G.A.; Hamel, W.R.; Jansen, J.F.; Killough, S.M.; Thompson, D.H.; Emery, M.S.

    1990-01-01

    During FY 1990, the Oak Ridge National Laboratory (ORNL) supported the Department of Energy (DOE) Environmental Restoration and Waste Management (ER ampersand WM) Office of Technology Development through several projects including the development of a semiautonomous survey of a buried waste site using a remotely operated all-terrain robotic testbed borrowed from the US Army. The testbed was developed for the US Army's Human Engineering Laboratory (HEL) for the US Army's Soldier Robot Interface Project (SRIP). Initial development of the SRIP testbed was performed by a team including ORNL, HEL, Tooele Army Depot, and Odetics, Inc., as an experimental testbed for a variety of human factors issues related to military applications of robotics. The SRIP testbed was made available to the DOE and ORNL for the further development required for a remote landfill survey. The robot was modified extensively, equipped with environmental sensors, and used to demonstrate an automated remote survey of Solid Waste Storage Area No. 3 (SWSA 3) at ORNL on Tuesday, September 18, 1990. Burial trenches in this area containing contaminated materials were covered with soil nearly twenty years ago. This paper describes the SRIP testbed and work performed in FY 1990 to demonstrate a semiautonomous landfill survey at ORNL. 5 refs

  20. Space Station technology testbed: 2010 deep space transport

    Science.gov (United States)

    Holt, Alan C.

    1993-01-01

    A space station in a crew-tended or permanently crewed configuration will provide major R&D opportunities for innovative, technology and materials development and advanced space systems testing. A space station should be designed with the basic infrastructure elements required to grow into a major systems technology testbed. This space-based technology testbed can and should be used to support the development of technologies required to expand our utilization of near-Earth space, the Moon and the Earth-to-Jupiter region of the Solar System. Space station support of advanced technology and materials development will result in new techniques for high priority scientific research and the knowledge and R&D base needed for the development of major, new commercial product thrusts. To illustrate the technology testbed potential of a space station and to point the way to a bold, innovative approach to advanced space systems' development, a hypothetical deep space transport development and test plan is described. Key deep space transport R&D activities are described would lead to the readiness certification of an advanced, reusable interplanetary transport capable of supporting eight crewmembers or more. With the support of a focused and highly motivated, multi-agency ground R&D program, a deep space transport of this type could be assembled and tested by 2010. Key R&D activities on a space station would include: (1) experimental research investigating the microgravity assisted, restructuring of micro-engineered, materials (to develop and verify the in-space and in-situ 'tuning' of materials for use in debris and radiation shielding and other protective systems), (2) exposure of microengineered materials to the space environment for passive and operational performance tests (to develop in-situ maintenance and repair techniques and to support the development, enhancement, and implementation of protective systems, data and bio-processing systems, and virtual reality and

  1. The Impact of Automation Reliability and Operator Fatigue on Performance and Reliance

    Science.gov (United States)

    2016-09-23

    Cummings et al., 2007). Automation designed to assist operators in overload situations may promote operator disengagement during periods of low...Calhoun et al., 2011). This testbed offers several tasks designed to emulate the cognitive demands that an operator managing multiple UAVs is likely...reliable (Cronbach’s α = 0.94) measure of affective and cognitive components of trust in automation. Items gauge confidence in an automation and

  2. Benchmarking Diagnostic Algorithms on an Electrical Power System Testbed

    Science.gov (United States)

    Kurtoglu, Tolga; Narasimhan, Sriram; Poll, Scott; Garcia, David; Wright, Stephanie

    2009-01-01

    Diagnostic algorithms (DAs) are key to enabling automated health management. These algorithms are designed to detect and isolate anomalies of either a component or the whole system based on observations received from sensors. In recent years a wide range of algorithms, both model-based and data-driven, have been developed to increase autonomy and improve system reliability and affordability. However, the lack of support to perform systematic benchmarking of these algorithms continues to create barriers for effective development and deployment of diagnostic technologies. In this paper, we present our efforts to benchmark a set of DAs on a common platform using a framework that was developed to evaluate and compare various performance metrics for diagnostic technologies. The diagnosed system is an electrical power system, namely the Advanced Diagnostics and Prognostics Testbed (ADAPT) developed and located at the NASA Ames Research Center. The paper presents the fundamentals of the benchmarking framework, the ADAPT system, description of faults and data sets, the metrics used for evaluation, and an in-depth analysis of benchmarking results obtained from testing ten diagnostic algorithms on the ADAPT electrical power system testbed.

  3. A Testbed Environment for Buildings-to-Grid Cyber Resilience Research and Development

    Energy Technology Data Exchange (ETDEWEB)

    Sridhar, Siddharth; Ashok, Aditya; Mylrea, Michael E.; Pal, Seemita; Rice, Mark J.; Gourisetti, Sri Nikhil Gup

    2017-09-19

    The Smart Grid is characterized by the proliferation of advanced digital controllers at all levels of its operational hierarchy from generation to end consumption. Such controllers within modern residential and commercial buildings enable grid operators to exercise fine-grained control over energy consumption through several emerging Buildings-to-Grid (B2G) applications. Though this capability promises significant benefits in terms of operational economics and improved reliability, cybersecurity weaknesses in the supporting infrastructure could be exploited to cause a detrimental effect and this necessitates focused research efforts on two fronts. First, the understanding of how cyber attacks in the B2G space could impact grid reliability and to what extent. Second, the development and validation of cyber-physical application-specific countermeasures that are complementary to traditional infrastructure cybersecurity mechanisms for enhanced cyber attack detection and mitigation. The PNNL B2G testbed is currently being developed to address these core research needs. Specifically, the B2G testbed combines high-fidelity buildings+grid simulators, industry-grade building automation and Supervisory Control and Data Acquisition (SCADA) systems in an integrated, realistic, and reconfigurable environment capable of supporting attack-impact-detection-mitigation experimentation. In this paper, we articulate the need for research testbeds to model various B2G applications broadly by looking at the end-to-end operational hierarchy of the Smart Grid. Finally, the paper not only describes the architecture of the B2G testbed in detail, but also addresses the broad spectrum of B2G resilience research it is capable of supporting based on the smart grid operational hierarchy identified earlier.

  4. Fast Physics Testbed for the FASTER Project

    Energy Technology Data Exchange (ETDEWEB)

    Lin, W.; Liu, Y.; Hogan, R.; Neggers, R.; Jensen, M.; Fridlind, A.; Lin, Y.; Wolf, A.

    2010-03-15

    This poster describes the Fast Physics Testbed for the new FAst-physics System Testbed and Research (FASTER) project. The overall objective is to provide a convenient and comprehensive platform for fast turn-around model evaluation against ARM observations and to facilitate development of parameterizations for cloud-related fast processes represented in global climate models. The testbed features three major components: a single column model (SCM) testbed, an NWP-Testbed, and high-resolution modeling (HRM). The web-based SCM-Testbed features multiple SCMs from major climate modeling centers and aims to maximize the potential of SCM approach to enhance and accelerate the evaluation and improvement of fast physics parameterizations through continuous evaluation of existing and evolving models against historical as well as new/improved ARM and other complementary measurements. The NWP-Testbed aims to capitalize on the large pool of operational numerical weather prediction products. Continuous evaluations of NWP forecasts against observations at ARM sites are carried out to systematically identify the biases and skills of physical parameterizations under all weather conditions. The highresolution modeling (HRM) activities aim to simulate the fast processes at high resolution to aid in the understanding of the fast processes and their parameterizations. A four-tier HRM framework is established to augment the SCM- and NWP-Testbeds towards eventual improvement of the parameterizations.

  5. A remote integrated testbed for cooperating objects

    CERN Document Server

    Dios, Jose Ramiro Martinez-de; Bernabe, Alberto de San; Ollero, Anibal

    2013-01-01

    Testbeds are gaining increasing relevance in research domains and also in industrial applications. However, very few books devoted to testbeds have been published. To the best of my knowledge no book on this topic has been published. This book is particularly interesting for the growing community of testbed developers. I believe the book is also very interesting for researchers in robot-WSN cooperation.This book provides detailed description of a system that can be considered the first testbed that allows full peer-to-peer interoperability between heterogeneous robots and ubiquitous systems su

  6. A Business-to-Business Interoperability Testbed: An Overview

    Energy Technology Data Exchange (ETDEWEB)

    Kulvatunyou, Boonserm [ORNL; Ivezic, Nenad [ORNL; Monica, Martin [Sun Microsystems, Inc.; Jones, Albert [National Institute of Standards and Technology (NIST)

    2003-10-01

    In this paper, we describe a business-to-business (B2B) testbed co-sponsored by the Open Applications Group, Inc. (OAGI) and the National Institute of Standard and Technology (NIST) to advance enterprise e-commerce standards. We describe the business and technical objectives and initial activities within the B2B Testbed. We summarize our initial lessons learned to form the requirements that drive the next generation testbed development. We also give an overview of a promising testing framework architecture in which to drive the testbed developments. We outline the future plans for the testbed development.

  7. An agent-oriented approach to automated mission operations

    Science.gov (United States)

    Truszkowski, Walt; Odubiyi, Jide

    1994-01-01

    As we plan for the next generation of Mission Operations Control Center (MOCC) systems, there are many opportunities for the increased utilization of innovative knowledge-based technologies. The innovative technology discussed is an advanced use of agent-oriented approaches to the automation of mission operations. The paper presents an overview of this technology and discusses applied operational scenarios currently being investigated and prototyped. A major focus of the current work is the development of a simple user mechanism that would empower operations staff members to create, in real time, software agents to assist them in common, labor intensive operations tasks. These operational tasks would include: handling routine data and information management functions; amplifying the capabilities of a spacecraft analyst/operator to rapidly identify, analyze, and correct spacecraft anomalies by correlating complex data/information sets and filtering error messages; improving routine monitoring and trend analysis by detecting common failure signatures; and serving as a sentinel for spacecraft changes during critical maneuvers enhancing the system's capabilities to support nonroutine operational conditions with minimum additional staff. An agent-based testbed is under development. This testbed will allow us to: (1) more clearly understand the intricacies of applying agent-based technology in support of the advanced automation of mission operations and (2) access the full set of benefits that can be realized by the proper application of agent-oriented technology in a mission operations environment. The testbed under development addresses some of the data management and report generation functions for the Explorer Platform (EP)/Extreme UltraViolet Explorer (EUVE) Flight Operations Team (FOT). We present an overview of agent-oriented technology and a detailed report on the operation's concept for the testbed.

  8. An Optimized Autonomous Space In-situ Sensorweb (OASIS) for Volcano Monitoring

    Science.gov (United States)

    Song, W.; Shirazi, B.; Lahusen, R.; Chien, S.; Kedar, S.; Webb, F.

    2006-12-01

    In response to NASA's announced requirement for Earth hazard monitoring sensor-web technology, we are developing a prototype real-time Optimized Autonomous Space In-situ Sensorweb. The prototype will be focused on volcano hazard monitoring at Mount St. Helens, which has been in continuous eruption since October 2004. The system is designed to be flexible and easily configurable for many other applications as well. The primary goals of the project are: 1) integrating complementary space (i.e., Earth Observing One (EO- 1) satellite) and in-situ (ground-based) elements into an interactive, autonomous sensor-web; 2) advancing sensor-web power and communication resource management technology; and 3) enabling scalability for seamless infusion of future space and in-situ assets into the sensor-web. To meet these goals, we are developing: 1) a test-bed in-situ array with smart sensor nodes capable of making autonomous data acquisition decisions; 2) efficient self-organization algorithm of sensor-web topology to support efficient data communication and command control; 3) smart bandwidth allocation algorithms in which sensor nodes autonomously determine packet priorities based on mission needs and local bandwidth information in real- time; and 4) remote network management and reprogramming tools. The space and in-situ control components of the system will be integrated such that each element is capable of triggering the other. Sensor-web data acquisition and dissemination will be accomplished through the use of SensorML language standards for geospatial information. The three-year project will demonstrate end-to-end system performance with the in-situ test-bed at Mount St. Helens and NASA's EO-1 platform.

  9. Improving Flight Software Module Validation Efforts : a Modular, Extendable Testbed Software Framework

    Science.gov (United States)

    Lange, R. Connor

    2012-01-01

    Ever since Explorer-1, the United States' first Earth satellite, was developed and launched in 1958, JPL has developed many more spacecraft, including landers and orbiters. While these spacecraft vary greatly in their missions, capabilities,and destination, they all have something in common. All of the components of these spacecraft had to be comprehensively tested. While thorough testing is important to mitigate risk, it is also a very expensive and time consuming process. Thankfully,since virtually all of the software testing procedures for SMAP are computer controlled, these procedures can be automated. Most people testing SMAP flight software (FSW) would only need to write tests that exercise specific requirements and then check the filtered results to verify everything occurred as planned. This gives developers the ability to automatically launch tests on the testbed, distill the resulting logs into only the important information, generate validation documentation, and then deliver the documentation to management. With many of the steps in FSW testing automated, developers can use their limited time more effectively and can validate SMAP FSW modules quicker and test them more rigorously. As a result of the various benefits of automating much of the testing process, management is considering this automated tools use in future FSW validation efforts.

  10. Application of automation for low cost aircraft cabin simulator

    NARCIS (Netherlands)

    Tan, C.F.; Chen, W.; Boomen, van den G.J.A.; Rauterberg, G.W.M.

    2010-01-01

    This paper presents an application of automation for low cost aircraft cabin simulator. The aircraft cabin simulator is a testbed that was designed for research on aircraft passenger comfort mprovement product. The simulator consists of an economy class section, a business class section, a lavatory

  11. Automated image analysis for quantitative fluorescence in situ hybridization with environmental samples.

    Science.gov (United States)

    Zhou, Zhi; Pons, Marie Noëlle; Raskin, Lutgarde; Zilles, Julie L

    2007-05-01

    When fluorescence in situ hybridization (FISH) analyses are performed with complex environmental samples, difficulties related to the presence of microbial cell aggregates and nonuniform background fluorescence are often encountered. The objective of this study was to develop a robust and automated quantitative FISH method for complex environmental samples, such as manure and soil. The method and duration of sample dispersion were optimized to reduce the interference of cell aggregates. An automated image analysis program that detects cells from 4',6'-diamidino-2-phenylindole (DAPI) micrographs and extracts the maximum and mean fluorescence intensities for each cell from corresponding FISH images was developed with the software Visilog. Intensity thresholds were not consistent even for duplicate analyses, so alternative ways of classifying signals were investigated. In the resulting method, the intensity data were divided into clusters using fuzzy c-means clustering, and the resulting clusters were classified as target (positive) or nontarget (negative). A manual quality control confirmed this classification. With this method, 50.4, 72.1, and 64.9% of the cells in two swine manure samples and one soil sample, respectively, were positive as determined with a 16S rRNA-targeted bacterial probe (S-D-Bact-0338-a-A-18). Manual counting resulted in corresponding values of 52.3, 70.6, and 61.5%, respectively. In two swine manure samples and one soil sample 21.6, 12.3, and 2.5% of the cells were positive with an archaeal probe (S-D-Arch-0915-a-A-20), respectively. Manual counting resulted in corresponding values of 22.4, 14.0, and 2.9%, respectively. This automated method should facilitate quantitative analysis of FISH images for a variety of complex environmental samples.

  12. 77 FR 18793 - Spectrum Sharing Innovation Test-Bed Pilot Program

    Science.gov (United States)

    2012-03-28

    .... 120322212-2212-01] Spectrum Sharing Innovation Test-Bed Pilot Program AGENCY: National Telecommunications... Innovation Test-Bed pilot program to assess whether devices employing Dynamic Spectrum Access techniques can... Spectrum Sharing Innovation Test-Bed (Test-Bed) pilot program to examine the feasibility of increased...

  13. Development of a space-systems network testbed

    Science.gov (United States)

    Lala, Jaynarayan; Alger, Linda; Adams, Stuart; Burkhardt, Laura; Nagle, Gail; Murray, Nicholas

    1988-01-01

    This paper describes a communications network testbed which has been designed to allow the development of architectures and algorithms that meet the functional requirements of future NASA communication systems. The central hardware components of the Network Testbed are programmable circuit switching communication nodes which can be adapted by software or firmware changes to customize the testbed to particular architectures and algorithms. Fault detection, isolation, and reconfiguration has been implemented in the Network with a hybrid approach which utilizes features of both centralized and distributed techniques to provide efficient handling of faults within the Network.

  14. User interface design principles for the SSM/PMAD automated power system

    Science.gov (United States)

    Jakstas, Laura M.; Myers, Chris J.

    1991-01-01

    Martin Marietta has developed a user interface for the space station module power management and distribution (SSM/PMAD) automated power system testbed which provides human access to the functionality of the power system, as well as exemplifying current techniques in user interface design. The testbed user interface was designed to enable an engineer to operate the system easily without having significant knowledge of computer systems, as well as provide an environment in which the engineer can monitor and interact with the SSM/PMAD system hardware. The design of the interface supports a global view of the most important data from the various hardware and software components, as well as enabling the user to obtain additional or more detailed data when needed. The components and representations of the SSM/PMAD testbed user interface are examined. An engineer's interactions with the system are also described.

  15. The design and implementation of the LLNL gigabit testbed

    Energy Technology Data Exchange (ETDEWEB)

    Garcia, D. [Lawrence Livermore National Labs., CA (United States)

    1994-12-01

    This paper will look at the design and implementation of the LLNL Gigabit testbed (LGTB), where various high speed networking products, can be tested in one environment. The paper will discuss the philosophy behind the design of and the need for the testbed, the tests that are performed in the testbed, and the tools used to implement those tests.

  16. Virtual Factory Testbed

    Data.gov (United States)

    Federal Laboratory Consortium — The Virtual Factory Testbed (VFT) is comprised of three physical facilities linked by a standalone network (VFNet). The three facilities are the Smart and Wireless...

  17. Optimized Autonomous Space In-situ Sensor-Web for volcano monitoring

    Science.gov (United States)

    Song, W.-Z.; Shirazi, B.; Kedar, S.; Chien, S.; Webb, F.; Tran, D.; Davis, A.; Pieri, D.; LaHusen, R.; Pallister, J.; Dzurisin, D.; Moran, S.; Lisowski, M.

    2008-01-01

    In response to NASA's announced requirement for Earth hazard monitoring sensor-web technology, a multidisciplinary team involving sensor-network experts (Washington State University), space scientists (JPL), and Earth scientists (USGS Cascade Volcano Observatory (CVO)), is developing a prototype dynamic and scaleable hazard monitoring sensor-web and applying it to volcano monitoring. The combined Optimized Autonomous Space -In-situ Sensor-web (OASIS) will have two-way communication capability between ground and space assets, use both space and ground data for optimal allocation of limited power and bandwidth resources on the ground, and use smart management of competing demands for limited space assets. It will also enable scalability and seamless infusion of future space and in-situ assets into the sensor-web. The prototype will be focused on volcano hazard monitoring at Mount St. Helens, which has been active since October 2004. The system is designed to be flexible and easily configurable for many other applications as well. The primary goals of the project are: 1) integrating complementary space (i.e., Earth Observing One (EO-1) satellite) and in-situ (ground-based) elements into an interactive, autonomous sensor-web; 2) advancing sensor-web power and communication resource management technology; and 3) enabling scalability for seamless infusion of future space and in-situ assets into the sensor-web. To meet these goals, we are developing: 1) a test-bed in-situ array with smart sensor nodes capable of making autonomous data acquisition decisions; 2) efficient self-organization algorithm of sensor-web topology to support efficient data communication and command control; 3) smart bandwidth allocation algorithms in which sensor nodes autonomously determine packet priorities based on mission needs and local bandwidth information in real-time; and 4) remote network management and reprogramming tools. The space and in-situ control components of the system will be

  18. Advanced Artificial Intelligence Technology Testbed

    Science.gov (United States)

    Anken, Craig S.

    1993-01-01

    The Advanced Artificial Intelligence Technology Testbed (AAITT) is a laboratory testbed for the design, analysis, integration, evaluation, and exercising of large-scale, complex, software systems, composed of both knowledge-based and conventional components. The AAITT assists its users in the following ways: configuring various problem-solving application suites; observing and measuring the behavior of these applications and the interactions between their constituent modules; gathering and analyzing statistics about the occurrence of key events; and flexibly and quickly altering the interaction of modules within the applications for further study.

  19. High Precision Testbed to Evaluate Ethernet Performance for In-Car Networks

    DEFF Research Database (Denmark)

    Revsbech, Kasper; Madsen, Tatiana Kozlova; Schiøler, Henrik

    2012-01-01

    Validating safety-critical real-time systems such as in-car networks often involves a model-based performance analysis of the network. An important issue performing such analysis is to provide precise model parameters, matching the actual equipment. One way to obtain such parameters is to derive...... them by measurements of the equipment. In this work we describe the design of a testbed enabling active measurements on up to 1 [Gb=Sec] Copper based Ethernet Switches. By use of the testbed it self, we conduct a series of tests where the precision of the testbed is estimated. We find a maximum error...

  20. The Science of Home Automation

    Science.gov (United States)

    Thomas, Brian Louis

    Smart home technologies and the concept of home automation have become more popular in recent years. This popularity has been accompanied by social acceptance of passive sensors installed throughout the home. The subsequent increase in smart homes facilitates the creation of home automation strategies. We believe that home automation strategies can be generated intelligently by utilizing smart home sensors and activity learning. In this dissertation, we hypothesize that home automation can benefit from activity awareness. To test this, we develop our activity-aware smart automation system, CARL (CASAS Activity-aware Resource Learning). CARL learns the associations between activities and device usage from historical data and utilizes the activity-aware capabilities to control the devices. To help validate CARL we deploy and test three different versions of the automation system in a real-world smart environment. To provide a foundation of activity learning, we integrate existing activity recognition and activity forecasting into CARL home automation. We also explore two alternatives to using human-labeled data to train the activity learning models. The first unsupervised method is Activity Detection, and the second is a modified DBSCAN algorithm that utilizes Dynamic Time Warping (DTW) as a distance metric. We compare the performance of activity learning with human-defined labels and with automatically-discovered activity categories. To provide evidence in support of our hypothesis, we evaluate CARL automation in a smart home testbed. Our results indicate that home automation can be boosted through activity awareness. We also find that the resulting automation has a high degree of usability and comfort for the smart home resident.

  1. Environment Emulation For Wsn Testbed

    Directory of Open Access Journals (Sweden)

    Radosław Kapłoniak

    2012-01-01

    Full Text Available The development of applications for wireless sensor networks is a challenging task. For this reason, several testbed platforms have been created. They simplify the manageability of nodes by offering easy ways of programming and debugging sensor nodes. These platforms, sometimes composed of dozens of sensors, provide a convenient way for carrying out research on medium access control and data exchange between nodes. In this article, we propose the extension of the WSN testbed, which could be used for evaluating and testing the functionality of sensor networks applications by emulating a real-world environment.

  2. Nuclear Instrumentation and Control Cyber Testbed Considerations – Lessons Learned

    Energy Technology Data Exchange (ETDEWEB)

    Jonathan Gray; Robert Anderson; Julio G. Rodriguez; Cheol-Kwon Lee

    2014-08-01

    Abstract: Identifying and understanding digital instrumentation and control (I&C) cyber vulnerabilities within nuclear power plants and other nuclear facilities, is critical if nation states desire to operate nuclear facilities safely, reliably, and securely. In order to demonstrate objective evidence that cyber vulnerabilities have been adequately identified and mitigated, a testbed representing a facility’s critical nuclear equipment must be replicated. Idaho National Laboratory (INL) has built and operated similar testbeds for common critical infrastructure I&C for over ten years. This experience developing, operating, and maintaining an I&C testbed in support of research identifying cyber vulnerabilities has led the Korean Atomic Energy Research Institute of the Republic of Korea to solicit the experiences of INL to help mitigate problems early in the design, development, operation, and maintenance of a similar testbed. The following information will discuss I&C testbed lessons learned and the impact of these experiences to KAERI.

  3. SSERVI Analog Regolith Simulant Testbed Facility

    Science.gov (United States)

    Minafra, J.; Schmidt, G. K.

    2016-12-01

    SSERVI's goals include supporting planetary researchers within NASA, other government agencies; private sector and hardware developers; competitors in focused prize design competitions; and academic sector researchers. The SSERVI Analog Regolith Simulant Testbed provides opportunities for research scientists and engineers to study the effects of regolith analog testbed research in the planetary exploration field. This capability is essential to help to understand the basic effects of continued long-term exposure to a simulated analog test environment. The current facility houses approximately eight tons of JSC-1A lunar regolith simulant in a test bin consisting of a 4 meter by 4 meter area. SSERVI provides a bridge between several groups, joining together researchers from: 1) scientific and exploration communities, 2) multiple disciplines across a wide range of planetary sciences, and 3) domestic and international communities and partnerships. This testbed provides a means of consolidating the tasks of acquisition, storage and safety mitigation in handling large quantities of regolith simulant Facility hardware and environment testing scenarios include, but are not limited to the following; Lunar surface mobility, Dust exposure and mitigation, Regolith handling and excavation, Solar-like illumination, Lunar surface compaction profile, Lofted dust, Mechanical properties of lunar regolith, and Surface features (i.e. grades and rocks) Numerous benefits vary from easy access to a controlled analog regolith simulant testbed, and planetary exploration activities at NASA Research Park, to academia and expanded commercial opportunities in California's Silicon Valley, as well as public outreach and education opportunities.

  4. Exploration Systems Health Management Facilities and Testbed Workshop

    Science.gov (United States)

    Wilson, Scott; Waterman, Robert; McCleskey, Carey

    2004-01-01

    Presentation Agenda : (1) Technology Maturation Pipeline (The Plan) (2) Cryogenic testbed (and other KSC Labs) (2a) Component / Subsystem technologies (3) Advanced Technology Development Center (ATDC) (3a) System / Vehic1e technologies (4) EL V Flight Experiments (Flight Testbeds).

  5. Building a framework to manage trust in automation

    Science.gov (United States)

    Metcalfe, J. S.; Marathe, A. R.; Haynes, B.; Paul, V. J.; Gremillion, G. M.; Drnec, K.; Atwater, C.; Estepp, J. R.; Lukos, J. R.; Carter, E. C.; Nothwang, W. D.

    2017-05-01

    All automations must, at some point in their lifecycle, interface with one or more humans. Whether operators, end-users, or bystanders, human responses can determine the perceived utility and acceptance of an automation. It has been long believed that human trust is a primary determinant of human-automation interactions and further presumed that calibrating trust can lead to appropriate choices regarding automation use. However, attempts to improve joint system performance by calibrating trust have not yet provided a generalizable solution. To address this, we identified several factors limiting the direct integration of trust, or metrics thereof, into an active mitigation strategy. The present paper outlines our approach to addressing this important issue, its conceptual underpinnings, and practical challenges encountered in execution. Among the most critical outcomes has been a shift in focus from trust to basic interaction behaviors and their antecedent decisions. This change in focus inspired the development of a testbed and paradigm that was deployed in two experiments of human interactions with driving automation that were executed in an immersive, full-motion simulation environment. Moreover, by integrating a behavior and physiology-based predictor within a novel consequence-based control system, we demonstrated that it is possible to anticipate particular interaction behaviors and influence humans towards more optimal choices about automation use in real time. Importantly, this research provides a fertile foundation for the development and integration of advanced, wearable technologies for sensing and inferring critical state variables for better integration of human elements into otherwise fully autonomous systems.

  6. Integrating Simulated Physics and Device Virtualization in Control System Testbeds

    OpenAIRE

    Redwood , Owen; Reynolds , Jason; Burmester , Mike

    2016-01-01

    Part 3: INFRASTRUCTURE MODELING AND SIMULATION; International audience; Malware and forensic analyses of embedded cyber-physical systems are tedious, manual processes that testbeds are commonly not designed to support. Additionally, attesting the physics impact of embedded cyber-physical system malware has no formal methodologies and is currently an art. This chapter describes a novel testbed design methodology that integrates virtualized embedded industrial control systems and physics simula...

  7. Implementation of standard testbeds for numerical relativity

    Energy Technology Data Exchange (ETDEWEB)

    Babiuc, M C [Department of Physics and Physical Science, Marshall University, Huntington, WV 25755 (United States); Husa, S [Friedrich Schiller University Jena, Max-Wien-Platz 1, 07743 Jena (Germany); Alic, D [Department of Physics, University of the Balearic Islands, Cra Valldemossa km 7.5, 07122 Palma de Mallorca (Spain); Hinder, I [Center for Gravitational Wave Physics, Pennsylvania State University, University Park, PA 16802 (United States); Lechner, C [Weierstrass Institute for Applied Analysis and Stochastics (WIAS), Mohrenstrasse 39, 10117 Berlin (Germany); Schnetter, E [Center for Computation and Technology, 216 Johnston Hall, Louisiana State University, Baton Rouge, LA 70803 (United States); Szilagyi, B; Dorband, N; Pollney, D; Winicour, J [Max-Planck-Institut fuer Gravitationsphysik (Albert-Einstein-Institut), Am Muehlenberg 1, 14076 Golm (Germany); Zlochower, Y [Center for Computational Relativity and Gravitation, School of Mathematical Sciences, Rochester Institute of Technology, 78 Lomb Memorial Drive, Rochester, New York 14623 (United States)

    2008-06-21

    We discuss results that have been obtained from the implementation of the initial round of testbeds for numerical relativity which was proposed in the first paper of the Apples with Apples Alliance. We present benchmark results for various codes which provide templates for analyzing the testbeds and to draw conclusions about various features of the codes. This allows us to sharpen the initial test specifications, design a new test and add theoretical insight.

  8. CanOpen on RASTA: The Integration of the CanOpen IP Core in the Avionics Testbed

    Science.gov (United States)

    Furano, Gianluca; Guettache, Farid; Magistrati, Giorgio; Tiotto, Gabriele; Ortega, Carlos Urbina; Valverde, Alberto

    2013-08-01

    This paper presents the work done within the ESA Estec Data Systems Division, targeting the integration of the CanOpen IP Core with the existing Reference Architecture Test-bed for Avionics (RASTA). RASTA is the reference testbed system of the ESA Avionics Lab, designed to integrate the main elements of a typical Data Handling system. It aims at simulating a scenario where a Mission Control Center communicates with on-board computers and systems through a TM/TC link, thus providing the data management through qualified processors and interfaces such as Leon2 core processors, CAN bus controllers, MIL-STD-1553 and SpaceWire. This activity aims at the extension of the RASTA with two boards equipped with HurriCANe controller, acting as CANOpen slaves. CANOpen software modules have been ported on the RASTA system I/O boards equipped with Gaisler GR-CAN controller and acts as master communicating with the CCIPC boards. CanOpen serves as upper application layer for based on CAN defined within the CAN-in-Automation standard and can be regarded as the definitive standard for the implementation of CAN-based systems solutions. The development and integration of CCIPC performed by SITAEL S.p.A., is the first application that aims to bring the CANOpen standard for space applications. The definition of CANOpen within the European Cooperation for Space Standardization (ECSS) is under development.

  9. Dr. Tulga Ersal at NSF Workshop Accessible Remote Testbeds ART'15

    Science.gov (United States)

    Event Archives Dr. Tulga Ersal at NSF Workshop Accessible Remote Testbeds ART'15 On November 12th, Dr Workshop on Accessible Remote Testbeds (ART'15) at Georgia Tech. From the event website: The rationale behind the ART'15 workshop is that remote-access testbeds could, if done right, significantly change how

  10. A Novel UAV Electric Propulsion Testbed for Diagnostics and Prognostics

    Science.gov (United States)

    Gorospe, George E., Jr.; Kulkarni, Chetan S.

    2017-01-01

    This paper presents a novel hardware-in-the-loop (HIL) testbed for systems level diagnostics and prognostics of an electric propulsion system used in UAVs (unmanned aerial vehicle). Referencing the all electric, Edge 540T aircraft used in science and research by NASA Langley Flight Research Center, the HIL testbed includes an identical propulsion system, consisting of motors, speed controllers and batteries. Isolated under a controlled laboratory environment, the propulsion system has been instrumented for advanced diagnostics and prognostics. To produce flight like loading on the system a slave motor is coupled to the motor under test (MUT) and provides variable mechanical resistance, and the capability of introducing nondestructive mechanical wear-like frictional loads on the system. This testbed enables the verification of mathematical models of each component of the propulsion system, the repeatable generation of flight-like loads on the system for fault analysis, test-to-failure scenarios, and the development of advanced system level diagnostics and prognostics methods. The capabilities of the testbed are extended through the integration of a LabVIEW-based client for the Live Virtual Constructive Distributed Environment (LVCDC) Gateway which enables both the publishing of generated data for remotely located observers and prognosers and the synchronization the testbed propulsion system with vehicles in the air. The developed HIL testbed gives researchers easy access to a scientifically relevant portion of the aircraft without the overhead and dangers encountered during actual flight.

  11. Cooperative Search with Autonomous Vehicles in a 3D Aquatic Testbed

    Science.gov (United States)

    2012-01-01

    Cooperative Search with Autonomous Vehicles in a 3D Aquatic Testbed Matthew Keeter1, Daniel Moore2,3, Ryan Muller2,3, Eric Nieters1, Jennifer...Many applications for autonomous vehicles involve three-dimensional domains, notably aerial and aquatic environments. Such applications include mon...TYPE 3. DATES COVERED 00-00-2012 to 00-00-2012 4. TITLE AND SUBTITLE Cooperative Search With Autonomous Vehicles In A 3D Aquatic Testbed 5a

  12. A Reconfigurable Testbed Environment for Spacecraft Autonomy

    Science.gov (United States)

    Biesiadecki, Jeffrey; Jain, Abhinandan

    1996-01-01

    A key goal of NASA's New Millennium Program is the development of technology for increased spacecraft on-board autonomy. Achievement of this objective requires the development of a new class of ground-based automony testbeds that can enable the low-cost and rapid design, test, and integration of the spacecraft autonomy software. This paper describes the development of an Autonomy Testbed Environment (ATBE) for the NMP Deep Space I comet/asteroid rendezvous mission.

  13. Context-aware local Intrusion Detection in SCADA systems : a testbed and two showcases

    NARCIS (Netherlands)

    Chromik, Justyna Joanna; Haverkort, Boudewijn R.H.M.; Remke, Anne Katharina Ingrid; Pilch, Carina; Brackmann, Pascal; Duhme, Christof; Everinghoff, Franziska; Giberlein, Artur; Teodorowicz, Thomas; Wieland, Julian

    2017-01-01

    This paper illustrates the use of a testbed that we have developed for context-aware local intrusion detection. This testbed is based on the co-simulation framework Mosaik and allows for the validation of local intrusion detection mechanisms at field stations in power distribution networks. For two

  14. Automated brightfield dual-color in situ hybridization for detection of mouse double minute 2 gene amplification in sarcomas.

    Science.gov (United States)

    Zhang, Wenjun; McElhinny, Abigail; Nielsen, Alma; Wang, Maria; Miller, Melanie; Singh, Shalini; Rueger, Ruediger; Rubin, Brian P; Wang, Zhen; Tubbs, Raymond R; Nagle, Raymond B; Roche, Pat; Wu, Ping; Pestic-Dragovich, Lidija

    2011-01-01

    The human homolog of the mouse double minute 2 (MDM2) oncogene is amplified in about 20% of sarcomas. The measurement of the MDM2 amplification can aid in classification and may provide a predictive value for recently formulated therapies targeting MDM2. We have developed and validated an automated bright field dual-color in situ hybridization application to detect MDM2 gene amplification. A repeat-depleted MDM2 probe was constructed to target the MDM2 gene region at 12q15. A chromosome 12-specific probe (CHR12) was generated from a pα12H8 plasmid. The in situ hybridization assay was developed by using a dinitrophenyl-labeled MDM2 probe and a digoxigenin-labeled CHR12 probe on the Ventana Medical Systems' automated slide-staining platforms. The specificity of the MDM2 and CHR12 probes was shown on metaphase spreads and further validated against controls, including normal human tonsil and known MDM2-amplified samples. The assay performance was evaluated on a cohort of 100 formalin-fixed, paraffin-embedded specimens by using a conventional bright field microscope. Simultaneous hybridization and signal detection for MDM2 and CHR12 showed that both DNA targets were present in the same cells. One hundred soft tissue specimens were stained for MDM2 and CHR12. Although 26 of 29 lipomas were nonamplified and eusomic, MDM2 amplification was noted in 78% of atypical lipomatous tumors or well-differentiated liposarcomas. Five of 6 dedifferentiated liposarcoma cases were amplified for MDM2. MDM2 amplification was observed in 1 of 8 osteosarcomas; 3 showed CHR12 aneusomy. MDM2 amplification was present in 1 of 4 chondrosarcomas. Nine of 10 synovial sarcomas displayed no evidence of MDM2 amplification in most tumor cells. In pleomorphic sarcoma, not otherwise specified (pleomorphic malignant fibrous histiocytoma), MDM2 was amplified in 38% of cases, whereas 92% were aneusomic for CHR12. One alveolar rhabdomyosarcoma and 2 embryonal rhabdomyosarcomas displayed low-level aneusomy

  15. Smart Antenna UKM Testbed for Digital Beamforming System

    Directory of Open Access Journals (Sweden)

    2009-03-01

    Full Text Available A new design of smart antenna testbed developed at UKM for digital beamforming purpose is proposed. The smart antenna UKM testbed developed based on modular design employing two novel designs of L-probe fed inverted hybrid E-H (LIEH array antenna and software reconfigurable digital beamforming system (DBS. The antenna is developed based on using the novel LIEH microstrip patch element design arranged into 4×1 uniform linear array antenna. An interface board is designed to interface to the ADC board with the RF front-end receiver. The modular concept of the system provides the capability to test the antenna hardware, beamforming unit, and beamforming algorithm in an independent manner, thus allowing the smart antenna system to be developed and tested in parallel, hence reduces the design time. The DBS was developed using a high-performance TMS320C6711TM floating-point DSP board and a 4-channel RF front-end receiver developed in-house. An interface board is designed to interface to the ADC board with the RF front-end receiver. A four-element receiving array testbed at 1.88–2.22 GHz frequency is constructed, and digital beamforming on this testbed is successfully demonstrated.

  16. Wavefront control performance modeling with WFIRST shaped pupil coronagraph testbed

    Science.gov (United States)

    Zhou, Hanying; Nemati, Bijian; Krist, John; Cady, Eric; Kern, Brian; Poberezhskiy, Ilya

    2017-09-01

    NASA's WFIRST mission includes a coronagraph instrument (CGI) for direct imaging of exoplanets. Significant improvement in CGI model fidelity has been made recently, alongside a testbed high contrast demonstration in a simulated dynamic environment at JPL. We present our modeling method and results of comparisons to testbed's high order wavefront correction performance for the shaped pupil coronagraph. Agreement between model prediction and testbed result at better than a factor of 2 has been consistently achieved in raw contrast (contrast floor, chromaticity, and convergence), and with that comes good agreement in contrast sensitivity to wavefront perturbations and mask lateral shear.

  17. SSERVI Analog Regolith Simulant Testbed Facility

    Science.gov (United States)

    Minafra, Joseph; Schmidt, Gregory; Bailey, Brad; Gibbs, Kristina

    2016-10-01

    The Solar System Exploration Research Virtual Institute (SSERVI) at NASA's Ames Research Center in California's Silicon Valley was founded in 2013 to act as a virtual institute that provides interdisciplinary research centered on the goals of its supporting directorates: NASA Science Mission Directorate (SMD) and the Human Exploration & Operations Mission Directorate (HEOMD).Primary research goals of the Institute revolve around the integration of science and exploration to gain knowledge required for the future of human space exploration beyond low Earth orbit. SSERVI intends to leverage existing JSC1A regolith simulant resources into the creation of a regolith simulant testbed facility. The purpose of this testbed concept is to provide the planetary exploration community with a readily available capability to test hardware and conduct research in a large simulant environment.SSERVI's goals include supporting planetary researchers within NASA, other government agencies; private sector and hardware developers; competitors in focused prize design competitions; and academic sector researchers.SSERVI provides opportunities for research scientists and engineers to study the effects of regolith analog testbed research in the planetary exploration field. This capability is essential to help to understand the basic effects of continued long-term exposure to a simulated analog test environment.The current facility houses approximately eight tons of JSC-1A lunar regolith simulant in a test bin consisting of a 4 meter by 4 meter area, including dust mitigation and safety oversight.Facility hardware and environment testing scenarios could include, Lunar surface mobility, Dust exposure and mitigation, Regolith handling and excavation, Solar-like illumination, Lunar surface compaction profile, Lofted dust, Mechanical properties of lunar regolith, Surface features (i.e. grades and rocks)Numerous benefits vary from easy access to a controlled analog regolith simulant testbed, and

  18. Use of Tabu Search in a Solver to Map Complex Networks onto Emulab Testbeds

    National Research Council Canada - National Science Library

    MacDonald, Jason E

    2007-01-01

    The University of Utah's solver for the testbed mapping problem uses a simulated annealing metaheuristic algorithm to map a researcher's experimental network topology onto available testbed resources...

  19. Data dissemination in the wild: A testbed for high-mobility MANETs

    DEFF Research Database (Denmark)

    Vingelmann, Peter; Pedersen, Morten Videbæk; Heide, Janus

    2012-01-01

    This paper investigates the problem of efficient data dissemination in Mobile Ad hoc NETworks (MANETs) with high mobility. A testbed is presented; which provides a high degree of mobility in experiments. The testbed consists of 10 autonomous robots with mobile phones mounted on them. The mobile...... information, and the goal is to convey that information to all devices. A strategy is proposed that uses UDP broadcast transmissions and random linear network coding to facilitate the efficient exchange of information in the network. An application is introduced that implements this strategy on Nokia phones...

  20. Towards standard testbeds for numerical relativity

    International Nuclear Information System (INIS)

    Alcubierre, Miguel; Allen, Gabrielle; Bona, Carles; Fiske, David; Goodale, Tom; Guzman, F Siddhartha; Hawke, Ian; Hawley, Scott H; Husa, Sascha; Koppitz, Michael; Lechner, Christiane; Pollney, Denis; Rideout, David; Salgado, Marcelo; Schnetter, Erik; Seidel, Edward; Shinkai, Hisa-aki; Shoemaker, Deirdre; Szilagyi, Bela; Takahashi, Ryoji; Winicour, Jeff

    2004-01-01

    In recent years, many different numerical evolution schemes for Einstein's equations have been proposed to address stability and accuracy problems that have plagued the numerical relativity community for decades. Some of these approaches have been tested on different spacetimes, and conclusions have been drawn based on these tests. However, differences in results originate from many sources, including not only formulations of the equations, but also gauges, boundary conditions, numerical methods and so on. We propose to build up a suite of standardized testbeds for comparing approaches to the numerical evolution of Einstein's equations that are designed to both probe their strengths and weaknesses and to separate out different effects, and their causes, seen in the results. We discuss general design principles of suitable testbeds, and we present an initial round of simple tests with periodic boundary conditions. This is a pivotal first step towards building a suite of testbeds to serve the numerical relativists and researchers from related fields who wish to assess the capabilities of numerical relativity codes. We present some examples of how these tests can be quite effective in revealing various limitations of different approaches, and illustrating their differences. The tests are presently limited to vacuum spacetimes, can be run on modest computational resources and can be used with many different approaches used in the relativity community

  1. Towards standard testbeds for numerical relativity

    Energy Technology Data Exchange (ETDEWEB)

    Alcubierre, Miguel [Inst. de Ciencias Nucleares, Univ. Nacional Autonoma de Mexico, Apartado Postal 70-543, Mexico Distrito Federal 04510 (Mexico); Allen, Gabrielle; Goodale, Tom; Guzman, F Siddhartha; Hawke, Ian; Husa, Sascha; Koppitz, Michael; Lechner, Christiane; Pollney, Denis; Rideout, David [Max-Planck-Inst. fuer Gravitationsphysik, Albert-Einstein-Institut, 14476 Golm (Germany); Bona, Carles [Departament de Fisica, Universitat de les Illes Balears, Ctra de Valldemossa km 7.5, 07122 Palma de Mallorca (Spain); Fiske, David [Dept. of Physics, Univ. of Maryland, College Park, MD 20742-4111 (United States); Hawley, Scott H [Center for Relativity, Univ. of Texas at Austin, Austin, Texas 78712 (United States); Salgado, Marcelo [Inst. de Ciencias Nucleares, Univ. Nacional Autonoma de Mexico, Apartado Postal 70-543, Mexico Distrito Federal 04510 (Mexico); Schnetter, Erik [Inst. fuer Astronomie und Astrophysik, Universitaet Tuebingen, 72076 Tuebingen (Germany); Seidel, Edward [Max-Planck-Inst. fuer Gravitationsphysik, Albert-Einstein-Inst., 14476 Golm (Germany); Shinkai, Hisa-aki [Computational Science Div., Inst. of Physical and Chemical Research (RIKEN), Hirosawa 2-1, Wako, Saitama 351-0198 (Japan); Shoemaker, Deirdre [Center for Radiophysics and Space Research, Cornell Univ., Ithaca, NY 14853 (United States); Szilagyi, Bela [Dept. of Physics and Astronomy, Univ. of Pittsburgh, Pittsburgh, PA 15260 (United States); Takahashi, Ryoji [Theoretical Astrophysics Center, Juliane Maries Vej 30, 2100 Copenhagen, (Denmark); Winicour, Jeff [Max-Planck-Inst. fuer Gravitationsphysik, Albert-Einstein-Institut, 14476 Golm (Germany)

    2004-01-21

    In recent years, many different numerical evolution schemes for Einstein's equations have been proposed to address stability and accuracy problems that have plagued the numerical relativity community for decades. Some of these approaches have been tested on different spacetimes, and conclusions have been drawn based on these tests. However, differences in results originate from many sources, including not only formulations of the equations, but also gauges, boundary conditions, numerical methods and so on. We propose to build up a suite of standardized testbeds for comparing approaches to the numerical evolution of Einstein's equations that are designed to both probe their strengths and weaknesses and to separate out different effects, and their causes, seen in the results. We discuss general design principles of suitable testbeds, and we present an initial round of simple tests with periodic boundary conditions. This is a pivotal first step towards building a suite of testbeds to serve the numerical relativists and researchers from related fields who wish to assess the capabilities of numerical relativity codes. We present some examples of how these tests can be quite effective in revealing various limitations of different approaches, and illustrating their differences. The tests are presently limited to vacuum spacetimes, can be run on modest computational resources and can be used with many different approaches used in the relativity community.

  2. Development of a Tethered Formation Flight Testbed for ISS, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The development of a testbed for the development and demonstration of technologies needed by tethered formation flying satellites is proposed. Such a testbed would...

  3. AMS San Diego Testbed - Calibration Data

    Data.gov (United States)

    Department of Transportation — The data in this repository were collected from the San Diego, California testbed, namely, I-15 from the interchange with SR-78 in the north to the interchange with...

  4. Mounted Smartphones as Measurement and Control Platforms for Motor-Based Laboratory Test-Beds

    Directory of Open Access Journals (Sweden)

    Jared A. Frank

    2016-08-01

    Full Text Available Laboratory education in science and engineering often entails the use of test-beds equipped with costly peripherals for sensing, acquisition, storage, processing, and control of physical behavior. However, costly peripherals are no longer necessary to obtain precise measurements and achieve stable feedback control of test-beds. With smartphones performing diverse sensing and processing tasks, this study examines the feasibility of mounting smartphones directly to test-beds to exploit their embedded hardware and software in the measurement and control of the test-beds. This approach is a first step towards replacing laboratory-grade peripherals with more compact and affordable smartphone-based platforms, whose interactive user interfaces can engender wider participation and engagement from learners. Demonstrative cases are presented in which the sensing, computation, control, and user interaction with three motor-based test-beds are handled by a mounted smartphone. Results of experiments and simulations are used to validate the feasibility of mounted smartphones as measurement and feedback control platforms for motor-based laboratory test-beds, report the measurement precision and closed-loop performance achieved with such platforms, and address challenges in the development of platforms to maintain system stability.

  5. Mounted Smartphones as Measurement and Control Platforms for Motor-Based Laboratory Test-Beds.

    Science.gov (United States)

    Frank, Jared A; Brill, Anthony; Kapila, Vikram

    2016-08-20

    Laboratory education in science and engineering often entails the use of test-beds equipped with costly peripherals for sensing, acquisition, storage, processing, and control of physical behavior. However, costly peripherals are no longer necessary to obtain precise measurements and achieve stable feedback control of test-beds. With smartphones performing diverse sensing and processing tasks, this study examines the feasibility of mounting smartphones directly to test-beds to exploit their embedded hardware and software in the measurement and control of the test-beds. This approach is a first step towards replacing laboratory-grade peripherals with more compact and affordable smartphone-based platforms, whose interactive user interfaces can engender wider participation and engagement from learners. Demonstrative cases are presented in which the sensing, computation, control, and user interaction with three motor-based test-beds are handled by a mounted smartphone. Results of experiments and simulations are used to validate the feasibility of mounted smartphones as measurement and feedback control platforms for motor-based laboratory test-beds, report the measurement precision and closed-loop performance achieved with such platforms, and address challenges in the development of platforms to maintain system stability.

  6. NASA Robotic Neurosurgery Testbed

    Science.gov (United States)

    Mah, Robert

    1997-01-01

    The detection of tissue interface (e.g., normal tissue, cancer, tumor) has been limited clinically to tactile feedback, temperature monitoring, and the use of a miniature ultrasound probe for tissue differentiation during surgical operations, In neurosurgery, the needle used in the standard stereotactic CT or MRI guided brain biopsy provides no information about the tissue being sampled. The tissue sampled depends entirely upon the accuracy with which the localization provided by the preoperative CT or MRI scan is translated to the intracranial biopsy site. In addition, no information about the tissue being traversed by the needle (e.g., a blood vessel) is provided. Hemorrhage due to the biopsy needle tearing a blood vessel within the brain is the most devastating complication of stereotactic CT/MRI guided brain biopsy. A robotic neurosurgery testbed has been developed at NASA Ames Research Center as a spin-off of technologies from space, aeronautics and medical programs. The invention entitled "Robotic Neurosurgery Leading to Multimodality Devices for Tissue Identification" is nearing a state ready for commercialization. The devices will: 1) improve diagnostic accuracy and precision of general surgery, with near term emphasis on stereotactic brain biopsy, 2) automate tissue identification, with near term emphasis on stereotactic brain biopsy, to permit remote control of the procedure, and 3) reduce morbidity for stereotactic brain biopsy. The commercial impact from this work is the potential development of a whole new generation of smart surgical tools to increase the safety, accuracy and efficiency of surgical procedures. Other potential markets include smart surgical tools for tumor ablation in neurosurgery, general exploratory surgery, prostate cancer surgery, and breast cancer surgery.

  7. Development of Liquid Propulsion Systems Testbed at MSFC

    Science.gov (United States)

    Alexander, Reginald; Nelson, Graham

    2016-01-01

    As NASA, the Department of Defense and the aerospace industry in general strive to develop capabilities to explore near-Earth, Cis-lunar and deep space, the need to create more cost effective techniques of propulsion system design, manufacturing and test is imperative in the current budget constrained environment. The physics of space exploration have not changed, but the manner in which systems are developed and certified needs to change if there is going to be any hope of designing and building the high performance liquid propulsion systems necessary to deliver crew and cargo to the further reaches of space. To further the objective of developing these systems, the Marshall Space Flight Center is currently in the process of formulating a Liquid Propulsion Systems testbed, which will enable rapid integration of components to be tested and assessed for performance in integrated systems. The manifestation of this testbed is a breadboard engine configuration (BBE) with facility support for consumables and/or other components as needed. The goal of the facility is to test NASA developed elements, but can be used to test articles developed by other government agencies, industry or academia. Joint government/private partnership is likely the approach that will be required to enable efficient propulsion system development. MSFC has recently tested its own additively manufactured liquid hydrogen pump, injector, and valves in a BBE hot firing. It is rapidly building toward testing the pump and a new CH4 injector in the BBE configuration to demonstrate a 22,000 lbf, pump-fed LO2/LCH4 engine for the Mars lander or in-space transportation. The value of having this BBE testbed is that as components are developed they may be easily integrated in the testbed and tested. MSFC is striving to enhance its liquid propulsion system development capability. Rapid design, analysis, build and test will be critical to fielding the next high thrust rocket engine. With the maturity of the

  8. A Testbed For Validating the LHC Controls System Core Before Deployment

    CERN Document Server

    Nguyen Xuan, J

    2011-01-01

    Since the start-up of the LHC, it is crucial to carefully test core controls components before deploying them operationally. The Testbed of the CERN accelerator controls group was developed for this purpose. It contains different hardware (PPC, i386) running various operating systems (Linux and LynxOS) and core software components running on front-ends, communication middleware and client libraries. The Testbed first executes integration tests to verify that the components delivered by individual teams interoperate, and then system tests, which verify high-level, end-user functionality. It also verifies that different versions of components are compatible, which is vital, because not all parts of the operational LHC control system can be upgraded simultaneously. In addition, the Testbed can be used for performance and stress tests. Internally, the Testbed is driven by Atlassian Bamboo, a Continuous Integration server, which builds and deploys automatically new software versions into the Test...

  9. A methodology for automation and robotics evaluation applied to the space station telerobotic servicer

    Science.gov (United States)

    Smith, Jeffrey H.; Gyanfi, Max; Volkmer, Kent; Zimmerman, Wayne

    1988-01-01

    The efforts of a recent study aimed at identifying key issues and trade-offs associated with using a Flight Telerobotic Servicer (FTS) to aid in Space Station assembly-phase tasks is described. The use of automation and robotic (A and R) technologies for large space systems would involve a substitution of automation capabilities for human extravehicular or intravehicular activities (EVA, IVA). A methodology is presented that incorporates assessment of candidate assembly-phase tasks, telerobotic performance capabilities, development costs, and effect of operational constraints (space transportation system (STS), attached payload, and proximity operations). Changes in the region of cost-effectiveness are examined under a variety of systems design assumptions. A discussion of issues is presented with focus on three roles the FTS might serve: (1) as a research-oriented testbed to learn more about space usage of telerobotics; (2) as a research based testbed having an experimental demonstration orientation with limited assembly and servicing applications; or (3) as an operational system to augment EVA and to aid the construction of the Space Station and to reduce the programmatic (schedule) risk by increasing the flexibility of mission operations.

  10. Wireless Sensor Networks TestBed: ASNTbed

    CSIR Research Space (South Africa)

    Dludla, AG

    2013-05-01

    Full Text Available Wireless sensor networks (WSNs) have been used in different types of applications and deployed within various environments. Simulation tools are essential for studying WSNs, especially for exploring large-scale networks. However, WSN testbeds...

  11. Mounted Smartphones as Measurement and Control Platforms for Motor-Based Laboratory Test-Beds

    OpenAIRE

    Jared A. Frank; Anthony Brill; Vikram Kapila

    2016-01-01

    Laboratory education in science and engineering often entails the use of test-beds equipped with costly peripherals for sensing, acquisition, storage, processing, and control of physical behavior. However, costly peripherals are no longer necessary to obtain precise measurements and achieve stable feedback control of test-beds. With smartphones performing diverse sensing and processing tasks, this study examines the feasibility of mounting smartphones directly to test-beds to exploit their em...

  12. Growth plan for an inspirational test-bed of smart textile services

    NARCIS (Netherlands)

    Wensveen, S.A.G.; Tomico, O.; Bhomer, ten M.; Kuusk, K.

    2015-01-01

    In this pictorial we visualize the growth plan for an inspirational test-bed of smart textile product service systems. The goal of the test-bed is to inspire and inform the Dutch creative industries of textile, interaction and service design to combine their strengths and share opportunities. The

  13. Mounted Smartphones as Measurement and Control Platforms for Motor-Based Laboratory Test-Beds

    Science.gov (United States)

    Frank, Jared A.; Brill, Anthony; Kapila, Vikram

    2016-01-01

    Laboratory education in science and engineering often entails the use of test-beds equipped with costly peripherals for sensing, acquisition, storage, processing, and control of physical behavior. However, costly peripherals are no longer necessary to obtain precise measurements and achieve stable feedback control of test-beds. With smartphones performing diverse sensing and processing tasks, this study examines the feasibility of mounting smartphones directly to test-beds to exploit their embedded hardware and software in the measurement and control of the test-beds. This approach is a first step towards replacing laboratory-grade peripherals with more compact and affordable smartphone-based platforms, whose interactive user interfaces can engender wider participation and engagement from learners. Demonstrative cases are presented in which the sensing, computation, control, and user interaction with three motor-based test-beds are handled by a mounted smartphone. Results of experiments and simulations are used to validate the feasibility of mounted smartphones as measurement and feedback control platforms for motor-based laboratory test-beds, report the measurement precision and closed-loop performance achieved with such platforms, and address challenges in the development of platforms to maintain system stability. PMID:27556464

  14. Design of aircraft cabin testbed for stress free air travel experiment

    NARCIS (Netherlands)

    Tan, C.F.; Chen, W.; Rauterberg, G.W.M.

    2009-01-01

    The paper presents an aircraft cabin testbed that is designed and built for the stress free air travel experiment. The project is funded by European Union in the aim of improving air travel comfort during long haul flight. The testbed is used to test and validate the adaptive system that is capable

  15. Data Distribution Service-Based Interoperability Framework for Smart Grid Testbed Infrastructure

    Directory of Open Access Journals (Sweden)

    Tarek A. Youssef

    2016-03-01

    Full Text Available This paper presents the design and implementation of a communication and control infrastructure for smart grid operation. The proposed infrastructure enhances the reliability of the measurements and control network. The advantages of utilizing the data-centric over message-centric communication approach are discussed in the context of smart grid applications. The data distribution service (DDS is used to implement a data-centric common data bus for the smart grid. This common data bus improves the communication reliability, enabling distributed control and smart load management. These enhancements are achieved by avoiding a single point of failure while enabling peer-to-peer communication and an automatic discovery feature for dynamic participating nodes. The infrastructure and ideas presented in this paper were implemented and tested on the smart grid testbed. A toolbox and application programing interface for the testbed infrastructure are developed in order to facilitate interoperability and remote access to the testbed. This interface allows control, monitoring, and performing of experiments remotely. Furthermore, it could be used to integrate multidisciplinary testbeds to study complex cyber-physical systems (CPS.

  16. High Throughput In Situ XAFS Screening of Catalysts

    International Nuclear Information System (INIS)

    Tsapatsaris, Nikolaos; Beesley, Angela M.; Weiher, Norbert; Tatton, Helen; Schroeder, Sven L. M.; Dent, Andy J.; Mosselmans, Frederick J. W.; Tromp, Moniek; Russu, Sergio; Evans, John; Harvey, Ian; Hayama, Shu

    2007-01-01

    We outline and demonstrate the feasibility of high-throughput (HT) in situ XAFS for synchrotron radiation studies. An XAS data acquisition and control system for the analysis of dynamic materials libraries under control of temperature and gaseous environments has been developed. The system is compatible with the 96-well industry standard and coupled to multi-stream quadrupole mass spectrometry (QMS) analysis of reactor effluents. An automated analytical workflow generates data quickly compared to traditional individual spectrum acquisition and analyses them in quasi-real time using an HT data analysis tool based on IFFEFIT. The system was used for the automated characterization of a library of 91 catalyst precursors containing ternary combinations of Cu, Pt, and Au on γ-Al2O3, and for the in situ characterization of Au catalysts supported on Al2O3 and TiO2

  17. Holodeck Testbed Project

    Science.gov (United States)

    Arias, Adriel (Inventor)

    2016-01-01

    The main objective of the Holodeck Testbed is to create a cost effective, realistic, and highly immersive environment that can be used to train astronauts, carry out engineering analysis, develop procedures, and support various operations tasks. Currently, the Holodeck testbed allows to step into a simulated ISS (International Space Station) and interact with objects; as well as, perform Extra Vehicular Activities (EVA) on the surface of the Moon or Mars. The Holodeck Testbed is using the products being developed in the Hybrid Reality Lab (HRL). The HRL is combining technologies related to merging physical models with photo-realistic visuals to create a realistic and highly immersive environment. The lab also investigates technologies and concepts that are needed to allow it to be integrated with other testbeds; such as, the gravity offload capability provided by the Active Response Gravity Offload System (ARGOS). My main two duties were to develop and animate models for use in the HRL environments and work on a new way to interface with computers using Brain Computer Interface (BCI) technology. On my first task, I was able to create precise computer virtual tool models (accurate down to the thousandths or hundredths of an inch). To make these tools even more realistic, I produced animations for these tools so they would have the same mechanical features as the tools in real life. The computer models were also used to create 3D printed replicas that will be outfitted with tracking sensors. The sensor will allow the 3D printed models to align precisely with the computer models in the physical world and provide people with haptic/tactile feedback while wearing a VR (Virtual Reality) headset and interacting with the tools. Getting close to the end of my internship the lab bought a professional grade 3D Scanner. With this, I was able to replicate more intricate tools at a much more time-effective rate. The second task was to investigate the use of BCI to control

  18. Expert systems and advanced automation for space missions operations

    Science.gov (United States)

    Durrani, Sajjad H.; Perkins, Dorothy C.; Carlton, P. Douglas

    1990-01-01

    Increased complexity of space missions during the 1980s led to the introduction of expert systems and advanced automation techniques in mission operations. This paper describes several technologies in operational use or under development at the National Aeronautics and Space Administration's Goddard Space Flight Center. Several expert systems are described that diagnose faults, analyze spacecraft operations and onboard subsystem performance (in conjunction with neural networks), and perform data quality and data accounting functions. The design of customized user interfaces is discussed, with examples of their application to space missions. Displays, which allow mission operators to see the spacecraft position, orientation, and configuration under a variety of operating conditions, are described. Automated systems for scheduling are discussed, and a testbed that allows tests and demonstrations of the associated architectures, interface protocols, and operations concepts is described. Lessons learned are summarized.

  19. LOS Throughput Measurements in Real-Time with a 128-Antenna Massive MIMO Testbed

    OpenAIRE

    Harris, Paul; Zhang, Siming; Beach, Mark; Mellios, Evangelos; Nix, Andrew; Armour, Simon; Doufexi, Angela; Nieman, Karl; Kundargi, Nikhil

    2017-01-01

    This paper presents initial results for a novel 128-antenna massive Multiple-Input, Multiple- Output (MIMO) testbed developed through Bristol Is Open in collaboration with National Instruments and Lund University. We believe that the results presented here validate the adoption of massive MIMO as a key enabling technology for 5G and pave the way for further pragmatic research by the massive MIMO community. The testbed operates in real-time with a Long-Term Evolution (LTE)-like PHY in Time Div...

  20. Comparison of two matrix data structures for advanced CSM testbed applications

    Science.gov (United States)

    Regelbrugge, M. E.; Brogan, F. A.; Nour-Omid, B.; Rankin, C. C.; Wright, M. A.

    1989-01-01

    The first section describes data storage schemes presently used by the Computational Structural Mechanics (CSM) testbed sparse matrix facilities and similar skyline (profile) matrix facilities. The second section contains a discussion of certain features required for the implementation of particular advanced CSM algorithms, and how these features might be incorporated into the data storage schemes described previously. The third section presents recommendations, based on the discussions of the prior sections, for directing future CSM testbed development to provide necessary matrix facilities for advanced algorithm implementation and use. The objective is to lend insight into the matrix structures discussed and to help explain the process of evaluating alternative matrix data structures and utilities for subsequent use in the CSM testbed.

  1. Application of a Novel and Automated Branched DNA in Situ Hybridization Method for the Rapid and Sensitive Localization of mRNA Molecules in Plant Tissues

    Directory of Open Access Journals (Sweden)

    Andrew J. Bowling

    2014-04-01

    Full Text Available Premise of the study: A novel branched DNA detection technology, RNAscope in situ hybridization (ISH, originally developed for use on human clinical and animal tissues, was adapted for use in plant tissue in an attempt to overcome some of the limitations associated with traditional ISH assays. Methods and Results: Zea mays leaf tissue was formaldehyde fixed and paraffin embedded (FFPE and then probed with the RNAscope ISH assay for two endogenous genes, phosphoenolpyruvate carboxylase (PEPC and phosphoenolpyruvate carboxykinase (PEPCK. Results from both manual and automated methods showed tissue- and cell-specific mRNA localization patterns expected from these well-studied genes. Conclusions: RNAscope ISH is a sensitive method that generates high-quality, easily interpretable results from FFPE plant tissues. Automation of the RNAscope method on the Ventana Discovery Ultra platform allows significant advantages for repeatability, reduction in variability, and flexibility of workflow processes.

  2. A technique for recording polycrystalline structure and orientation during in situ deformation cycles of rock analogues using an automated fabric analyser.

    Science.gov (United States)

    Peternell, M; Russell-Head, D S; Wilson, C J L

    2011-05-01

    Two in situ plane-strain deformation experiments on norcamphor and natural ice using synchronous recording of crystal c-axis orientations have been performed with an automated fabric analyser and a newly developed sample press and deformation stage. Without interrupting the deformation experiment, c-axis orientations are determined for each pixel in a 5 × 5 mm sample area at a spatial resolution of 5 μm/pixel. In the case of norcamphor, changes in microstructures and associated crystallographic information, at a strain rate of ∼2 × 10(-5) s(-1), were recorded for the first time during a complete in situ deformation-cycle experiment that consisted of an annealing, deformation and post-deformation annealing path. In the case of natural ice, slower external strain rates (∼1 × 10(-6) s(-1)) enabled the investigation of small changes in the polycrystal aggregate's crystallography and microstructure for small amounts of strain. The technical setup and first results from the experiments are presented. © 2010 The Authors Journal of Microscopy © 2010 Royal Microscopical Society.

  3. High-contrast imager for Complex Aperture Telescopes (HiCAT): testbed design and coronagraph developments

    Science.gov (United States)

    N'Diaye, Mamadou; Choquet, E.; Pueyo, L.; Elliot, E.; Perrin, M. D.; Wallace, J.; Anderson, R. E.; Carlotti, A.; Groff, T. D.; Hartig, G. F.; Kasdin, J.; Lajoie, C.; Levecq, O.; Long, C.; Macintosh, B.; Mawet, D.; Norman, C. A.; Shaklan, S.; Sheckells, M.; Sivaramakrishnan, A.; Soummer, R.

    2014-01-01

    We present a new high-contrast imaging testbed designed to provide complete solutions for wavefront sensing and control and starlight suppression with complex aperture telescopes (NASA APRA; Soummer PI). This includes geometries with central obstruction, support structures, and/or primary mirror segmentation. Complex aperture telescopes are often associated with large telescope designs, which are considered for future space missions. However, these designs makes high-contrast imaging challenging because of additional diffraction features in the point spread function. We present a novel optimization approach for the testbed optical and opto-mechanical design that minimizes the impact of both phase and amplitude errors from the wave propagation of testbed optics surface errors. This design approach allows us to define the specification for the bench optics, which we then compare to the manufactured parts. We discuss the testbed alignment and first results. We also present our coronagraph design for different testbed pupil shapes (AFTA or ATLAST), which involves a new method for the optimization of Apodized Pupil Lyot Coronagraphs (APLC).

  4. Closing the contrast gap between testbed and model prediction with WFIRST-CGI shaped pupil coronagraph

    Science.gov (United States)

    Zhou, Hanying; Nemati, Bijan; Krist, John; Cady, Eric; Prada, Camilo M.; Kern, Brian; Poberezhskiy, Ilya

    2016-07-01

    JPL has recently passed an important milestone in its technology development for a proposed NASA WFIRST mission coronagraph: demonstration of better than 1x10-8 contrast over broad bandwidth (10%) on both shaped pupil coronagraph (SPC) and hybrid Lyot coronagraph (HLC) testbeds with the WFIRST obscuration pattern. Challenges remain, however, in the technology readiness for the proposed mission. One is the discrepancies between the achieved contrasts on the testbeds and their corresponding model predictions. A series of testbed diagnoses and modeling activities were planned and carried out on the SPC testbed in order to close the gap. A very useful tool we developed was a derived "measured" testbed wavefront control Jacobian matrix that could be compared with the model-predicted "control" version that was used to generate the high contrast dark hole region in the image plane. The difference between these two is an estimate of the error in the control Jacobian. When the control matrix, which includes both amplitude and phase, was modified to reproduce the error, the simulated performance closely matched the SPC testbed behavior in both contrast floor and contrast convergence speed. This is a step closer toward model validation for high contrast coronagraphs. Further Jacobian analysis and modeling provided clues to the possible sources for the mismatch: DM misregistration and testbed optical wavefront error (WFE) and the deformable mirror (DM) setting for correcting this WFE. These analyses suggested that a high contrast coronagraph has a tight tolerance in the accuracy of its control Jacobian. Modifications to both testbed control model as well as prediction model are being implemented, and future works are discussed.

  5. Wireless Testbed Bonsai

    Science.gov (United States)

    2006-02-01

    wireless sensor device network, and a about 200 Stargate nodes higher-tier multi-hop peer- to-peer 802.11b wireless network. Leading up to the full ExScal...deployment, we conducted spatial scaling tests on our higher-tier protocols on a 7 × 7 grid of Stargates nodes 45m and with 90m separations respectively...onW and its scaled version W̃ . III. EXPERIMENTAL SETUP Description of Kansei testbed. A stargate is a single board linux-based computer [7]. It uses a

  6. Current Developments in DETER Cybersecurity Testbed Technology

    Science.gov (United States)

    2015-12-08

    Management Experimental cybersecurity research is often inherently risky. An experiment may involve releasing live malware code, operating a real botnet...imagine a worm that can only propagate by first contacting a “propagation service” (T1 constraint), composed with a testbed firewall (T2...experiment. Finally, T1 constraints might be enforced by (1) explicit modification of malware to constrain its behavior, (2) implicit constraints

  7. INFN Tier-1 Testbed Facility

    International Nuclear Information System (INIS)

    Gregori, Daniele; Cavalli, Alessandro; Dell'Agnello, Luca; Dal Pra, Stefano; Prosperini, Andrea; Ricci, Pierpaolo; Ronchieri, Elisabetta; Sapunenko, Vladimir

    2012-01-01

    INFN-CNAF, located in Bologna, is the Information Technology Center of National Institute of Nuclear Physics (INFN). In the framework of the Worldwide LHC Computing Grid, INFN-CNAF is one of the eleven worldwide Tier-1 centers to store and reprocessing Large Hadron Collider (LHC) data. The Italian Tier-1 provides the resources of storage (i.e., disk space for short term needs and tapes for long term needs) and computing power that are needed for data processing and analysis to the LHC scientific community. Furthermore, INFN Tier-1 houses computing resources for other particle physics experiments, like CDF at Fermilab, SuperB at Frascati, as well as for astro particle and spatial physics experiments. The computing center is a very complex infrastructure, the hardaware layer include the network, storage and farming area, while the software layer includes open source and proprietary software. Software updating and new hardware adding can unexpectedly deteriorate the production activity of the center: therefore a testbed facility has been set up in order to reproduce and certify the various layers of the Tier-1. In this article we describe the testbed and the checks performed.

  8. Versatile Electric Propulsion Aircraft Testbed, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — An all-electric aircraft testbed is proposed to provide a dedicated development environment for the rigorous study and advancement of electrically powered aircraft....

  9. Mini-mast CSI testbed user's guide

    Science.gov (United States)

    Tanner, Sharon E.; Pappa, Richard S.; Sulla, Jeffrey L.; Elliott, Kenny B.; Miserentino, Robert; Bailey, James P.; Cooper, Paul A.; Williams, Boyd L., Jr.; Bruner, Anne M.

    1992-01-01

    The Mini-Mast testbed is a 20 m generic truss highly representative of future deployable trusses for space applications. It is fully instrumented for system identification and active vibrations control experiments and is used as a ground testbed at NASA-Langley. The facility has actuators and feedback sensors linked via fiber optic cables to the Advanced Real Time Simulation (ARTS) system, where user defined control laws are incorporated into generic controls software. The object of the facility is to conduct comprehensive active vibration control experiments on a dynamically realistic large space structure. A primary goal is to understand the practical effects of simplifying theoretical assumptions. This User's Guide describes the hardware and its primary components, the dynamic characteristics of the test article, the control law implementation process, and the necessary safeguards employed to protect the test article. Suggestions for a strawman controls experiment are also included.

  10. Real-Time Simulation and Hardware-in-the-Loop Testbed for Distribution Synchrophasor Applications

    Directory of Open Access Journals (Sweden)

    Matthias Stifter

    2018-04-01

    Full Text Available With the advent of Distribution Phasor Measurement Units (D-PMUs and Micro-Synchrophasors (Micro-PMUs, the situational awareness in power distribution systems is going to the next level using time-synchronization. However, designing, analyzing, and testing of such accurate measurement devices are still challenging. Due to the lack of available knowledge and sufficient history for synchrophasors’ applications at the power distribution level, the realistic simulation, and validation environments are essential for D-PMU development and deployment. This paper presents a vendor agnostic PMU real-time simulation and hardware-in-the-Loop (PMU-RTS-HIL testbed, which helps in multiple PMUs validation and studies. The network of real and virtual PMUs was built in a full time-synchronized environment for PMU applications’ validation. The proposed testbed also includes an emulated communication network (CNS layer to replicate bandwidth, packet loss and collisions conditions inherent to the PMUs data streams’ issues. Experimental results demonstrate the flexibility and scalability of the developed PMU-RTS-HIL testbed by producing large amounts of measurements under typical normal and abnormal distribution grid operation conditions.

  11. NBodyLab: A Testbed for Undergraduates Utilizing a Web Interface to NEMO and MD-GRAPE2 Hardware

    Science.gov (United States)

    Johnson, V. L.; Teuben, P. J.; Penprase, B. E.

    An N-body simulation testbed called NBodyLab was developed at Pomona College as a teaching tool for undergraduates. The testbed runs under Linux and provides a web interface to selected back-end NEMO modeling and analysis tools, and several integration methods which can optionally use an MD-GRAPE2 supercomputer card in the server to accelerate calculation of particle-particle forces. The testbed provides a framework for using and experimenting with the main components of N-body simulations: data models and transformations, numerical integration of the equations of motion, analysis and visualization products, and acceleration techniques (in this case, special purpose hardware). The testbed can be used by students with no knowledge of programming or Unix, freeing such students and their instructor to spend more time on scientific experimentation. The advanced student can extend the testbed software and/or more quickly transition to the use of more advanced Unix-based toolsets such as NEMO, Starlab and model builders such as GalactICS. Cosmology students at Pomona College used the testbed to study collisions of galaxies with different speeds, masses, densities, collision angles, angular momentum, etc., attempting to simulate, for example, the Tadpole Galaxy and the Antenna Galaxies. The testbed framework is available as open-source to assist other researchers and educators. Recommendations are made for testbed enhancements.

  12. Definition study for variable cycle engine testbed engine and associated test program

    Science.gov (United States)

    Vdoviak, J. W.

    1978-01-01

    The product/study double bypass variable cycle engine (VCE) was updated to incorporate recent improvements. The effect of these improvements on mission range and noise levels was determined. This engine design was then compared with current existing high-technology core engines in order to define a subscale testbed configuration that simulated many of the critical technology features of the product/study VCE. Detailed preliminary program plans were then developed for the design, fabrication, and static test of the selected testbed engine configuration. These plans included estimated costs and schedules for the detail design, fabrication and test of the testbed engine and the definition of a test program, test plan, schedule, instrumentation, and test stand requirements.

  13. Optical Network Testbeds Workshop

    Energy Technology Data Exchange (ETDEWEB)

    Joe Mambretti

    2007-06-01

    This is the summary report of the third annual Optical Networking Testbed Workshop (ONT3), which brought together leading members of the international advanced research community to address major challenges in creating next generation communication services and technologies. Networking research and development (R&D) communities throughout the world continue to discover new methods and technologies that are enabling breakthroughs in advanced communications. These discoveries are keystones for building the foundation of the future economy, which requires the sophisticated management of extremely large qualities of digital information through high performance communications. This innovation is made possible by basic research and experiments within laboratories and on specialized testbeds. Initial network research and development initiatives are driven by diverse motives, including attempts to solve existing complex problems, the desire to create powerful new technologies that do not exist using traditional methods, and the need to create tools to address specific challenges, including those mandated by large scale science or government agency mission agendas. Many new discoveries related to communications technologies transition to wide-spread deployment through standards organizations and commercialization. These transition paths allow for new communications capabilities that drive many sectors of the digital economy. In the last few years, networking R&D has increasingly focused on advancing multiple new capabilities enabled by next generation optical networking. Both US Federal networking R&D and other national R&D initiatives, such as those organized by the National Institute of Information and Communications Technology (NICT) of Japan are creating optical networking technologies that allow for new, powerful communication services. Among the most promising services are those based on new types of multi-service or hybrid networks, which use new optical networking

  14. Easy as Pi: A Network Coding Raspberry Pi Testbed

    Directory of Open Access Journals (Sweden)

    Chres W. Sørensen

    2016-10-01

    Full Text Available In the near future, upcoming communications and storage networks are expected to tolerate major difficulties produced by huge amounts of data being generated from the Internet of Things (IoT. For these types of networks, strategies and mechanisms based on network coding have appeared as an alternative to overcome these difficulties in a holistic manner, e.g., without sacrificing the benefit of a given network metric when improving another. There has been recurrent issues on: (i making large-scale deployments akin to the Internet of Things; (ii assessing and (iii replicating the obtained results in preliminary studies. Therefore, finding testbeds that can deal with large-scale deployments and not lose historic data in order to evaluate these mechanisms are greatly needed and desirable from a research perspective. However, this can be hard to manage, not only due to the inherent costs of the hardware, but also due to maintenance challenges. In this paper, we present the required key steps to design, setup and maintain an inexpensive testbed using Raspberry Pi devices for communications and storage networks with network coding capabilities. This testbed can be utilized for any applications requiring results replicability.

  15. Automated Image Analysis of HER2 Fluorescence In Situ Hybridization to Refine Definitions of Genetic Heterogeneity in Breast Cancer Tissue.

    Science.gov (United States)

    Radziuviene, Gedmante; Rasmusson, Allan; Augulis, Renaldas; Lesciute-Krilaviciene, Daiva; Laurinaviciene, Aida; Clim, Eduard; Laurinavicius, Arvydas

    2017-01-01

    Human epidermal growth factor receptor 2 gene- (HER2-) targeted therapy for breast cancer relies primarily on HER2 overexpression established by immunohistochemistry (IHC) with borderline cases being further tested for amplification by fluorescence in situ hybridization (FISH). Manual interpretation of HER2 FISH is based on a limited number of cells and rather complex definitions of equivocal, polysomic, and genetically heterogeneous (GH) cases. Image analysis (IA) can extract high-capacity data and potentially improve HER2 testing in borderline cases. We investigated statistically derived indicators of HER2 heterogeneity in HER2 FISH data obtained by automated IA of 50 IHC borderline (2+) cases of invasive ductal breast carcinoma. Overall, IA significantly underestimated the conventional HER2, CEP17 counts, and HER2/CEP17 ratio; however, it collected more amplified cells in some cases below the lower limit of GH definition by manual procedure. Indicators for amplification, polysomy, and bimodality were extracted by factor analysis and allowed clustering of the tumors into amplified, nonamplified, and equivocal/polysomy categories. The bimodality indicator provided independent cell diversity characteristics for all clusters. Tumors classified as bimodal only partially coincided with the conventional GH heterogeneity category. We conclude that automated high-capacity nonselective tumor cell assay can generate evidence-based HER2 intratumor heterogeneity indicators to refine GH definitions.

  16. An adaptable, low cost test-bed for unmanned vehicle systems research

    Science.gov (United States)

    Goppert, James M.

    2011-12-01

    An unmanned vehicle systems test-bed has been developed. The test-bed has been designed to accommodate hardware changes and various vehicle types and algorithms. The creation of this test-bed allows research teams to focus on algorithm development and employ a common well-tested experimental framework. The ArduPilotOne autopilot was developed to provide the necessary level of abstraction for multiple vehicle types. The autopilot was also designed to be highly integrated with the Mavlink protocol for Micro Air Vehicle (MAV) communication. Mavlink is the native protocol for QGroundControl, a MAV ground control program. Features were added to QGroundControl to accommodate outdoor usage. Next, the Mavsim toolbox was developed for Scicoslab to allow hardware-in-the-loop testing, control design and analysis, and estimation algorithm testing and verification. In order to obtain linear models of aircraft dynamics, the JSBSim flight dynamics engine was extended to use a probabilistic Nelder-Mead simplex method. The JSBSim aircraft dynamics were compared with wind-tunnel data collected. Finally, a structured methodology for successive loop closure control design is proposed. This methodology is demonstrated along with the rest of the test-bed tools on a quadrotor, a fixed wing RC plane, and a ground vehicle. Test results for the ground vehicle are presented.

  17. Development of a smart-antenna test-bed, demonstrating software defined digital beamforming

    NARCIS (Netherlands)

    Kluwer, T.; Slump, Cornelis H.; Schiphorst, Roelof; Hoeksema, F.W.

    2001-01-01

    This paper describes a smart-antenna test-bed consisting of ‘common of the shelf’ (COTS) hardware and software defined radio components. The use of software radio components enables a flexible platform to implement and test mobile communication systems as a real-world system. The test-bed is

  18. PEER Testbed Study on a Laboratory Building: Exercising Seismic Performance Assessment

    OpenAIRE

    Comerio, Mary C.; Stallmeyer, John C.; Smith, Ryan; Makris, Nicos; Konstantinidis, Dimitrios; Mosalam, Khalid; Lee, Tae-Hyung; Beck, James L.; Porter, Keith A.; Shaikhutdinov, Rustem; Hutchinson, Tara; Chaudhuri, Samit Ray; Chang, Stephanie E.; Falit-Baiamonte, Anthony; Holmes, William T.

    2005-01-01

    From 2002 to 2004 (years five and six of a ten-year funding cycle), the PEER Center organized the majority of its research around six testbeds. Two buildings and two bridges, a campus, and a transportation network were selected as case studies to “exercise” the PEER performance-based earthquake engineering methodology. All projects involved interdisciplinary teams of researchers, each producing data to be used by other colleagues in their research. The testbeds demonstrat...

  19. Automated colorimetric in situ hybridization (CISH) detection of immunoglobulin (Ig) light chain mRNA expression in plasma cell (PC) dyscrasias and non-Hodgkin lymphoma.

    Science.gov (United States)

    Beck, Rose C; Tubbs, Raymond R; Hussein, Mohamad; Pettay, James; Hsi, Eric D

    2003-03-01

    Immunohistochemistry (IHC) is frequently used to detect plasma cell (PC) or B cell monoclonality in histologic sections, but its interpretation is often confounded by background staining. We evaluated a new automated method for colorimetric in situ hybridization (CISH) detection of clonality in PC dyscrasias and small B cell lymphomas. Cases of PC dyscrasia included multiple myeloma (MM; 31 cases), plasmacytoma (seven cases), or amyloidosis (one case), while cases of lymphoma included small lymphocytic (three cases), marginal zone (four cases), lymphoplasmacytic (three cases), and mantle cell lymphomas (three cases). Tissue sections were stained for kappa and lambda light chains by IHC and for light chain mRNA by automated CISH using haptenated probes. Twenty-eight of 31 MM cases had detectable light chain restriction by IHC. Thirty of 31 MM cases demonstrated light chain restriction by CISH, including 2 cases with uninterpretable IHC and one case of nonsecretory myeloma, which was negative for light chains by IHC. Seven of 7 plasmacytoma cases had detectable light chain restriction by CISH, including one case of nonsecretory plasmacytoma in which IHC was noninformative. Automated CISH demonstrated monoclonality in 9 of 13 cases of B cell non-Hodgkin lymphoma and had a slightly higher sensitivity than IHC (6 of 13 cases), especially in cases of lymphoplasmacytic and marginal zone lymphoma. Overall, there were no discrepancies in light chain restriction results between IHC, CISH, or serum paraprotein analysis. Automated CISH is useful in detecting light chain expression in paraffin sections and appeared superior to IHC for light chain detection in PC dyscrasias and B cell non-Hodgkin lymphomas, predominantly due to lack of background staining.

  20. Diffraction-based analysis of tunnel size for a scaled external occulter testbed

    Science.gov (United States)

    Sirbu, Dan; Kasdin, N. Jeremy; Vanderbei, Robert J.

    2016-07-01

    For performance verification of an external occulter mask (also called a starshade), scaled testbeds have been developed to measure the suppression of the occulter shadow in the pupil plane and contrast in the image plane. For occulter experiments the scaling is typically performed by maintaining an equivalent Fresnel number. The original Princeton occulter testbed was oversized with respect to both input beam and shadow propagation to limit any diffraction effects due to finite testbed enclosure edges; however, to operate at realistic space-mission equivalent Fresnel numbers an extended testbed is currently under construction. With the longer propagation distances involved, diffraction effects due to the edge of the tunnel must now be considered in the experiment design. Here, we present a diffraction-based model of two separate tunnel effects. First, we consider the effect of tunnel-edge induced diffraction ringing upstream from the occulter mask. Second, we consider the diffraction effect due to clipping of the output shadow by the tunnel downstream from the occulter mask. These calculations are performed for a representative point design relevant to the new Princeton occulter experiment, but we also present an analytical relation that can be used for other propagation distances.

  1. Graphical interface between the CIRSSE testbed and CimStation software with MCS/CTOS

    Science.gov (United States)

    Hron, Anna B.

    1992-01-01

    This research is concerned with developing a graphical simulation of the testbed at the Center for Intelligent Robotic Systems for Space Exploration (CIRSSE) and the interface which allows for communication between the two. Such an interface is useful in telerobotic operations, and as a functional interaction tool for testbed users. Creating a simulated model of a real world system, generates inevitable calibration discrepancies between them. This thesis gives a brief overview of the work done to date in the area of workcell representation and communication, describes the development of the CIRSSE interface, and gives a direction for future work in the area of system calibration. The CimStation software used for development of this interface, is a highly versatile robotic workcell simulation package which has been programmed for this application with a scale graphical model of the testbed, and supporting interface menu code. A need for this tool has been identified for the reasons of path previewing, as a window on teleoperation and for calibration of simulated vs. real world models. The interface allows information (i.e., joint angles) generated by CimStation to be sent as motion goal positions to the testbed robots. An option of the interface has been established such that joint angle information generated by supporting testbed algorithms (i.e., TG, collision avoidance) can be piped through CimStation as a visual preview of the path.

  2. A Method to Analyze Threats and Vulnerabilities by Using a Cyber Security Test-bed of an Operating NPP

    International Nuclear Information System (INIS)

    Kim, Yong Sik; Son, Choul Woong; Lee, Soo Ill

    2016-01-01

    In order to implement cyber security controls for an Operating NPP, a security assessment should conduct in advance, and it is essential to analyze threats and vulnerabilities for a cyber security risk assessment phase. It might be impossible to perform a penetration test or scanning for a vulnerability analysis because the test may cause adverse effects on the inherent functions of ones. This is the reason why we develop and construct a cyber security test-bed instead of using real I and C systems in the operating NPP. In this paper, we propose a method to analyze threats and vulnerabilities of a specific target system by using a cyber security test-bed. The test-bed is being developed considering essential functions of the selected safety and non-safety system. This paper shows the method to analyze threats and vulnerabilities of a specific target system by using a cyber security test-bed. In order to develop the cyber security test-bed with both safety and non-safety functions, test-bed functions analysis and preliminary threats and vulnerabilities identification have been conducted. We will determine the attack scenarios and conduct the test-bed based vulnerability analysis

  3. A Method to Analyze Threats and Vulnerabilities by Using a Cyber Security Test-bed of an Operating NPP

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yong Sik; Son, Choul Woong; Lee, Soo Ill [KHNP CRI, Daejeon (Korea, Republic of)

    2016-10-15

    In order to implement cyber security controls for an Operating NPP, a security assessment should conduct in advance, and it is essential to analyze threats and vulnerabilities for a cyber security risk assessment phase. It might be impossible to perform a penetration test or scanning for a vulnerability analysis because the test may cause adverse effects on the inherent functions of ones. This is the reason why we develop and construct a cyber security test-bed instead of using real I and C systems in the operating NPP. In this paper, we propose a method to analyze threats and vulnerabilities of a specific target system by using a cyber security test-bed. The test-bed is being developed considering essential functions of the selected safety and non-safety system. This paper shows the method to analyze threats and vulnerabilities of a specific target system by using a cyber security test-bed. In order to develop the cyber security test-bed with both safety and non-safety functions, test-bed functions analysis and preliminary threats and vulnerabilities identification have been conducted. We will determine the attack scenarios and conduct the test-bed based vulnerability analysis.

  4. The Living With a Star Space Environment Testbed Program

    Science.gov (United States)

    Barth, Janet; LaBel, Kenneth; Day, John H. (Technical Monitor)

    2001-01-01

    NASA has initiated the Living with a Star (LWS) Program to develop the scientific understanding to address the aspects of the Connected Sun-Earth system that affects life and society. The Program Architecture includes science missions, theory and modeling and Space Environment Testbeds (SET). This current paper discusses the Space Environment Testbeds. The goal of the SET program is to improve the engineering approach to accomodate and/or mitigate the effects of solar variability on spacecraft design and operations. The SET Program will infuse new technologies into the space programs through collection of data in space and subsequent design and validation of technologies. Examples of these technologies are cited and discussed.

  5. User interface design principles for the SSM/PMAD automated power system

    International Nuclear Information System (INIS)

    Jakstas, L.M.; Myers, C.J.

    1991-01-01

    Computer-human interfaces are an integral part of developing software for spacecraft power systems. A well designed and efficient user interface enables an engineer to effectively operate the system, while it concurrently prevents the user from entering data which is beyond boundary conditions or performing operations which are out of context. A user interface should also be designed to ensure that the engineer easily obtains all useful and critical data for operating the system and is aware of all faults and states in the system. Martin Marietta, under contract to NASA George C. Marshall Space Flight Center, has developed a user interface for the Space Station Module Power Management and Distribution (SSM/PMAD) automated power system testbed which provides human access to the functionality of the power system, as well as exemplifying current techniques in user interface design. The testbed user interface was designed to enable an engineer to operate the system easily without having significant knowledge of computer systems, as well as provide an environment in which the engineer can monitor and interact with the SSM/PMAD system hardware. The design of the interface supports a global view of the most important data form the various hardware and software components, as well as enabling the user to obtain additional or more detailed data when needed. The components and representations of the SSM/PMAD testbed user interface are examined in this paper. An engineer's interactions with the system are also described

  6. University of Florida Advanced Technologies Campus Testbed

    Science.gov (United States)

    2017-09-21

    The University of Florida (UF) and its Transportation Institute (UFTI), the Florida Department of Transportation (FDOT) and the City of Gainesville (CoG) are cooperating to develop a smart transportation testbed on the University of Florida (UF) main...

  7. Development and verification testing of automation and robotics for assembly of space structures

    Science.gov (United States)

    Rhodes, Marvin D.; Will, Ralph W.; Quach, Cuong C.

    1993-01-01

    A program was initiated within the past several years to develop operational procedures for automated assembly of truss structures suitable for large-aperture antennas. The assembly operations require the use of a robotic manipulator and are based on the principle of supervised autonomy to minimize crew resources. A hardware testbed was established to support development and evaluation testing. A brute-force automation approach was used to develop the baseline assembly hardware and software techniques. As the system matured and an operation was proven, upgrades were incorprated and assessed against the baseline test results. This paper summarizes the developmental phases of the program, the results of several assembly tests, the current status, and a series of proposed developments for additional hardware and software control capability. No problems that would preclude automated in-space assembly of truss structures have been encountered. The current system was developed at a breadboard level and continued development at an enhanced level is warranted.

  8. Automated processing of fluorescence in-situ hybridization slides for HER2 testing in breast and gastro-esophageal carcinomas.

    Science.gov (United States)

    Tafe, Laura J; Allen, Samantha F; Steinmetz, Heather B; Dokus, Betty A; Cook, Leanne J; Marotti, Jonathan D; Tsongalis, Gregory J

    2014-08-01

    HER2 fluorescence in-situ hybridization (FISH) is used in breast and gastro-esophageal carcinoma for determining HER2 gene amplification and patients' eligibility for HER2 targeted therapeutics. Traditional manual processing of the FISH slides is labor intensive because of multiple steps that require hands on manipulation of the slides and specifically timed intervals between steps. This highly manual processing also introduces inter-run and inter-operator variability that may affect the quality of the FISH result. Therefore, we sought to incorporate an automated processing instrument into our FISH workflow. Twenty-six cases including breast (20) and gastro-esophageal (6) cancer comprising 23 biopsies and three excision specimens were tested for HER2 FISH (Pathvysion, Abbott) using the Thermobrite Elite (TBE) system (Leica). Up to 12 slides can be run simultaneously. All cases were previously tested by the Pathvysion HER2 FISH assay with manual preparation. Twenty cells were counted by two observers for each case; five cases were tested on three separate runs by different operators to evaluate the precision and inter-operator variability. There was 100% concordance in the scoring between the manual and TBE methods as well as among the five cases that were tested on three runs. Only one case failed due to poor probe hybridization. In total, seven cases were positive for HER2 amplification (HER2:CEP17 ratio >2.2) and the remaining 19 were negative (HER2:CEP17 ratio <1.8) utilizing the 2007 ASCO/CAP scoring criteria. Due to the automated denaturation and hybridization, for each run, there was a reduction in labor of 3.5h which could then be dedicated to other lab functions. The TBE is a walk away pre- and post-hybridization system that automates FISH slide processing, improves work flow and consistency and saves approximately 3.5h of technologist time. The instrument has a small footprint thus occupying minimal counter space. TBE processed slides performed

  9. Torpedo and countermeasures modelling in the Torpedo Defence System Testbed

    NARCIS (Netherlands)

    Benders, F.P.A.; Witberg, R.R.; H.J. Grootendorst, H.J.

    2002-01-01

    Several years ago, TNO-FEL started the development of the Torpedo Defence System Testbed (TDSTB) based on the TORpedo SIMulation (TORSIM) model and the Maritime Operations Simulation and Evaluation System (MOSES). MOSES provides the simulation and modelling environment for the evaluation and

  10. Analysis, Modeling, and Simulation (AMS) Testbed Development and Evaluation to Support Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) Programs - calibration Report for Phoenix Testbed : Final Report. [supporting datasets - Phoenix Testbed

    Science.gov (United States)

    2017-07-26

    The datasets in this zip file are in support of FHWA-JPO-16-379, Analysis, Modeling, and Simulation (AMS) Testbed Development and Evaluation to Support Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) Program...

  11. COLUMBUS as Engineering Testbed for Communications and Multimedia Equipment

    Science.gov (United States)

    Bank, C.; Anspach von Broecker, G. O.; Kolloge, H.-G.; Richters, M.; Rauer, D.; Urban, G.; Canovai, G.; Oesterle, E.

    2002-01-01

    The paper presents ongoing activities to prepare COLUMBUS for communications and multimedia technology experiments. For this purpose, Astrium SI, Bremen, has studied several options how to best combine the given system architecture with flexible and state-of-the-art interface avionics and software. These activities have been conducted in coordination with, and partially under contract of, DLR and ESA/ESTEC. Moreover, Astrium SI has realized three testbeds for multimedia software and hardware testing under own funding. The experimental core avionics unit - about a half double rack - establishes the core of a new multi-user experiment facility for this type of investigation onboard COLUMBUS, which shall be available to all users of COLUMBUS. It allows for the connection of 2nd generation payload, that is payload requiring broadband data transfer and near-real-time access by the Principal Investigator on ground, to test highly interactive and near-realtime payload operation. The facility is also foreseen to test new equipment to provide the astronauts onboard the ISS/COLUMBUS with bi- directional hi-fi voice and video connectivity to ground, private voice coms and e-mail, and a multimedia workstation for ops training and recreation. Connection to an appropriate Wide Area Network (WAN) on Earth is possible. The facility will include a broadband data transmission front-end terminal, which is mounted externally on the COLUMBUS module. This Equipment provides high flexibility due to the complete transparent transmit and receive chains, the steerable multi-frequency antenna system and its own thermal and power control and distribution. The Equipment is monitored and controlled via the COLUMBUS internal facility. It combines several new hardware items, which are newly developed for the next generation of broadband communication satellites and operates in Ka -Band with the experimental ESA data relay satellite ARTEMIS. The equipment is also TDRSS compatible; the open loop

  12. Modular Algorithm Testbed Suite (MATS): A Software Framework for Automatic Target Recognition

    Science.gov (United States)

    2017-01-01

    NAVAL SURFACE WARFARE CENTER PANAMA CITY DIVISION PANAMA CITY, FL 32407-7001 TECHNICAL REPORT NSWC PCD TR-2017-004 MODULAR ...31-01-2017 Technical Modular Algorithm Testbed Suite (MATS): A Software Framework for Automatic Target Recognition DR...flexible platform to facilitate the development and testing of ATR algorithms. To that end, NSWC PCD has created the Modular Algorithm Testbed Suite

  13. Prognostics-Enabled Power Supply for ADAPT Testbed, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Ridgetop's role is to develop electronic prognostics for sensing power systems in support of NASA/Ames ADAPT testbed. The prognostic enabled power systems from...

  14. Cognitive Medical Wireless Testbed System (COMWITS)

    Science.gov (United States)

    2016-11-01

    Number: ...... ...... Sub Contractors (DD882) Names of other research staff Inventions (DD882) Scientific Progress This testbed merges two ARO grants...bit 64 bit CPU Intel Xeon Processor E5-1650v3 (6C, 3.5 GHz, Turbo, HT , 15M, 140W) Intel Core i7-3770 (3.4 GHz Quad Core, 77W) Dual Intel Xeon

  15. Automated Operations Development for Advanced Exploration Systems

    Science.gov (United States)

    Haddock, Angie T.; Stetson, Howard

    2012-01-01

    Automated space operations command and control software development and its implementation must be an integral part of the vehicle design effort. The software design must encompass autonomous fault detection, isolation, recovery capabilities and also provide "single button" intelligent functions for the crew. Development, operations and safety approval experience with the Timeliner system onboard the International Space Station (ISS), which provided autonomous monitoring with response and single command functionality of payload systems, can be built upon for future automated operations as the ISS Payload effort was the first and only autonomous command and control system to be in continuous execution (6 years), 24 hours a day, 7 days a week within a crewed spacecraft environment. Utilizing proven capabilities from the ISS Higher Active Logic (HAL) System, along with the execution component design from within the HAL 9000 Space Operating System, this design paper will detail the initial HAL System software architecture and interfaces as applied to NASA's Habitat Demonstration Unit (HDU) in support of the Advanced Exploration Systems, Autonomous Mission Operations project. The development and implementation of integrated simulators within this development effort will also be detailed and is the first step in verifying the HAL 9000 Integrated Test-Bed Component [2] designs effectiveness. This design paper will conclude with a summary of the current development status and future development goals as it pertains to automated command and control for the HDU.

  16. Visible nulling coronagraphy testbed development for exoplanet detection

    Science.gov (United States)

    Lyon, Richard G.; Clampin, Mark; Woodruff, Robert A.; Vasudevan, Gopal; Thompson, Patrick; Chen, Andrew; Petrone, Peter; Booth, Andrew; Madison, Timothy; Bolcar, Matthew; Noecker, M. Charley; Kendrick, Stephen; Melnick, Gary; Tolls, Volker

    2010-07-01

    Three of the recently completed NASA Astrophysics Strategic Mission Concept (ASMC) studies addressed the feasibility of using a Visible Nulling Coronagraph (VNC) as the prime instrument for exoplanet science. The VNC approach is one of the few approaches that works with filled, segmented and sparse or diluted aperture telescope systems and thus spans the space of potential ASMC exoplanet missions. NASA/Goddard Space Flight Center (GSFC) has a well-established effort to develop VNC technologies and has developed an incremental sequence of VNC testbeds to advance the this approach and the technologies associated with it. Herein we report on the continued development of the vacuum Visible Nulling Coronagraph testbed (VNT). The VNT is an ultra-stable vibration isolated testbed that operates under high bandwidth closed-loop control within a vacuum chamber. It will be used to achieve an incremental sequence of three visible light nulling milestones of sequentially higher contrasts of 108, 109 and 1010 at an inner working angle of 2*λ/D and ultimately culminate in spectrally broadband (>20%) high contrast imaging. Each of the milestones, one per year, is traceable to one or more of the ASMC studies. The VNT uses a modified Mach-Zehnder nulling interferometer, modified with a modified "W" configuration to accommodate a hex-packed MEMS based deformable mirror, a coherent fiber bundle and achromatic phase shifters. Discussed will be the optical configuration laboratory results, critical technologies and the null sensing and control approach.

  17. Oligonucleotide PIK3CA/Chromosome 3 Dual in Situ Hybridization Automated Assay with Improved Signals, One-Hour Hybridization, and No Use of Blocking DNA.

    Science.gov (United States)

    Zhang, Wenjun; Hubbard, Antony; Baca-Parkinson, Leslie; Stanislaw, Stacey; Vladich, Frank; Robida, Mark D; Grille, James G; Maxwell, Daniel; Tsao, Tsu-Shuen; Carroll, William; Gardner, Tracie; Clements, June; Singh, Shalini; Tang, Lei

    2015-09-01

    The PIK3CA gene at chromosome 3q26.32 was found to be amplified in up to 45% of patients with squamous cell carcinoma of the lung. The strong correlation between PIK3CA amplification and increased phosphatidylinositol 3-kinase (PI3K) pathway activities suggested that PIK3CA gene copy number is a potential predictive biomarker for PI3K inhibitors. Currently, all microscopic assessments of PIK3CA and chromosome 3 (CHR3) copy numbers use fluorescence in situ hybridization. PIK3CA probes are derived from bacterial artificial chromosomes whereas CHR3 probes are derived mainly from the plasmid pHS05. These manual fluorescence in situ hybridization assays mandate 12- to 18-hour hybridization and use of blocking DNA from human sources. Moreover, fluorescence in situ hybridization studies provide limited morphologic assessment and suffer from signal decay. We developed an oligonucleotide-based bright-field in situ hybridization assay that overcomes these shortcomings. This assay requires only a 1-hour hybridization with no need for blocking DNA followed by indirect chromogenic detection. Oligonucleotide probes produced discrete and uniform CHR3 stains superior to those from the pHS05 plasmid. This assay achieved successful staining in 100% of the 195 lung squamous cell carcinoma resections and in 94% of the 33 fine-needle aspirates. This robust automated bright-field dual in situ hybridization assay for the simultaneous detection of PIK3CA and CHR3 centromere provides a potential clinical diagnostic method to assess PIK3CA gene abnormality in lung tumors. Copyright © 2015 American Society for Investigative Pathology and the Association for Molecular Pathology. Published by Elsevier Inc. All rights reserved.

  18. Vacuum Nuller Testbed (VNT) Performance, Characterization and Null Control: Progress Report

    Science.gov (United States)

    Lyon, Richard G.; Clampin, Mark; Petrone, Peter; Mallik, Udayan; Madison, Timothy; Bolcar, Matthew R.; Noecker, M. Charley; Kendrick, Stephen; Helmbrecht, Michael

    2011-01-01

    Herein we report on the development. sensing and control and our first results with the Vacuum Nuller Testbed to realize a Visible Nulling Coronagraph (VNC) for exoplanet coronagraphy. The VNC is one of the few approaches that works with filled. segmented and sparse or diluted-aperture telescope systems. It thus spans a range of potential future NASA telescopes and could be Hown as a separate instrument on such a future mission. NASA/Goddard Space Flight Center (GSFC) has a well-established effort to develop VNC technologies. and has developed an incremental sequence of VNC testbeds to advance this approach and the enabling technologies associated with it. We discuss the continued development of the vacuum Visible Nulling Coronagraph testbed (VNT). Tbe VNT is an ultra-stable vibration isolated testbed that operates under closed-loop control within a vacuum chamber. It will be used to achieve an incremental sequence of three visible-light nulling milestones with sequentially higher contrasts of 10(sup 8), 10(sup 9) and ideally 10(sup 10) at an inner working angle of 2*lambda/D. The VNT is based on a modified Mach-Zehnder nulling interferometer, with a "W" configuration to accommodate a hex-packed MEMS based deformable mirror, a coherent fiber bundle and achromatic phase shifters. We discuss the initial laboratory results, the optical configuration, critical technologies and the null sensing and control approach.

  19. Vacuum nuller testbed (VNT) performance, characterization and null control: progress report

    Science.gov (United States)

    Lyon, Richard G.; Clampin, Mark; Petrone, Peter; Mallik, Udayan; Madison, Timothy; Bolcar, Matthew R.; Noecker, M. Charley; Kendrick, Stephen; Helmbrecht, Michael

    2011-10-01

    Herein we report on the development, sensing and control and our first results with the Vacuum Nuller Testbed to realize a Visible Nulling Coronagraph (VNC) for exoplanet coronagraphy. The VNC is one of the few approaches that works with filled, segmented and sparse or diluted-aperture telescope systems. It thus spans a range of potential future NASA telescopes and could be flown as a separate instrument on such a future mission. NASA/Goddard Space Flight Center (GSFC) has a well-established effort to develop VNC technologies, and has developed an incremental sequence of VNC testbeds to advance this approach and the enabling technologies associated with it. We discuss the continued development of the vacuum Visible Nulling Coronagraph testbed (VNT). The VNT is an ultra-stable vibration isolated testbed that operates under closed-loop control within a vacuum chamber. It will be used to achieve an incremental sequence of three visible-light nulling milestones with sequentially higher contrasts of 108, 109, and ideally 1010 at an inner working angle of 2*λ/D. The VNT is based on a modified Mach-Zehnder nulling interferometer, with a "W" configuration to accommodate a hex-packed MEMS based deformable mirror, a coherent fiber bundle and achromatic phase shifters. We discuss the initial laboratory results, the optical configuration, critical technologies and the null sensing and control approach.

  20. Towards an automated checked baggage inspection system augmented with robots

    Science.gov (United States)

    DeDonato, Matthew P.; Dimitrov, Velin; Padır, Taskin

    2014-05-01

    We present a novel system for enhancing the efficiency and accuracy of checked baggage screening process at airports. The system requirements address the identification and retrieval of objects of interest that are prohibited in a checked luggage. The automated testbed is comprised of a Baxter research robot designed by Rethink Robotics for luggage and object manipulation, and a down-looking overhead RGB-D sensor for inspection and detection. We discuss an overview of current system implementations, areas of opportunity for improvements, robot system integration challenges, details of the proposed software architecture and experimental results from a case study for identifying various kinds of lighters in checked bags.

  1. Variable Coding and Modulation Experiment Using NASA's Space Communication and Navigation Testbed

    Science.gov (United States)

    Downey, Joseph A.; Mortensen, Dale J.; Evans, Michael A.; Tollis, Nicholas S.

    2016-01-01

    National Aeronautics and Space Administration (NASA)'s Space Communication and Navigation Testbed on the International Space Station provides a unique opportunity to evaluate advanced communication techniques in an operational system. The experimental nature of the Testbed allows for rapid demonstrations while using flight hardware in a deployed system within NASA's networks. One example is variable coding and modulation, which is a method to increase data-throughput in a communication link. This paper describes recent flight testing with variable coding and modulation over S-band using a direct-to-earth link between the SCaN Testbed and the Glenn Research Center. The testing leverages the established Digital Video Broadcasting Second Generation (DVB-S2) standard to provide various modulation and coding options. The experiment was conducted in a challenging environment due to the multipath and shadowing caused by the International Space Station structure. Performance of the variable coding and modulation system is evaluated and compared to the capacity of the link, as well as standard NASA waveforms.

  2. Extensible automated dispersive liquid–liquid microextraction

    Energy Technology Data Exchange (ETDEWEB)

    Li, Songqing; Hu, Lu; Chen, Ketao; Gao, Haixiang, E-mail: hxgao@cau.edu.cn

    2015-05-04

    Highlights: • An extensible automated dispersive liquid–liquid microextraction was developed. • A fully automatic SPE workstation with a modified operation program was used. • Ionic liquid-based in situ DLLME was used as model method. • SPE columns packed with nonwoven polypropylene fiber was used for phase separation. • The approach was applied to the determination of benzoylurea insecticides in water. - Abstract: In this study, a convenient and extensible automated ionic liquid-based in situ dispersive liquid–liquid microextraction (automated IL-based in situ DLLME) was developed. 1-Octyl-3-methylimidazolium bis[(trifluoromethane)sulfonyl]imide ([C{sub 8}MIM]NTf{sub 2}) is formed through the reaction between [C{sub 8}MIM]Cl and lithium bis[(trifluoromethane)sulfonyl]imide (LiNTf{sub 2}) to extract the analytes. Using a fully automatic SPE workstation, special SPE columns packed with nonwoven polypropylene (NWPP) fiber, and a modified operation program, the procedures of the IL-based in situ DLLME, including the collection of a water sample, injection of an ion exchange solvent, phase separation of the emulsified solution, elution of the retained extraction phase, and collection of the eluent into vials, can be performed automatically. The developed approach, coupled with high-performance liquid chromatography–diode array detection (HPLC–DAD), was successfully applied to the detection and concentration determination of benzoylurea (BU) insecticides in water samples. Parameters affecting the extraction performance were investigated and optimized. Under the optimized conditions, the proposed method achieved extraction recoveries of 80% to 89% for water samples. The limits of detection (LODs) of the method were in the range of 0.16–0.45 ng mL{sup −1}. The intra-column and inter-column relative standard deviations (RSDs) were <8.6%. Good linearity (r > 0.9986) was obtained over the calibration range from 2 to 500 ng mL{sup −1}. The proposed

  3. Extensible automated dispersive liquid–liquid microextraction

    International Nuclear Information System (INIS)

    Li, Songqing; Hu, Lu; Chen, Ketao; Gao, Haixiang

    2015-01-01

    Highlights: • An extensible automated dispersive liquid–liquid microextraction was developed. • A fully automatic SPE workstation with a modified operation program was used. • Ionic liquid-based in situ DLLME was used as model method. • SPE columns packed with nonwoven polypropylene fiber was used for phase separation. • The approach was applied to the determination of benzoylurea insecticides in water. - Abstract: In this study, a convenient and extensible automated ionic liquid-based in situ dispersive liquid–liquid microextraction (automated IL-based in situ DLLME) was developed. 1-Octyl-3-methylimidazolium bis[(trifluoromethane)sulfonyl]imide ([C 8 MIM]NTf 2 ) is formed through the reaction between [C 8 MIM]Cl and lithium bis[(trifluoromethane)sulfonyl]imide (LiNTf 2 ) to extract the analytes. Using a fully automatic SPE workstation, special SPE columns packed with nonwoven polypropylene (NWPP) fiber, and a modified operation program, the procedures of the IL-based in situ DLLME, including the collection of a water sample, injection of an ion exchange solvent, phase separation of the emulsified solution, elution of the retained extraction phase, and collection of the eluent into vials, can be performed automatically. The developed approach, coupled with high-performance liquid chromatography–diode array detection (HPLC–DAD), was successfully applied to the detection and concentration determination of benzoylurea (BU) insecticides in water samples. Parameters affecting the extraction performance were investigated and optimized. Under the optimized conditions, the proposed method achieved extraction recoveries of 80% to 89% for water samples. The limits of detection (LODs) of the method were in the range of 0.16–0.45 ng mL −1 . The intra-column and inter-column relative standard deviations (RSDs) were <8.6%. Good linearity (r > 0.9986) was obtained over the calibration range from 2 to 500 ng mL −1 . The proposed method opens a new avenue

  4. Implementation of a RPS Cyber Security Test-bed with Two PLCs

    International Nuclear Information System (INIS)

    Shin, Jinsoo; Heo, Gyunyoung; Son, Hanseong; An, Yongkyu; Rizwan, Uddin

    2015-01-01

    Our research team proposed the methodology to evaluate cyber security with Bayesian network (BN) as a cyber security evaluation model and help operator, licensee, licensor or regulator in granting evaluation priorities. The methodology allowed for overall evaluation of cyber security by considering architectural aspect of facility and management aspect of cyber security at the same time. In order to emphasize reality of this model by inserting true data, it is necessary to conduct a penetration test that pretends an actual cyber-attack. Through the collaboration with University of Illinois at Urbana-Champaign, which possesses the Tricon a safety programmable logic controller (PLC) used at nuclear power plants and develops a test-bed for nuclear power plant, a test-bed for reactor protection system (RPS) is being developed with the PLCs. Two PLCs are used to construct a simple test-bed for RPS, bi-stable processor (BP) and coincidence processor (CP). By using two PLCs, it is possible to examine cyber-attack against devices such as PLC, cyber-attack against communication between devices, and the effects of a PLC on the other PLC. Two PLCs were used to construct a test-bed for penetration test in this study. Advantages of using two or more PLCs instead of single PLC are as follows. 1) Results of cyber-attack reflecting characteristics among PLCs can be obtained. 2) Cyber-attack can be attempted using a method of attacking communication between PLCs. True data obtained can be applied to existing cyber security evaluation model to emphasize reality of the model

  5. Implementation of a RPS Cyber Security Test-bed with Two PLCs

    Energy Technology Data Exchange (ETDEWEB)

    Shin, Jinsoo; Heo, Gyunyoung [Kyung Hee Univ., Yongin (Korea, Republic of); Son, Hanseong [Joongbu Univ., Geumsan (Korea, Republic of); An, Yongkyu; Rizwan, Uddin [University of Illinois at Urbana-Champaign, Urbana (United States)

    2015-10-15

    Our research team proposed the methodology to evaluate cyber security with Bayesian network (BN) as a cyber security evaluation model and help operator, licensee, licensor or regulator in granting evaluation priorities. The methodology allowed for overall evaluation of cyber security by considering architectural aspect of facility and management aspect of cyber security at the same time. In order to emphasize reality of this model by inserting true data, it is necessary to conduct a penetration test that pretends an actual cyber-attack. Through the collaboration with University of Illinois at Urbana-Champaign, which possesses the Tricon a safety programmable logic controller (PLC) used at nuclear power plants and develops a test-bed for nuclear power plant, a test-bed for reactor protection system (RPS) is being developed with the PLCs. Two PLCs are used to construct a simple test-bed for RPS, bi-stable processor (BP) and coincidence processor (CP). By using two PLCs, it is possible to examine cyber-attack against devices such as PLC, cyber-attack against communication between devices, and the effects of a PLC on the other PLC. Two PLCs were used to construct a test-bed for penetration test in this study. Advantages of using two or more PLCs instead of single PLC are as follows. 1) Results of cyber-attack reflecting characteristics among PLCs can be obtained. 2) Cyber-attack can be attempted using a method of attacking communication between PLCs. True data obtained can be applied to existing cyber security evaluation model to emphasize reality of the model.

  6. High Contrast Vacuum Nuller Testbed (VNT) Contrast, Performance and Null Control

    Science.gov (United States)

    Lyon, Richard G.; Clampin, Mark; Petrone, Peter; Mallik, Udayan; Madison, Timothy; Bolcar, Matthew R.

    2012-01-01

    Herein we report on our Visible Nulling Coronagraph high-contrast result of 109 contrast averaged over a focal planeregion extending from 14 D with the Vacuum Nuller Testbed (VNT) in a vibration isolated vacuum chamber. TheVNC is a hybrid interferometriccoronagraphic approach for exoplanet science. It operates with high Lyot stopefficiency for filled, segmented and sparse or diluted-aperture telescopes, thereby spanning the range of potential futureNASA flight telescopes. NASAGoddard Space Flight Center (GSFC) has a well-established effort to develop the VNCand its technologies, and has developed an incremental sequence of VNC testbeds to advance this approach and itsenabling technologies. These testbeds have enabled advancement of high-contrast, visible light, nulling interferometry tounprecedented levels. The VNC is based on a modified Mach-Zehnder nulling interferometer, with a W configurationto accommodate a hex-packed MEMS based deformable mirror, a coherent fiber bundle and achromatic phase shifters.We give an overview of the VNT and discuss the high-contrast laboratory results, the optical configuration, criticaltechnologies and null sensing and control.

  7. High contrast vacuum nuller testbed (VNT) contrast, performance, and null control

    Science.gov (United States)

    Lyon, Richard G.; Clampin, Mark; Petrone, Peter; Mallik, Udayan; Madison, Timothy; Bolcar, Matthew R.

    2012-09-01

    Herein we report on our Visible Nulling Coronagraph high-contrast result of 109 contrast averaged over a focal plane region extending from 1 - 4 λ/D with the Vacuum Nuller Testbed (VNT) in a vibration isolated vacuum chamber. The VNC is a hybrid interferometric/coronagraphic approach for exoplanet science. It operates with high Lyot stop efficiency for filled, segmented and sparse or diluted-aperture telescopes, thereby spanning the range of potential future NASA flight telescopes. NASA/Goddard Space Flight Center (GSFC) has a well-established effort to develop the VNC and its technologies, and has developed an incremental sequence of VNC testbeds to advance this approach and its enabling technologies. These testbeds have enabled advancement of high-contrast, visible light, nulling interferometry to unprecedented levels. The VNC is based on a modified Mach-Zehnder nulling interferometer, with a “W” configuration to accommodate a hex-packed MEMS based deformable mirror, a coherent fiber bundle and achromatic phase shifters. We give an overview of the VNT and discuss the high-contrast laboratory results, the optical configuration, critical technologies and null sensing and control.

  8. Process defects and in situ monitoring methods in metal powder bed fusion: a review

    International Nuclear Information System (INIS)

    Grasso, Marco; Colosimo, Bianca Maria

    2017-01-01

    Despite continuous technological enhancements of metal Additive Manufacturing (AM) systems, the lack of process repeatability and stability still represents a barrier for the industrial breakthrough. The most relevant metal AM applications currently involve industrial sectors (e.g. aerospace and bio-medical) where defects avoidance is fundamental. Because of this, there is the need to develop novel in situ monitoring tools able to keep under control the stability of the process on a layer-by-layer basis, and to detect the onset of defects as soon as possible. On the one hand, AM systems must be equipped with in situ sensing devices able to measure relevant quantities during the process, a.k.a. process signatures. On the other hand, in-process data analytics and statistical monitoring techniques are required to detect and localize the defects in an automated way. This paper reviews the literature and the commercial tools for in situ monitoring of powder bed fusion (PBF) processes. It explores the different categories of defects and their main causes, the most relevant process signatures and the in situ sensing approaches proposed so far. Particular attention is devoted to the development of automated defect detection rules and the study of process control strategies, which represent two critical fields for the development of future smart PBF systems. (paper)

  9. Process defects and in situ monitoring methods in metal powder bed fusion: a review

    Science.gov (United States)

    Grasso, Marco; Colosimo, Bianca Maria

    2017-04-01

    Despite continuous technological enhancements of metal Additive Manufacturing (AM) systems, the lack of process repeatability and stability still represents a barrier for the industrial breakthrough. The most relevant metal AM applications currently involve industrial sectors (e.g. aerospace and bio-medical) where defects avoidance is fundamental. Because of this, there is the need to develop novel in situ monitoring tools able to keep under control the stability of the process on a layer-by-layer basis, and to detect the onset of defects as soon as possible. On the one hand, AM systems must be equipped with in situ sensing devices able to measure relevant quantities during the process, a.k.a. process signatures. On the other hand, in-process data analytics and statistical monitoring techniques are required to detect and localize the defects in an automated way. This paper reviews the literature and the commercial tools for in situ monitoring of powder bed fusion (PBF) processes. It explores the different categories of defects and their main causes, the most relevant process signatures and the in situ sensing approaches proposed so far. Particular attention is devoted to the development of automated defect detection rules and the study of process control strategies, which represent two critical fields for the development of future smart PBF systems.

  10. Automated scanning electron microscopy and x-ray microanalysis for in situ quantification of gadolinium deposits in skin

    International Nuclear Information System (INIS)

    Thakral, Charu; Abraham, Jerrold L.

    2007-01-01

    Gadolinium (Gd) has been identified as a possible causative agent of an emerging cutaneous and systemic fibrosing disorder, nephrogenic systemic fibrosis (NSF), which can cause serious disability and even death. To date, there are only two known associations with this disorder - renal insufficiency and Gd enhanced magnetic resonance imaging (MRI). We developed an automated quantitative scanning electron microscopy (SEM) and Energy dispersive x-ray spectroscopy (EDS) method for Gd in tissue of NSF patients. Freshly cut paraffin block surfaces examined using the variable pressure mode under standardized conditions and random search of the tissue area allow in situ detection and semiquantitative morphometric (volumetric) analysis of insoluble higher atomic number features using backscattered electron imaging. We detected Gd ranging from 1 to 2270 cps/mm 2 in 57 cutaneous biopsies of NSF. Gd was associated with P, Ca, and usually Na in tissue deposits. Our method reproducibly determines the elemental composition, relative concentration, and spatial distribution of detected features within the tissue. However, we cannot detect features below our spatial resolution, nor concentrations below the detection limit of our SEM/EDS system. The findings confirm transmetallation and release of toxic Gd ions in NSF and allow dose-response analysis at the histologic level. (author)

  11. Multi-level infrastructure of interconnected testbeds of large-scale wireless sensor networks (MI2T-WSN)

    CSIR Research Space (South Africa)

    Abu-Mahfouz, Adnan M

    2012-06-01

    Full Text Available are still required for further testing before the real implementation. In this paper we propose a multi-level infrastructure of interconnected testbeds of large- scale WSNs. This testbed consists of 1000 sensor motes that will be distributed into four...

  12. Embedded Sensors and Controls to Improve Component Performance and Reliability -- Bench-scale Testbed Design Report

    Energy Technology Data Exchange (ETDEWEB)

    Melin, Alexander M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Kisner, Roger A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Drira, Anis [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Reed, Frederick K. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-09-01

    Embedded instrumentation and control systems that can operate in extreme environments are challenging due to restrictions on sensors and materials. As a part of the Department of Energy's Nuclear Energy Enabling Technology cross-cutting technology development programs Advanced Sensors and Instrumentation topic, this report details the design of a bench-scale embedded instrumentation and control testbed. The design goal of the bench-scale testbed is to build a re-configurable system that can rapidly deploy and test advanced control algorithms in a hardware in the loop setup. The bench-scale testbed will be designed as a fluid pump analog that uses active magnetic bearings to support the shaft. The testbed represents an application that would improve the efficiency and performance of high temperature (700 C) pumps for liquid salt reactors that operate in an extreme environment and provide many engineering challenges that can be overcome with embedded instrumentation and control. This report will give details of the mechanical design, electromagnetic design, geometry optimization, power electronics design, and initial control system design.

  13. Testbed for a LiFi system integrated in streetlights

    OpenAIRE

    Monzón Baeza, Victor; Sánchez Fernández, Matilde Pilar; García-Armada, Ana; Royo, A.

    2015-01-01

    Proceeding at: 2015 European Conference on Networks and Communications (EuCNC) took place June 29 - July 2 in Paris, France. In this paper, a functional LiFi real-time testbed implemented on FPGAs is presented. The setup evaluates the performance of our design in a downlink scenario where the transmitter is embedded on the streetlights and a mobile phone’s camera is used as receiver, therefore achieving the goal of lighting and communicating simultaneously. To validate the ...

  14. The Airborne Optical Systems Testbed (AOSTB)

    Science.gov (United States)

    2017-05-31

    are the Atlantic Ocean and coastal waterways, which reflect back very little light at our SWIR operating wavelength of 1064 nm. The Airborne Optical...demonstrate our typical FOPEN capabilities, figure 5 shows two images taken over a forested area near Burlington, VT. Figure 5(a) is a 3D point...Systems Testbed (AOSTB) 1 - 6 STO-MP-SET-999 (a) (b) Fig. 5. Ladar target scan of a forested area in northern Vermont

  15. Towards a Perpetual Sensor Network Testbed without Backchannel

    DEFF Research Database (Denmark)

    Johansen, Aslak; Bonnet, Philippe; Sørensen, Thomas

    2012-01-01

    The sensor network testbeds available today rely on a communication channel different from the mote radio - a backchannel - to facilitate mote reprogramming, health monitoring and performance analysis. Such backchannels are either supported as wired communication channels (USB or Ethernet), or vi...

  16. A Multi-Vehicles, Wireless Testbed for Networked Control, Communications and Computing

    Science.gov (United States)

    Murray, Richard; Doyle, John; Effros, Michelle; Hickey, Jason; Low, Steven

    2002-03-01

    We have constructed a testbed consisting of 4 mobile vehicles (with 4 additional vehicles being completed), each with embedded computing and communications capability for use in testing new approaches for command and control across dynamic networks. The system is being used or is planned to be used for testing of a variety of communications-related technologies, including distributed command and control algorithms, dynamically reconfigurable network topologies, source coding for real-time transmission of data in lossy environments, and multi-network communications. A unique feature of the testbed is the use of vehicles that have second order dynamics. Requiring real-time feedback algorithms to stabilize the system while performing cooperative tasks. The testbed was constructed in the Caltech Vehicles Laboratory and consists of individual vehicles with PC-based computation and controls, and multiple communications devices (802.11 wireless Ethernet, Bluetooth, and infrared). The vehicles are freely moving, wheeled platforms propelled by high performance dotted fairs. The room contains an access points for an 802.11 network, overhead visual sensing (to allow emulation of CI'S signal processing), a centralized computer for emulating certain distributed computations, and network gateways to control and manipulate communications traffic.

  17. A Battery Certification Testbed for Small Satellite Missions

    Science.gov (United States)

    Cameron, Zachary; Kulkarni, Chetan S.; Luna, Ali Guarneros; Goebel, Kai; Poll, Scott

    2015-01-01

    A battery pack consisting of standard cylindrical 18650 lithium-ion cells has been chosen for small satellite missions based on previous flight heritage and compliance with NASA battery safety requirements. However, for batteries that transit through the International Space Station (ISS), additional certification tests are required for individual cells as well as the battery packs. In this manuscript, we discuss the development of generalized testbeds for testing and certifying different types of batteries critical to small satellite missions. Test procedures developed and executed for this certification effort include: a detailed physical inspection before and after experiments; electrical cycling characterization at the cell and pack levels; battery-pack overcharge, over-discharge, external short testing; battery-pack vacuum leak and vibration testing. The overall goals of these certification procedures are to conform to requirements set forth by the agency and identify unique safety hazards. The testbeds, procedures, and experimental results are discussed for batteries chosen for small satellite missions to be launched from the ISS.

  18. Development of an Experimental Testbed for Research in Lithium-Ion Battery Management Systems

    Directory of Open Access Journals (Sweden)

    Mehdi Ferdowsi

    2013-10-01

    Full Text Available Advanced electrochemical batteries are becoming an integral part of a wide range of applications from household and commercial to smart grid, transportation, and aerospace applications. Among different battery technologies, lithium-ion (Li-ion batteries are growing more and more popular due to their high energy density, high galvanic potential, low self-discharge, low weight, and the fact that they have almost no memory effect. However, one of the main obstacles facing the widespread commercialization of Li-ion batteries is the design of reliable battery management systems (BMSs. An efficient BMS ensures electrical safety during operation, while increasing battery lifetime, capacity and thermal stability. Despite the need for extensive research in this field, the majority of research conducted on Li-ion battery packs and BMS are proprietary works conducted by manufacturers. The available literature, however, provides either general descriptions or detailed analysis of individual components of the battery system, and ignores addressing details of the overall system development. This paper addresses the development of an experimental research testbed for studying Li-ion batteries and their BMS design. The testbed can be configured in a variety of cell and pack architectures, allowing for a wide range of BMS monitoring, diagnostics, and control technologies to be tested and analyzed. General considerations that should be taken into account while designing Li-ion battery systems are reviewed and different technologies and challenges commonly encountered in Li-ion battery systems are investigated. This testbed facilitates future development of more practical and improved BMS technologies with the aim of increasing the safety, reliability, and efficiency of existing Li-ion battery systems. Experimental results of initial tests performed on the system are used to demonstrate some of the capabilities of the developed research testbed. To the authors

  19. Simultaneous monitoring of faecal indicators and harmful algae using an in-situ autonomous sensor.

    Science.gov (United States)

    Yamahara, K M; Demir-Hilton, E; Preston, C M; Marin, R; Pargett, D; Roman, B; Jensen, S; Birch, J M; Boehm, A B; Scholin, C A

    2015-08-01

    Faecal indicator bacteria (FIB) and harmful algal blooms (HABs) threaten the health and the economy of coastal communities worldwide. Emerging automated sampling technologies combined with molecular analytical techniques could enable rapid detection of micro-organisms in-situ, thereby improving resource management and public health decision-making. We evaluated this concept using a robotic device, the Environmental Sample Processor (ESP). The ESP automates in-situ sample collection, nucleic acid extraction and molecular analyses. Here, the ESP measured and reported concentrations of FIB (Enterococcus spp.), a microbial source-tracking marker (human-specific Bacteriodales) and a HAB species (Psuedo-nitzschia spp.) over a 45-day deployment on the Santa Cruz Municipal Wharf (Santa Cruz, CA, USA). Both FIB and HABs were enumerated from single in-situ collected water samples. The in-situ qPCR efficiencies ranged from 86% to 105%, while the limit of quantifications during the deployment was 10 copies reaction(-1) . No differences were observed in the concentrations of enterococci, the human-specific marker in Bacteroidales spp., and P. australis between in-situ collected sample and traditional hand sampling methods (P > 0·05). Analytical results were Internet-accessible within hours of sample collection, demonstrating the feasibility of same-day public notification of current water quality conditions. This study presents the first report of in-situ qPCR enumeration of both faecal indicators and harmful algal species in coastal marine waters. We utilize a robotic device for in-situ quantification of enterococci, the human-specific marker in Bacteriodales and Pseudo-nitzschia spp. from the same water samples collected and processed in-situ. The results demonstrate that rapid, in-situ monitoring can be utilized to identify and quantify multiple health-relevant micro-organisms important in water quality monitoring and that this monitoring can be used to inform same

  20. A Holistic Approach to ZigBee Performance Enhancement for Home Automation Networks

    Directory of Open Access Journals (Sweden)

    August Betzler

    2014-08-01

    Full Text Available Wireless home automation networks are gaining importance for smart homes. In this ambit, ZigBee networks play an important role. The ZigBee specification defines a default set of protocol stack parameters and mechanisms that is further refined by the ZigBee Home Automation application profile. In a holistic approach, we analyze how the network performance is affected with the tuning of parameters and mechanisms across multiple layers of the ZigBee protocol stack and investigate possible performance gains by implementing and testing alternative settings. The evaluations are carried out in a testbed of 57 TelosB motes. The results show that considerable performance improvements can be achieved by using alternative protocol stack configurations. From these results, we derive two improved protocol stack configurations for ZigBee wireless home automation networks that are validated in various network scenarios. In our experiments, these improved configurations yield a relative packet delivery ratio increase of up to 33.6%, a delay decrease of up to 66.6% and an improvement of the energy efficiency for battery powered devices of up to 48.7%, obtainable without incurring any overhead to the network.

  1. A Holistic Approach to ZigBee Performance Enhancement for Home Automation Networks

    Science.gov (United States)

    Betzler, August; Gomez, Carles; Demirkol, Ilker; Paradells, Josep

    2014-01-01

    Wireless home automation networks are gaining importance for smart homes. In this ambit, ZigBee networks play an important role. The ZigBee specification defines a default set of protocol stack parameters and mechanisms that is further refined by the ZigBee Home Automation application profile. In a holistic approach, we analyze how the network performance is affected with the tuning of parameters and mechanisms across multiple layers of the ZigBee protocol stack and investigate possible performance gains by implementing and testing alternative settings. The evaluations are carried out in a testbed of 57 TelosB motes. The results show that considerable performance improvements can be achieved by using alternative protocol stack configurations. From these results, we derive two improved protocol stack configurations for ZigBee wireless home automation networks that are validated in various network scenarios. In our experiments, these improved configurations yield a relative packet delivery ratio increase of up to 33.6%, a delay decrease of up to 66.6% and an improvement of the energy efficiency for battery powered devices of up to 48.7%, obtainable without incurring any overhead to the network. PMID:25196004

  2. A holistic approach to ZigBee performance enhancement for home automation networks.

    Science.gov (United States)

    Betzler, August; Gomez, Carles; Demirkol, Ilker; Paradells, Josep

    2014-08-14

    Wireless home automation networks are gaining importance for smart homes. In this ambit, ZigBee networks play an important role. The ZigBee specification defines a default set of protocol stack parameters and mechanisms that is further refined by the ZigBee Home Automation application profile. In a holistic approach, we analyze how the network performance is affected with the tuning of parameters and mechanisms across multiple layers of the ZigBee protocol stack and investigate possible performance gains by implementing and testing alternative settings. The evaluations are carried out in a testbed of 57 TelosB motes. The results show that considerable performance improvements can be achieved by using alternative protocol stack configurations. From these results, we derive two improved protocol stack configurations for ZigBee wireless home automation networks that are validated in various network scenarios. In our experiments, these improved configurations yield a relative packet delivery ratio increase of up to 33.6%, a delay decrease of up to 66.6% and an improvement of the energy efficiency for battery powered devices of up to 48.7%, obtainable without incurring any overhead to the network.

  3. Automation of ALK gene rearrangement testing with fluorescence in situ hybridization (FISH): a feasibility study.

    Science.gov (United States)

    Zwaenepoel, Karen; Merkle, Dennis; Cabillic, Florian; Berg, Erica; Belaud-Rotureau, Marc-Antoine; Grazioli, Vittorio; Herelle, Olga; Hummel, Michael; Le Calve, Michele; Lenze, Dido; Mende, Stefanie; Pauwels, Patrick; Quilichini, Benoit; Repetti, Elena

    2015-02-01

    In the past several years we have observed a significant increase in our understanding of molecular mechanisms that drive lung cancer. Specifically in the non-small cell lung cancer sub-types, ALK gene rearrangements represent a sub-group of tumors that are targetable by the tyrosine kinase inhibitor Crizotinib, resulting in significant reductions in tumor burden. Phase II and III clinical trials were performed using an ALK break-apart FISH probe kit, making FISH the gold standard for identifying ALK rearrangements in patients. FISH is often considered a labor and cost intensive molecular technique, and in this study we aimed to demonstrate feasibility for automation of ALK FISH testing, to improve laboratory workflow and ease of testing. This involved automation of the pre-treatment steps of the ALK assay using various protocols on the VP 2000 instrument, and facilitating automated scanning of the fluorescent FISH specimens for simplified enumeration on various backend scanning and analysis systems. The results indicated that ALK FISH can be automated. Significantly, both the Ikoniscope and BioView system of automated FISH scanning and analysis systems provided a robust analysis algorithm to define ALK rearrangements. In addition, the BioView system facilitated consultation of difficult cases via the internet. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. SCaN Testbed Software Development and Lessons Learned

    Science.gov (United States)

    Kacpura, Thomas J.; Varga, Denise M.

    2012-01-01

    National Aeronautics and Space Administration (NASA) has developed an on-orbit, adaptable, Software Defined Radio (SDR)Space Telecommunications Radio System (STRS)-based testbed facility to conduct a suite of experiments to advance technologies, reduce risk, and enable future mission capabilities on the International Space Station (ISS). The SCAN Testbed Project will provide NASA, industry, other Government agencies, and academic partners the opportunity to develop and field communications, navigation, and networking technologies in the laboratory and space environment based on reconfigurable, SDR platforms and the STRS Architecture.The SDRs are a new technology for NASA, and the support infrastructure they require is different from legacy, fixed function radios. SDRs offer the ability to reconfigure on-orbit communications by changing software for new waveforms and operating systems to enable new capabilities or fix any anomalies, which was not a previous option. They are not stand alone devices, but required a new approach to effectively control them and flow data. This requires extensive software to be developed to utilize the full potential of these reconfigurable platforms. The paper focuses on development, integration and testing as related to the avionics processor system, and the software required to command, control, monitor, and interact with the SDRs, as well as the other communication payload elements. An extensive effort was required to develop the flight software and meet the NASA requirements for software quality and safety. The flight avionics must be radiation tolerant, and these processors have limited capability in comparison to terrestrial counterparts. A big challenge was that there are three SDRs onboard, and interfacing with multiple SDRs simultaneously complicatesd the effort. The effort also includes ground software, which is a key element for both the command of the payload, and displaying data created by the payload. The verification of

  5. Construction of test-bed system of voltage management system to ...

    African Journals Online (AJOL)

    Construction of test-bed system of voltage management system to apply physical power system. ... Journal of Fundamental and Applied Sciences ... system of voltage management system (VMS) in order to apply physical power system.

  6. Phased Array Antenna Testbed Development at the NASA Glenn Research Center

    Science.gov (United States)

    Lambert, Kevin M.; Kubat, Gregory; Johnson, Sandra K.; Anzic, Godfrey

    2003-01-01

    Ideal phased array antennas offer advantages for communication systems, such as wide-angle scanning and multibeam operation, which can be utilized in certain NASA applications. However, physically realizable, electronically steered, phased array antennas introduce additional system performance parameters, which must be included in the evaluation of the system. The NASA Glenn Research Center (GRC) is currently conducting research to identify these parameters and to develop the tools necessary to measure them. One of these tools is a testbed where phased array antennas may be operated in an environment that simulates their use. This paper describes the development of the testbed and its use in characterizing a particular K-Band, phased array antenna.

  7. Operation Duties on the F-15B Research Testbed

    Science.gov (United States)

    Truong, Samson S.

    2010-01-01

    This presentation entails what I have done this past summer for my Co-op tour in the Operations Engineering Branch. Activities included supporting the F-15B Research Testbed, supporting the incoming F-15D models, design work, and other operations engineering duties.

  8. Design, Development, and Testing of a UAV Hardware-in-the-Loop Testbed for Aviation and Airspace Prognostics Research

    Science.gov (United States)

    Kulkarni, Chetan; Teubert, Chris; Gorospe, George; Burgett, Drew; Quach, Cuong C.; Hogge, Edward

    2016-01-01

    The airspace is becoming more and more complicated, and will continue to do so in the future with the integration of Unmanned Aerial Vehicles (UAVs), autonomy, spacecraft, other forms of aviation technology into the airspace. The new technology and complexity increases the importance and difficulty of safety assurance. Additionally, testing new technologies on complex aviation systems & systems of systems can be very difficult, expensive, and sometimes unsafe in real life scenarios. Prognostic methodology provides an estimate of the health and risks of a component, vehicle, or airspace and knowledge of how that will change over time. That measure is especially useful in safety determination, mission planning, and maintenance scheduling. The developed testbed will be used to validate prediction algorithms for the real-time safety monitoring of the National Airspace System (NAS) and the prediction of unsafe events. The framework injects flight related anomalies related to ground systems, routing, airport congestion, etc. to test and verify algorithms for NAS safety. In our research work, we develop a live, distributed, hardware-in-the-loop testbed for aviation and airspace prognostics along with exploring further research possibilities to verify and validate future algorithms for NAS safety. The testbed integrates virtual aircraft using the X-Plane simulator and X-PlaneConnect toolbox, UAVs using onboard sensors and cellular communications, and hardware in the loop components. In addition, the testbed includes an additional research framework to support and simplify future research activities. It enables safe, accurate, and inexpensive experimentation and research into airspace and vehicle prognosis that would not have been possible otherwise. This paper describes the design, development, and testing of this system. Software reliability, safety and latency are some of the critical design considerations in development of the testbed. Integration of HITL elements in

  9. Implementation of a Wireless Time Distribution Testbed Protected with Quantum Key Distribution

    Energy Technology Data Exchange (ETDEWEB)

    Bonior, Jason D [ORNL; Evans, Philip G [ORNL; Sheets, Gregory S [ORNL; Jones, John P [ORNL; Flynn, Toby H [ORNL; O' Neil, Lori Ross [Pacific Northwest National Laboratory (PNNL); Hutton, William [Pacific Northwest National Laboratory (PNNL); Pratt, Richard [Pacific Northwest National Laboratory (PNNL); Carroll, Thomas E. [Pacific Northwest National Laboratory (PNNL)

    2017-01-01

    Secure time transfer is critical for many timesensitive applications. the Global Positioning System (GPS) which is often used for this purpose has been shown to be susceptible to spoofing attacks. Quantum Key Distribution offers a way to securely generate encryption keys at two locations. Through careful use of this information it is possible to create a system that is more resistant to spoofing attacks. In this paper we describe our work to create a testbed which utilizes QKD and traditional RF links. This testbed will be used for the development of more secure and spoofing resistant time distribution protocols.

  10. Development of a Testbed for Wireless Underground Sensor Networks

    Directory of Open Access Journals (Sweden)

    Mehmet C. Vuran

    2010-01-01

    Full Text Available Wireless Underground Sensor Networks (WUSNs constitute one of the promising application areas of the recently developed wireless sensor networking techniques. WUSN is a specialized kind of Wireless Sensor Network (WSN that mainly focuses on the use of sensors that communicate through soil. Recent models for the wireless underground communication channel are proposed but few field experiments were realized to verify the accuracy of the models. The realization of field WUSN experiments proved to be extremely complex and time-consuming in comparison with the traditional wireless environment. To the best of our knowledge, this is the first work that proposes guidelines for the development of an outdoor WUSN testbed with the goals of improving the accuracy and reducing of time for WUSN experiments. Although the work mainly aims WUSNs, many of the presented practices can also be applied to generic WSN testbeds.

  11. A Matlab-Based Testbed for Integration, Evaluation and Comparison of Heterogeneous Stereo Vision Matching Algorithms

    Directory of Open Access Journals (Sweden)

    Raul Correal

    2016-11-01

    Full Text Available Stereo matching is a heavily researched area with a prolific published literature and a broad spectrum of heterogeneous algorithms available in diverse programming languages. This paper presents a Matlab-based testbed that aims to centralize and standardize this variety of both current and prospective stereo matching approaches. The proposed testbed aims to facilitate the application of stereo-based methods to real situations. It allows for configuring and executing algorithms, as well as comparing results, in a fast, easy and friendly setting. Algorithms can be combined so that a series of processes can be chained and executed consecutively, using the output of a process as input for the next; some additional filtering and image processing techniques have been included within the testbed for this purpose. A use case is included to illustrate how these processes are sequenced and its effect on the results for real applications. The testbed has been conceived as a collaborative and incremental open-source project, where its code is accessible and modifiable, with the objective of receiving contributions and releasing future versions to include new algorithms and features. It is currently available online for the research community.

  12. The Living With a Star Space Environment Testbed Payload

    Science.gov (United States)

    Xapsos, Mike

    2015-01-01

    This presentation outlines a brief description of the Living With a Star (LWS) Program missions and detailed information about the Space Environment Testbed (SET) payload consisting of a space weather monitor and carrier containing 4 board experiments.

  13. Accuracy of determining preoperative cancer extent measured by automated breast ultrasonography.

    Science.gov (United States)

    Tozaki, Mitsuhiro; Fukuma, Eisuke

    2010-12-01

    The aim of this study was to determine the accuracy of measuring preoperative cancer extent using automated breast ultrasonography (US). This retrospective study consisted of 40 patients with histopathologically confirmed breast cancer. All of the patients underwent automated breast US (ABVS; Siemens Medical Solutions, Mountain View, CA, USA) on the day before the surgery. The sizes of the lesions on US were measured on coronal multiplanar reconstruction images using the ABVS workstation. Histopathological measurement of tumor size included not only the invasive foci but also any in situ component and was used as the gold standard. The discrepancy of the tumor extent between automated breast US and the histological examination was calculated. Automated breast US enabled visualization of the breast carcinomas in all patients. The mean size of the lesions on US was 12 mm (range 4-62 mm). The histopathological diagnosis was ductal carcinoma in situ (DCIS) in seven patients and invasive ductal carcinoma in 33 patients (18 without an intraductal component, 15 with an intraductal component). Lesions ranged in diameter from 4 to 65 mm (mean 16 mm). The accuracy of determination of the tumor extent with a deviation in length of <2 cm was 98% (39/40). Automated breast US is thought to be useful for evaluating tumor extent preoperatively.

  14. In vivo robotics: the automation of neuroscience and other intact-system biological fields.

    Science.gov (United States)

    Kodandaramaiah, Suhasa B; Boyden, Edward S; Forest, Craig R

    2013-12-01

    Robotic and automation technologies have played a huge role in in vitro biological science, having proved critical for scientific endeavors such as genome sequencing and high-throughput screening. Robotic and automation strategies are beginning to play a greater role in in vivo and in situ sciences, especially when it comes to the difficult in vivo experiments required for understanding the neural mechanisms of behavior and disease. In this perspective, we discuss the prospects for robotics and automation to influence neuroscientific and intact-system biology fields. We discuss how robotic innovations might be created to open up new frontiers in basic and applied neuroscience and present a concrete example with our recent automation of in vivo whole-cell patch clamp electrophysiology of neurons in the living mouse brain. © 2013 New York Academy of Sciences.

  15. Recent Successes and Future Plans for NASA's Space Communications and Navigation Testbed on the International Space Station

    Science.gov (United States)

    Reinhart, Richard C.; Sankovic, John M.; Johnson, Sandra K.; Lux, James P.; Chelmins, David T.

    2014-01-01

    new waveforms requires a waveform build environment for the particular SDR, helps assess the usefulness of the platform provider documentation, and exercises the objectives of STRS Standard and the SCaN Testbed. There is considerable interest in conducting experiments using the SCaN Testbed from NASA, academia, commercial companies, and other space agencies. There are approximately 25 experiments or activities supported by the project underway or in development, with more proposals ready, as time and funding allow, and new experiment solicitations available. NASA continues development of new waveforms and applications in communications, networking, and navigation, the first university experimenters are beginning waveform development, which will support the next generation of communications engineers, and international interest is beginning with space agency partners from European Space Agency (ESA) and the Centre National d'Etudes Spatiales (CNES). This paper will provide an overview of the SCaN Testbed and discuss its recent accomplishments and experiment activities.Its recent successes in Ka-band operations, reception of the newest GPS signals, SDR reconfigurations, and STRS demonstration in space when combined with the future experiment portfolio have positioned the SCaN Testbed to enable future space communications and navigation capabilities for exploration and science.

  16. A Testbed to Evaluate the FIWARE-Based IoT Platform in the Domain of Precision Agriculture

    Science.gov (United States)

    Martínez, Ramón; Pastor, Juan Ángel; Álvarez, Bárbara; Iborra, Andrés

    2016-01-01

    Wireless sensor networks (WSNs) represent one of the most promising technologies for precision farming. Over the next few years, a significant increase in the use of such systems on commercial farms is expected. WSNs present a number of problems, regarding scalability, interoperability, communications, connectivity with databases and data processing. Different Internet of Things middleware is appearing to overcome these challenges. This paper checks whether one of these middleware, FIWARE, is suitable for the development of agricultural applications. To the authors’ knowledge, there are no works that show how to use FIWARE in precision agriculture and study its appropriateness, its scalability and its efficiency for this kind of applications. To do this, a testbed has been designed and implemented to simulate different deployments and load conditions. The testbed is a typical FIWARE application, complete, yet simple and comprehensible enough to show the main features and components of FIWARE, as well as the complexity of using this technology. Although the testbed has been deployed in a laboratory environment, its design is based on the analysis of an Internet of Things use case scenario in the domain of precision agriculture. PMID:27886091

  17. ASE-BAN, a Wireless Body Area Network Testbed

    DEFF Research Database (Denmark)

    Madsen, Jens Kargaard; Karstoft, Henrik; Toftegaard, Thomas Skjødeberg

    2010-01-01

    /actuators attached to the body and a host server application. The gateway uses the BlackFin BF533 processor from Analog Devices, and uses Bluetooth for wireless communication. Two types of sensors are attached to the network: an electro-cardio-gram sensor and an oximeter sensor. The testbed has been successfully...

  18. Accelerating Innovation that Enhances Resource Recovery in the Wastewater Sector: Advancing a National Testbed Network.

    Science.gov (United States)

    Mihelcic, James R; Ren, Zhiyong Jason; Cornejo, Pablo K; Fisher, Aaron; Simon, A J; Snyder, Seth W; Zhang, Qiong; Rosso, Diego; Huggins, Tyler M; Cooper, William; Moeller, Jeff; Rose, Bob; Schottel, Brandi L; Turgeon, Jason

    2017-07-18

    This Feature examines significant challenges and opportunities to spur innovation and accelerate adoption of reliable technologies that enhance integrated resource recovery in the wastewater sector through the creation of a national testbed network. The network is a virtual entity that connects appropriate physical testing facilities, and other components needed for a testbed network, with researchers, investors, technology providers, utilities, regulators, and other stakeholders to accelerate the adoption of innovative technologies and processes that are needed for the water resource recovery facility of the future. Here we summarize and extract key issues and developments, to provide a strategy for the wastewater sector to accelerate a path forward that leads to new sustainable water infrastructures.

  19. Embedded Sensors and Controls to Improve Component Performance and Reliability -- Loop-scale Testbed Design Report

    Energy Technology Data Exchange (ETDEWEB)

    Melin, Alexander M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Kisner, Roger A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-09-01

    Embedded instrumentation and control systems that can operate in extreme environments are challenging to design and operate. Extreme environments limit the options for sensors and actuators and degrade their performance. Because sensors and actuators are necessary for feedback control, these limitations mean that designing embedded instrumentation and control systems for the challenging environments of nuclear reactors requires advanced technical solutions that are not available commercially. This report details the development of testbed that will be used for cross-cutting embedded instrumentation and control research for nuclear power applications. This research is funded by the Department of Energy's Nuclear Energy Enabling Technology program's Advanced Sensors and Instrumentation topic. The design goal of the loop-scale testbed is to build a low temperature pump that utilizes magnetic bearing that will be incorporated into a water loop to test control system performance and self-sensing techniques. Specifically, this testbed will be used to analyze control system performance in response to nonlinear and cross-coupling fluid effects between the shaft axes of motion, rotordynamics and gyroscopic effects, and impeller disturbances. This testbed will also be used to characterize the performance losses when using self-sensing position measurement techniques. Active magnetic bearings are a technology that can reduce failures and maintenance costs in nuclear power plants. They are particularly relevant to liquid salt reactors that operate at high temperatures (700 C). Pumps used in the extreme environment of liquid salt reactors provide many engineering challenges that can be overcome with magnetic bearings and their associated embedded instrumentation and control. This report will give details of the mechanical design and electromagnetic design of the loop-scale embedded instrumentation and control testbed.

  20. On-wire lithography-generated molecule-based transport junctions: a new testbed for molecular electronics.

    Science.gov (United States)

    Chen, Xiaodong; Jeon, You-Moon; Jang, Jae-Won; Qin, Lidong; Huo, Fengwei; Wei, Wei; Mirkin, Chad A

    2008-07-02

    On-wire lithography (OWL) fabricated nanogaps are used as a new testbed to construct molecular transport junctions (MTJs) through the assembly of thiolated molecular wires across a nanogap formed between two Au electrodes. In addition, we show that one can use OWL to rapidly characterize a MTJ and optimize gap size for two molecular wires of different dimensions. Finally, we have used this new testbed to identify unusual temperature-dependent transport mechanisms for alpha,omega-dithiol terminated oligo(phenylene ethynylene).

  1. Bright-field in situ hybridization for HER2 gene amplification in breast cancer using tissue microarrays: correlation between chromogenic (CISH) and automated silver-enhanced (SISH) methods with patient outcome.

    Science.gov (United States)

    Francis, Glenn D; Jones, Mark A; Beadle, Geoffrey F; Stein, Sandra R

    2009-06-01

    HER2 gene amplification or overexpression occurs in 15% to 25% of breast cancers and has implications for treatment and prognosis. The most commonly used methods for HER2 testing are fluorescence in situ hybridization (FISH) and immunohistochemistry. FISH is considered to be the reference standard and more accurately predicts response to trastuzumab, but is technically demanding, expensive, and requires specialized equipment. In situ hybridization is required to be eligible for adjuvant treatment with trastuzumab in Australia. Bright-field in situ hybridization is an alternative to FISH and uses a combination of in situ methodology and a peroxidase-mediated chromogenic substrate such as diaminobenzidine [chromogenic in situ hybridization (CISH)] or multimer technology coupled with enzyme metallography [silver-enhanced in situ hybridization (SISH)] to create a marker visible under bright-field microscopy. CISH was introduced into diagnostic testing in Australia in October 2006. SISH methodology is a more recent introduction into the testing repertoire. An evaluation of CISH and SISH performance to assess patient outcome were performed using tissue microarrays. Tissue microarrays were constructed in duplicate using material from 593 patients with invasive breast carcinoma and assessed using CISH and SISH. Gene amplification was assessed using the American Society of Clinical Oncology/College of American Pathologists guideline and Australian HER2 Advisory Board criteria (single probe: diploid, 1 to 2.5 copies/nucleus; polysomy >2.5 to 4 copies/nucleus; equivocal, >4 to 6 copies/nucleus; low-level amplification, >6 to 10 copies/nucleus and high-level amplification >10 copies/nucleus; dual probe HER2/CHR17 ratio: nonamplified 2.2). Results were informative for 337 tissue cores comprising 230 patient samples. Concordance rates were 96% for HER2 single probe CISH and SISH and 95.5% for single probe CISH and dual probe HER2/CHR17 SISH. Both bright-field methods correlated

  2. The feasibility of automated online flow cytometry for in-situ monitoring of microbial dynamics in aquatic ecosystems

    Science.gov (United States)

    Besmer, Michael D.; Weissbrodt, David G.; Kratochvil, Bradley E.; Sigrist, Jürg A.; Weyland, Mathias S.; Hammes, Frederik

    2014-01-01

    Fluorescent staining coupled with flow cytometry (FCM) is often used for the monitoring, quantification and characterization of bacteria in engineered and environmental aquatic ecosystems including seawater, freshwater, drinking water, wastewater, and industrial bioreactors. However, infrequent grab sampling hampers accurate characterization and subsequent understanding of microbial dynamics in all of these ecosystems. A logic technological progression is high throughput and full automation of the sampling, staining, measurement, and data analysis steps. Here we assess the feasibility and applicability of automated FCM by means of actual data sets produced with prototype instrumentation. As proof-of-concept we demonstrate examples of microbial dynamics in (i) flowing tap water from a municipal drinking water supply network and (ii) river water from a small creek subject to two rainfall events. In both cases, automated measurements were done at 15-min intervals during 12–14 consecutive days, yielding more than 1000 individual data points for each ecosystem. The extensive data sets derived from the automated measurements allowed for the establishment of baseline data for each ecosystem, as well as for the recognition of daily variations and specific events that would most likely be missed (or miss-characterized) by infrequent sampling. In addition, the online FCM data from the river water was combined and correlated with online measurements of abiotic parameters, showing considerable potential for a better understanding of cause-and-effect relationships in aquatic ecosystems. Although several challenges remain, the successful operation of an automated online FCM system and the basic interpretation of the resulting data sets represent a breakthrough toward the eventual establishment of fully automated online microbiological monitoring technologies. PMID:24917858

  3. James Webb Space Telescope Optical Simulation Testbed: Segmented Mirror Phase Retrieval Testing

    Science.gov (United States)

    Laginja, Iva; Egron, Sylvain; Brady, Greg; Soummer, Remi; Lajoie, Charles-Philippe; Bonnefois, Aurélie; Long, Joseph; Michau, Vincent; Choquet, Elodie; Ferrari, Marc; Leboulleux, Lucie; Mazoyer, Johan; N’Diaye, Mamadou; Perrin, Marshall; Petrone, Peter; Pueyo, Laurent; Sivaramakrishnan, Anand

    2018-01-01

    The James Webb Space Telescope (JWST) Optical Simulation Testbed (JOST) is a hardware simulator designed to produce JWST-like images. A model of the JWST three mirror anastigmat is realized with three lenses in form of a Cooke Triplet, which provides JWST-like optical quality over a field equivalent to a NIRCam module, and an Iris AO segmented mirror with hexagonal elements is standing in for the JWST segmented primary. This setup successfully produces images extremely similar to NIRCam images from cryotesting in terms of the PSF morphology and sampling relative to the diffraction limit.The testbed is used for staff training of the wavefront sensing and control (WFS&C) team and for independent analysis of WFS&C scenarios of the JWST. Algorithms like geometric phase retrieval (GPR) that may be used in flight and potential upgrades to JWST WFS&C will be explored. We report on the current status of the testbed after alignment, implementation of the segmented mirror, and testing of phase retrieval techniques.This optical bench complements other work at the Makidon laboratory at the Space Telescope Science Institute, including the investigation of coronagraphy for segmented aperture telescopes. Beyond JWST we intend to use JOST for WFS&C studies for future large segmented space telescopes such as LUVOIR.

  4. LTE-Advanced/WLAN testbed

    OpenAIRE

    Plaisner, Denis

    2017-01-01

    Táto práca sa zaoberá skúmaním a vyhodnocovaním komunikácie štandardov LTE-Advance a WiFi (IEEE 802.11n/ac). Pri jednotlivých štandardoch je preskúmaný chybový parameter EVM. Pre prácu s jednotlivými štandardmi je navrhnuté univerzálne pracovisko (testbed). Toto univerzálne pracovisko slúži na nastavovanie vysielacieho a prijímacieho zariadenia a na spracovávanie prenášaných signálov a ich vyhodnocovanie. Pre túto prácu je vybrané prostredie Matlab, cez ktoré sa ovládajú použité prístroje ako...

  5. Large Scale Data Mining to Improve Usability of Data: An Intelligent Archive Testbed

    Science.gov (United States)

    Ramapriyan, Hampapuram; Isaac, David; Yang, Wenli; Morse, Steve

    2005-01-01

    Research in certain scientific disciplines - including Earth science, particle physics, and astrophysics - continually faces the challenge that the volume of data needed to perform valid scientific research can at times overwhelm even a sizable research community. The desire to improve utilization of this data gave rise to the Intelligent Archives project, which seeks to make data archives active participants in a knowledge building system capable of discovering events or patterns that represent new information or knowledge. Data mining can automatically discover patterns and events, but it is generally viewed as unsuited for large-scale use in disciplines like Earth science that routinely involve very high data volumes. Dozens of research projects have shown promising uses of data mining in Earth science, but all of these are based on experiments with data subsets of a few gigabytes or less, rather than the terabytes or petabytes typically encountered in operational systems. To bridge this gap, the Intelligent Archives project is establishing a testbed with the goal of demonstrating the use of data mining techniques in an operationally-relevant environment. This paper discusses the goals of the testbed and the design choices surrounding critical issues that arose during testbed implementation.

  6. Morphological spot counting from stacked images for automated analysis of gene copy numbers by fluorescence in situ hybridization.

    Science.gov (United States)

    Grigoryan, Artyom M; Dougherty, Edward R; Kononen, Juha; Bubendorf, Lukas; Hostetter, Galen; Kallioniemi, Olli

    2002-01-01

    Fluorescence in situ hybridization (FISH) is a molecular diagnostic technique in which a fluorescent labeled probe hybridizes to a target nucleotide sequence of deoxyribose nucleic acid. Upon excitation, each chromosome containing the target sequence produces a fluorescent signal (spot). Because fluorescent spot counting is tedious and often subjective, automated digital algorithms to count spots are desirable. New technology provides a stack of images on multiple focal planes throughout a tissue sample. Multiple-focal-plane imaging helps overcome the biases and imprecision inherent in single-focal-plane methods. This paper proposes an algorithm for global spot counting in stacked three-dimensional slice FISH images without the necessity of nuclei segmentation. It is designed to work in complex backgrounds, when there are agglomerated nuclei, and in the presence of illumination gradients. It is based on the morphological top-hat transform, which locates intensity spikes on irregular backgrounds. After finding signals in the slice images, the algorithm groups these together to form three-dimensional spots. Filters are employed to separate legitimate spots from fluorescent noise. The algorithm is set in a comprehensive toolbox that provides visualization and analytic facilities. It includes simulation software that allows examination of algorithm performance for various image and algorithm parameter settings, including signal size, signal density, and the number of slices.

  7. A MIMO-OFDM Testbed for Wireless Local Area Networks

    Directory of Open Access Journals (Sweden)

    Conrat Jean-Marc

    2006-01-01

    Full Text Available We describe the design steps and final implementation of a MIMO OFDM prototype platform developed to enhance the performance of wireless LAN standards such as HiperLAN/2 and 802.11, using multiple transmit and multiple receive antennas. We first describe the channel measurement campaign used to characterize the indoor operational propagation environment, and analyze the influence of the channel on code design through a ray-tracing channel simulator. We also comment on some antenna and RF issues which are of importance for the final realization of the testbed. Multiple coding, decoding, and channel estimation strategies are discussed and their respective performance-complexity trade-offs are evaluated over the realistic channel obtained from the propagation studies. Finally, we present the design methodology, including cross-validation of the Matlab, C++, and VHDL components, and the final demonstrator architecture. We highlight the increased measured performance of the MIMO testbed over the single-antenna system.

  8. In situ Hearing Tests for the Purpose of a Self-Fit Hearing Aid

    NARCIS (Netherlands)

    Boymans, Monique; Dreschler, Wouter A.

    2017-01-01

    This study investigated the potential and limitations of a self-fit hearing aid. This can be used in the "developing" world or in countries with large distances between the hearing-impaired subjects and the professional. It contains an on-board tone generator for in situ user-controlled, automated

  9. The Segmented Aperture Interferometric Nulling Testbed (SAINT) I: overview and air-side system description

    Science.gov (United States)

    Hicks, Brian A.; Lyon, Richard G.; Petrone, Peter; Ballard, Marlin; Bolcar, Matthew R.; Bolognese, Jeff; Clampin, Mark; Dogoda, Peter; Dworzanski, Daniel; Helmbrecht, Michael A.; Koca, Corina; Shiri, Ron

    2016-07-01

    This work presents an overview of the Segmented Aperture Interferometric Nulling Testbed (SAINT), a project that will pair an actively-controlled macro-scale segmented mirror with the Visible Nulling Coronagraph (VNC). SAINT will incorporate the VNC's demonstrated wavefront sensing and control system to refine and quantify end-to-end high-contrast starlight suppression performance. This pathfinder testbed will be used as a tool to study and refine approaches to mitigating instabilities and complex diffraction expected from future large segmented aperture telescopes.

  10. Design and Development of a Rapid Research, Design, and Development Platform for In-Situ Testing of Tools and Concepts for Trajectory-Based Operations

    Science.gov (United States)

    Underwood, Matthew C.

    2017-01-01

    To provide justification for equipping a fleet of aircraft with avionics capable of supporting trajectory-based operations, significant flight testing must be accomplished. However, equipping aircraft with these avionics and enabling technologies to communicate the clearances required for trajectory-based operations is cost-challenging using conventional avionics approaches. This paper describes an approach to minimize the costs and risks of flight testing these technologies in-situ, discusses the test-bed platform developed, and highlights results from a proof-of-concept flight test campaign that demonstrates the feasibility and efficiency of this approach.

  11. Distributed computing testbed for a remote experimental environment

    International Nuclear Information System (INIS)

    Butner, D.N.; Casper, T.A.; Howard, B.C.; Henline, P.A.; Davis, S.L.; Barnes, D.

    1995-01-01

    Collaboration is increasing as physics research becomes concentrated on a few large, expensive facilities, particularly in magnetic fusion energy research, with national and international participation. These facilities are designed for steady state operation and interactive, real-time experimentation. We are developing tools to provide for the establishment of geographically distant centers for interactive operations; such centers would allow scientists to participate in experiments from their home institutions. A testbed is being developed for a Remote Experimental Environment (REE), a ''Collaboratory.'' The testbed will be used to evaluate the ability of a remotely located group of scientists to conduct research on the DIII-D Tokamak at General Atomics. The REE will serve as a testing environment for advanced control and collaboration concepts applicable to future experiments. Process-to-process communications over high speed wide area networks provide real-time synchronization and exchange of data among multiple computer networks, while the ability to conduct research is enhanced by adding audio/video communication capabilities. The Open Software Foundation's Distributed Computing Environment is being used to test concepts in distributed control, security, naming, remote procedure calls and distributed file access using the Distributed File Services. We are exploring the technology and sociology of remotely participating in the operation of a large scale experimental facility

  12. Software Testbed for Developing and Evaluating Integrated Autonomous Subsystems

    Science.gov (United States)

    Ong, James; Remolina, Emilio; Prompt, Axel; Robinson, Peter; Sweet, Adam; Nishikawa, David

    2015-01-01

    To implement fault tolerant autonomy in future space systems, it will be necessary to integrate planning, adaptive control, and state estimation subsystems. However, integrating these subsystems is difficult, time-consuming, and error-prone. This paper describes Intelliface/ADAPT, a software testbed that helps researchers develop and test alternative strategies for integrating planning, execution, and diagnosis subsystems more quickly and easily. The testbed's architecture, graphical data displays, and implementations of the integrated subsystems support easy plug and play of alternate components to support research and development in fault-tolerant control of autonomous vehicles and operations support systems. Intelliface/ADAPT controls NASA's Advanced Diagnostics and Prognostics Testbed (ADAPT), which comprises batteries, electrical loads (fans, pumps, and lights), relays, circuit breakers, invertors, and sensors. During plan execution, an experimentor can inject faults into the ADAPT testbed by tripping circuit breakers, changing fan speed settings, and closing valves to restrict fluid flow. The diagnostic subsystem, based on NASA's Hybrid Diagnosis Engine (HyDE), detects and isolates these faults to determine the new state of the plant, ADAPT. Intelliface/ADAPT then updates its model of the ADAPT system's resources and determines whether the current plan can be executed using the reduced resources. If not, the planning subsystem generates a new plan that reschedules tasks, reconfigures ADAPT, and reassigns the use of ADAPT resources as needed to work around the fault. The resource model, planning domain model, and planning goals are expressed using NASA's Action Notation Modeling Language (ANML). Parts of the ANML model are generated automatically, and other parts are constructed by hand using the Planning Model Integrated Development Environment, a visual Eclipse-based IDE that accelerates ANML model development. Because native ANML planners are currently

  13. Evaluation of Unmanned Aircraft Systems (UAS) for Weather and Climate using the Multi-testbed approach

    Science.gov (United States)

    Baker, B.; Lee, T.; Buban, M.; Dumas, E. J.

    2017-12-01

    Evaluation of Unmanned Aircraft Systems (UAS) for Weather and Climate using the Multi-testbed approachC. Bruce Baker1, Ed Dumas1,2, Temple Lee1,2, Michael Buban1,21NOAA ARL, Atmospheric Turbulence and Diffusion Division, Oak Ridge, TN2Oak Ridge Associated Universities, Oak Ridge, TN The development of a small Unmanned Aerial System (sUAS) testbeds that can be used to validate, integrate, calibrate and evaluate new technology and sensors for routine boundary layer research, validation of operational weather models, improvement of model parameterizations, and recording observations within high-impact storms is important for understanding the importance and impact of using sUAS's routinely as a new observing platform. The goal of the multi-testbed approach is to build a robust set of protocols to assess the cost and operational feasibility of unmanned observations for routine applications using various combinations of sUAS aircraft and sensors in different locations and field experiments. All of these observational testbeds serve different community needs, but they also use a diverse suite of methodologies for calibration and evaluation of different sensors and platforms for severe weather and boundary layer research. The primary focus will be to evaluate meteorological sensor payloads to measure thermodynamic parameters and define surface characteristics with visible, IR, and multi-spectral cameras. This evaluation will lead to recommendations for sensor payloads for VTOL and fixed-wing sUAS.

  14. In-situ thermography of automated fiber placement parts

    Science.gov (United States)

    Gregory, Elizabeth D.; Juarez, Peter D.

    2018-04-01

    Automated fiber placement (AFP) provides precision and repeatable manufacturing of both simple and complex geometry composite parts. However, AFP also introduces the possibility for unique flaws such as overlapping tows, gaps between tows, tow twists, lack of layer adhesion and foreign object debris. These types of flaws can all result in a significant loss of performance in the final part. The current inspection method for these flaws is a costly and time intensive visual inspection of each ply layer. This work describes some initial efforts to incorporate thermal inspection on the AFP head and analysis of the data to identify the previously mentioned flaws. Previous bench-top laboratory experiments demonstrated that laps, gaps, and twists were identified from a thermal image. The AFP head uses an on- board lamp to preheat the surface of the part during layup to increase ply consolidation. The preheated surface is used as a thermal source to observe the state of the new material after compaction. We will present data collected with the Integrated Structural Assembly of Advanced Composites (ISAAC) AFP machine at Langley Research Center showing that changes to the temperature profile is sufficient for identifying all types of flaws.

  15. Conceptual Design and Cost Estimate of a Subsonic NASA Testbed Vehicle (NTV) for Aeronautics Research

    Science.gov (United States)

    Nickol, Craig L.; Frederic, Peter

    2013-01-01

    A conceptual design and cost estimate for a subsonic flight research vehicle designed to support NASA's Environmentally Responsible Aviation (ERA) project goals is presented. To investigate the technical and economic feasibility of modifying an existing aircraft, a highly modified Boeing 717 was developed for maturation of technologies supporting the three ERA project goals of reduced fuel burn, noise, and emissions. This modified 717 utilizes midfuselage mounted modern high bypass ratio engines in conjunction with engine exhaust shielding structures to provide a low noise testbed. The testbed also integrates a natural laminar flow wing section and active flow control for the vertical tail. An eight year program plan was created to incrementally modify and test the vehicle, enabling the suite of technology benefits to be isolated and quantified. Based on the conceptual design and programmatic plan for this testbed vehicle, a full cost estimate of $526M was developed, representing then-year dollars at a 50% confidence level.

  16. Solar Resource Assessment with Sky Imagery and a Virtual Testbed for Sky Imager Solar Forecasting

    Science.gov (United States)

    Kurtz, Benjamin Bernard

    In recent years, ground-based sky imagers have emerged as a promising tool for forecasting solar energy on short time scales (0 to 30 minutes ahead). Following the development of sky imager hardware and algorithms at UC San Diego, we present three new or improved algorithms for sky imager forecasting and forecast evaluation. First, we present an algorithm for measuring irradiance with a sky imager. Sky imager forecasts are often used in conjunction with other instruments for measuring irradiance, so this has the potential to decrease instrumentation costs and logistical complexity. In particular, the forecast algorithm itself often relies on knowledge of the current irradiance which can now be provided directly from the sky images. Irradiance measurements are accurate to within about 10%. Second, we demonstrate a virtual sky imager testbed that can be used for validating and enhancing the forecast algorithm. The testbed uses high-quality (but slow) simulations to produce virtual clouds and sky images. Because virtual cloud locations are known, much more advanced validation procedures are possible with the virtual testbed than with measured data. In this way, we are able to determine that camera geometry and non-uniform evolution of the cloud field are the two largest sources of forecast error. Finally, with the assistance of the virtual sky imager testbed, we develop improvements to the cloud advection model used for forecasting. The new advection schemes are 10-20% better at short time horizons.

  17. Real-time remote diagnostic monitoring test-bed in JET

    International Nuclear Information System (INIS)

    Castro, R.; Kneupner, K.; Vega, J.; De Arcas, G.; Lopez, J.M.; Purahoo, K.; Murari, A.; Fonseca, A.; Pereira, A.; Portas, A.

    2010-01-01

    Based on the remote experimentation concept oriented to long pulse shots, a test-bed system has been implemented in JET. Its main functionality is the real-time monitoring, on remote, of a reflectometer diagnostic, to visualize different data outputs and status information. The architecture of the system is formed by: the data generator components, the data distribution system, an access control service, and the client applications. In the test-bed there is one data generator, which is the acquisition equipment associated with the reflectometer diagnostic that generates data and status information. The data distribution system has been implemented using a publishing-subscribing technology that receives data from data generators and redistributes them to client applications. And finally, for monitoring, a client application based on JAVA Web Start technology has been used. There are three interesting results from this project. The first one is the analysis of different aspects (data formats, data frame rate, data resolution, etc) related with remote real-time diagnostic monitoring oriented to long pulse experiments. The second one is the definition and implementation of an architecture, flexible enough to be applied to different types of data generated from other diagnostics, and that fits with remote access requirements. Finally, the third result is a secure system, taking into account internal networks and firewalls aspects of JET, and securing the access from remote users. For this last issue, PAPI technology has been used, enabling access control based on user attributes, enabling mobile users to monitor diagnostics in real-time, and enabling the integration of this service into the EFDA Federation (Castro et al., 2008 ).

  18. Real-time remote diagnostic monitoring test-bed in JET

    Energy Technology Data Exchange (ETDEWEB)

    Castro, R., E-mail: rodrigo.castro@ciemat.e [Asociacion EURATOM/CIEMAT para Fusion, Madrid (Spain); Kneupner, K. [EURATOM/UKAEA Fusion Association, Culham Science Centre, Abingdon, OX14 3DB (United Kingdom); Vega, J. [Asociacion EURATOM/CIEMAT para Fusion, Madrid (Spain); De Arcas, G.; Lopez, J.M. [Universidad Politecnica de Madrid, Grupo I2A2, Madrid (Spain); Purahoo, K. [EURATOM/UKAEA Fusion Association, Culham Science Centre, Abingdon, OX14 3DB (United Kingdom); Murari, A. [Associazione EURATOM-ENEA per la Fusione, Consorzio RFX, 4-35127 Padova (Italy); Fonseca, A. [Associacao EURATOM/IST, Lisbon (Portugal); Pereira, A.; Portas, A. [Asociacion EURATOM/CIEMAT para Fusion, Madrid (Spain)

    2010-07-15

    Based on the remote experimentation concept oriented to long pulse shots, a test-bed system has been implemented in JET. Its main functionality is the real-time monitoring, on remote, of a reflectometer diagnostic, to visualize different data outputs and status information. The architecture of the system is formed by: the data generator components, the data distribution system, an access control service, and the client applications. In the test-bed there is one data generator, which is the acquisition equipment associated with the reflectometer diagnostic that generates data and status information. The data distribution system has been implemented using a publishing-subscribing technology that receives data from data generators and redistributes them to client applications. And finally, for monitoring, a client application based on JAVA Web Start technology has been used. There are three interesting results from this project. The first one is the analysis of different aspects (data formats, data frame rate, data resolution, etc) related with remote real-time diagnostic monitoring oriented to long pulse experiments. The second one is the definition and implementation of an architecture, flexible enough to be applied to different types of data generated from other diagnostics, and that fits with remote access requirements. Finally, the third result is a secure system, taking into account internal networks and firewalls aspects of JET, and securing the access from remote users. For this last issue, PAPI technology has been used, enabling access control based on user attributes, enabling mobile users to monitor diagnostics in real-time, and enabling the integration of this service into the EFDA Federation (Castro et al., 2008 ).

  19. Social media analytics and research testbed (SMART: Exploring spatiotemporal patterns of human dynamics with geo-targeted social media messages

    Directory of Open Access Journals (Sweden)

    Jiue-An Yang

    2016-06-01

    Full Text Available The multilevel model of meme diffusion conceptualizes how mediated messages diffuse over time and space. As a pilot application of implementing the meme diffusion, we developed the social media analytics and research testbed to monitor Twitter messages and track the diffusion of information in and across different cities and geographic regions. Social media analytics and research testbed is an online geo-targeted search and analytics tool, including an automatic data processing procedure at the backend and an interactive frontend user interface. Social media analytics and research testbed is initially designed to facilitate (1 searching and geo-locating tweet topics and terms in different cities and geographic regions; (2 filtering noise from raw data (such as removing redundant retweets and using machine learning methods to improve precision; (3 analyzing social media data from a spatiotemporal perspective; and (4 visualizing social media data in diagnostic ways (such as weekly and monthly trends, trend maps, top media, top retweets, top mentions, or top hashtags. Social media analytics and research testbed provides researchers and domain experts with a tool that can efficiently facilitate the refinement, formalization, and testing of research hypotheses or questions. Three case studies (flu outbreaks, Ebola epidemic, and marijuana legalization are introduced to illustrate how the predictions of meme diffusion can be examined and to demonstrate the potentials and key functions of social media analytics and research testbed.

  20. Drilling Automation Demonstrations in Subsurface Exploration for Astrobiology

    Science.gov (United States)

    Glass, Brian; Cannon, H.; Lee, P.; Hanagud, S.; Davis, K.

    2006-01-01

    This project proposes to study subsurface permafrost microbial habitats at a relevant Arctic Mars-analog site (Haughton Crater, Devon Island, Canada) while developing and maturing the subsurface drilling and drilling automation technologies that will be required by post-2010 missions. It builds on earlier drilling technology projects to add permafrost and ice-drilling capabilities to 5m with a lightweight drill that will be automatically monitored and controlled in-situ. Frozen cores obtained with this drill under sterilized protocols will be used in testing three hypotheses pertaining to near-surface physical geology and ground H2O ice distribution, viewed as a habitat for microbial life in subsurface ice and ice-consolidated sediments. Automation technologies employed will demonstrate hands-off diagnostics and drill control, using novel vibrational dynamical analysis methods and model-based reasoning to monitor and identify drilling fault states before and during faults. Three field deployments, to a Mars-analog site with frozen impact crater fallback breccia, will support science goals, provide a rigorous test of drilling automation and lightweight permafrost drilling, and leverage past experience with the field site s particular logistics.

  1. Analysis, modeling, and simulation (AMS) testbed development and evaluation to support dynamic mobility applications (DMA) and active transportation and demand management (ATDM) programs : Dallas testbed analysis plan.

    Science.gov (United States)

    2016-06-16

    The primary objective of this project is to develop multiple simulation Testbeds/transportation models to evaluate theimpacts of DMA connected vehicle applications and the active and dynamic transportation management (ATDM)strategies. The outputs (mo...

  2. Design and construction of a 76m long-travel laser enclosure for a space occulter testbed

    Science.gov (United States)

    Galvin, Michael; Kim, Yunjong; Kasdin, N. Jeremy; Sirbu, Dan; Vanderbei, Robert; Echeverri, Dan; Sagolla, Giuseppe; Rousing, Andreas; Balasubramanian, Kunjithapatham; Ryan, Daniel; Shaklan, Stuart; Lisman, Doug

    2016-07-01

    Princeton University is upgrading our space occulter testbed. In particular, we are lengthening it to 76m to achieve flightlike Fresnel numbers. This much longer testbed required an all-new enclosure design. In this design, we prioritized modularity and the use of commercial off-the-shelf (COTS) and semi-COTS components. Several of the technical challenges encountered included an unexpected slow beam drift and black paint selection. Herein we describe the design and construction of this long-travel laser enclosure.

  3. The end-to-end testbed of the optical metrology system on-board LISA Pathfinder

    Energy Technology Data Exchange (ETDEWEB)

    Steier, F; Cervantes, F Guzman; Marin, A F GarcIa; Heinzel, G; Danzmann, K [Max-Planck-Institut fuer Gravitationsphysik (Albert-Einstein-Institut) and Universitaet Hannover (Germany); Gerardi, D, E-mail: frank.steier@aei.mpg.d [EADS Astrium Satellites GmbH, Friedrichshafen (Germany)

    2009-05-07

    LISA Pathfinder is a technology demonstration mission for the Laser Interferometer Space Antenna (LISA). The main experiment on-board LISA Pathfinder is the so-called LISA Technology Package (LTP) which has the aim to measure the differential acceleration between two free-falling test masses with an accuracy of 3 x 10{sup -14} ms{sup -2} Hz{sup -1/2} between 1 mHz and 30 mHz. This measurement is performed interferometrically by the optical metrology system (OMS) on-board LISA Pathfinder. In this paper, we present the development of an experimental end-to-end testbed of the entire OMS. It includes the interferometer and its sub-units, the interferometer backend which is a phasemeter and the processing of the phasemeter output data. Furthermore, three-axes piezo-actuated mirrors are used instead of the free-falling test masses for the characterization of the dynamic behaviour of the system and some parts of the drag-free and attitude control system (DFACS) which controls the test masses and the satellite. The end-to-end testbed includes all parts of the LTP that can reasonably be tested on earth without free-falling test masses. At its present status it consists mainly of breadboard components. Some of those have already been replaced by engineering models of the LTP experiment. In the next steps, further engineering and flight models will also be inserted in this testbed and tested against well-characterized breadboard components. The presented testbed is an important reference for the unit tests and can also be used for validation of the on-board experiment during the mission.

  4. The Orlando TDWR testbed and airborne wind shear date comparison results

    Science.gov (United States)

    Campbell, Steven; Berke, Anthony; Matthews, Michael

    1992-01-01

    The focus of this talk is on comparing terminal Doppler Weather Radar (TDWR) and airborne wind shear data in computing a microburst hazard index called the F factor. The TDWR is a ground-based system for detecting wind shear hazards to aviation in the terminal area. The Federal Aviation Administration will begin deploying TDWR units near 45 airports in late 1992. As part of this development effort, M.I.T. Lincoln Laboratory operates under F.A.A. support a TDWR testbed radar in Orlando, FL. During the past two years, a series of flight tests has been conducted with instrumented aircraft penetrating microburst events while under testbed radar surveillance. These tests were carried out with a Cessna Citation 2 aircraft operated by the University of North Dakota (UND) Center for Aerospace Sciences in 1990, and a Boeing 737 operated by NASA Langley Research Center in 1991. A large data base of approximately 60 instrumented microburst penetrations has been obtained from these flights.

  5. Development of optical packet and circuit integrated ring network testbed.

    Science.gov (United States)

    Furukawa, Hideaki; Harai, Hiroaki; Miyazawa, Takaya; Shinada, Satoshi; Kawasaki, Wataru; Wada, Naoya

    2011-12-12

    We developed novel integrated optical packet and circuit switch-node equipment. Compared with our previous equipment, a polarization-independent 4 × 4 semiconductor optical amplifier switch subsystem, gain-controlled optical amplifiers, and one 100 Gbps optical packet transponder and seven 10 Gbps optical path transponders with 10 Gigabit Ethernet (10GbE) client-interfaces were newly installed in the present system. The switch and amplifiers can provide more stable operation without equipment adjustments for the frequent polarization-rotations and dynamic packet-rate changes of optical packets. We constructed an optical packet and circuit integrated ring network testbed consisting of two switch nodes for accelerating network development, and we demonstrated 66 km fiber transmission and switching operation of multiplexed 14-wavelength 10 Gbps optical paths and 100 Gbps optical packets encapsulating 10GbE frames. Error-free (frame error rate optical packets of various packet lengths and packet rates, and stable operation of the network testbed was confirmed. In addition, 4K uncompressed video streaming over OPS links was successfully demonstrated. © 2011 Optical Society of America

  6. Plasticity mechanisms in ultrafine grained freestanding aluminum thin films revealed by in-situ transmission electron microscopy nanomechanical testing

    International Nuclear Information System (INIS)

    Idrissi, Hosni; Kobler, Aaron; Amin-Ahmadi, Behnam; Schryvers, Dominique; Coulombier, Michael; Pardoen, Thomas; Galceran, Montserrat; Godet, Stéphane; Raskin, Jean-Pierre; Kübel, Christian

    2014-01-01

    In-situ bright field transmission electron microscopy (TEM) nanomechanical tensile testing and in-situ automated crystallographic orientation mapping in TEM were combined to unravel the elementary mechanisms controlling the plasticity of ultrafine grained Aluminum freestanding thin films. The characterizations demonstrate that deformation proceeds with a transition from grain rotation to intragranular dislocation glide and starvation plasticity mechanism at about 1% deformation. The grain rotation is not affected by the character of the grain boundaries. No grain growth or twinning is detected

  7. Plasticity mechanisms in ultrafine grained freestanding aluminum thin films revealed by in-situ transmission electron microscopy nanomechanical testing

    Energy Technology Data Exchange (ETDEWEB)

    Idrissi, Hosni, E-mail: hosni.idrissi@ua.ac.be [EMAT, University of Antwerp, Groenenborgerlaan 171, B-2020 Antwerp (Belgium); Institute of Mechanics, Materials and Civil Engineering, Université catholique de Louvain, Place Sainte Barbe 2, B-1348 Louvain-La-Neuve (Belgium); Kobler, Aaron [Institute of Nanotechnology (INT) and Karlsruhe Nano Micro Facility (KNMF), Karlsruhe Institute of Technology - KIT, Hermann-von-Helmholtz-Platz 1, 76344 Eggenstein-Leopoldshafen (Germany); Joint Research Laboratory Nanomaterials (KIT and TUD) at Technische Universität Darmstadt (TUD), Petersenstr. 32, 64287 Darmstadt (Germany); Amin-Ahmadi, Behnam; Schryvers, Dominique [EMAT, University of Antwerp, Groenenborgerlaan 171, B-2020 Antwerp (Belgium); Coulombier, Michael; Pardoen, Thomas [Institute of Mechanics, Materials and Civil Engineering, Université catholique de Louvain, Place Sainte Barbe 2, B-1348 Louvain-La-Neuve (Belgium); Galceran, Montserrat; Godet, Stéphane [Matters and Materials Department, Université Libre de Bruxelles, 50 Av. FD Roosevelt CP194/03, 1050 Brussels (Belgium); Raskin, Jean-Pierre [Information and Communications Technologies, Electronics and Applied Mathematics (ICTEAM), Microwave Laboratory, Université catholique de Louvain, B-1348 Louvain-la-Neuve (Belgium); Kübel, Christian [Institute of Nanotechnology (INT) and Karlsruhe Nano Micro Facility (KNMF), Karlsruhe Institute of Technology - KIT, Hermann-von-Helmholtz-Platz 1, 76344 Eggenstein-Leopoldshafen (Germany)

    2014-03-10

    In-situ bright field transmission electron microscopy (TEM) nanomechanical tensile testing and in-situ automated crystallographic orientation mapping in TEM were combined to unravel the elementary mechanisms controlling the plasticity of ultrafine grained Aluminum freestanding thin films. The characterizations demonstrate that deformation proceeds with a transition from grain rotation to intragranular dislocation glide and starvation plasticity mechanism at about 1% deformation. The grain rotation is not affected by the character of the grain boundaries. No grain growth or twinning is detected.

  8. Optimizing Electric Vehicle Coordination Over a Heterogeneous Mesh Network in a Scaled-Down Smart Grid Testbed

    DEFF Research Database (Denmark)

    Bhattarai, Bishnu Prasad; Lévesque, Martin; Maier, Martin

    2015-01-01

    High penetration of renewable energy sources and electric vehicles (EVs) create power imbalance and congestion in the existing power network, and hence causes significant problems in the control and operation. Despite investing huge efforts from the electric utilities, governments, and researchers......, smart grid (SG) is still at the developmental stage to address those issues. In this regard, a smart grid testbed (SGT) is desirable to develop, analyze, and demonstrate various novel SG solutions, namely demand response, real-time pricing, and congestion management. In this paper, a novel SGT...... is developed in a laboratory by scaling a 250 kVA, 0.4 kV real low-voltage distribution feeder down to 1 kVA, 0.22 kV. Information and communication technology is integrated in the scaled-down network to establish real-time monitoring and control. The novelty of the developed testbed is demonstrated...

  9. Passive Thermal Design Approach for the Space Communications and Navigation (SCaN) Testbed Experiment on the International Space Station (ISS)

    Science.gov (United States)

    Siamidis, John; Yuko, Jim

    2014-01-01

    The Space Communications and Navigation (SCaN) Program Office at NASA Headquarters oversees all of NASAs space communications activities. SCaN manages and directs the ground-based facilities and services provided by the Deep Space Network (DSN), Near Earth Network (NEN), and the Space Network (SN). Through the SCaN Program Office, NASA GRC developed a Software Defined Radio (SDR) testbed experiment (SCaN testbed experiment) for use on the International Space Station (ISS). It is comprised of three different SDR radios, the Jet Propulsion Laboratory (JPL) radio, Harris Corporation radio, and the General Dynamics Corporation radio. The SCaN testbed experiment provides an on-orbit, adaptable, SDR Space Telecommunications Radio System (STRS) - based facility to conduct a suite of experiments to advance the Software Defined Radio, Space Telecommunications Radio Systems (STRS) standards, reduce risk (Technology Readiness Level (TRL) advancement) for candidate Constellation future space flight hardware software, and demonstrate space communication links critical to future NASA exploration missions. The SCaN testbed project provides NASA, industry, other Government agencies, and academic partners the opportunity to develop and field communications, navigation, and networking technologies in the laboratory and space environment based on reconfigurable, software defined radio platforms and the STRS Architecture.The SCaN testbed is resident on the P3 Express Logistics Carrier (ELC) on the exterior truss of the International Space Station (ISS). The SCaN testbed payload launched on the Japanese Aerospace Exploration Agency (JAXA) H-II Transfer Vehicle (HTV) and was installed on the ISS P3 ELC located on the inboard RAM P3 site. The daily operations and testing are managed out of NASA GRC in the Telescience Support Center (TSC).

  10. Automated Predictive Big Data Analytics Using Ontology Based Semantics.

    Science.gov (United States)

    Nural, Mustafa V; Cotterell, Michael E; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A

    2015-10-01

    Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology.

  11. Human activity and rest in situ.

    Science.gov (United States)

    Roenneberg, Till; Keller, Lena K; Fischer, Dorothee; Matera, Joana L; Vetter, Céline; Winnebeck, Eva C

    2015-01-01

    Our lives are structured by the daily alternation of activity and rest, of wake and sleep. Despite significant advances in circadian and sleep research, we still lack answers to many of the most fundamental questions about this conspicuous behavioral pattern. We strongly believe that investigating this pattern in entrained conditions, real-life and daily contexts-in situ-will help the field to elucidate some of these central questions. Here, we present two common approaches for in situ investigation of human activity and rest: the Munich ChronoType Questionnaire (MCTQ) and actimetry. In the first half of this chapter, we provide detailed instructions on how to use and interpret the MCTQ. In addition, we give an overview of the main insights gained with this instrument over the past 10 years, including some new findings on the interaction of light and age on sleep timing. In the second half of this chapter, we introduce the reader to the method of actimetry and share our experience in basic analysis techniques, including visualization, smoothing, and cosine model fitting of in situ recorded data. Additionally, we describe our new approach to automatically detect sleep from activity recordings. Our vision is that the broad use of such easy techniques in real-life settings combined with automated analyses will lead to the creation of large databases. The resulting power of big numbers will promote our understanding of such fundamental biological phenomena as sleep. © 2015 Elsevier Inc. All rights reserved.

  12. In situ spectrophotometric measurement of dissolved inorganic carbon in seawater

    Science.gov (United States)

    Liua, Xuewu; Byrne, Robert H.; Adornato, Lori; Yates, Kimberly K.; Kaltenbacher, Eric; Ding, Xiaoling; Yang, Bo

    2013-01-01

    Autonomous in situ sensors are needed to document the effects of today’s rapid ocean uptake of atmospheric carbon dioxide (e.g., ocean acidification). General environmental conditions (e.g., biofouling, turbidity) and carbon-specific conditions (e.g., wide diel variations) present significant challenges to acquiring long-term measurements of dissolved inorganic carbon (DIC) with satisfactory accuracy and resolution. SEAS-DIC is a new in situ instrument designed to provide calibrated, high-frequency, long-term measurements of DIC in marine and fresh waters. Sample water is first acidified to convert all DIC to carbon dioxide (CO2). The sample and a known reagent solution are then equilibrated across a gas-permeable membrane. Spectrophotometric measurement of reagent pH can thereby determine the sample DIC over a wide dynamic range, with inherent calibration provided by the pH indicator’s molecular characteristics. Field trials indicate that SEAS-DIC performs well in biofouling and turbid waters, with a DIC accuracy and precision of ∼2 μmol kg–1 and a measurement rate of approximately once per minute. The acidic reagent protects the sensor cell from biofouling, and the gas-permeable membrane excludes particulates from the optical path. This instrument, the first spectrophotometric system capable of automated in situ DIC measurements, positions DIC to become a key parameter for in situ CO2-system characterizations.

  13. M1 mirror print-through investigation and performance on the thermo-opto-mechanical testbed for the Space Interferometry Mission

    Science.gov (United States)

    Feria, V. Alfonso; Lam, Jonathan; Van Buren, Dave

    2006-06-01

    SIM PlanetQuest (SIM) is a large (9-meter baseline) space-borne optical interferometer that will determine the position and distance of stars to high accuracy. With microarcsecond measurements SIM will probe nearby stars for Earth-sized planets. To achieve this precision, SIM requires very tight manufacturing tolerances and high stability of optical components. To reduce technical risks, the SIM project developed an integrated thermal, mechanical and optical testbed (TOM3) to allow predictions of the system performance at the required high precision. The TOM3 testbed used full-scale brassboard optical components and picometer-class metrology to reach the SIM target performance levels. During the testbed integration and after one of the testbed mirrors, M1, was bonded into its mount, some surface distortion dimples that exceeded the optical specification were discovered. A detailed finite element model was used to analyze different load cases to try to determine the source of the M1 surface deformations. The same model was also used to compare with actual deformations due to varied thermal conditions on the TOM3 testbed. This paper presents the studies carried out to determine the source of the surface distortions on the M1 mirror as well as comparison and model validation during testing. This research was carried out at the Jet Propulsion Laboratory, California Institute of Technology, under a contract with the National Aeronautics and Space Administration.

  14. Digital microfluidics for automated hanging drop cell spheroid culture.

    Science.gov (United States)

    Aijian, Andrew P; Garrell, Robin L

    2015-06-01

    Cell spheroids are multicellular aggregates, grown in vitro, that mimic the three-dimensional morphology of physiological tissues. Although there are numerous benefits to using spheroids in cell-based assays, the adoption of spheroids in routine biomedical research has been limited, in part, by the tedious workflow associated with spheroid formation and analysis. Here we describe a digital microfluidic platform that has been developed to automate liquid-handling protocols for the formation, maintenance, and analysis of multicellular spheroids in hanging drop culture. We show that droplets of liquid can be added to and extracted from through-holes, or "wells," and fabricated in the bottom plate of a digital microfluidic device, enabling the formation and assaying of hanging drops. Using this digital microfluidic platform, spheroids of mouse mesenchymal stem cells were formed and maintained in situ for 72 h, exhibiting good viability (>90%) and size uniformity (% coefficient of variation <10% intraexperiment, <20% interexperiment). A proof-of-principle drug screen was performed on human colorectal adenocarcinoma spheroids to demonstrate the ability to recapitulate physiologically relevant phenomena such as insulin-induced drug resistance. With automatable and flexible liquid handling, and a wide range of in situ sample preparation and analysis capabilities, the digital microfluidic platform provides a viable tool for automating cell spheroid culture and analysis. © 2014 Society for Laboratory Automation and Screening.

  15. An Overview of NASA's Subsonic Research Aircraft Testbed (SCRAT)

    Science.gov (United States)

    Baumann, Ethan; Hernandez, Joe; Ruhf, John C.

    2013-01-01

    National Aeronautics and Space Administration Dryden Flight Research Center acquired a Gulfstream III (GIII) aircraft to serve as a testbed for aeronautics flight research experiments. The aircraft is referred to as SCRAT, which stands for SubsoniC Research Aircraft Testbed. The aircraft's mission is to perform aeronautics research; more specifically raising the Technology Readiness Level (TRL) of advanced technologies through flight demonstrations and gathering high-quality research data suitable for verifying the technologies, and validating design and analysis tools. The SCRAT has the ability to conduct a range of flight research experiments throughout a transport class aircraft's flight envelope. Experiments ranging from flight-testing of a new aircraft system or sensor to those requiring structural and aerodynamic modifications to the aircraft can be accomplished. The aircraft has been modified to include an instrumentation system and sensors necessary to conduct flight research experiments along with a telemetry capability. An instrumentation power distribution system was installed to accommodate the instrumentation system and future experiments. An engineering simulation of the SCRAT has been developed to aid in integrating research experiments. A series of baseline aircraft characterization flights has been flown that gathered flight data to aid in developing and integrating future research experiments. This paper describes the SCRAT's research systems and capabilities.

  16. Design of a low-power testbed for Wireless Sensor Networks and verification

    NARCIS (Netherlands)

    van Hoesel, L.F.W.; Dulman, S.O.; Havinga, Paul J.M.; Kip, Harry J.

    In this document the design considerations and component choices of a testbed prototype device for wireless sensor networks will be discussed. These devices must be able to monitor their physical environment, process data and assist other nodes in forwarding sensor readings. For these tasks, five

  17. Setup for in situ deep level transient spectroscopy of semiconductors during swift heavy ion irradiation.

    Science.gov (United States)

    Kumar, Sandeep; Kumar, Sugam; Katharria, Y S; Safvan, C P; Kanjilal, D

    2008-05-01

    A computerized system for in situ deep level characterization during irradiation in semiconductors has been set up and tested in the beam line for materials science studies of the 15 MV Pelletron accelerator at the Inter-University Accelerator Centre, New Delhi. This is a new facility for in situ irradiation-induced deep level studies, available in the beam line of an accelerator laboratory. It is based on the well-known deep level transient spectroscopy (DLTS) technique. High versatility for data manipulation is achieved through multifunction data acquisition card and LABVIEW. In situ DLTS studies of deep levels produced by impact of 100 MeV Si ions on Aun-Si(100) Schottky barrier diode are presented to illustrate performance of the automated DLTS facility in the beam line.

  18. Development of a hardware-in-the-loop testbed to demonstrate multiple spacecraft operations in proximity

    Science.gov (United States)

    Eun, Youngho; Park, Sang-Young; Kim, Geuk-Nam

    2018-06-01

    This paper presents a new state-of-the-art ground-based hardware-in-the-loop test facility, which was developed to verify and demonstrate autonomous guidance, navigation, and control algorithms for space proximity operations and formation flying maneuvers. The test facility consists of two complete spaceflight simulators, an aluminum-based operational arena, and a set of infrared motion tracking cameras; thus, the testbed is capable of representing space activities under circumstances prevailing on the ground. The spaceflight simulators have a maximum of five-degree-of-freedom in a quasi-momentum-free environment, which is produced by a set of linear/hemispherical air-bearings and a horizontally leveled operational arena. The tracking system measures the real-time three-dimensional position and attitude to provide state variables to the agents. The design of the testbed is illustrated in detail for every element throughout the paper. The practical hardware characteristics of the active/passive measurement units and internal actuators are identified in detail from various perspectives. These experimental results support the successful development of the entire facility and enable us to implement and verify the spacecraft proximity operation strategy in the near future.

  19. Autonomous power expert fault diagnostic system for Space Station Freedom electrical power system testbed

    Science.gov (United States)

    Truong, Long V.; Walters, Jerry L.; Roth, Mary Ellen; Quinn, Todd M.; Krawczonek, Walter M.

    1990-01-01

    The goal of the Autonomous Power System (APS) program is to develop and apply intelligent problem solving and control to the Space Station Freedom Electrical Power System (SSF/EPS) testbed being developed and demonstrated at NASA Lewis Research Center. The objectives of the program are to establish artificial intelligence technology paths, to craft knowledge-based tools with advanced human-operator interfaces for power systems, and to interface and integrate knowledge-based systems with conventional controllers. The Autonomous Power EXpert (APEX) portion of the APS program will integrate a knowledge-based fault diagnostic system and a power resource planner-scheduler. Then APEX will interface on-line with the SSF/EPS testbed and its Power Management Controller (PMC). The key tasks include establishing knowledge bases for system diagnostics, fault detection and isolation analysis, on-line information accessing through PMC, enhanced data management, and multiple-level, object-oriented operator displays. The first prototype of the diagnostic expert system for fault detection and isolation has been developed. The knowledge bases and the rule-based model that were developed for the Power Distribution Control Unit subsystem of the SSF/EPS testbed are described. A corresponding troubleshooting technique is also described.

  20. Design and Prototyping of a Satellite Antenna Slew Testbed

    Science.gov (United States)

    2013-12-01

    beers and kind advice gave me a family away from home. To my familia here in the Bay Area; their constant support, understanding and surprise...Encoder Cable Maxon 275934 2 CAB 29 EPOS Power Cable Maxon 275829 2 CAB 30 Misc Hardware** NPS 30 - - Bill of Materials 35 closely match the actual ...computed trajectory. The position and velocity results were then implemented on the testbed motors for comparison of actual versus commanded values

  1. Automation &robotics for future Mars exploration

    Science.gov (United States)

    Schulte, W.; von Richter, A.; Bertrand, R.

    2003-04-01

    Automation and Robotics (A&R) are currently considered as a key technology for Mars exploration. initiatives in this field aim at developing new A&R systems and technologies for planetary surface exploration. Kayser-Threde led the study AROMA (Automation &Robotics for Human Mars Exploration) under ESA contract in order to define a reference architecture of A&R elements in support of a human Mars exploration program. One of the goals was to define new developments and to maintain the competitiveness of European industry within this field. We present a summary of the A&R study in respect to a particular system: The Autonomous Research Island (ARI). In the Mars exploration scenario initially a robotic outpost system lands at pre-selected sites in order to search for life forms and water and to analyze the surface, geology and atmosphere. A&R systems, i.e. rovers and autonomous instrument packages, perform a number of missions with scientific and technology development objectives on the surface of Mars as part of preparations for a human exploration mission. In the Robotic Outpost Phase ARI is conceived as an automated lander which can perform in-situ analysis. It consists of a service module and a micro-rover system for local investigations. Such a system is already under investigation and development in other TRP activities. The micro-rover system provides local mobility for in-situ scientific investigations at a given landing or deployment site. In the long run ARI supports also human Mars missions. An astronaut crew would travel larger distances in a pressurized rover on Mars. Whenever interesting features on the surface are identified, the crew would interrupt the travel and perform local investigations. In order to save crew time ARI could be deployed by the astronauts to perform time-consuming investigations as for example in-situ geochemistry analysis of rocks/soil. Later, the crew could recover the research island for refurbishment and deployment at another

  2. The OGC Innovation Program Testbeds - Advancing Architectures for Earth and Systems

    Science.gov (United States)

    Bermudez, L. E.; Percivall, G.; Simonis, I.; Serich, S.

    2017-12-01

    The OGC Innovation Program provides a collaborative agile process for solving challenging science problems and advancing new technologies. Since 1999, 100 initiatives have taken place, from multi-million dollar testbeds to small interoperability experiments. During these initiatives, sponsors and technology implementers (including academia and private sector) come together to solve problems, produce prototypes, develop demonstrations, provide best practices, and advance the future of standards. This presentation will provide the latest system architectures that can be used for Earth and space systems as a result of the OGC Testbed 13, including the following components: Elastic cloud autoscaler for Earth Observations (EO) using a WPS in an ESGF hybrid climate data research platform. Accessibility of climate data for the scientist and non-scientist users via on demand models wrapped in WPS. Standards descriptions for containerize applications to discover processes on the cloud, including using linked data, a WPS extension for hybrid clouds and linking to hybrid big data stores. OpenID and OAuth to secure OGC Services with built-in Attribute Based Access Control (ABAC) infrastructures leveraging GeoDRM patterns. Publishing and access of vector tiles, including use of compression and attribute options reusing patterns from WMS, WMTS and WFS. Servers providing 3D Tiles and streaming of data, including Indexed 3d Scene Layer (I3S), CityGML and Common DataBase (CDB). Asynchronous Services with advanced pushed notifications strategies, with a filter language instead of simple topic subscriptions, that can be use across OGC services. Testbed 14 will continue advancing topics like Big Data, security, and streaming, as well as making easier to use OGC services (e.g. RESTful APIs). The Call for Participation will be issued in December and responses are due on mid January 2018.

  3. Testbed model and data assimilation for ARM

    International Nuclear Information System (INIS)

    Louis, J.F.

    1992-01-01

    The objectives of this contract are to further develop and test the ALFA (AER Local Forecast and Assimilation) model originally designed at AER for local weather prediction and apply it to three distinct but related purposes in connection with the Atmospheric Radiation Measurement (ARM) program: (a) to provide a testbed that simulates a global climate model in order to facilitate the development and testing of new cloud parametrizations and radiation models; (b) to assimilate the ARM data continuously at the scale of a climate model, using the adjoint method, thus providing the initial conditions and verification data for testing parameumtions; (c) to study the sensitivity of a radiation scheme to cloud parameters, again using the adjoint method, thus demonstrating the usefulness of the testbed model. The data assimilation will use a variational technique that minimizes the difference between the model results and the observation during the analysis period. The adjoint model is used to compute the gradient of a measure of the model errors with respect to nudging terms that are added to the equations to force the model output closer to the data. The radiation scheme that will be included in the basic ALFA model makes use of a gen two-stream approximation, and is designed for vertically inhonogeneous, multiple-scattering atmospheres. The sensitivity of this model to the definition of cloud parameters will be studied. The adjoint technique will also be used to compute the sensitivities. This project is designed to provide the Science Team members with the appropriate tools and modeling environment for proper testing and tuning of new radiation models and cloud parametrization schemes

  4. A commercial space technology testbed on ISS

    Science.gov (United States)

    Boyle, David R.

    2000-01-01

    There is a significant and growing commercial market for new, more capable communications and remote sensing satellites. Competition in this market strongly motivates satellite manufacturers and spacecraft component developers to test and demonstrate new space hardware in a realistic environment. External attach points on the International Space Station allow it to function uniquely as a space technology testbed to satisfy this market need. However, space industry officials have identified three critical barriers to their commercial use of the ISS: unpredictable access, cost risk, and schedule uncertainty. Appropriate NASA policy initiatives and business/technical assistance for industry from the Commercial Space Center for Engineering can overcome these barriers. .

  5. Automated Passive Capillary Lysimeters for Estimating Water Drainage in the Vadose Zone

    Science.gov (United States)

    Jabro, J.; Evans, R.

    2009-04-01

    In this study, we demonstrated and evaluated the performance and accuracy of an automated PCAP lysimeters that we designed for in-situ continuous measuring and estimating of drainage water below the rootzone of a sugarbeet-potato-barley rotation under two irrigation frequencies. Twelve automated PCAPs with sampling surface dimensions of 31 cm width * 91 cm long and 87 cm in height were placed 90 cm below the soil surface in a Lihen sandy loam. Our state-of-the-art design incorporated Bluetooth wireless technology to enable an automated datalogger to transmit drainage water data simultaneously every 15 minutes to a remote host and had a greater efficiency than other types of lysimeters. It also offered a significantly larger coverage area (2700 cm2) than similarly designed vadose zone lysimeters. The cumulative manually extracted drainage water was compared with the cumulative volume of drainage water recorded by the datalogger from the tipping bucket using several statistical methods. Our results indicated that our automated PCAPs are accurate and provided convenient means for estimating water drainage in the vadose zone without the need for costly and manually time-consuming supportive systems.

  6. PlanetLab Europe as Geographically-Distributed Testbed for Software Development and Evaluation

    Directory of Open Access Journals (Sweden)

    Dan Komosny

    2015-01-01

    Full Text Available In this paper, we analyse the use of PlanetLab Europe for development and evaluation of geographically-oriented Internet services. PlanetLab is a global research network with the main purpose to support development of new Internet services and protocols. PlanetLab is divided into several branches; one of them is PlanetLab Europe. PlanetLab Europe consists of about 350 nodes at 150 geographically different sites. The nodes are accessible by remote login, and the users can run their software on the nodes. In the paper, we study the PlanetLab's properties that are significant for its use as a geographically distributed testbed. This includes node position accuracy, services availability and stability. We find a considerable number of location inaccuracies and a number of services that cannot be considered as reliable. Based on the results we propose a simple approach to nodes selection in testbeds for geographically-oriented Internet services development and evaluation.

  7. In-situ gamma spectroscopic measurement of natural waters in Bulgaria

    International Nuclear Information System (INIS)

    Manushev, B.; Mandzhukov, I.; Tsankov, L.; Boshkova, T.; Gurev, V.; Mandzhukova, B.; Kozhukharov, I.; Grozev, G.

    1983-01-01

    In-situ gamma spectrometric measurements are carried out to record differences higher than the errors of measurement in the gamma-field spectra in various basins in Bulgaria - two high mountain lakes, dam and the Black sea. A standard scintillation gamma spectrometer, consisting of a scintillation detector ND-424 type, a channel analyzer NP-424 and a 128 channel Al-128 type analyzer, has been used. The sensitivity of the procedure used is sufficient to detect the transfer of nuclides by dissolution from rocks, forming the bottom and the water-collecting region of the water basin. The advancement of the experimental techniques defines the future use of the procedure. In-situ gamma spectrometric determination may be used in cases of continuous and automated control of the radiation purity of the cooling water in atomic power plants or the water basins located close to such plants and of radioactive contamination of the sea and ocean water

  8. BEATBOX v1.0: Background Error Analysis Testbed with Box Models

    Science.gov (United States)

    Knote, Christoph; Barré, Jérôme; Eckl, Max

    2018-02-01

    The Background Error Analysis Testbed (BEATBOX) is a new data assimilation framework for box models. Based on the BOX Model eXtension (BOXMOX) to the Kinetic Pre-Processor (KPP), this framework allows users to conduct performance evaluations of data assimilation experiments, sensitivity analyses, and detailed chemical scheme diagnostics from an observation simulation system experiment (OSSE) point of view. The BEATBOX framework incorporates an observation simulator and a data assimilation system with the possibility of choosing ensemble, adjoint, or combined sensitivities. A user-friendly, Python-based interface allows for the tuning of many parameters for atmospheric chemistry and data assimilation research as well as for educational purposes, for example observation error, model covariances, ensemble size, perturbation distribution in the initial conditions, and so on. In this work, the testbed is described and two case studies are presented to illustrate the design of a typical OSSE experiment, data assimilation experiments, a sensitivity analysis, and a method for diagnosing model errors. BEATBOX is released as an open source tool for the atmospheric chemistry and data assimilation communities.

  9. BEATBOX v1.0: Background Error Analysis Testbed with Box Models

    Directory of Open Access Journals (Sweden)

    C. Knote

    2018-02-01

    Full Text Available The Background Error Analysis Testbed (BEATBOX is a new data assimilation framework for box models. Based on the BOX Model eXtension (BOXMOX to the Kinetic Pre-Processor (KPP, this framework allows users to conduct performance evaluations of data assimilation experiments, sensitivity analyses, and detailed chemical scheme diagnostics from an observation simulation system experiment (OSSE point of view. The BEATBOX framework incorporates an observation simulator and a data assimilation system with the possibility of choosing ensemble, adjoint, or combined sensitivities. A user-friendly, Python-based interface allows for the tuning of many parameters for atmospheric chemistry and data assimilation research as well as for educational purposes, for example observation error, model covariances, ensemble size, perturbation distribution in the initial conditions, and so on. In this work, the testbed is described and two case studies are presented to illustrate the design of a typical OSSE experiment, data assimilation experiments, a sensitivity analysis, and a method for diagnosing model errors. BEATBOX is released as an open source tool for the atmospheric chemistry and data assimilation communities.

  10. A testbed to explore the optimal electrical stimulation parameters for suppressing inter-ictal spikes in human hippocampal slices.

    Science.gov (United States)

    Min-Chi Hsiao; Pen-Ning Yu; Dong Song; Liu, Charles Y; Heck, Christi N; Millett, David; Berger, Theodore W

    2014-01-01

    New interventions using neuromodulatory devices such as vagus nerve stimulation, deep brain stimulation and responsive neurostimulation are available or under study for the treatment of refractory epilepsy. Since the actual mechanisms of the onset and termination of the seizure are still unclear, most researchers or clinicians determine the optimal stimulation parameters through trial-and-error procedures. It is necessary to further explore what types of electrical stimulation parameters (these may include stimulation frequency, amplitude, duration, interval pattern, and location) constitute a set of optimal stimulation paradigms to suppress seizures. In a previous study, we developed an in vitro epilepsy model using hippocampal slices from patients suffering from mesial temporal lobe epilepsy. Using a planar multi-electrode array system, inter-ictal activity from human hippocampal slices was consistently recorded. In this study, we have further transferred this in vitro seizure model to a testbed for exploring the possible neurostimulation paradigms to inhibit inter-ictal spikes. The methodology used to collect the electrophysiological data, the approach to apply different electrical stimulation parameters to the slices are provided in this paper. The results show that this experimental testbed will provide a platform for testing the optimal stimulation parameters of seizure cessation. We expect this testbed will expedite the process for identifying the most effective parameters, and may ultimately be used to guide programming of new stimulating paradigms for neuromodulatory devices.

  11. Comparison of Chromogenic In Situ Hybridization and Fluorescence In Situ Hybridization for the Evaluation of MDM2 Amplification in Adipocytic Tumors.

    Science.gov (United States)

    Mardekian, Stacey K; Solomides, Charalambos C; Gong, Jerald Z; Peiper, Stephen C; Wang, Zi-Xuan; Bajaj, Renu

    2015-11-01

    Atypical lipomatous tumor/well-differentiated liposarcoma (ALT-WDLPS) and dedifferentiated liposarcoma (DDLPS) are characterized cytogenetically by a 12q13-15 amplification involving the mouse double minute 2 (MDM2) oncogene. Fluorescence in situ hybridization (FISH) is used frequently to detect this amplification and aid with the diagnosis of these entities, which is difficult by morphology alone. Recently, bright-field in situ hybridization techniques such as chromogenic in situ hybridization (CISH) have been introduced for the determination of MDM2 amplification status. The present study compared the results of FISH and CISH for detecting MDM2 amplification in 41 cases of adipocytic tumors. Amplification was defined in both techniques as a MDM2/CEN12 ratio of 2 or greater. Eleven cases showed amplification with both FISH and CISH, and 26 cases showed no amplification with both methods. Two cases had discordant results between CISH and FISH, and two cases were not interpretable by CISH. CISH is advantageous for allowing pathologists to evaluate the histologic and molecular alterations occurring simultaneously in a specimen. Moreover, CISH is found to be more cost- and time-efficient when used with automation, and the signals do not quench over time. CISH technique is a reliable alternative to FISH in the evaluation of adipocytic tumors for MDM2 amplification. © 2014 Wiley Periodicals, Inc.

  12. Drilling Automation Tests At A Lunar/Mars Analog Site

    Science.gov (United States)

    Glass, B.; Cannon, H.; Hanagud, S.; Lee, P.; Paulsen, G.

    2006-01-01

    Future in-situ lunar/martian resource utilization and characterization, as well as the scientific search for life on Mars, will require access to the subsurface and hence drilling. Drilling on Earth is hard - an art form more than an engineering discipline. The limited mass, energy and manpower in planetary drilling situations makes application of terrestrial drilling techniques problematic. The Drilling Automation for Mars Exploration (DAME) project is developing drilling automation and robotics for projected use in missions to the Moon and Mars in the 2011-15 period. This has been tested recently, drilling in permafrost at a lunar/martian analog site (Haughton Crater, Devon Island, Canada).

  13. Analysis, modeling, and simulation (AMS) testbed development and evaluation to support dynamic mobility applications (DMA) and active transportation and demand management (ATDM) programs - evaluation summary for the San Diego testbed

    Science.gov (United States)

    2017-08-01

    The primary objective of this project is to develop multiple simulation testbeds and transportation models to evaluate the impacts of Connected Vehicle Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) strateg...

  14. Analysis, Modeling, and Simulation (AMS) Testbed Development and Evaluation to Support Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) Programs - Evaluation Report for the San Diego Testbed

    Science.gov (United States)

    2017-07-01

    The primary objective of this project is to develop multiple simulation testbeds and transportation models to evaluate the impacts of Connected Vehicle Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) strateg...

  15. Real-Time Remote Diagnostic Monitoring Test-bed in JET

    Energy Technology Data Exchange (ETDEWEB)

    Castro, R. [Asociation Euratom/CIEMAT para Fusion, Madrid (Spain); Kneupner, K.; Purahoo, K. [EURATOM/UKAEA Fusion Association, Abingdon (United Kingdom); Vega, J.; Pereira, A.; Portas, A. [Association EuratomCIEMAT para Fusion, Madrid (Spain); De Arcas, G.; Lopez, J.M. [Universidad Politecnica de Madrid (Spain); Murari, A. [Consorzio RFX, Padova (Italy); Fonseca, A. [Associacao URATOM/IST, Lisboa (Portugal); Contributors, J.E. [JET-EFDA, Abingdon (United Kingdom)

    2009-07-01

    Based on the remote experimentation concept oriented to long pulse shots, a test-bed system has been implemented in JET. It integrates 2 functionalities. The first one is the real-time monitoring, on remote, of a reflectometer diagnostic, to visualize different data outputs and status information. The second one is the integration of dotJET (Diagnostic Overview Tool for JET), which internally provides at JET an overview about the current diagnostic systems state, in order to monitor, on remote, JET diagnostics status. The architecture of the system is formed by: the data generator components, the data distribution system, an access control service, and the client applications. In the test-bed there are two data generators: the acquisition equipment associated with the reflectometer diagnostic that generates data and status information, and dotJET server that centralize the access to the status information of JET diagnostics. The data distribution system has been implemented using a publishing-subscribing technology that receives data from data generators and redistributes them to client applications. And finally, for monitoring, a client application based on Java Web Start technology, and a dotJET client application have been used. There are 3 interesting results from this project. The first one is the analysis of different aspects (data formats, data frame rate, data resolution, etc) related with remote real-time diagnostic monitoring oriented to long pulse experiments. The second one is the definition and implementation of a flexible enough architecture, to be applied to different types of data generated from other diagnostics, and that fits with remote access requirements; and the third one is to have achieved a secure system, taking into account internal networks and firewalls aspects in JET, and securing the access from remote users. For this last issue, PAPI technology has been used, enabling access control based on user attributes, enabling mobile users to

  16. Developing the science product algorithm testbed for Chinese next-generation geostationary meteorological satellites: Fengyun-4 series

    Science.gov (United States)

    Min, Min; Wu, Chunqiang; Li, Chuan; Liu, Hui; Xu, Na; Wu, Xiao; Chen, Lin; Wang, Fu; Sun, Fenglin; Qin, Danyu; Wang, Xi; Li, Bo; Zheng, Zhaojun; Cao, Guangzhen; Dong, Lixin

    2017-08-01

    Fengyun-4A (FY-4A), the first of the Chinese next-generation geostationary meteorological satellites, launched in 2016, offers several advances over the FY-2: more spectral bands, faster imaging, and infrared hyperspectral measurements. To support the major objective of developing the prototypes of FY-4 science algorithms, two science product algorithm testbeds for imagers and sounders have been developed by the scientists in the FY-4 Algorithm Working Group (AWG). Both testbeds, written in FORTRAN and C programming languages for Linux or UNIX systems, have been tested successfully by using Intel/g compilers. Some important FY-4 science products, including cloud mask, cloud properties, and temperature profiles, have been retrieved successfully through using a proxy imager, Himawari-8/Advanced Himawari Imager (AHI), and sounder data, obtained from the Atmospheric InfraRed Sounder, thus demonstrating their robustness. In addition, in early 2016, the FY-4 AWG was developed based on the imager testbed—a near real-time processing system for Himawari-8/AHI data for use by Chinese weather forecasters. Consequently, robust and flexible science product algorithm testbeds have provided essential and productive tools for popularizing FY-4 data and developing substantial improvements in FY-4 products.

  17. TESTING THE APODIZED PUPIL LYOT CORONAGRAPH ON THE LABORATORY FOR ADAPTIVE OPTICS EXTREME ADAPTIVE OPTICS TESTBED

    International Nuclear Information System (INIS)

    Thomas, Sandrine J.; Dillon, Daren; Gavel, Donald; Soummer, Remi; Macintosh, Bruce; Sivaramakrishnan, Anand

    2011-01-01

    We present testbed results of the Apodized Pupil Lyot Coronagraph (APLC) at the Laboratory for Adaptive Optics (LAO). These results are part of the validation and tests of the coronagraph and of the Extreme Adaptive Optics (ExAO) for the Gemini Planet Imager (GPI). The apodizer component is manufactured with a halftone technique using black chrome microdots on glass. Testing this APLC (like any other coronagraph) requires extremely good wavefront correction, which is obtained to the 1 nm rms level using the microelectricalmechanical systems (MEMS) technology, on the ExAO visible testbed of the LAO at the University of Santa Cruz. We used an APLC coronagraph without central obstruction, both with a reference super-polished flat mirror and with the MEMS to obtain one of the first images of a dark zone in a coronagraphic image with classical adaptive optics using a MEMS deformable mirror (without involving dark hole algorithms). This was done as a complementary test to the GPI coronagraph testbed at American Museum of Natural History, which studied the coronagraph itself without wavefront correction. Because we needed a full aperture, the coronagraph design is very different from the GPI design. We also tested a coronagraph with central obstruction similar to that of GPI. We investigated the performance of the APLC coronagraph and more particularly the effect of the apodizer profile accuracy on the contrast. Finally, we compared the resulting contrast to predictions made with a wavefront propagation model of the testbed to understand the effects of phase and amplitude errors on the final contrast.

  18. Open-Source Based Testbed for Multioperator 4G/5G Infrastructure Sharing in Virtual Environments

    Directory of Open Access Journals (Sweden)

    Ricardo Marco Alaez

    2017-01-01

    Full Text Available Fourth-Generation (4G mobile networks are based on Long-Term Evolution (LTE technologies and are being deployed worldwide, while research on further evolution towards the Fifth Generation (5G has been recently initiated. 5G will be featured with advanced network infrastructure sharing capabilities among different operators. Therefore, an open-source implementation of 4G/5G networks with this capability is crucial to enable early research in this area. The main contribution of this paper is the design and implementation of such a 4G/5G open-source testbed to investigate multioperator infrastructure sharing capabilities executed in virtual architectures. The proposed design and implementation enable the virtualization and sharing of some of the components of the LTE architecture. A testbed has been implemented and validated with intensive empirical experiments conducted to validate the suitability of virtualizing LTE components in virtual infrastructures (i.e., infrastructures with multitenancy sharing capabilities. The impact of the proposed technologies can lead to significant saving of both capital and operational costs for mobile telecommunication operators.

  19. An Approach for Smart Antenna Testbed

    Science.gov (United States)

    Kawitkar, R. S.; Wakde, D. G.

    2003-07-01

    The use of wireless, mobile, personal communications services are expanding rapidly. Adaptive or "Smart" antenna arrays can increase channel capacity through spatial division. Adaptive antennas can also track mobile users, improving both signal range and quality. For these reasons, smart antenna systems have attracted widespread interest in the telecommunications industry for applications to third generation wireless systems.This paper aims to design and develop an advanced antennas testbed to serve as a common reference for testing adaptive antenna arrays and signal combining algorithms, as well as complete systems. A flexible suite of off line processing software should be written using matlab to perform system calibration, test bed initialization, data acquisition control, data storage/transfer, off line signal processing and analysis and graph plotting. The goal of this paper is to develop low complexity smart antenna structures for 3G systems. The emphasis will be laid on ease of implementation in a multichannel / multi-user environment. A smart antenna test bed will be developed, and various state-of-the-art DSP structures and algorithms will be investigated.Facing the soaring demand for mobile communications, the use of smart antenna arrays in mobile communications systems to exploit spatial diversity to further improve spectral efficiency has recently received considerable attention. Basically, a smart antenna array comprises a number of antenna elements combined via a beamforming network (amplitude and phase control network). Some of the benefits that can be achieved by using SAS (Smart Antenna System) include lower mobile terminal power consumption, range extension, ISI reduction, higher data rate support, and ease of integration into the existing base station system. In terms of economic benefits, adaptive antenna systems employed at base station, though increases the per base station cost, can increase coverage area of each cell site, thereby reducing

  20. Sensing across large-scale cognitive radio networks: Data processing, algorithms, and testbed for wireless tomography and moving target tracking

    Science.gov (United States)

    Bonior, Jason David

    As the use of wireless devices has become more widespread so has the potential for utilizing wireless networks for remote sensing applications. Regular wireless communication devices are not typically designed for remote sensing. Remote sensing techniques must be carefully tailored to the capabilities of these networks before they can be applied. Experimental verification of these techniques and algorithms requires robust yet flexible testbeds. In this dissertation, two experimental testbeds for the advancement of research into sensing across large-scale cognitive radio networks are presented. System architectures, implementations, capabilities, experimental verification, and performance are discussed. One testbed is designed for the collection of scattering data to be used in RF and wireless tomography research. This system is used to collect full complex scattering data using a vector network analyzer (VNA) and amplitude-only data using non-synchronous software-defined radios (SDRs). Collected data is used to experimentally validate a technique for phase reconstruction using semidefinite relaxation and demonstrate the feasibility of wireless tomography. The second testbed is a SDR network for the collection of experimental data. The development of tools for network maintenance and data collection is presented and discussed. A novel recursive weighted centroid algorithm for device-free target localization using the variance of received signal strength for wireless links is proposed. The signal variance resulting from a moving target is modeled as having contours related to Cassini ovals. This model is used to formulate recursive weights which reduce the influence of wireless links that are farther from the target location estimate. The algorithm and its implementation on this testbed are presented and experimental results discussed.

  1. Water Electrolysis for In-Situ Resource Utilization (ISRU)

    Science.gov (United States)

    Lee, Kristopher A.

    2016-01-01

    Sending humans to Mars for any significant amount of time will require capabilities and technologies that enable Earth independence. To move towards this independence, the resources found on Mars must be utilized to produce the items needed to sustain humans away from Earth. To accomplish this task, NASA is studying In Situ Resource Utilization (ISRU) systems and techniques to make use of the atmospheric carbon dioxide and the water found on Mars. Among other things, these substances can be harvested and processed to make oxygen and methane. Oxygen is essential, not only for sustaining the lives of the crew on Mars, but also as the oxidizer for an oxygen-methane propulsion system that could be utilized on a Mars ascent vehicle. Given the presence of water on Mars, the electrolysis of water is a common technique to produce the desired oxygen. Towards this goal, NASA designed and developed a Proton Exchange Membrane (PEM) water electrolysis system, which was originally slated to produce oxygen for propulsion and fuel cell use in the Mars Atmosphere and Regolith COllector/PrOcessor for Lander Operations (MARCO POLO) project. As part of the Human Exploration Spacecraft Testbed for Integration and Advancement (HESTIA) project, this same electrolysis system, originally targeted at enabling in situ propulsion and power, operated in a life-support scenario. During HESTIA testing at Johnson Space Center, the electrolysis system supplied oxygen to a chamber simulating a habitat housing four crewmembers. Inside the chamber, oxygen was removed from the atmosphere to simulate consumption by the crew, and the electrolysis system's oxygen was added to replenish it. The electrolysis system operated nominally throughout the duration of the HESTIA test campaign, and the oxygen levels in the life support chamber were maintained at the desired levels.

  2. Combining Space-Based and In-Situ Measurements to Track Flooding in Thailand

    Science.gov (United States)

    Chien, Steve; Doubleday, Joshua; Mclaren, David; Tran, Daniel; Tanpipat, Veerachai; Chitradon, Royal; Boonya-aaroonnet, Surajate; Thanapakpawin, Porranee; Khunboa, Chatchai; Leelapatra, Watis; hide

    2011-01-01

    We describe efforts to integrate in-situ sensing, space-borne sensing, hydrological modeling, active control of sensing, and automatic data product generation to enhance monitoring and management of flooding. In our approach, broad coverage sensors and missions such as MODIS, TRMM, and weather satellite information and in-situ weather and river gauging information are all inputs to track flooding via river basin and sub-basin hydrological models. While these inputs can provide significant information as to the major flooding, targetable space measurements can provide better spatial resolution measurements of flooding extent. In order to leverage such assets we automatically task observations in response to automated analysis indications of major flooding. These new measurements are automatically processed and assimilated with the other flooding data. We describe our ongoing efforts to deploy this system to track major flooding events in Thailand.

  3. Demonstration of automated proximity and docking technologies

    Science.gov (United States)

    Anderson, Robert L.; Tsugawa, Roy K.; Bryan, Thomas C.

    An autodock was demonstrated using straightforward techniques and real sensor hardware. A simulation testbed was established and validated. The sensor design was refined with improved optical performance and image processing noise mitigation techniques, and the sensor is ready for production from off-the-shelf components. The autonomous spacecraft architecture is defined. The areas of sensors, docking hardware, propulsion, and avionics are included in the design. The Guidance Navigation and Control architecture and requirements are developed. Modular structures suitable for automated control are used. The spacecraft system manager functions including configuration, resource, and redundancy management are defined. The requirements for autonomous spacecraft executive are defined. High level decisionmaking, mission planning, and mission contingency recovery are a part of this. The next step is to do flight demonstrations. After the presentation the following question was asked. How do you define validation? There are two components to validation definition: software simulation with formal and vigorous validation, and hardware and facility performance validated with respect to software already validated against analytical profile.

  4. AUTOMATING THE MEASUREMENT OF RED CORAL IN SITU USING UNDERWATER PHOTOGRAMMETRY AND CODED TARGETS

    Directory of Open Access Journals (Sweden)

    P. Drap

    2013-07-01

    Full Text Available A photogrammetry tool dedicated to the monitoring of red coral populations in situ has been developed by LSIS in Marseille (France. This tool is used to collect in an efficient and precise manner key data for the study of the population dynamics of red coral. In selected red coral populations, scuba-divers obtain a series of photographs from the permanent plots (about 2 m2 on an annual basis. To facilitate the photographic sampling and measurements, the scuba-divers use a 20 x 20 cm quadrat to cover the permanent plots. The analysis of the photographs provides reliable measurements on colony sizes (basal diameter and maximum height, occurrence of breakage of colonies and the occurrence of necrosis. To minimize the divers' tasks during the acquisition phase, we opted for stereoscopic acquisition using a single device to easily adapt the measurement procedure to the scene configuration. The material is quite light, one camera and two electronic strobes and a simple procedure with two photographs taken for each site. To facilitate the measurement phase of colony sizes; the exploitation of photographs consists of four key steps: orientation, scaling, measurement of the characteristic points of coral colonies and result validation (checking measurement consistency to detect possible errors in measurement or interpretation. Since the context of the shooting can vary widely, dominant colors, contrast, etc. may often change. In order to have a stable and common reference in all photographs independently of the site, we decided to always include a quadrat in the scene which then will be used for the orientation and scaling phases. The automation of orientation and the lack of constraints to adapt the analytical technique to the features of each site offer the possibility to multiply field surveys and to measure hundreds of quadrats from several different populations in a very efficient manner. The measurement results are exported into a spreadsheet

  5. Complacency and Automation Bias in the Use of Imperfect Automation.

    Science.gov (United States)

    Wickens, Christopher D; Clegg, Benjamin A; Vieane, Alex Z; Sebok, Angelia L

    2015-08-01

    We examine the effects of two different kinds of decision-aiding automation errors on human-automation interaction (HAI), occurring at the first failure following repeated exposure to correctly functioning automation. The two errors are incorrect advice, triggering the automation bias, and missing advice, reflecting complacency. Contrasts between analogous automation errors in alerting systems, rather than decision aiding, have revealed that alerting false alarms are more problematic to HAI than alerting misses are. Prior research in decision aiding, although contrasting the two aiding errors (incorrect vs. missing), has confounded error expectancy. Participants performed an environmental process control simulation with and without decision aiding. For those with the aid, automation dependence was created through several trials of perfect aiding performance, and an unexpected automation error was then imposed in which automation was either gone (one group) or wrong (a second group). A control group received no automation support. The correct aid supported faster and more accurate diagnosis and lower workload. The aid failure degraded all three variables, but "automation wrong" had a much greater effect on accuracy, reflecting the automation bias, than did "automation gone," reflecting the impact of complacency. Some complacency was manifested for automation gone, by a longer latency and more modest reduction in accuracy. Automation wrong, creating the automation bias, appears to be a more problematic form of automation error than automation gone, reflecting complacency. Decision-aiding automation should indicate its lower degree of confidence in uncertain environments to avoid the automation bias. © 2015, Human Factors and Ergonomics Society.

  6. Automated microaxial tomography of cell nuclei after specific labelling by fluorescence in situ hybridisation

    Czech Academy of Sciences Publication Activity Database

    Kozubek, Michal; Skalníková, M.; Matula, Pe.; Bártová, Eva; Rauch, J.; Neuhaus, F.; Eipel, H.; Hasmann, M.

    2002-01-01

    Roč. 33, 7-8 (2002), s. 655-665 ISSN 0968-4328 Institutional research plan: CEZ:AV0Z5004920 Keywords : microaxial tomography * automated microscopy * high-resolution cytometry Subject RIV: BO - Biophysics Impact factor: 1.537, year: 2002

  7. Development and experimentation of an eye/brain/task testbed

    Science.gov (United States)

    Harrington, Nora; Villarreal, James

    1987-01-01

    The principal objective is to develop a laboratory testbed that will provide a unique capability to elicit, control, record, and analyze the relationship of operator task loading, operator eye movement, and operator brain wave data in a computer system environment. The ramifications of an integrated eye/brain monitor to the man machine interface are staggering. The success of such a system would benefit users of space and defense, paraplegics, and the monitoring of boring screens (nuclear power plants, air defense, etc.)

  8. Camera calibration in a hazardous environment performed in situ with automated analysis and verification

    International Nuclear Information System (INIS)

    DePiero, F.W.; Kress, R.L.

    1993-01-01

    Camera calibration using the method of Two Planes is discussed. An implementation of the technique is described that may be performed in situ, e.g., in a hazardous or contaminated environment, thus eliminating the need for decontamination of camera systems before recalibration. Companion analysis techniques used for verifying the correctness of the calibration are presented

  9. Genetic Algorithm Phase Retrieval for the Systematic Image-Based Optical Alignment Testbed

    Science.gov (United States)

    Taylor, Jaime; Rakoczy, John; Steincamp, James

    2003-01-01

    Phase retrieval requires calculation of the real-valued phase of the pupil fimction from the image intensity distribution and characteristics of an optical system. Genetic 'algorithms were used to solve two one-dimensional phase retrieval problem. A GA successfully estimated the coefficients of a polynomial expansion of the phase when the number of coefficients was correctly specified. A GA also successfully estimated the multiple p h e s of a segmented optical system analogous to the seven-mirror Systematic Image-Based Optical Alignment (SIBOA) testbed located at NASA s Marshall Space Flight Center. The SIBOA testbed was developed to investigate phase retrieval techniques. Tiphilt and piston motions of the mirrors accomplish phase corrections. A constant phase over each mirror can be achieved by an independent tip/tilt correction: the phase Conection term can then be factored out of the Discrete Fourier Tranform (DFT), greatly reducing computations.

  10. New Security Development and Trends to Secure the SCADA Sensors Automated Transmission during Critical Sessions

    Directory of Open Access Journals (Sweden)

    Aamir Shahzad

    2015-10-01

    Full Text Available Modern technology enhancements have been used worldwide to fulfill the requirements of the industrial sector, especially in supervisory control and data acquisition (SCADA systems as a part of industrial control systems (ICS. SCADA systems have gained popularity in industrial automations due to technology enhancements and connectivity with modern computer networks and/or protocols. The procurement of new technologies has made SCADA systems important and helpful to processing in oil lines, water treatment plants, and electricity generation and control stations. On the other hand, these systems have vulnerabilities like other traditional computer networks (or systems, especially when interconnected with open platforms. Many international organizations and researchers have proposed and deployed solutions for SCADA security enhancement, but most of these have been based on node-to-node security, without emphasizing critical sessions that are linked directly with industrial processing and automation. This study concerns SCADA security measures related to critical processing with specified sessions of automated polling, analyzing cryptography mechanisms and deploying the appropriate explicit inclusive security solution in a distributed network protocol version 3 (DNP3 stack, as part of a SCADA system. The bytes flow through the DNP3 stack with security computational bytes within specified critical intervals defined for polling. We took critical processing knowledge into account when designing a SCADA/DNP3 testbed and deploying a cryptography solution that did not affect communications.

  11. The University of Canberra quantum key distribution testbed

    International Nuclear Information System (INIS)

    Ganeshkumar, G.; Edwards, P.J.; Cheung, W.N.; Barbopoulos, L.O.; Pham, H.; Hazel, J.C.

    1999-01-01

    Full text: We describe the design, operation and preliminary results obtained from a quantum key distribution (QKD) testbed constructed at the University of Canberra. Quantum cryptographic systems use shared secret keys exchanged in the form of sequences of polarisation coded or phase encoded single photons transmitted over an optical communications channel. Secrecy of this quantum key rests upon fundamental laws of quantum physics: measurements of linear or circular photon polarisation states introduce noise into the conjugate variable and so reveal eavesdropping. In its initial realisation reported here, pulsed light from a 650nm laser diode is attenuated by a factor of 10 6 , plane-polarised and then transmitted through a birefringent liquid crystal modulator (LCM) to a polarisation sensitive single photon receiver. This transmitted key sequence consists of a 1 kHz train of weak coherent 100ns wide light pulses, polarisation coded according to the BB84 protocol. Each pulse is randomly assigned one of four polarisation states (two orthogonal linear and two orthogonal circular) by computer PCA operated by the sender ('Alice'). This quaternary polarisation shift keyed photon stream is detected by the receiver ('Bob') whose computer (PCB) randomly chooses either a linear or a circular polarisation basis. Computer PCB is also used for final key selection, authentication, privacy amplification and eavesdropping. We briefly discuss the realisation of a mesoscopic single photon QKD source and the use of the testbed to simulate a global quantum key distribution system using earth satellites. Copyright (1999) Australian Optical Society

  12. Vacuum Nuller Testbed Performance, Characterization and Null Control

    Science.gov (United States)

    Lyon, R. G.; Clampin, M.; Petrone, P.; Mallik, U.; Madison, T.; Bolcar, M.; Noecker, C.; Kendrick, S.; Helmbrecht, M. A.

    2011-01-01

    The Visible Nulling Coronagraph (VNC) can detect and characterize exoplanets with filled, segmented and sparse aperture telescopes, thereby spanning the choice of future internal coronagraph exoplanet missions. NASA/Goddard Space Flight Center (GSFC) has developed a Vacuum Nuller Testbed (VNT) to advance this approach, and assess and advance technologies needed to realize a VNC as a flight instrument. The VNT is an ultra-stable testbed operating at 15 Hz in vacuum. It consists of a MachZehnder nulling interferometer; modified with a "W" configuration to accommodate a hexpacked MEMS based deformable mirror (DM), coherent fiber bundle and achromatic phase shifters. The 2-output channels are imaged with a vacuum photon counting camera and conventional camera. Error-sensing and feedback to DM and delay line with control algorithms are implemented in a real-time architecture. The inherent advantage of the VNC is that it is its own interferometer and directly controls its errors by exploiting images from bright and dark channels simultaneously. Conservation of energy requires the sum total of the photon counts be conserved independent of the VNC state. Thus sensing and control bandwidth is limited by the target stars throughput, with the net effect that the higher bandwidth offloads stressing stability tolerances within the telescope. We report our recent progress with the VNT towards achieving an incremental sequence of contrast milestones of 10(exp 8) , 10(exp 9) and 10(exp 10) respectively at inner working angles approaching 2A/D. Discussed will be the optics, lab results, technologies, and null control. Shown will be evidence that the milestones have been achieved.

  13. Regulatory pathway analysis by high-throughput in situ hybridization.

    Directory of Open Access Journals (Sweden)

    Axel Visel

    2007-10-01

    Full Text Available Automated in situ hybridization enables the construction of comprehensive atlases of gene expression patterns in mammals. Such atlases can become Web-searchable digital expression maps of individual genes and thus offer an entryway to elucidate genetic interactions and signaling pathways. Towards this end, an atlas housing approximately 1,000 spatial gene expression patterns of the midgestation mouse embryo was generated. Patterns were textually annotated using a controlled vocabulary comprising >90 anatomical features. Hierarchical clustering of annotations was carried out using distance scores calculated from the similarity between pairs of patterns across all anatomical structures. This process ordered hundreds of complex expression patterns into a matrix that reflects the embryonic architecture and the relatedness of patterns of expression. Clustering yielded 12 distinct groups of expression patterns. Because of the similarity of expression patterns within a group, members of each group may be components of regulatory cascades. We focused on the group containing Pax6, an evolutionary conserved transcriptional master mediator of development. Seventeen of the 82 genes in this group showed a change of expression in the developing neocortex of Pax6-deficient embryos. Electromobility shift assays were used to test for the presence of Pax6-paired domain binding sites. This led to the identification of 12 genes not previously known as potential targets of Pax6 regulation. These findings suggest that cluster analysis of annotated gene expression patterns obtained by automated in situ hybridization is a novel approach for identifying components of signaling cascades.

  14. The Objectives of NASA's Living with a Star Space Environment Testbed

    Science.gov (United States)

    Barth, Janet L.; LaBel, Kenneth A.; Brewer, Dana; Kauffman, Billy; Howard, Regan; Griffin, Geoff; Day, John H. (Technical Monitor)

    2001-01-01

    NASA is planning to fly a series of Space Environment Testbeds (SET) as part of the Living With A Star (LWS) Program. The goal of the testbeds is to improve and develop capabilities to mitigate and/or accommodate the affects of solar variability in spacecraft and avionics design and operation. This will be accomplished by performing technology validation in space to enable routine operations, characterize technology performance in space, and improve and develop models, guidelines and databases. The anticipated result of the LWS/SET program is improved spacecraft performance, design, and operation for survival of the radiation, spacecraft charging, meteoroid, orbital debris and thermosphere/ionosphere environments. The program calls for a series of NASA Research Announcements (NRAs) to be issued to solicit flight validation experiments, improvement in environment effects models and guidelines, and collateral environment measurements. The selected flight experiments may fly on the SET experiment carriers and flights of opportunity on other commercial and technology missions. This paper presents the status of the project so far, including a description of the types of experiments that are intended to fly on SET-1 and a description of the SET-1 carrier parameters.

  15. In Situ Correlated Molecular Imaging of Chemically Communicating Microbial Communities

    Energy Technology Data Exchange (ETDEWEB)

    Bohn, Paul W. [Univ. of Notre Dame, IN (United States); Shrout, J. D. [Univ. of Notre Dame, IN (United States); Sweedler, J. V. [Univ. of Illinois, Urbana-Champaign, IL (United States); Farrand, S. [Univ. of Illinois, Urbana-Champaign, IL (United States)

    2016-01-25

    This document constitutes the final technical report for DE-SC0006642, In Situ Correlated Molecular Imaging of Chemically Communicating Microbial Communities, a project carried out collaboratively by investigators at Notre Dame and UIUC. The work carried out under DOE support in this project produced advances in two areas: development of new highly sophisticated correlated imaging approaches and the application of these new tools to the growth and differentiation of microbial communities under a variety of environmental conditions. A significant effort involved the creation of technical enhancements and sampling approaches to allow us to advance heterocorrelated mass spectrometry imaging (MSI) and correlated Raman microscopy (CRM) from bacterial cultures and biofilms. We then exploited these measurement advances in heterocorrelated MS/CRM imaging to determine relationship of signaling molecules and excreted signaling molecules produced by P. aeruginosa to conditions relevant to the rhizosphere. In particular, we: (1) developed a laboratory testbed mimic for the rhizosphere to enable microbial growth on slides under controlled conditions; (2) integrated specific measurements of (a) rhamnolipids, (b) quinolone/quinolones, and (c) phenazines specific to P. aeruginosa; and (3) utilized the imaging tools to probe how messenger secretion, quorum sensing and swarming behavior are correlated with behavior.

  16. The CMS Integration Grid Testbed

    CERN Document Server

    Graham, G E; Aziz, Shafqat; Bauerdick, L.A.T.; Ernst, Michael; Kaiser, Joseph; Ratnikova, Natalia; Wenzel, Hans; Wu, Yu-jun; Aslakson, Erik; Bunn, Julian; Iqbal, Saima; Legrand, Iosif; Newman, Harvey; Singh, Suresh; Steenberg, Conrad; Branson, James; Fisk, Ian; Letts, James; Arbree, Adam; Avery, Paul; Bourilkov, Dimitri; Cavanaugh, Richard; Rodriguez, Jorge Luis; Kategari, Suchindra; Couvares, Peter; DeSmet, Alan; Livny, Miron; Roy, Alain; Tannenbaum, Todd; Graham, Gregory E.; Aziz, Shafqat; Ernst, Michael; Kaiser, Joseph; Ratnikova, Natalia; Wenzel, Hans; Wu, Yujun; Aslakson, Erik; Bunn, Julian; Iqbal, Saima; Legrand, Iosif; Newman, Harvey; Singh, Suresh; Steenberg, Conrad; Branson, James; Fisk, Ian; Letts, James; Arbree, Adam; Avery, Paul; Bourilkov, Dimitri; Cavanaugh, Richard; Rodriguez, Jorge; Kategari, Suchindra; Couvares, Peter; Smet, Alan De; Livny, Miron; Roy, Alain; Tannenbaum, Todd

    2003-01-01

    The CMS Integration Grid Testbed (IGT) comprises USCMS Tier-1 and Tier-2 hardware at the following sites: the California Institute of Technology, Fermi National Accelerator Laboratory, the University of California at San Diego, and the University of Florida at Gainesville. The IGT runs jobs using the Globus Toolkit with a DAGMan and Condor-G front end. The virtual organization (VO) is managed using VO management scripts from the European Data Grid (EDG). Gridwide monitoring is accomplished using local tools such as Ganglia interfaced into the Globus Metadata Directory Service (MDS) and the agent based Mona Lisa. Domain specific software is packaged and installed using the Distrib ution After Release (DAR) tool of CMS, while middleware under the auspices of the Virtual Data Toolkit (VDT) is distributed using Pacman. During a continuo us two month span in Fall of 2002, over 1 million official CMS GEANT based Monte Carlo events were generated and returned to CERN for analysis while being demonstrated at SC2002. ...

  17. Cooperating expert systems for Space Station - Power/thermal subsystem testbeds

    Science.gov (United States)

    Wong, Carla M.; Weeks, David J.; Sundberg, Gale R.; Healey, Kathleen L.; Dominick, Jeffrey S.

    1988-01-01

    The Systems Autonomy Demonstration Project (SADP) is a NASA-sponsored series of increasingly complex demonstrations to show the benefits of integrating knowledge-based systems with conventional process control in real-time, real-world problem domains that can facilitate the operations and availability of major Space Station distributed systems. This paper describes the system design, objectives, approaches, and status of each of the testbed knowledge-based systems. Simplified schematics of the systems are shown.

  18. Development of Research Reactor Simulator and Its Application to Dynamic Test-bed

    International Nuclear Information System (INIS)

    Kwon, Kee Choon; Park, Jae Chang; Lee, Seung Wook; Bang, Dane; Bae, Sung Won

    2014-01-01

    We developed HANARO and the Jordan Research and Training Reactor (JRTR) real-time simulator for operating staff training. The main purpose of this simulator is operator training, but we modified this simulator as a dynamic test-bed to test the reactor regulating system in HANARO or JRTR before installation. The simulator configuration is divided into hardware and software. The simulator hardware consists of a host computer, 6 operator stations, a network switch, and a large display panel. The simulator software is divided into three major parts: a mathematical modeling module, which executes the plant dynamic modeling program in real-time, an instructor station module that manages user instructions, and a human machine interface (HMI) module. The developed research reactors are installed in the Korea Atomic Energy Research Institute nuclear training center for reactor operator training. To use the simulator as a dynamic test-bed, the reactor regulating system modeling software of the simulator was replaced by a hardware controller and the simulator and target controller were interfaced with a hard-wired and network-based interface

  19. Development of Research Reactor Simulator and Its Application to Dynamic Test-bed

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Kee Choon; Park, Jae Chang; Lee, Seung Wook; Bang, Dane; Bae, Sung Won [KAERI, Daejeon (Korea, Republic of)

    2014-08-15

    We developed HANARO and the Jordan Research and Training Reactor (JRTR) real-time simulator for operating staff training. The main purpose of this simulator is operator training, but we modified this simulator as a dynamic test-bed to test the reactor regulating system in HANARO or JRTR before installation. The simulator configuration is divided into hardware and software. The simulator hardware consists of a host computer, 6 operator stations, a network switch, and a large display panel. The simulator software is divided into three major parts: a mathematical modeling module, which executes the plant dynamic modeling program in real-time, an instructor station module that manages user instructions, and a human machine interface (HMI) module. The developed research reactors are installed in the Korea Atomic Energy Research Institute nuclear training center for reactor operator training. To use the simulator as a dynamic test-bed, the reactor regulating system modeling software of the simulator was replaced by a hardware controller and the simulator and target controller were interfaced with a hard-wired and network-based interface.

  20. Implementation of Motion Simulation Software and Visual-Auditory Electronics for Use in a Low Gravity Robotic Testbed

    Science.gov (United States)

    Martin, William Campbell

    2011-01-01

    The Jet Propulsion Laboratory (JPL) is developing the All-Terrain Hex-Limbed Extra-Terrestrial Explorer (ATHLETE) to assist in manned space missions. One of the proposed targets for this robotic vehicle is a near-Earth asteroid (NEA), which typically exhibit a surface gravity of only a few micro-g. In order to properly test ATHLETE in such an environment, the development team has constructed an inverted Stewart platform testbed that acts as a robotic motion simulator. This project focused on creating physical simulation software that is able to predict how ATHLETE will function on and around a NEA. The corresponding platform configurations are calculated and then passed to the testbed to control ATHLETE's motion. In addition, imitation attitude, imitation attitude control thrusters were designed and fabricated for use on ATHLETE. These utilize a combination of high power LEDs and audio amplifiers to provide visual and auditory cues that correspond to the physics simulation.

  1. The Living With a Star Space Environment Testbed Experiments

    Science.gov (United States)

    Xapsos, Michael A.

    2014-01-01

    The focus of the Living With a Star (LWS) Space Environment Testbed (SET) program is to improve the performance of hardware in the space radiation environment. The program has developed a payload for the Air Force Research Laboratory (AFRL) Demonstration and Science Experiments (DSX) spacecraft that is scheduled for launch in August 2015 on the SpaceX Falcon Heavy rocket. The primary structure of DSX is an Evolved Expendable Launch Vehicle (EELV) Secondary Payload Adapter (ESPA) ring. DSX will be in a Medium Earth Orbit (MEO). This oral presentation will describe the SET payload.

  2. A technical description of the FlexHouse Project Testbed

    DEFF Research Database (Denmark)

    Sørensen, Jens Otto

    2000-01-01

    This paper describes the FlexHouse project testbed; a server dedicated to experiments within the FlexHouse project. The FlexHouse project is a project originating from The Business Computing Research Group at The Aarhus School of Business. The purpose of the project is to identify and develop...... methods that satisfy the following three requirements. Flexibility with respect to evolving data sources. Flexibility with respect to change of information needs. Efficiency with respect to view management....

  3. Analysis, Modeling, and Simulation (AMS) Testbed Development and Evaluation to Support Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) Programs : Evaluation Report for the San Diego Testbed : Draft Report.

    Science.gov (United States)

    2017-07-01

    The primary objective of this project is to develop multiple simulation testbeds and transportation models to evaluate the impacts of Connected Vehicle Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) strateg...

  4. EPIC: A Testbed for Scientifically Rigorous Cyber-Physical Security Experimentation

    OpenAIRE

    SIATERLIS CHRISTOS; GENGE BELA; HOHENADEL MARC

    2013-01-01

    Recent malware, like Stuxnet and Flame, constitute a major threat to Networked Critical Infrastructures (NCIs), e.g., power plants. They revealed several vulnerabilities in today's NCIs, but most importantly they highlighted the lack of an efficient scientific approach to conduct experiments that measure the impact of cyber threats on both the physical and the cyber parts of NCIs. In this paper we present EPIC, a novel cyber-physical testbed and a modern scientific instrument that can pr...

  5. Establishment of a sensor testbed at NIST for plant productivity monitoring

    Science.gov (United States)

    Allen, D. W.; Hutyra, L.; Reinmann, A.; Trlica, A.; Marrs, J.; Jones, T.; Whetstone, J. R.; Logan, B.; Reblin, J.

    2017-12-01

    Accurate assessments of biogenic carbon fluxes is challenging. Correlating optical signatures to plant activity allows for monitoring large regions. New methods, including solar-induced fluorescence (SIF), promise to provide more timely and accurate estimate of plant activity, but we are still developing a full understanding of the mechanistic leakage between plant assimilation of carbon and SIF. We have initiated a testbed to facilitate the evaluation of sensors and methods for remote monitoring of plant activity at the NIST headquarters. The test bed utilizes a forested area of mature trees in a mixed urban environment. A 1 hectare plot within the 26 hectare forest has been instrumented for ecophysiological measurements with an edge (100 m long) that is persistently monitored with multimodal optical sensors (SIF spectrometers, hyperspectral imagers, thermal infrared imaging, and lidar). This biological testbed has the advantage of direct access to the national scales maintained by NIST of measurements related to both the physical and optical measurements of interest. We offer a description of the test site, the sensors, and preliminary results from the first season of observations for ecological, physiological, and remote sensing based estimates of ecosystem productivity.

  6. Testbed diversity as a fundamental principle for effective ICS security research

    OpenAIRE

    Green, Benjamin; Frey, Sylvain Andre Francis; Rashid, Awais; Hutchison, David

    2016-01-01

    The implementation of diversity in testbeds is essential to understanding and improving the security and resilience of Industrial Control Systems (ICS). Employing a wide spec- trum of equipment, diverse networks, and business processes, as deployed in real-life infrastructures, is particularly diffi- cult in experimental conditions. However, this level of di- versity is key from a security perspective, as attackers can exploit system particularities and process intricacies to their advantage....

  7. Adaptive Coding and Modulation Experiment With NASA's Space Communication and Navigation Testbed

    Science.gov (United States)

    Downey, Joseph; Mortensen, Dale; Evans, Michael; Briones, Janette; Tollis, Nicholas

    2016-01-01

    National Aeronautics and Space Administration (NASA)'s Space Communication and Navigation Testbed is an advanced integrated communication payload on the International Space Station. This paper presents results from an adaptive coding and modulation (ACM) experiment over S-band using a direct-to-earth link between the SCaN Testbed and the Glenn Research Center. The testing leverages the established Digital Video Broadcasting Second Generation (DVB-S2) standard to provide various modulation and coding options, and uses the Space Data Link Protocol (Consultative Committee for Space Data Systems (CCSDS) standard) for the uplink and downlink data framing. The experiment was conducted in a challenging environment due to the multipath and shadowing caused by the International Space Station structure. Several approaches for improving the ACM system are presented, including predictive and learning techniques to accommodate signal fades. Performance of the system is evaluated as a function of end-to-end system latency (round-trip delay), and compared to the capacity of the link. Finally, improvements over standard NASA waveforms are presented.

  8. First light of an external occulter testbed at flight Fresnel numbers

    Science.gov (United States)

    Kim, Yunjong; Sirbu, Dan; Hu, Mia; Kasdin, Jeremy; Vanderbei, Robert J.; Harness, Anthony; Shaklan, Stuart

    2017-01-01

    Many approaches have been suggested over the last couple of decades for imaging Earth-like planets. One of the main candidates for creating high-contrast for future Earth-like planets detection is an external occulter. The external occulter is a spacecraft flown along the line-of-sight of a space telescope to suppress starlight and enable high-contrast direct imaging of exoplanets. The occulter is typically tens of meters in diameter and the separation from the telescope is of the order of tens of thousands of kilometers. Optical testing of a full-scale external occulter on the ground is impossible because of the long separations. Therefore, laboratory verification of occulter designs is necessary to validate the optical models used to design and predict occulter performance. At Princeton, we have designed and built a testbed that allows verification of scaled occulter designs whose suppressed shadow is mathematically identical to that of space occulters. The goal of this experiment is to demonstrate a pupil plane suppression of better than 1e-9 with a corresponding image plane contrast of better than 1e-11. The occulter testbed uses a 77.2 m optical propagation distance to realize the flight Fresnel number of 14.5. The scaled mask is placed at 27.2 m from the artificial source and the camera is located 50.0 m from the scaled mask. We will use an etched silicon mask, manufactured by the Microdevices Lab(MDL) of the Jet Propulsion Laboratory(JPL), as the occulter. Based on conversations with MDL, we expect that 0.5 μm feature size is an achievable resolution in the mask manufacturing process and is therefore likely the indicator of the best possible performance. The occulter is illuminated by a diverging laser beam to reduce the aberrations from the optics before the occulter. Here, we present first light result of a sample design operating at a flight Fresnel number and the experimental setup of the testbed. We compare the experimental results with simulations

  9. Automated translating beam profiler for in situ laser beam spot-size and focal position measurements

    Science.gov (United States)

    Keaveney, James

    2018-03-01

    We present a simple and convenient, high-resolution solution for automated laser-beam profiling with axial translation. The device is based on a Raspberry Pi computer, Pi Noir CMOS camera, stepper motor, and commercial translation stage. We also provide software to run the device. The CMOS sensor is sensitive over a large wavelength range between 300 and 1100 nm and can be translated over 25 mm along the beam axis. The sensor head can be reversed without changing its axial position, allowing for a quantitative estimate of beam overlap with counter-propagating laser beams. Although not limited to this application, the intended use for this device is the automated measurement of the focal position and spot-size of a Gaussian laser beam. We present example data of one such measurement to illustrate device performance.

  10. Janus: Graphical Software for Analyzing In-Situ Measurements of Solar-Wind Ions

    Science.gov (United States)

    Maruca, B.; Stevens, M. L.; Kasper, J. C.; Korreck, K. E.

    2016-12-01

    In-situ observations of solar-wind ions provide tremendous insights into the physics of space plasmas. Instrument on spacecraft measure distributions of ion energies, which can be processed into scientifically useful data (e.g., values for ion densities and temperatures). This analysis requires a strong, technical understanding of the instrument, so it has traditionally been carried out by the instrument teams using automated software that they had developed for that purpose. The automated routines are optimized for typical solar-wind conditions, so they can fail to capture the complex (and scientifically interesting) microphysics of transient solar-wind - such as coronal mass ejections (CME's) and co-rotating interaction regions (CIR's) - which are often better analyzed manually.This presentation reports on the ongoing development of Janus, a new software package for processing in-situ measurement of solar-wind ions. Janus will provide user with an easy-to-use graphical user interface (GUI) for carrying out highly customized analyses. Transparent to the user, Janus will automatically handle the most technical tasks (e.g., the retrieval and calibration of measurements). For the first time, users with only limited knowledge about the instruments (e.g., non-instrumentalists and students) will be able to easily process measurements of solar-wind ions. Version 1 of Janus focuses specifically on such measurements from the Wind spacecraft's Faraday Cups and is slated for public release in time for this presentation.

  11. The CMS integration grid testbed

    Energy Technology Data Exchange (ETDEWEB)

    Graham, Gregory E.

    2004-08-26

    The CMS Integration Grid Testbed (IGT) comprises USCMS Tier-1 and Tier-2 hardware at the following sites: the California Institute of Technology, Fermi National Accelerator Laboratory, the University of California at San Diego, and the University of Florida at Gainesville. The IGT runs jobs using the Globus Toolkit with a DAGMan and Condor-G front end. The virtual organization (VO) is managed using VO management scripts from the European Data Grid (EDG). Gridwide monitoring is accomplished using local tools such as Ganglia interfaced into the Globus Metadata Directory Service (MDS) and the agent based Mona Lisa. Domain specific software is packaged and installed using the Distribution After Release (DAR) tool of CMS, while middleware under the auspices of the Virtual Data Toolkit (VDT) is distributed using Pacman. During a continuous two month span in Fall of 2002, over 1 million official CMS GEANT based Monte Carlo events were generated and returned to CERN for analysis while being demonstrated at SC2002. In this paper, we describe the process that led to one of the world's first continuously available, functioning grids.

  12. Real-Time Emulation of Heterogeneous Wireless Networks with End-to-Edge Quality of Service Guarantees: The AROMA Testbed

    Directory of Open Access Journals (Sweden)

    Anna Umbert

    2010-01-01

    Full Text Available This work presents and describes the real-time testbed for all-IP Beyond 3G (B3G heterogeneous wireless networks that has been developed in the framework of the European IST AROMA project. The main objective of the AROMA testbed is to provide a highly accurate and realistic framework where the performance of algorithms, policies, protocols, services, and applications for a complete heterogeneous wireless network can be fully assessed and evaluated before bringing them to a real system. The complexity of the interaction between all-IP B3G systems and user applications, while dealing with the Quality of Service (QoS concept, motivates the development of this kind of emulation platform where different solutions can be tested in realistic conditions that could not be achieved by means of simple offline simulations. This work provides an in-depth description of the AROMA testbed, emphasizing many interesting implementation details and lessons learned during the development of the tool that may result helpful to other researchers and system engineers in the development of similar emulation platforms. Several case studies are also presented in order to illustrate the full potential and capabilities of the presented emulation platform.

  13. In situ beamline analysis and correction of active optics.

    Science.gov (United States)

    Sutter, John; Alcock, Simon; Sawhney, Kawal

    2012-11-01

    At the Diamond Light Source, pencil-beam measurements have enabled long-wavelength slope errors on X-ray mirror surfaces to be examined under ultra-high vacuum and beamline mounting without the need to remove the mirror from the beamline. For an active mirror an automated procedure has been implemented to calculate the actuator settings that optimize its figure. More recently, this in situ pencil-beam method has been applied to additional uses for which ex situ measurements would be inconvenient or simply impossible. First, it has been used to check the stability of the slope errors of several bimorph mirrors at intervals of several weeks or months. Then, it also proved useful for the adjustment of bender and sag compensation actuators on mechanically bent mirrors. Fits to the bending of ideal beams have been performed on the slope errors of a mechanically bent mirror in order to distinguish curvatures introduced by the bending actuators from gravitational distortion. Application of the optimization procedure to another mechanically bent mirror led to an improvement of its sag compensation mechanism.

  14. Creative thinking of design and redesign on SEAT aircraft cabin testbed: a case study

    NARCIS (Netherlands)

    Tan, C.F.; Chen, W.; Rauterberg, G.W.M.

    2009-01-01

    this paper, the intuition approach in the design and redesign of the environmental friendly innovative aircraft cabin simulator is presented.. The aircraft cabin simulator is a testbed that used for European Project SEAT (Smart tEchnologies for Stress free Air Travel). The SEAT project aims to

  15. Static and dynamic optimization of CAPE problems using a Model Testbed

    DEFF Research Database (Denmark)

    This paper presents a new computer aided tool for setting up and solving CAPE related static and dynamic optimisation problems. The Model Testbed (MOT) offers an integrated environment for setting up and solving a very large range of CAPE problems, including complex optimisation problems...... and dynamic optimisation, and how interfacing of solvers and seamless information flow can lead to more efficient solution of process design problems....

  16. Analysis, modeling, and simulation (AMS) testbed development and evaluation to support dynamic mobility applications (DMA) and active transportation and demand management (ATDM) programs — evaluation report for ATDM program. [supporting datasets - Pasadena Testbed

    Science.gov (United States)

    2017-07-26

    This zip file contains POSTDATA.ATT (.ATT); Print to File (.PRN); Portable Document Format (.PDF); and document (.DOCX) files of data to support FHWA-JPO-16-385, Analysis, modeling, and simulation (AMS) testbed development and evaluation to support d...

  17. Evaluasi Kinerja Layanan IPTV pada Jaringan Testbed WiMAX Berbasis Standar IEEE 802.16-2004

    Directory of Open Access Journals (Sweden)

    Prasetiyono Hari Mukti

    2015-09-01

    Full Text Available In this paper, a performance evaluation for IPTV Services over WiMAX testbed based on IEEE Standard 802.16-2004 will be described. The performance of the proposed system is evaluated in terms of delay, jitter, throughput and packet loss. Service performance evaluations are conducted on network topology of point-to-point in the variation of background traffic with different scheduling types. Background traffic is injected into the system to give the sense that the proposed system has variation traffic load. Scheduling type which are used in this paper are Best Effort (BE, Non-Real-Time Polling Service (nrtPS, Real-Time Polling Service (rtPS and Unsolicited Grant Service (UGS. The expemerintal results of IPTV service performance over the testbed network show that the maximum average of delay, jitter, packet loss and jitter are 16.581 ms, 58.515 ms, 0.67 Mbps dan 10.96%, respectively.

  18. Design and Development of a 200-kW Turbo-Electric Distributed Propulsion Testbed

    Science.gov (United States)

    Papathakis, Kurt V.; Kloesel, Kurt J.; Lin, Yohan; Clarke, Sean; Ediger, Jacob J.; Ginn, Starr

    2016-01-01

    The National Aeronautics and Space Administration (NASA) Armstrong Flight Research Center (AFRC) (Edwards, California) is developing a Hybrid-Electric Integrated Systems Testbed (HEIST) Testbed as part of the HEIST Project, to study power management and transition complexities, modular architectures, and flight control laws for turbo-electric distributed propulsion technologies using representative hardware and piloted simulations. Capabilities are being developed to assess the flight readiness of hybrid electric and distributed electric vehicle architectures. Additionally, NASA will leverage experience gained and assets developed from HEIST to assist in flight-test proposal development, flight-test vehicle design, and evaluation of hybrid electric and distributed electric concept vehicles for flight safety. The HEIST test equipment will include three trailers supporting a distributed electric propulsion wing, a battery system and turbogenerator, dynamometers, and supporting power and communication infrastructure, all connected to the AFRC Core simulation. Plans call for 18 high performance electric motors that will be powered by batteries and the turbogenerator, and commanded by a piloted simulation. Flight control algorithms will be developed on the turbo-electric distributed propulsion system.

  19. Development of a Remotely Operated Vehicle Test-bed

    Directory of Open Access Journals (Sweden)

    Biao WANG

    2013-06-01

    Full Text Available This paper presents the development of a remotely operated vehicle (ROV, designed to serve as a convenient, cost-effective platform for research and experimental validation of hardware, sensors and control algorithms. Both of the mechanical and control system design are introduced. The vehicle with a dimension 0.65 m long, 0.45 m wide has been designed to have a frame structure for modification of mounted devices and thruster allocation. For control system, STM32 based MCU boards specially designed for this project, are used as core processing boards. And an open source, modular, flexible software is developed. Experiment results demonstrate the effectiveness of the test-bed.

  20. Automated assessment of cognitive health using smart home technologies.

    Science.gov (United States)

    Dawadi, Prafulla N; Cook, Diane J; Schmitter-Edgecombe, Maureen; Parsey, Carolyn

    2013-01-01

    The goal of this work is to develop intelligent systems to monitor the wellbeing of individuals in their home environments. This paper introduces a machine learning-based method to automatically predict activity quality in smart homes and automatically assess cognitive health based on activity quality. This paper describes an automated framework to extract set of features from smart home sensors data that reflects the activity performance or ability of an individual to complete an activity which can be input to machine learning algorithms. Output from learning algorithms including principal component analysis, support vector machine, and logistic regression algorithms are used to quantify activity quality for a complex set of smart home activities and predict cognitive health of participants. Smart home activity data was gathered from volunteer participants (n=263) who performed a complex set of activities in our smart home testbed. We compare our automated activity quality prediction and cognitive health prediction with direct observation scores and health assessment obtained from neuropsychologists. With all samples included, we obtained statistically significant correlation (r=0.54) between direct observation scores and predicted activity quality. Similarly, using a support vector machine classifier, we obtained reasonable classification accuracy (area under the ROC curve=0.80, g-mean=0.73) in classifying participants into two different cognitive classes, dementia and cognitive healthy. The results suggest that it is possible to automatically quantify the task quality of smart home activities and perform limited assessment of the cognitive health of individual if smart home activities are properly chosen and learning algorithms are appropriately trained.

  1. Smart Grid: Network simulator for smart grid test-bed

    International Nuclear Information System (INIS)

    Lai, L C; Ong, H S; Che, Y X; Do, N Q; Ong, X J

    2013-01-01

    Smart Grid become more popular, a smaller scale of smart grid test-bed is set up at UNITEN to investigate the performance and to find out future enhancement of smart grid in Malaysia. The fundamental requirement in this project is design a network with low delay, no packet drop and with high data rate. Different type of traffic has its own characteristic and is suitable for different type of network and requirement. However no one understands the natural of traffic in smart grid. This paper presents the comparison between different types of traffic to find out the most suitable traffic for the optimal network performance.

  2. Telescience testbed: Operational support functions for biomedical experiments

    Science.gov (United States)

    Yamashita, Masamichi; Watanabe, Satoru; Shoji, Takatoshi; Clarke, Andrew H.; Suzuki, Hiroyuki; Yanagihara, Dai

    A telescience testbed was conducted to study the methodology of space biomedicine with simulated constraints imposed on space experiments. An experimental subject selected for this testbedding was an elaborate surgery of animals and electrophysiological measurements conducted by an operator onboard. The standing potential in the ampulla of the pigeon's semicircular canal was measured during gravitational and caloric stimulation. A principal investigator, isolated from the operation site, participated in the experiment interactively by telecommunication links. Reliability analysis was applied to the whole layers of experimentation, including design of experimental objectives and operational procedures. Engineering and technological aspects of telescience are discussed in terms of reliability to assure quality of science. Feasibility of robotics was examined for supportive functions to reduce the workload of the onboard operator.

  3. The Living With a Star Program Space Environment Testbed

    Science.gov (United States)

    Barth, Janet; Day, John H. (Technical Monitor)

    2001-01-01

    This viewgraph presentation describes the objective, approach, and scope of the Living With a Star (LWS) program at the Marshall Space Flight Center. Scientists involved in the project seek to refine the understanding of space weather and the role of solar variability in terrestrial climate change. Research and the development of improved analytic methods have led to increased predictive capabilities and the improvement of environment specification models. Specifically, the Space Environment Testbed (SET) project of LWS is responsible for the implementation of improved engineering approaches to observing solar effects on climate change. This responsibility includes technology development, ground test protocol development, and the development of a technology application model/engineering tool.

  4. A Method to Derive Monitoring Variables for a Cyber Security Test-bed of I and C System

    Energy Technology Data Exchange (ETDEWEB)

    Han, Kyung Soo; Song, Jae Gu; Lee, Joung Woon; Lee, Cheol Kwon [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2013-10-15

    In the IT field, monitoring techniques have been developed to protect the systems connected by networks from cyber attacks and incidents. For the development of monitoring systems for I and C cyber security, it is necessary to review the monitoring systems in the IT field and derive cyber security-related monitoring variables among the proprietary operating information about the I and C systems. Tests for the development and application of these monitoring systems may cause adverse effects on the I and C systems. To analyze influences on the system and safely intended variables, the construction of an I and C system Test-bed should be preceded. This article proposes a method of deriving variables that should be monitored through a monitoring system for cyber security as a part of I and C Test-bed. The surveillance features and the monitored variables of NMS(Network Management System), a monitoring technique in the IT field, were reviewed in section 2. In Section 3, the monitoring variables for an I and C cyber security were derived by the of NMS and the investigation for information used for hacking techniques that can be practiced against I and C systems. The monitoring variables of NMS in the IT field and the information about the malicious behaviors used for hacking were derived as expected variables to be monitored for an I and C cyber security research. The derived monitoring variables were classified into the five functions of NMS for efficient management. For the cyber security of I and C systems, the vulnerabilities should be understood through a penetration test etc. and an assessment of influences on the actual system should be carried out. Thus, constructing a test-bed of I and C systems is necessary for the safety system in operation. In the future, it will be necessary to develop a logging and monitoring system for studies on the vulnerabilities of I and C systems with test-beds.

  5. A Method to Derive Monitoring Variables for a Cyber Security Test-bed of I and C System

    International Nuclear Information System (INIS)

    Han, Kyung Soo; Song, Jae Gu; Lee, Joung Woon; Lee, Cheol Kwon

    2013-01-01

    In the IT field, monitoring techniques have been developed to protect the systems connected by networks from cyber attacks and incidents. For the development of monitoring systems for I and C cyber security, it is necessary to review the monitoring systems in the IT field and derive cyber security-related monitoring variables among the proprietary operating information about the I and C systems. Tests for the development and application of these monitoring systems may cause adverse effects on the I and C systems. To analyze influences on the system and safely intended variables, the construction of an I and C system Test-bed should be preceded. This article proposes a method of deriving variables that should be monitored through a monitoring system for cyber security as a part of I and C Test-bed. The surveillance features and the monitored variables of NMS(Network Management System), a monitoring technique in the IT field, were reviewed in section 2. In Section 3, the monitoring variables for an I and C cyber security were derived by the of NMS and the investigation for information used for hacking techniques that can be practiced against I and C systems. The monitoring variables of NMS in the IT field and the information about the malicious behaviors used for hacking were derived as expected variables to be monitored for an I and C cyber security research. The derived monitoring variables were classified into the five functions of NMS for efficient management. For the cyber security of I and C systems, the vulnerabilities should be understood through a penetration test etc. and an assessment of influences on the actual system should be carried out. Thus, constructing a test-bed of I and C systems is necessary for the safety system in operation. In the future, it will be necessary to develop a logging and monitoring system for studies on the vulnerabilities of I and C systems with test-beds

  6. A low-cost test-bed for real-time landmark tracking

    Science.gov (United States)

    Csaszar, Ambrus; Hanan, Jay C.; Moreels, Pierre; Assad, Christopher

    2007-04-01

    A low-cost vehicle test-bed system was developed to iteratively test, refine and demonstrate navigation algorithms before attempting to transfer the algorithms to more advanced rover prototypes. The platform used here was a modified radio controlled (RC) car. A microcontroller board and onboard laptop computer allow for either autonomous or remote operation via a computer workstation. The sensors onboard the vehicle represent the types currently used on NASA-JPL rover prototypes. For dead-reckoning navigation, optical wheel encoders, a single axis gyroscope, and 2-axis accelerometer were used. An ultrasound ranger is available to calculate distance as a substitute for the stereo vision systems presently used on rovers. The prototype also carries a small laptop computer with a USB camera and wireless transmitter to send real time video to an off-board computer. A real-time user interface was implemented that combines an automatic image feature selector, tracking parameter controls, streaming video viewer, and user generated or autonomous driving commands. Using the test-bed, real-time landmark tracking was demonstrated by autonomously driving the vehicle through the JPL Mars yard. The algorithms tracked rocks as waypoints. This generated coordinates calculating relative motion and visually servoing to science targets. A limitation for the current system is serial computing-each additional landmark is tracked in order-but since each landmark is tracked independently, if transferred to appropriate parallel hardware, adding targets would not significantly diminish system speed.

  7. NASA Langley's AirSTAR Testbed: A Subscale Flight Test Capability for Flight Dynamics and Control System Experiments

    Science.gov (United States)

    Jordan, Thomas L.; Bailey, Roger M.

    2008-01-01

    As part of the Airborne Subscale Transport Aircraft Research (AirSTAR) project, NASA Langley Research Center (LaRC) has developed a subscaled flying testbed in order to conduct research experiments in support of the goals of NASA s Aviation Safety Program. This research capability consists of three distinct components. The first of these is the research aircraft, of which there are several in the AirSTAR stable. These aircraft range from a dynamically-scaled, twin turbine vehicle to a propeller driven, off-the-shelf airframe. Each of these airframes carves out its own niche in the research test program. All of the airplanes have sophisticated on-board data acquisition and actuation systems, recording, telemetering, processing, and/or receiving data from research control systems. The second piece of the testbed is the ground facilities, which encompass the hardware and software infrastructure necessary to provide comprehensive support services for conducting flight research using the subscale aircraft, including: subsystem development, integrated testing, remote piloting of the subscale aircraft, telemetry processing, experimental flight control law implementation and evaluation, flight simulation, data recording/archiving, and communications. The ground facilities are comprised of two major components: (1) The Base Research Station (BRS), a LaRC laboratory facility for system development, testing and data analysis, and (2) The Mobile Operations Station (MOS), a self-contained, motorized vehicle serving as a mobile research command/operations center, functionally equivalent to the BRS, capable of deployment to remote sites for supporting flight tests. The third piece of the testbed is the test facility itself. Research flights carried out by the AirSTAR team are conducted at NASA Wallops Flight Facility (WFF) on the Eastern Shore of Virginia. The UAV Island runway is a 50 x 1500 paved runway that lies within restricted airspace at Wallops Flight Facility. The

  8. A Real-Time GPP Software-Defined Radio Testbed for the Physical Layer of Wireless Standards

    NARCIS (Netherlands)

    Schiphorst, Roelof; Hoeksema, F.W.; Slump, Cornelis H.

    2005-01-01

    We present our contribution to the general-purpose-processor-(GPP)-based radio. We describe a baseband software-defined radio testbed for the physical layer of wireless LAN standards. All physical layer functions have been successfully mapped on a Pentium 4 processor that performs these functions in

  9. A Monocular Vision Measurement System of Three-Degree-of-Freedom Air-Bearing Test-Bed Based on FCCSP

    Science.gov (United States)

    Gao, Zhanyu; Gu, Yingying; Lv, Yaoyu; Xu, Zhenbang; Wu, Qingwen

    2018-06-01

    A monocular vision-based pose measurement system is provided for real-time measurement of a three-degree-of-freedom (3-DOF) air-bearing test-bed. Firstly, a circular plane cooperative target is designed. An image of a target fixed on the test-bed is then acquired. Blob analysis-based image processing is used to detect the object circles on the target. A fast algorithm (FCCSP) based on pixel statistics is proposed to extract the centers of object circles. Finally, pose measurements can be obtained when combined with the centers and the coordinate transformation relation. Experiments show that the proposed method is fast, accurate, and robust enough to satisfy the requirement of the pose measurement.

  10. Aerodynamic design of the National Rotor Testbed.

    Energy Technology Data Exchange (ETDEWEB)

    Kelley, Christopher Lee [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-10-01

    A new wind turbine blade has been designed for the National Rotor Testbed (NRT) project and for future experiments at the Scaled Wind Farm Technology (SWiFT) facility with a specific focus on scaled wakes. This report shows the aerodynamic design of new blades that can produce a wake that has similitude to utility scale blades despite the difference in size and location in the atmospheric boundary layer. Dimensionless quantities circulation, induction, thrust coefficient, and tip-speed-ratio were kept equal between rotor scales in region 2 of operation. The new NRT design matched the aerodynamic quantities of the most common wind turbine in the United States, the GE 1.5sle turbine with 37c model blades. The NRT blade design is presented along with its performance subject to the winds at SWiFT. The design requirements determined by the SWiFT experimental test campaign are shown to be met.

  11. EMERGE - ESnet/MREN Regional Science Grid Experimental NGI Testbed

    Energy Technology Data Exchange (ETDEWEB)

    Mambretti, Joe; DeFanti, Tom; Brown, Maxine

    2001-07-31

    This document is the final report on the EMERGE Science Grid testbed research project from the perspective of the International Center for Advanced Internet Research (iCAIR) at Northwestern University, which was a subcontractor to this UIC project. This report is a compilation of information gathered from a variety of materials related to this project produced by multiple EMERGE participants, especially those at Electronic Visualization Lab (EVL) at the University of Illinois at Chicago (UIC), Argonne National Lab and iCAIR. The EMERGE Science Grid project was managed by Tom DeFanti, PI from EVL at UIC.

  12. Development of an autonomous power system testbed

    International Nuclear Information System (INIS)

    Barton, J.R.; Adams, T.; Liffring, M.E.

    1985-01-01

    A power system testbed has been assembled to advance the development of large autonomous electrical power systems required for the space station, spacecraft, and aircraft. The power system for this effort was designed to simulate single- or dual-bus autonomous power systems, or autonomous systems that reconfigure from a single bus to a dual bus following a severe fault. The approach taken was to provide a flexible power system design with two computer systems for control and management. One computer operates as the control system and performs basic control functions, data and command processing, charge control, and provides status to the second computer. The second computer contains expert system software for mission planning, load management, fault identification and recovery, and sends load and configuration commands to the control system

  13. Test-bed for the remote health monitoring system for bridge structures using FBG sensors

    Science.gov (United States)

    Lee, Chin-Hyung; Park, Ki-Tae; Joo, Bong-Chul; Hwang, Yoon-Koog

    2009-05-01

    This paper reports on test-bed for the long-term health monitoring system for bridge structures employing fiber Bragg grating (FBG) sensors, which is remotely accessible via the web, to provide real-time quantitative information on a bridge's response to live loading and environmental changes, and fast prediction of the structure's integrity. The sensors are attached on several locations of the structure and connected to a data acquisition system permanently installed onsite. The system can be accessed through remote communication using an optical cable network, through which the evaluation of the bridge behavior under live loading can be allowed at place far away from the field. Live structural data are transmitted continuously to the server computer at the central office. The server computer is connected securely to the internet, where data can be retrieved, processed and stored for the remote web-based health monitoring. Test-bed revealed that the remote health monitoring technology will enable practical, cost-effective, and reliable condition assessment and maintenance of bridge structures.

  14. Screening of subfertile men for testicular carcinoma in situ by an automated image analysis-based cytological test of the ejaculate

    DEFF Research Database (Denmark)

    Almstrup, K; Lippert, Marianne; Mogensen, Hanne O

    2011-01-01

    a slightly lower sensitivity (0.51), possibly because of obstruction. We conclude that this novel non-invasive test combining automated immunocytochemistry and advanced image analysis allows identification of TC at the CIS stage with a high specificity, but a negative test does not completely exclude CIS...... and detected in ejaculates with specific CIS markers. We have built a high throughput framework involving automated immunocytochemical staining, scanning microscopy and in silico image analysis allowing automated detection and grading of CIS-like stained objects in semen samples. In this study, 1175 ejaculates...... from 765 subfertile men were tested using this framework. In 5/765 (0.65%) cases, CIS-like cells were identified in the ejaculate. Three of these had bilateral testicular biopsies performed and CIS was histologically confirmed in two. In total, 63 bilateral testicular biopsy were performed...

  15. Test-bed Assessment of Communication Technologies for a Power-Balancing Controller

    DEFF Research Database (Denmark)

    Findrik, Mislav; Pedersen, Rasmus; Hasenleithner, Eduard

    2016-01-01

    and control. In this paper, we present a Smart Grid test-bed that integrates various communication technologies and deploys a power balancing controller for LV grids. Control performance of the introduced power balancing controller is subsequently investigated and its robustness to communication network cross......Due to growing need for sustainable energy, increasing number of different renewable energy resources are being connected into distribution grids. In order to efficiently manage a decentralized power generation units, the smart grid will rely on communication networks for information exchange...

  16. Semi-automated scoring of triple-probe FISH in human sperm using confocal microscopy.

    Science.gov (United States)

    Branch, Francesca; Nguyen, GiaLinh; Porter, Nicholas; Young, Heather A; Martenies, Sheena E; McCray, Nathan; Deloid, Glen; Popratiloff, Anastas; Perry, Melissa J

    2017-09-01

    Structural and numerical sperm chromosomal aberrations result from abnormal meiosis and are directly linked to infertility. Any live births that arise from aneuploid conceptuses can result in syndromes such as Kleinfelter, Turners, XYY and Edwards. Multi-probe fluorescence in situ hybridization (FISH) is commonly used to study sperm aneuploidy, however manual FISH scoring in sperm samples is labor-intensive and introduces errors. Automated scoring methods are continuously evolving. One challenging aspect for optimizing automated sperm FISH scoring has been the overlap in excitation and emission of the fluorescent probes used to enumerate the chromosomes of interest. Our objective was to demonstrate the feasibility of combining confocal microscopy and spectral imaging with high-throughput methods for accurately measuring sperm aneuploidy. Our approach used confocal microscopy to analyze numerical chromosomal abnormalities in human sperm using enhanced slide preparation and rigorous semi-automated scoring methods. FISH for chromosomes X, Y, and 18 was conducted to determine sex chromosome disomy in sperm nuclei. Application of online spectral linear unmixing was used for effective separation of four fluorochromes while decreasing data acquisition time. Semi-automated image processing, segmentation, classification, and scoring were performed on 10 slides using custom image processing and analysis software and results were compared with manual methods. No significant differences in disomy frequencies were seen between the semi automated and manual methods. Samples treated with pepsin were observed to have reduced background autofluorescence and more uniform distribution of cells. These results demonstrate that semi-automated methods using spectral imaging on a confocal platform are a feasible approach for analyzing numerical chromosomal aberrations in sperm, and are comparable to manual methods. © 2017 International Society for Advancement of Cytometry. © 2017

  17. SABA: A Testbed for a Real-Time MIMO System

    Directory of Open Access Journals (Sweden)

    Brühl Lars

    2006-01-01

    Full Text Available The growing demand for high data rates for wireless communication systems leads to the development of new technologies to increase the channel capacity thus increasing the data rate. MIMO (multiple-input multiple-output systems are best qualified for these applications. In this paper, we present a MIMO test environment for high data rate transmissions in frequency-selective environments. An overview of the testbed is given, including the analyzed algorithms, the digital signal processing with a new highly parallel processor to perform the algorithms in real time, as well as the analog front-ends. A brief overview of the influence of polarization on the channel capacity is given as well.

  18. Automation in Immunohematology

    Directory of Open Access Journals (Sweden)

    Meenu Bajpai

    2012-01-01

    Full Text Available There have been rapid technological advances in blood banking in South Asian region over the past decade with an increasing emphasis on quality and safety of blood products. The conventional test tube technique has given way to newer techniques such as column agglutination technique, solid phase red cell adherence assay, and erythrocyte-magnetized technique. These new technologies are adaptable to automation and major manufacturers in this field have come up with semi and fully automated equipments for immunohematology tests in the blood bank. Automation improves the objectivity and reproducibility of tests. It reduces human errors in patient identification and transcription errors. Documentation and traceability of tests, reagents and processes and archiving of results is another major advantage of automation. Shifting from manual methods to automation is a major undertaking for any transfusion service to provide quality patient care with lesser turnaround time for their ever increasing workload. This article discusses the various issues involved in the process.

  19. Automation in Clinical Microbiology

    Science.gov (United States)

    Ledeboer, Nathan A.

    2013-01-01

    Historically, the trend toward automation in clinical pathology laboratories has largely bypassed the clinical microbiology laboratory. In this article, we review the historical impediments to automation in the microbiology laboratory and offer insight into the reasons why we believe that we are on the cusp of a dramatic change that will sweep a wave of automation into clinical microbiology laboratories. We review the currently available specimen-processing instruments as well as the total laboratory automation solutions. Lastly, we outline the types of studies that will need to be performed to fully assess the benefits of automation in microbiology laboratories. PMID:23515547

  20. Automated Quality Control of in Situ Soil Moisture from the North American Soil Moisture Database Using NLDAS-2 Products

    Science.gov (United States)

    Ek, M. B.; Xia, Y.; Ford, T.; Wu, Y.; Quiring, S. M.

    2015-12-01

    The North American Soil Moisture Database (NASMD) was initiated in 2011 to provide support for developing climate forecasting tools, calibrating land surface models and validating satellite-derived soil moisture algorithms. The NASMD has collected data from over 30 soil moisture observation networks providing millions of in situ soil moisture observations in all 50 states as well as Canada and Mexico. It is recognized that the quality of measured soil moisture in NASMD is highly variable due to the diversity of climatological conditions, land cover, soil texture, and topographies of the stations and differences in measurement devices (e.g., sensors) and installation. It is also recognized that error, inaccuracy and imprecision in the data set can have significant impacts on practical operations and scientific studies. Therefore, developing an appropriate quality control procedure is essential to ensure the data is of the best quality. In this study, an automated quality control approach is developed using the North American Land Data Assimilation System phase 2 (NLDAS-2) Noah soil porosity, soil temperature, and fraction of liquid and total soil moisture to flag erroneous and/or spurious measurements. Overall results show that this approach is able to flag unreasonable values when the soil is partially frozen. A validation example using NLDAS-2 multiple model soil moisture products at the 20 cm soil layer showed that the quality control procedure had a significant positive impact in Alabama, North Carolina, and West Texas. It had a greater impact in colder regions, particularly during spring and autumn. Over 433 NASMD stations have been quality controlled using the methodology proposed in this study, and the algorithm will be implemented to control data quality from the other ~1,200 NASMD stations in the near future.

  1. Automated titration method for use on blended asphalts

    Science.gov (United States)

    Pauli, Adam T [Cheyenne, WY; Robertson, Raymond E [Laramie, WY; Branthaver, Jan F [Chatham, IL; Schabron, John F [Laramie, WY

    2012-08-07

    A system for determining parameters and compatibility of a substance such as an asphalt or other petroleum substance uses titration to highly accurately determine one or more flocculation occurrences and is especially applicable to the determination or use of Heithaus parameters and optimal mixing of various asphalt stocks. In a preferred embodiment, automated titration in an oxygen gas exclusive system and further using spectrophotometric analysis (2-8) of solution turbidity is presented. A reversible titration technique enabling in-situ titration measurement of various solution concentrations is also presented.

  2. Interactive aircraft cabin testbed for stress-free air travel system experiment: an innovative concurrent design approach

    NARCIS (Netherlands)

    Tan, C.F.; Chen, W.; Rauterberg, G.W.M.

    2009-01-01

    In this paper, a study of the concurrent engineering design for the environmental friendly low cost aircraft cabin simulator is presented. The study describes the used of concurrent design technique in the design activity. The simulator is a testbed that was designed and built for research on

  3. FloorNet: Deployment and Evaluation of a Multihop Wireless 802.11 Testbed

    Directory of Open Access Journals (Sweden)

    Zink Michael

    2010-01-01

    Full Text Available A lot of attention has been given to multihop wireless networks lately, but further research—in particular, through experimentation—is needed. This attention has motivated an increase in the number of 802.11-based deployments, both indoor and outdoor. These testbeds, which require a significant amount of resources during both deployment and maintenance, are used to run measurements in order to analyze and understand the limitation and differences between analytical or simulation-based figures and the results from real-life experimentation. This paper makes two major contributions: (i first, we describe a novel wireless multihop testbed, which we name FloorNet, that is deployed and operated under the false floor of a lab in our Computer Science building. This false floor provides a strong physical protection that prevents disconnections or misplacements, as well as radio shielding (to some extent thanks to the false floor panels—this later feature is assessed through experimentation; (ii second, by running exhaustive and controlled experiments we are able to analyze the performance limits of commercial off-the-shelf hardware, as well as to derive practical design criteria for the deployment and configuration of mesh networks. These results both provide valuable insights of wireless multihop performance and prove that FloorNet constitutes a valuable asset to research on wireless mesh networks.

  4. X-ray Pulsar Navigation Algorithms and Testbed for SEXTANT

    Science.gov (United States)

    Winternitz, Luke M. B.; Hasouneh, Monther A.; Mitchell, Jason W.; Valdez, Jennifer E.; Price, Samuel R.; Semper, Sean R.; Yu, Wayne H.; Ray, Paul S.; Wood, Kent S.; Arzoumanian, Zaven; hide

    2015-01-01

    The Station Explorer for X-ray Timing and Navigation Technology (SEXTANT) is a NASA funded technologydemonstration. SEXTANT will, for the first time, demonstrate real-time, on-board X-ray Pulsar-based Navigation (XNAV), a significant milestone in the quest to establish a GPS-like navigation capability available throughout our Solar System and beyond. This paper describes the basic design of the SEXTANT system with a focus on core models and algorithms, and the design and continued development of the GSFC X-ray Navigation Laboratory Testbed (GXLT) with its dynamic pulsar emulation capability. We also present early results from GXLT modeling of the combined NICER X-ray timing instrument hardware and SEXTANT flight software algorithms.

  5. Implementation of Real-Time Feedback Flow Control Algorithms on a Canonical Testbed

    Science.gov (United States)

    Tian, Ye; Song, Qi; Cattafesta, Louis

    2005-01-01

    This report summarizes the activities on "Implementation of Real-Time Feedback Flow Control Algorithms on a Canonical Testbed." The work summarized consists primarily of two parts. The first part summarizes our previous work and the extensions to adaptive ID and control algorithms. The second part concentrates on the validation of adaptive algorithms by applying them to a vibration beam test bed. Extensions to flow control problems are discussed.

  6. Ames life science telescience testbed evaluation

    Science.gov (United States)

    Haines, Richard F.; Johnson, Vicki; Vogelsong, Kristofer H.; Froloff, Walt

    1989-01-01

    Eight surrogate spaceflight mission specialists participated in a real-time evaluation of remote coaching using the Ames Life Science Telescience Testbed facility. This facility consisted of three remotely located nodes: (1) a prototype Space Station glovebox; (2) a ground control station; and (3) a principal investigator's (PI) work area. The major objective of this project was to evaluate the effectiveness of telescience techniques and hardware to support three realistic remote coaching science procedures: plant seed germinator charging, plant sample acquisition and preservation, and remote plant observation with ground coaching. Each scenario was performed by a subject acting as flight mission specialist, interacting with a payload operations manager and a principal investigator expert. All three groups were physically isolated from each other yet linked by duplex audio and color video communication channels and networked computer workstations. Workload ratings were made by the flight and ground crewpersons immediately after completing their assigned tasks. Time to complete each scientific procedural step was recorded automatically. Two expert observers also made performance ratings and various error assessments. The results are presented and discussed.

  7. Coral-based Proxy Records of Ocean Acidification: A Pilot Study at the Puerto Rico Test-bed Site

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Coral cores collected nearby the Atlantic Ocean Acidification Test-bed (AOAT) at La Parguera, Puerto Rico were used to characterize the relationship between...

  8. Report of the Interagency Optical Network Testbeds Workshop 2 September 12-14, 2006 NASA Ames Research Center

    Energy Technology Data Exchange (ETDEWEB)

    Joe Mambretti Richard desJardins

    2006-05-01

    A new generation of optical networking services and technologies is rapidly changing the world of communications. National and international networks are implementing optical services to supplement traditional packet routed services. On September 12-14, 2005, the Optical Network Testbeds Workshop 2 (ONT2), an invitation-only forum hosted by the NASA Research and Engineering Network (NREN) and co-sponsored by the Department of Energy (DOE), was held at NASA Ames Research Center in Mountain View, California. The aim of ONT2 was to help the Federal Large Scale Networking Coordination Group (LSN) and its Joint Engineering Team (JET) to coordinate testbed and network roadmaps describing agency and partner organization views and activities for moving toward next generation communication services based on leading edge optical networks in the 3-5 year time frame. ONT2 was conceived and organized as a sequel to the first Optical Network Testbeds Workshop (ONT1, August 2004, www.nren.nasa.gov/workshop7). ONT1 resulted in a series of recommendations to LSN. ONT2 was designed to move beyond recommendations to agree on a series of “actionable objectives” that would proactively help federal and partner optical network testbeds and advanced research and education (R&E) networks to begin incorporating technologies and services representing the next generation of advanced optical networks in the next 1-3 years. Participants in ONT2 included representatives from innovative prototype networks (Panel A), basic optical network research testbeds (Panel B), and production R&D networks (Panels C and D), including “JETnets,” selected regional optical networks (RONs), international R&D networks, commercial network technology and service providers (Panel F), and senior engineering and R&D managers from LSN agencies and partner organizations. The overall goal of ONT2 was to identify and coordinate short and medium term activities and milestones for researching, developing, identifying

  9. Optimization of automation: I. Estimation method of cognitive automation rates reflecting the effects of automation on human operators in nuclear power plants

    International Nuclear Information System (INIS)

    Lee, Seung Min; Kim, Jong Hyun; Seong, Poong Hyun

    2014-01-01

    Highlights: • We propose an estimation method of the automation rate by taking the advantages of automation as the estimation measures. • We conduct the experiments to examine the validity of the suggested method. • The higher the cognitive automation rate is, the greater the decreased rate of the working time will be. • The usefulness of the suggested estimation method is proved by statistical analyses. - Abstract: Since automation was introduced in various industrial fields, the concept of the automation rate has been used to indicate the inclusion proportion of automation among all work processes or facilities. Expressions of the inclusion proportion of automation are predictable, as is the ability to express the degree of the enhancement of human performance. However, many researchers have found that a high automation rate does not guarantee high performance. Therefore, to reflect the effects of automation on human performance, this paper proposes a new estimation method of the automation rate that considers the effects of automation on human operators in nuclear power plants (NPPs). Automation in NPPs can be divided into two types: system automation and cognitive automation. Some general descriptions and characteristics of each type of automation are provided, and the advantages of automation are investigated. The advantages of each type of automation are used as measures of the estimation method of the automation rate. One advantage was found to be a reduction in the number of tasks, and another was a reduction in human cognitive task loads. The system and the cognitive automation rate were proposed as quantitative measures by taking advantage of the aforementioned benefits. To quantify the required human cognitive task loads and thus suggest the cognitive automation rate, Conant’s information-theory-based model was applied. The validity of the suggested method, especially as regards the cognitive automation rate, was proven by conducting

  10. Fixed target matrix for femtosecond time-resolved and in situ serial micro-crystallography

    Directory of Open Access Journals (Sweden)

    C. Mueller

    2015-09-01

    Full Text Available We present a crystallography chip enabling in situ room temperature crystallography at microfocus synchrotron beamlines and X-ray free-electron laser (X-FEL sources. Compared to other in situ approaches, we observe extremely low background and high diffraction data quality. The chip design is robust and allows fast and efficient loading of thousands of small crystals. The ability to load a large number of protein crystals, at room temperature and with high efficiency, into prescribed positions enables high throughput automated serial crystallography with microfocus synchrotron beamlines. In addition, we demonstrate the application of this chip for femtosecond time-resolved serial crystallography at the Linac Coherent Light Source (LCLS, Menlo Park, California, USA. The chip concept enables multiple images to be acquired from each crystal, allowing differential detection of changes in diffraction intensities in order to obtain high signal-to-noise and fully exploit the time resolution capabilities of XFELs.

  11. High-Resolution Adaptive Optics Test-Bed for Vision Science

    International Nuclear Information System (INIS)

    Wilks, S.C.; Thomspon, C.A.; Olivier, S.S.; Bauman, B.J.; Barnes, T.; Werner, J.S.

    2001-01-01

    We discuss the design and implementation of a low-cost, high-resolution adaptive optics test-bed for vision research. It is well known that high-order aberrations in the human eye reduce optical resolution and limit visual acuity. However, the effects of aberration-free eyesight on vision are only now beginning to be studied using adaptive optics to sense and correct the aberrations in the eye. We are developing a high-resolution adaptive optics system for this purpose using a Hamamatsu Parallel Aligned Nematic Liquid Crystal Spatial Light Modulator. Phase-wrapping is used to extend the effective stroke of the device, and the wavefront sensing and wavefront correction are done at different wavelengths. Issues associated with these techniques will be discussed

  12. Automated ocean color product validation for the Southern California Bight

    Science.gov (United States)

    Davis, Curtiss O.; Tufillaro, Nicholas; Jones, Burt; Arnone, Robert

    2012-06-01

    Automated match ups allow us to maintain and improve the products of current satellite ocean color sensors (MODIS, MERIS), and new sensors (VIIRS). As part of the VIIRS mission preparation, we have created a web based automated match up tool that provides access to searchable fields for date, site, and products, and creates match-ups between satellite (MODIS, MERIS, VIIRS), and in-situ measurements (HyperPRO and SeaPRISM). The back end of the system is a 'mySQL' database, and the front end is a `php' web portal with pull down menus for searchable fields. Based on selections, graphics are generated showing match-ups and statistics, and ascii files are created for downloads for the matchup data. Examples are shown for matching the satellite data with the data from Platform Eureka SeaPRISM off L.A. Harbor in the Southern California Bight.

  13. OPNET/Simulink Based Testbed for Disturbance Detection in the Smart Grid

    Energy Technology Data Exchange (ETDEWEB)

    Sadi, Mohammad A. H. [University of Memphis; Dasgupta, Dipankar [ORNL; Ali, Mohammad Hassan [University of Memphis; Abercrombie, Robert K [ORNL

    2015-01-01

    The important backbone of the smart grid is the cyber/information infrastructure, which is primarily used to communicate with different grid components. A smart grid is a complex cyber physical system containing a numerous and variety number of sources, devices, controllers and loads. Therefore, the smart grid is vulnerable to grid related disturbances. For such dynamic system, disturbance and intrusion detection is a paramount issue. This paper presents a Simulink and Opnet based co-simulated platform to carry out a cyber-intrusion in cyber network for modern power systems and the smart grid. The IEEE 30 bus power system model is used to demonstrate the effectiveness of the simulated testbed. The experiments were performed by disturbing the circuit breakers reclosing time through a cyber-attack. Different disturbance situations in the considered test system are considered and the results indicate the effectiveness of the proposed co-simulated scheme.

  14. Development of research reactor simulator and its application to dynamic test-bed

    International Nuclear Information System (INIS)

    Kwon, Kee-Choon; Baang, Dane; Park, Jae-Chang; Lee, Seung-Wook; Bae, Sung Won

    2014-01-01

    We developed a real-time simulator for 'High-flux Advanced Neutron Application ReactOr (HANARO), and the Jordan Research and Training Reactor (JRTR). The main purpose of this simulator is operator training, but we modified this simulator into a dynamic test-bed (DTB) to test the functions and dynamic control performance of reactor regulating system (RRS) in HANARO or JRTR before installation. The simulator hardware consists of a host computer, 6 operator stations, a network switch, and a large display panel. The software includes a mathematical model that implements plant dynamics in real-time, an instructor station module that manages user instructions, and a human machine interface module. The developed research reactor simulators are installed in the Korea Atomic Energy Research Institute nuclear training center for reactor operator training. To use the simulator as a dynamic test-bed, the reactor regulating system modeling software of the simulator was replaced by actual RRS cabinet, and was interfaced using a hard-wired and network-based interface. RRS cabinet generates control signals for reactor power control based on the various feedback signals from DTB, and the DTB runs plant dynamics based on the RRS control signals. Thus the Hardware-In-the-Loop Simulation between RRS and the emulated plant (DTB) has been implemented and tested in this configuration. The test result shows that the developed DTB and actual RRS cabinet works together simultaneously resulting in quite good dynamic control performances. (author)

  15. Microgrid testbeds around the world: State of art

    International Nuclear Information System (INIS)

    Hossain, Eklas; Kabalci, Ersan; Bayindir, Ramazan; Perez, Ronald

    2014-01-01

    Highlights: • A detail discussion on microgrid project around the world such as North American, Europe, and Japan. • Key benefits of microgrid, issues with on-site generation, features. • Why we need distributed generation system with a brief introduction. • Distributed generation technologies with cost analysis. • The overview on existing distribution network. - Abstract: This paper deals with the recent evolution of microgrids being used around the world in real life applications as well as laboratory application for research. This study is intended to introduce the subject by reviewing the components level, structure and types of microgrid applications installed as a plant or modeled as a simulation environment. The paper also presents a survey regarding published papers on why the microgrid is required, and what the components and control systems are which constitute the actual microgrid studies. It leads the researcher to see the microgrid in terms of the actual bigger picture of today and creates a new outlook about the potential developments. Additionally, comparison of microgrids in various regions based on several parameters allows researchers to define the required criteria and features of a special microgrid that is chosen for a particular scenario. The authors of this paper also tabulated all the necessary information about microgrids, and proposed a standard microgrid for better power quality and optimizing energy generation. Consequently, it is focused on inadequate knowledge and technology gaps in the power system field with regards to the future, and it is this which has been illustrated for the reader. The existing microgrid testbeds all around the world have been studied and analyzed and several of them are explained as an example in this study. Later, those investigated distribution systems are classified based on region (North America, Europe and Asia) and, as presented in literature, a significant amount of deviation has been found

  16. Robotic and Human-Tended Collaborative Drilling Automation for Subsurface Exploration

    Science.gov (United States)

    Glass, Brian; Cannon, Howard; Stoker, Carol; Davis, Kiel

    2005-01-01

    Future in-situ lunar/martian resource utilization and characterization, as well as the scientific search for life on Mars, will require access to the subsurface and hence drilling. Drilling on Earth is hard - an art form more than an engineering discipline. Human operators listen and feel drill string vibrations coming from kilometers underground. Abundant mass and energy make it possible for terrestrial drilling to employ brute-force approaches to failure recovery and system performance issues. Space drilling will require intelligent and autonomous systems for robotic exploration and to support human exploration. Eventual in-situ resource utilization will require deep drilling with probable human-tended operation of large-bore drills, but initial lunar subsurface exploration and near-term ISRU will be accomplished with lightweight, rover-deployable or standalone drills capable of penetrating a few tens of meters in depth. These lightweight exploration drills have a direct counterpart in terrestrial prospecting and ore-body location, and will be designed to operate either human-tended or automated. NASA and industry now are acquiring experience in developing and building low-mass automated planetary prototype drills to design and build a pre-flight lunar prototype targeted for 2011-12 flight opportunities. A successful system will include development of drilling hardware, and automated control software to operate it safely and effectively. This includes control of the drilling hardware, state estimation of both the hardware and the lithography being drilled and state of the hole, and potentially planning and scheduling software suitable for uncertain situations such as drilling. Given that Humans on the Moon or Mars are unlikely to be able to spend protracted EVA periods at a drill site, both human-tended and robotic access to planetary subsurfaces will require some degree of standalone, autonomous drilling capability. Human-robotic coordination will be important

  17. The Fourier-Kelvin Stellar Interferometer (FKSI) Nulling Testbed II: Closed-loop Path Length Metrology And Control Subsystem

    Science.gov (United States)

    Frey, B. J.; Barry, R. K.; Danchi, W. C.; Hyde, T. T.; Lee, K. Y.; Martino, A. J.; Zuray, M. S.

    2006-01-01

    The Fourier-Kelvin Stellar Interferometer (FKSI) is a mission concept for an imaging and nulling interferometer in the near to mid-infrared spectral region (3-8 microns), and will be a scientific and technological pathfinder for upcoming missions including TPF-I/DARWIN, SPECS, and SPIRIT. At NASA's Goddard Space Flight Center, we have constructed a symmetric Mach-Zehnder nulling testbed to demonstrate techniques and algorithms that can be used to establish and maintain the 10(exp 4) null depth that will be required for such a mission. Among the challenges inherent in such a system is the ability to acquire and track the null fringe to the desired depth for timescales on the order of hours in a laboratory environment. In addition, it is desirable to achieve this stability without using conventional dithering techniques. We describe recent testbed metrology and control system developments necessary to achieve these goals and present our preliminary results.

  18. mathFISH, a web tool that uses thermodynamics-based mathematical models for in silico evaluation of oligonucleotide probes for fluorescence in situ hybridization.

    Science.gov (United States)

    Yilmaz, L Safak; Parnerkar, Shreyas; Noguera, Daniel R

    2011-02-01

    Mathematical models of RNA-targeted fluorescence in situ hybridization (FISH) for perfectly matched and mismatched probe/target pairs are organized and automated in web-based mathFISH (http://mathfish.cee.wisc.edu). Offering the users up-to-date knowledge of hybridization thermodynamics within a theoretical framework, mathFISH is expected to maximize the probability of success during oligonucleotide probe design.

  19. Development and application of an actively controlled hybrid proton exchange membrane fuel cell - Lithium-ion battery laboratory test-bed based on off-the-shelf components

    Energy Technology Data Exchange (ETDEWEB)

    Yufit, V.; Brandon, N.P. [Dept. Earth Science and Engineering, Imperial College, London SW7 2AZ (United Kingdom)

    2011-01-15

    The use of commercially available components enables rapid prototyping and assembling of laboratory scale hybrid test-bed systems, which can be used to evaluate new hybrid configurations. The development of such a test-bed using an off-the-shelf PEM fuel cell, lithium-ion battery and DC/DC converter is presented here, and its application to a hybrid configuration appropriate for an unmanned underwater vehicle is explored. A control algorithm was implemented to regulate the power share between the fuel cell and the battery with a graphical interface to control, record and analyze the electrochemical and thermal parameters of the system. The results demonstrate the applicability of the test-bed and control algorithm for this application, and provide data on the dynamic electrical and thermal behaviour of the hybrid system. (author)

  20. NN-SITE: A remote monitoring testbed facility

    International Nuclear Information System (INIS)

    Kadner, S.; White, R.; Roman, W.; Sheely, K.; Puckett, J.; Ystesund, K.

    1997-01-01

    DOE, Aquila Technologies, LANL and SNL recently launched collaborative efforts to create a Non-Proliferation Network Systems Integration and Test (NN-Site, pronounced N-Site) facility. NN-Site will focus on wide area, local area, and local operating level network connectivity including Internet access. This facility will provide thorough and cost-effective integration, testing and development of information connectivity among diverse operating systems and network topologies prior to full-scale deployment. In concentrating on instrument interconnectivity, tamper indication, and data collection and review, NN-Site will facilitate efforts of equipment providers and system integrators in deploying systems that will meet nuclear non-proliferation and safeguards objectives. The following will discuss the objectives of ongoing remote monitoring efforts, as well as the prevalent policy concerns. An in-depth discussion of the Non-Proliferation Network Systems Integration and Test facility (NN-Site) will illuminate the role that this testbed facility can perform in meeting the objectives of remote monitoring efforts, and its potential contribution in promoting eventual acceptance of remote monitoring systems in facilities worldwide

  1. A numerical testbed for remote sensing of aerosols, and its demonstration for evaluating retrieval synergy from a geostationary satellite constellation of GEO-CAPE and GOES-R

    International Nuclear Information System (INIS)

    Wang, Jun; Xu, Xiaoguang; Ding, Shouguo; Zeng, Jing; Spurr, Robert; Liu, Xiong; Chance, Kelly; Mishchenko, Michael

    2014-01-01

    We present a numerical testbed for remote sensing of aerosols, together with a demonstration for evaluating retrieval synergy from a geostationary satellite constellation. The testbed combines inverse (optimal-estimation) software with a forward model containing linearized code for computing particle scattering (for both spherical and non-spherical particles), a kernel-based (land and ocean) surface bi-directional reflectance facility, and a linearized radiative transfer model for polarized radiance. Calculation of gas absorption spectra uses the HITRAN (HIgh-resolution TRANsmission molecular absorption) database of spectroscopic line parameters and other trace species cross-sections. The outputs of the testbed include not only the Stokes 4-vector elements and their sensitivities (Jacobians) with respect to the aerosol single scattering and physical parameters (such as size and shape parameters, refractive index, and plume height), but also DFS (Degree of Freedom for Signal) values for retrieval of these parameters. This testbed can be used as a tool to provide an objective assessment of aerosol information content that can be retrieved for any constellation of (planned or real) satellite sensors and for any combination of algorithm design factors (in terms of wavelengths, viewing angles, radiance and/or polarization to be measured or used). We summarize the components of the testbed, including the derivation and validation of analytical formulae for Jacobian calculations. Benchmark calculations from the forward model are documented. In the context of NASA's Decadal Survey Mission GEO-CAPE (GEOstationary Coastal and Air Pollution Events), we demonstrate the use of the testbed to conduct a feasibility study of using polarization measurements in and around the O 2 A band for the retrieval of aerosol height information from space, as well as an to assess potential improvement in the retrieval of aerosol fine and coarse mode aerosol optical depth (AOD) through the

  2. Testbed for High-Acuity Imaging and Stable Photometry

    Science.gov (United States)

    Gregory, James

    This proposal from MIT Lincoln Laboratory (LL) accompanies the NASA/APRA proposal enti-tled THAI-SPICE: Testbed for High-Acuity Imaging - Stable Photometry and Image-Motion Compensa-tion Experiment (submitted by Eliot Young, Southwest Research Institute). The goal of the THAI-SPICE project is to demonstrate three technologies that will help low-cost balloon-borne telescopes achieve diffraction-limited imaging: stable pointing, passive thermal stabilization and in-flight monitoring of the wave front error. This MIT LL proposal supplies a key element of the pointing stabilization component of THAI-SPICE: an electronic camera based on an orthogonaltransfer charge-coupled device (OTCCD). OTCCD cameras have been demonstrated with charge-transfer efficiencies >0.99999, noise of 90%. In addition to supplying a camera with an OTCCD detector, MIT LL will help with integration and testing of the OTCCD with the THAI-SPICE payload’s guide camera.

  3. A Versatile System for High-Throughput In Situ X-ray Screening and Data Collection of Soluble and Membrane-Protein Crystals

    Energy Technology Data Exchange (ETDEWEB)

    Broecker, Jana; Klingel, Viviane; Ou, Wei-Lin; Balo, Aidin R.; Kissick, David J.; Ogata, Craig M.; Kuo, Anling; Ernst, Oliver P.

    2016-10-12

    In recent years, in situ data collection has been a major focus of progress in protein crystallography. Here, we introduce the Mylar in situ method using Mylar-based sandwich plates that are inexpensive, easy to make and handle, and show significantly less background scattering than other setups. A variety of cognate holders for patches of Mylar in situ sandwich films corresponding to one or more wells makes the method robust and versatile, allows for storage and shipping of entire wells, and enables automated crystal imaging, screening, and goniometerbased X-ray diffraction data-collection at room temperature and under cryogenic conditions for soluble and membrane-protein crystals grown in or transferred to these plates. We validated the Mylar in situ method using crystals of the water-soluble proteins hen egg-white lysozyme and sperm whale myoglobin as well as the 7-transmembrane protein bacteriorhodopsin from Haloquadratum walsbyi. In conjunction with current developments at synchrotrons, this approach promises high-resolution structural studies of membrane proteins to become faster and more routine.

  4. Full-Scaled Advanced Systems Testbed: Ensuring Success of Adaptive Control Research Through Project Lifecycle Risk Mitigation

    Science.gov (United States)

    Pavlock, Kate M.

    2011-01-01

    The National Aeronautics and Space Administration's Dryden Flight Research Center completed flight testing of adaptive controls research on the Full-Scale Advance Systems Testbed (FAST) in January of 2011. The research addressed technical challenges involved with reducing risk in an increasingly complex and dynamic national airspace. Specific challenges lie with the development of validated, multidisciplinary, integrated aircraft control design tools and techniques to enable safe flight in the presence of adverse conditions such as structural damage, control surface failures, or aerodynamic upsets. The testbed is an F-18 aircraft serving as a full-scale vehicle to test and validate adaptive flight control research and lends a significant confidence to the development, maturation, and acceptance process of incorporating adaptive control laws into follow-on research and the operational environment. The experimental systems integrated into FAST were designed to allow for flexible yet safe flight test evaluation and validation of modern adaptive control technologies and revolve around two major hardware upgrades: the modification of Production Support Flight Control Computers (PSFCC) and integration of two, fourth-generation Airborne Research Test Systems (ARTS). Post-hardware integration verification and validation provided the foundation for safe flight test of Nonlinear Dynamic Inversion and Model Reference Aircraft Control adaptive control law experiments. To ensure success of flight in terms of cost, schedule, and test results, emphasis on risk management was incorporated into early stages of design and flight test planning and continued through the execution of each flight test mission. Specific consideration was made to incorporate safety features within the hardware and software to alleviate user demands as well as into test processes and training to reduce human factor impacts to safe and successful flight test. This paper describes the research configuration

  5. Development of an automated data acquisition and processing pipeline using multiple telescopes for observing transient phenomena

    Science.gov (United States)

    Savant, Vaibhav; Smith, Niall

    2016-07-01

    We report on the current status in the development of a pilot automated data acquisition and reduction pipeline based around the operation of two nodes of remotely operated robotic telescopes based in California, USA and Cork, Ireland. The observatories are primarily used as a testbed for automation and instrumentation and as a tool to facilitate STEM (Science Technology Engineering Mathematics) promotion. The Ireland node is situated at Blackrock Castle Observatory (operated by Cork Institute of Technology) and consists of two optical telescopes - 6" and 16" OTAs housed in two separate domes while the node in California is its 6" replica. Together they form a pilot Telescope ARrAy known as TARA. QuickPhot is an automated data reduction pipeline designed primarily to throw more light on the microvariability of blazars employing precision optical photometry and using data from the TARA telescopes as they constantly monitor predefined targets whenever observing conditions are favourable. After carrying out aperture photometry, if any variability above a given threshold is observed, the reporting telescope will communicate the source concerned and the other nodes will follow up with multi-band observations, taking advantage that they are located in strategically separated time-zones. Ultimately we wish to investigate the applicability of Shock-in-Jet and Geometric models. These try to explain the processes at work in AGNs which result in the formation of jets, by looking for temporal and spectral variability in TARA multi-band observations. We are also experimenting with using a Twochannel Optical PHotometric Imaging CAMera (TOΦCAM) that we have developed and which has been optimised for simultaneous two-band photometry on our 16" OTA.

  6. A Numerical Testbed for Remote Sensing of Aerosols, and its Demonstration for Evaluating Retrieval Synergy from a Geostationary Satellite Constellation of GEO-CAPE and GOES-R

    Science.gov (United States)

    Wang, Jun; Xu, Xiaoguang; Ding, Shouguo; Zeng, Jing; Spurr, Robert; Liu, Xiong; Chance, Kelly; Mishchenko, Michael I.

    2014-01-01

    We present a numerical testbed for remote sensing of aerosols, together with a demonstration for evaluating retrieval synergy from a geostationary satellite constellation. The testbed combines inverse (optimal-estimation) software with a forward model containing linearized code for computing particle scattering (for both spherical and non-spherical particles), a kernel-based (land and ocean) surface bi-directional reflectance facility, and a linearized radiative transfer model for polarized radiance. Calculation of gas absorption spectra uses the HITRAN (HIgh-resolution TRANsmission molecular absorption) database of spectroscopic line parameters and other trace species cross-sections. The outputs of the testbed include not only the Stokes 4-vector elements and their sensitivities (Jacobians) with respect to the aerosol single scattering and physical parameters (such as size and shape parameters, refractive index, and plume height), but also DFS (Degree of Freedom for Signal) values for retrieval of these parameters. This testbed can be used as a tool to provide an objective assessment of aerosol information content that can be retrieved for any constellation of (planned or real) satellite sensors and for any combination of algorithm design factors (in terms of wavelengths, viewing angles, radiance and/or polarization to be measured or used). We summarize the components of the testbed, including the derivation and validation of analytical formulae for Jacobian calculations. Benchmark calculations from the forward model are documented. In the context of NASA's Decadal Survey Mission GEOCAPE (GEOstationary Coastal and Air Pollution Events), we demonstrate the use of the testbed to conduct a feasibility study of using polarization measurements in and around the O2 A band for the retrieval of aerosol height information from space, as well as an to assess potential improvement in the retrieval of aerosol fine and coarse mode aerosol optical depth (AOD) through the

  7. Link Adaptation for Mitigating Earth-To-Space Propagation Effects on the NASA SCaN Testbed

    Science.gov (United States)

    Kilcoyne, Deirdre K.; Headley, William C.; Leffke, Zach J.; Rowe, Sonya A.; Mortensen, Dale J.; Reinhart, Richard C.; McGwier, Robert W.

    2016-01-01

    In Earth-to-Space communications, well-known propagation effects such as path loss and atmospheric loss can lead to fluctuations in the strength of the communications link between a satellite and its ground station. Additionally, the typically unconsidered effect of shadowing due to the geometry of the satellite and its solar panels can also lead to link degradation. As a result of these anticipated channel impairments, NASA's communication links have been traditionally designed to handle the worst-case impact of these effects through high link margins and static, lower rate, modulation formats. The work presented in this paper aims to relax these constraints by providing an improved trade-off between data rate and link margin through utilizing link adaptation. More specifically, this work provides a simulation study on the propagation effects impacting NASA's SCaN Testbed flight software-defined radio (SDR) as well as proposes a link adaptation algorithm that varies the modulation format of a communications link as its signal-to-noise ratio fluctuates. Ultimately, the models developed in this work will be utilized to conduct real-time flight experiments on-board the NASA SCaN Testbed.

  8. Miniaturized embryo array for automated trapping, immobilization and microperfusion of zebrafish embryos.

    Directory of Open Access Journals (Sweden)

    Jin Akagi

    Full Text Available Zebrafish (Danio rerio has recently emerged as a powerful experimental model in drug discovery and environmental toxicology. Drug discovery screens performed on zebrafish embryos mirror with a high level of accuracy the tests usually performed on mammalian animal models, and fish embryo toxicity assay (FET is one of the most promising alternative approaches to acute ecotoxicity testing with adult fish. Notwithstanding this, automated in-situ analysis of zebrafish embryos is still deeply in its infancy. This is mostly due to the inherent limitations of conventional techniques and the fact that metazoan organisms are not easily susceptible to laboratory automation. In this work, we describe the development of an innovative miniaturized chip-based device for the in-situ analysis of zebrafish embryos. We present evidence that automatic, hydrodynamic positioning, trapping and long-term immobilization of single embryos inside the microfluidic chips can be combined with time-lapse imaging to provide real-time developmental analysis. Our platform, fabricated using biocompatible polymer molding technology, enables rapid trapping of embryos in low shear stress zones, uniform drug microperfusion and high-resolution imaging without the need of manual embryo handling at various developmental stages. The device provides a highly controllable fluidic microenvironment and post-analysis eleuthero-embryo stage recovery. Throughout the incubation, the position of individual embryos is registered. Importantly, we also for first time show that microfluidic embryo array technology can be effectively used for the analysis of anti-angiogenic compounds using transgenic zebrafish line (fli1a:EGFP. The work provides a new rationale for rapid and automated manipulation and analysis of developing zebrafish embryos at a large scale.

  9. Automation in Warehouse Development

    CERN Document Server

    Verriet, Jacques

    2012-01-01

    The warehouses of the future will come in a variety of forms, but with a few common ingredients. Firstly, human operational handling of items in warehouses is increasingly being replaced by automated item handling. Extended warehouse automation counteracts the scarcity of human operators and supports the quality of picking processes. Secondly, the development of models to simulate and analyse warehouse designs and their components facilitates the challenging task of developing warehouses that take into account each customer’s individual requirements and logistic processes. Automation in Warehouse Development addresses both types of automation from the innovative perspective of applied science. In particular, it describes the outcomes of the Falcon project, a joint endeavour by a consortium of industrial and academic partners. The results include a model-based approach to automate warehouse control design, analysis models for warehouse design, concepts for robotic item handling and computer vision, and auton...

  10. Altering user' acceptance of automation through prior automation exposure.

    Science.gov (United States)

    Bekier, Marek; Molesworth, Brett R C

    2017-06-01

    Air navigation service providers worldwide see increased use of automation as one solution to overcome the capacity constraints imbedded in the present air traffic management (ATM) system. However, increased use of automation within any system is dependent on user acceptance. The present research sought to determine if the point at which an individual is no longer willing to accept or cooperate with automation can be manipulated. Forty participants underwent training on a computer-based air traffic control programme, followed by two ATM exercises (order counterbalanced), one with and one without the aid of automation. Results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation ('tipping point') decreased; suggesting it is indeed possible to alter automation acceptance. Practitioner Summary: This paper investigates whether the point at which a user of automation rejects automation (i.e. 'tipping point') is constant or can be manipulated. The results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation decreased; suggesting it is possible to alter automation acceptance.

  11. Future Trends in Process Automation

    OpenAIRE

    Jämsä-Jounela, Sirkka-Liisa

    2007-01-01

    The importance of automation in the process industries has increased dramatically in recent years. In the highly industrialized countries, process automation serves to enhance product quality, master the whole range of products, improve process safety and plant availability, efficiently utilize resources and lower emissions. In the rapidly developing countries, mass production is the main motivation for applying process automation. The greatest demand for process automation is in the chemical...

  12. mathFISH, a Web Tool That Uses Thermodynamics-Based Mathematical Models for In Silico Evaluation of Oligonucleotide Probes for Fluorescence In Situ Hybridization▿ †

    OpenAIRE

    Yilmaz, L. Safak; Parnerkar, Shreyas; Noguera, Daniel R.

    2010-01-01

    Mathematical models of RNA-targeted fluorescence in situ hybridization (FISH) for perfectly matched and mismatched probe/target pairs are organized and automated in web-based mathFISH (http://mathfish.cee.wisc.edu). Offering the users up-to-date knowledge of hybridization thermodynamics within a theoretical framework, mathFISH is expected to maximize the probability of success during oligonucleotide probe design.

  13. Automation in organizations: Eternal conflict

    Science.gov (United States)

    Dieterly, D. L.

    1981-01-01

    Some ideas on and insights into the problems associated with automation in organizations are presented with emphasis on the concept of automation, its relationship to the individual, and its impact on system performance. An analogy is drawn, based on an American folk hero, to emphasize the extent of the problems encountered when dealing with automation within an organization. A model is proposed to focus attention on a set of appropriate dimensions. The function allocation process becomes a prominent aspect of the model. The current state of automation research is mentioned in relation to the ideas introduced. Proposed directions for an improved understanding of automation's effect on the individual's efficiency are discussed. The importance of understanding the individual's perception of the system in terms of the degree of automation is highlighted.

  14. Optical testbed for the LISA phasemeter

    Science.gov (United States)

    Schwarze, T. S.; Fernández Barranco, G.; Penkert, D.; Gerberding, O.; Heinzel, G.; Danzmann, K.

    2016-05-01

    The planned spaceborne gravitational wave detector LISA will allow the detection of gravitational waves at frequencies between 0.1 mHz and 1 Hz. A breadboard model for the metrology system aka the phasemeter was developed in the scope of an ESA technology development project by a collaboration between the Albert Einstein Institute, the Technical University of Denmark and the Danish industry partner Axcon Aps. It in particular provides the electronic readout of the main interferometer phases besides auxiliary functions. These include clock noise transfer, ADC pilot tone correction, inter-satellite ranging and data transfer. Besides in LISA, the phasemeter can also be applied in future satellite geodesy missions. Here we show the planning and advances in the implementation of an optical testbed for the full metrology chain. It is based on an ultra-stable hexagonal optical bench. This bench allows the generation of three unequal heterodyne beatnotes with a zero phase combination, thus providing the possibility to probe the phase readout for non-linearities in an optical three signal test. Additionally, the utilization of three independent phasemeters will allow the testing of the auxiliary functions. Once working, components can individually be replaced with flight-qualified hardware in this setup.

  15. Optical testbed for the LISA phasemeter

    International Nuclear Information System (INIS)

    Schwarze, T S; Fernández Barranco, G; Penkert, D; Gerberding, O; Heinzel, G; Danzmann, K

    2016-01-01

    The planned spaceborne gravitational wave detector LISA will allow the detection of gravitational waves at frequencies between 0.1 mHz and 1 Hz. A breadboard model for the metrology system aka the phasemeter was developed in the scope of an ESA technology development project by a collaboration between the Albert Einstein Institute, the Technical University of Denmark and the Danish industry partner Axcon Aps. It in particular provides the electronic readout of the main interferometer phases besides auxiliary functions. These include clock noise transfer, ADC pilot tone correction, inter-satellite ranging and data transfer. Besides in LISA, the phasemeter can also be applied in future satellite geodesy missions. Here we show the planning and advances in the implementation of an optical testbed for the full metrology chain. It is based on an ultra-stable hexagonal optical bench. This bench allows the generation of three unequal heterodyne beatnotes with a zero phase combination, thus providing the possibility to probe the phase readout for non-linearities in an optical three signal test. Additionally, the utilization of three independent phasemeters will allow the testing of the auxiliary functions. Once working, components can individually be replaced with flight-qualified hardware in this setup. (paper)

  16. Novel in-situ lamella fabrication technique for in-situ TEM.

    Science.gov (United States)

    Canavan, Megan; Daly, Dermot; Rummel, Andreas; McCarthy, Eoin K; McAuley, Cathal; Nicolosi, Valeria

    2018-03-29

    In-situ transmission electron microscopy is rapidly emerging as the premier technique for characterising materials in a dynamic state on the atomic scale. The most important aspect of in-situ studies is specimen preparation. Specimens must be electron transparent and representative of the material in its operational state, amongst others. Here, a novel fabrication technique for the facile preparation of lamellae for in-situ transmission electron microscopy experimentation using focused ion beam milling is developed. This method involves the use of rotating microgrippers during the lift-out procedure, as opposed to the traditional micromanipulator needle and platinum weld. Using rotating grippers, and a unique adhesive substance, lamellae are mounted onto a MEMS device for in-situ TEM annealing experiments. We demonstrate how this technique can be used to avoid platinum deposition as well as minimising damage to the MEMS device during the thinning process. Our technique is both a cost effective and readily implementable alternative to the current generation of preparation methods for in-situ liquid, electrical, mechanical and thermal experimentation within the TEM as well as traditional cross-sectional lamella preparation. Copyright © 2018 Elsevier B.V. All rights reserved.

  17. Optimization of automation: III. Development of optimization method for determining automation rate in nuclear power plants

    International Nuclear Information System (INIS)

    Lee, Seung Min; Kim, Jong Hyun; Kim, Man Cheol; Seong, Poong Hyun

    2016-01-01

    Highlights: • We propose an appropriate automation rate that enables the best human performance. • We analyze the shortest working time considering Situation Awareness Recovery (SAR). • The optimized automation rate is estimated by integrating the automation and ostracism rate estimation methods. • The process to derive the optimized automation rate is demonstrated through case studies. - Abstract: Automation has been introduced in various industries, including the nuclear field, because it is commonly believed that automation promises greater efficiency, lower workloads, and fewer operator errors through reducing operator errors and enhancing operator and system performance. However, the excessive introduction of automation has deteriorated operator performance due to the side effects of automation, which are referred to as Out-of-the-Loop (OOTL), and this is critical issue that must be resolved. Thus, in order to determine the optimal level of automation introduction that assures the best human operator performance, a quantitative method of optimizing the automation is proposed in this paper. In order to propose the optimization method for determining appropriate automation levels that enable the best human performance, the automation rate and ostracism rate, which are estimation methods that quantitatively analyze the positive and negative effects of automation, respectively, are integrated. The integration was conducted in order to derive the shortest working time through considering the concept of situation awareness recovery (SAR), which states that the automation rate with the shortest working time assures the best human performance. The process to derive the optimized automation rate is demonstrated through an emergency operation scenario-based case study. In this case study, four types of procedures are assumed through redesigning the original emergency operating procedure according to the introduced automation and ostracism levels. Using the

  18. In-situ fluorescence hybridization applied to biological dosimetry: contribution of automation to the counting of radio-induced chromosome aberrations

    International Nuclear Information System (INIS)

    Germain Thomas Roy, Laurence

    1999-01-01

    The frequency of chromosome aberrations on peripheral blood lymphocytes is a dose indicator in the case of ionizing radiations over-exposure. Stable chromosome aberrations (translocations, insertions) are visualized after labelling of some chromosomes using the fluorescence in-situ hybridization (FISH). The study of the use of the FISH technique in biological dosimetry is done with dose-effect curves. It seems that a bias is introduced during the observation of chromosome aberrations involving only 3 pairs of chromosomes. In order to avoid this bias, it would be useful to test the feasibility of using the multi-FISH technique in biological dosimetry. Moreover, this type of chromosome aberration changes with the type of irradiation. It is thus important to define the aberrations to be considered when the FISH technique is used. In order to reduce the time of image analysis, the CYTOGEN system, developed by IMSTAR company (Paris, France) has been adapted to the needs of biological dosimetry. This system allows to localize automatically the metaphases on the slide, which reduces the observation time by 2 or 4. An automatic detection protocol for chromosome aberrations has been implemented. It comprises the image capture, the contours detection and the classification of some chromosome aberrations. The different steps of this protocol have been tested in order to check that no bias is introduced by the automation. However, because radio-induced aberrations are rare events, it seems that a totally automatic system is not foreseeable. A semi-automatic analysis is more suitable. The use of the Slit-Scan technology (Laboratory of applied physics, Heidelberg, Germany) in biological dosimetry has been studied too. This technique allows to analyze rapidly a huge number of chromosomes. A good correlation has been observed between the dicentric frequency measured automatically and by manual counting. The system is under development and should be adapted to the detection of

  19. In situ multi-axial loading frame to probe elastomers using X-ray scattering.

    Science.gov (United States)

    Pannier, Yannick; Proudhon, Henry; Mocuta, Cristian; Thiaudière, Dominique; Cantournet, Sabine

    2011-11-01

    An in situ tensile-shear loading device has been designed to study elastomer crystallization using synchrotron X-ray scattering at the Synchrotron Soleil on the DiffAbs beamline. Elastomer tape specimens of thickness 2 mm can be elongated by up to 500% in the longitudinal direction and sheared by up to 200% in the transverse direction. The device is fully automated and plugged into the TANGO control system of the beamline allowing synchronization between acquisition and loading sequences. Experimental results revealing the evolution of crystallization peaks under load are presented for several tension/shear loading sequences.

  20. Improving the driver-automation interaction: an approach using automation uncertainty.

    Science.gov (United States)

    Beller, Johannes; Heesen, Matthias; Vollrath, Mark

    2013-12-01

    The aim of this study was to evaluate whether communicating automation uncertainty improves the driver-automation interaction. A false system understanding of infallibility may provoke automation misuse and can lead to severe consequences in case of automation failure. The presentation of automation uncertainty may prevent this false system understanding and, as was shown by previous studies, may have numerous benefits. Few studies, however, have clearly shown the potential of communicating uncertainty information in driving. The current study fills this gap. We conducted a driving simulator experiment, varying the presented uncertainty information between participants (no uncertainty information vs. uncertainty information) and the automation reliability (high vs.low) within participants. Participants interacted with a highly automated driving system while engaging in secondary tasks and were required to cooperate with the automation to drive safely. Quantile regressions and multilevel modeling showed that the presentation of uncertainty information increases the time to collision in the case of automation failure. Furthermore, the data indicated improved situation awareness and better knowledge of fallibility for the experimental group. Consequently, the automation with the uncertainty symbol received higher trust ratings and increased acceptance. The presentation of automation uncertaintythrough a symbol improves overall driver-automation cooperation. Most automated systems in driving could benefit from displaying reliability information. This display might improve the acceptance of fallible systems and further enhances driver-automation cooperation.

  1. Bioprocessing automation in cell therapy manufacturing: Outcomes of special interest group automation workshop.

    Science.gov (United States)

    Ball, Oliver; Robinson, Sarah; Bure, Kim; Brindley, David A; Mccall, David

    2018-04-01

    Phacilitate held a Special Interest Group workshop event in Edinburgh, UK, in May 2017. The event brought together leading stakeholders in the cell therapy bioprocessing field to identify present and future challenges and propose potential solutions to automation in cell therapy bioprocessing. Here, we review and summarize discussions from the event. Deep biological understanding of a product, its mechanism of action and indication pathogenesis underpin many factors relating to bioprocessing and automation. To fully exploit the opportunities of bioprocess automation, therapeutics developers must closely consider whether an automation strategy is applicable, how to design an 'automatable' bioprocess and how to implement process modifications with minimal disruption. Major decisions around bioprocess automation strategy should involve all relevant stakeholders; communication between technical and business strategy decision-makers is of particular importance. Developers should leverage automation to implement in-process testing, in turn applicable to process optimization, quality assurance (QA)/ quality control (QC), batch failure control, adaptive manufacturing and regulatory demands, but a lack of precedent and technical opportunities can complicate such efforts. Sparse standardization across product characterization, hardware components and software platforms is perceived to complicate efforts to implement automation. The use of advanced algorithmic approaches such as machine learning may have application to bioprocess and supply chain optimization. Automation can substantially de-risk the wider supply chain, including tracking and traceability, cryopreservation and thawing and logistics. The regulatory implications of automation are currently unclear because few hardware options exist and novel solutions require case-by-case validation, but automation can present attractive regulatory incentives. Copyright © 2018 International Society for Cellular Therapy

  2. Radiation beamline testbeds for the simulation of planetary and spacecraft environments for human and robotic mission risk assessment

    Science.gov (United States)

    Wilkins, Richard

    The Center for Radiation Engineering and Science for Space Exploration (CRESSE) at Prairie View A&M University, Prairie View, Texas, USA, is establishing an integrated, multi-disciplinary research program on the scientific and engineering challenges faced by NASA and the inter-national space community caused by space radiation. CRESSE focuses on space radiation research directly applicable to astronaut health and safety during future long term, deep space missions, including Martian, lunar, and other planetary body missions beyond low earth orbit. The research approach will consist of experimental and theoretical radiation modeling studies utilizing particle accelerator facilities including: 1. NASA Space Radiation Laboratory (NSRL) at Brookhaven National Laboratory; 2. Proton Synchrotron at Loma Linda University Med-ical Center; and 3. Los Alamos Neutron Science Center (LANSCE) at Los Alamos National Laboratory. Specifically, CRESSE investigators are designing, developing, and building experimental test beds that simulate the lunar and Martian radiation environments for experiments focused on risk assessment for astronauts and instrumentation. The testbeds have been designated the Bioastronautics Experimental Research Testbeds for Environmental Radiation Nostrum Investigations and Education (BERT and ERNIE). The designs of BERT and ERNIE will allow for a high degree of flexibility and adaptability to modify experimental configurations to simulate planetary surface environments, planetary habitats, and spacecraft interiors. In the nominal configuration, BERT and ERIE will consist of a set of experimental zones that will simulate the planetary atmosphere (Solid CO2 in the case of the Martian surface.), the planetary surface, and sub-surface regions. These experimental zones can be used for dosimetry, shielding, biological, and electronic effects radiation studies in support of space exploration missions. BERT and ERNIE are designed to be compatible with the

  3. In-situ optical profilometry of CANDU fuel channels

    International Nuclear Information System (INIS)

    Jarvis, G.N.; Cornblum, E.O.; Grabish, M.G.

    1996-01-01

    Detailed knowledge of flaw geometry is crucial in the stress analysis of flaws found in the thin-walled Zirconium alloy pressure tubes of CANDU reactors. While ultrasonic inspection can provide much of the required data, the measurement of the sharpness, or root-radius, at the bottom of a flaw has not so far been possible in-situ. This paper will describe the application of optical profilometry techniques, to measure directly the depth and root-radius of open inside-surface flaws, within a flooded reactor pressure tube. The tool uses a rad-tolerant television camera, custom optics and light stripe generators to collect digitized image data from three different views of a flaw. Software has been developed to manage the collection of the image data and provide a full range of display and automated analysis options. The tool has recently been used successfully to measure fretting flaws in the 100--250 micron deep range

  4. An SDR-Based Real-Time Testbed for GNSS Adaptive Array Anti-Jamming Algorithms Accelerated by GPU

    Directory of Open Access Journals (Sweden)

    Hailong Xu

    2016-03-01

    Full Text Available Nowadays, software-defined radio (SDR has become a common approach to evaluate new algorithms. However, in the field of Global Navigation Satellite System (GNSS adaptive array anti-jamming, previous work has been limited due to the high computational power demanded by adaptive algorithms, and often lack flexibility and configurability. In this paper, the design and implementation of an SDR-based real-time testbed for GNSS adaptive array anti-jamming accelerated by a Graphics Processing Unit (GPU are documented. This testbed highlights itself as a feature-rich and extendible platform with great flexibility and configurability, as well as high computational performance. Both Space-Time Adaptive Processing (STAP and Space-Frequency Adaptive Processing (SFAP are implemented with a wide range of parameters. Raw data from as many as eight antenna elements can be processed in real-time in either an adaptive nulling or beamforming mode. To fully take advantage of the parallelism resource provided by the GPU, a batched method in programming is proposed. Tests and experiments are conducted to evaluate both the computational and anti-jamming performance. This platform can be used for research and prototyping, as well as a real product in certain applications.

  5. Segmented Aperture Interferometric Nulling Testbed (SAINT) II: component systems update

    Science.gov (United States)

    Hicks, Brian A.; Bolcar, Matthew R.; Helmbrecht, Michael A.; Petrone, Peter; Burke, Elliot; Corsetti, James; Dillon, Thomas; Lea, Andrew; Pellicori, Samuel; Sheets, Teresa; Shiri, Ron; Agolli, Jack; DeVries, John; Eberhardt, Andrew; McCabe, Tyler

    2017-09-01

    This work presents updates to the coronagraph and telescope components of the Segmented Aperture Interferometric Nulling Testbed (SAINT). The project pairs an actively-controlled macro-scale segmented mirror with the Visible Nulling Coronagraph (VNC) towards demonstrating capabilities for the future space observatories needed to directly detect and characterize a significant sample of Earth-sized worlds around nearby stars in the quest for identifying those which may be habitable and possibly harbor life. Efforts to improve the VNC wavefront control optics and mechanisms towards repeating narrowband results are described. A narrative is provided for the design of new optical components aimed at enabling broadband performance. Initial work with the hardware and software interface for controlling the segmented telescope mirror is also presented.

  6. How robust are in situ observations for validating satellite-derived albedo over the dark zone of the Greenland Ice Sheet?

    Science.gov (United States)

    Ryan, J.; Hubbard, A., II; Irvine-Fynn, T. D.; Doyle, S. H.; Cook, J.; Stibal, M.; Smith, L. C.; Box, J. E.

    2017-12-01

    Calibration and validation of satellite-derived ice sheet albedo data require high-quality, in situ measurements commonly acquired by up and down facing pyranometers mounted on automated weather stations (AWS). However, direct comparison between ground and satellite-derived albedo can only be justified when the measured surface is homogeneous at the length-scale of both satellite pixel and in situ footprint. We used digital imagery acquired by an unmanned aerial vehicle to evaluate point-to-pixel albedo comparisons across the western, ablating margin of the Greenland Ice Sheet. Our results reveal that in situ measurements overestimate albedo by up to 0.10 at the end of the melt season because the ground footprints of AWS-mounted pyranometers are insufficient to capture the spatial heterogeneity of the ice surface as it progressively ablates and darkens. Statistical analysis of 21 AWS across the entire Greenland Ice Sheet reveals that almost half suffer from this bias, including some AWS located within the wet snow zone.

  7. Termite: Emulation Testbed for Encounter Networks

    Directory of Open Access Journals (Sweden)

    Rodrigo Bruno

    2015-08-01

    Full Text Available Cutting-edge mobile devices like smartphones and tablets are equipped with various infrastructureless wireless interfaces, such as WiFi Direct and Bluetooth. Such technologies allow for novel mobile applications that take advantage of casual encounters between co-located users. However, the need to mimic the behavior of real-world encounter networks makes testing and debugging of such applications hard tasks. We present Termite, an emulation testbed for encounter networks. Our system allows developers to run their applications on a virtual encounter network emulated by software. Developers can model arbitrary encounter networks and specify user interactions on the emulated virtual devices. To facilitate testing and debugging, developers can place breakpoints, inspect the runtime state of virtual nodes, and run experiments in a stepwise fashion. Termite defines its own Petri Net variant to model the dynamically changing topology and synthesize user interactions with virtual devices. The system is designed to efficiently multiplex an underlying emulation hosting infrastructure across multiple developers, and to support heterogeneous mobile platforms. Our current system implementation supports virtual Android devices communicating over WiFi Direct networks and runs on top of a local cloud infrastructure. We evaluated our system using emulator network traces, and found that Termite is expressive and performs well.

  8. Toward designing for trust in database automation

    Energy Technology Data Exchange (ETDEWEB)

    Duez, P. P.; Jamieson, G. A. [Cognitive Engineering Laboratory, Univ. of Toronto, 5 King' s College Rd., Toronto, Ont. M5S 3G8 (Canada)

    2006-07-01

    Appropriate reliance on system automation is imperative for safe and productive work, especially in safety-critical systems. It is unsafe to rely on automation beyond its designed use; conversely, it can be both unproductive and unsafe to manually perform tasks that are better relegated to automated tools. Operator trust in automated tools mediates reliance, and trust appears to affect how operators use technology. As automated agents become more complex, the question of trust in automation is increasingly important. In order to achieve proper use of automation, we must engender an appropriate degree of trust that is sensitive to changes in operating functions and context. In this paper, we present research concerning trust in automation in the domain of automated tools for relational databases. Lee and See have provided models of trust in automation. One model developed by Lee and See identifies three key categories of information about the automation that lie along a continuum of attributional abstraction. Purpose-, process-and performance-related information serve, both individually and through inferences between them, to describe automation in such a way as to engender r properly-calibrated trust. Thus, one can look at information from different levels of attributional abstraction as a general requirements analysis for information key to appropriate trust in automation. The model of information necessary to engender appropriate trust in automation [1] is a general one. Although it describes categories of information, it does not provide insight on how to determine the specific information elements required for a given automated tool. We have applied the Abstraction Hierarchy (AH) to this problem in the domain of relational databases. The AH serves as a formal description of the automation at several levels of abstraction, ranging from a very abstract purpose-oriented description to a more concrete description of the resources involved in the automated process

  9. Toward designing for trust in database automation

    International Nuclear Information System (INIS)

    Duez, P. P.; Jamieson, G. A.

    2006-01-01

    Appropriate reliance on system automation is imperative for safe and productive work, especially in safety-critical systems. It is unsafe to rely on automation beyond its designed use; conversely, it can be both unproductive and unsafe to manually perform tasks that are better relegated to automated tools. Operator trust in automated tools mediates reliance, and trust appears to affect how operators use technology. As automated agents become more complex, the question of trust in automation is increasingly important. In order to achieve proper use of automation, we must engender an appropriate degree of trust that is sensitive to changes in operating functions and context. In this paper, we present research concerning trust in automation in the domain of automated tools for relational databases. Lee and See have provided models of trust in automation. One model developed by Lee and See identifies three key categories of information about the automation that lie along a continuum of attributional abstraction. Purpose-, process-and performance-related information serve, both individually and through inferences between them, to describe automation in such a way as to engender r properly-calibrated trust. Thus, one can look at information from different levels of attributional abstraction as a general requirements analysis for information key to appropriate trust in automation. The model of information necessary to engender appropriate trust in automation [1] is a general one. Although it describes categories of information, it does not provide insight on how to determine the specific information elements required for a given automated tool. We have applied the Abstraction Hierarchy (AH) to this problem in the domain of relational databases. The AH serves as a formal description of the automation at several levels of abstraction, ranging from a very abstract purpose-oriented description to a more concrete description of the resources involved in the automated process

  10. Design of a nickel-hydrogen battery simulator for the NASA EOS testbed

    Science.gov (United States)

    Gur, Zvi; Mang, Xuesi; Patil, Ashok R.; Sable, Dan M.; Cho, Bo H.; Lee, Fred C.

    1992-01-01

    The hardware and software design of a nickel-hydrogen (Ni-H2) battery simulator (BS) with application to the NASA Earth Observation System (EOS) satellite is presented. The battery simulator is developed as a part of a complete testbed for the EOS satellite power system. The battery simulator involves both hardware and software components. The hardware component includes the capability of sourcing and sinking current at a constant programmable voltage. The software component includes the capability of monitoring the battery's ampere-hours (Ah) and programming the battery voltage according to an empirical model of the nickel-hydrogen battery stored in a computer.

  11. Using the ISS as a testbed to prepare for the next generation of space-based telescopes

    Science.gov (United States)

    Postman, Marc; Sparks, William B.; Liu, Fengchuan; Ess, Kim; Green, Joseph; Carpenter, Kenneth G.; Thronson, Harley; Goullioud, Renaud

    2012-09-01

    The infrastructure available on the ISS provides a unique opportunity to develop the technologies necessary to assemble large space telescopes. Assembling telescopes in space is a game-changing approach to space astronomy. Using the ISS as a testbed enables a concentration of resources on reducing the technical risks associated with integrating the technologies, such as laser metrology and wavefront sensing and control (WFS&C), with the robotic assembly of major components including very light-weight primary and secondary mirrors and the alignment of the optical elements to a diffraction-limited optical system in space. The capability to assemble the optical system and remove and replace components via the existing ISS robotic systems such as the Special Purpose Dexterous Manipulator (SPDM), or by the ISS Flight Crew, allows for future experimentation as well as repair if necessary. In 2015, first light will be obtained by the Optical Testbed and Integration on ISS eXperiment (OpTIIX), a small 1.5-meter optical telescope assembled on the ISS. The primary objectives of OpTIIX include demonstrating telescope assembly technologies and end-to-end optical system technologies that will advance future large optical telescopes.

  12. A Functional Neuroimaging Analysis of the Trail Making Test-B: Implications for Clinical Application

    Directory of Open Access Journals (Sweden)

    Mark D. Allen

    2011-01-01

    Full Text Available Recent progress has been made using fMRI as a clinical assessment tool, often employing analogues of traditional “paper and pencil” tests. The Trail Making Test (TMT, popular for years as a neuropsychological exam, has been largely ignored in the realm of neuroimaging, most likely because its physical format and administration does not lend itself to straightforward adaptation as an fMRI paradigm. Likewise, there is relatively more ambiguity about the neural systems associated with this test than many other tests of comparable clinical use. In this study, we describe an fMRI version of Trail Making Test-B (TMTB that maintains the core functionality of the TMT while optimizing its use for both research and clinical settings. Subjects (N = 32 were administered the Functional Trail Making Test-B (f-TMTB. Brain region activations elicited by the f-TMTB were consistent with expectations given by prior TMT neurophysiological studies, including significant activations in the ventral and dorsal visual pathways and the medial pre-supplementary motor area. The f-TMTB was further evaluated for concurrent validity with the traditional TMTB using an additional sample of control subjects (N = 100. Together, these results support the f-TMTB as a viable neuroimaging adaptation of the TMT that is optimized to evoke maximally robust fMRI activation with minimal time and equipment requirements.

  13. Designing, Implementing and Documenting the Atlas Networking Test-bed.

    CERN Document Server

    Martinsen, Hans Åge

    The A Toroidal LHC ApparatuS (Atlas) experiment at the Large Hadron Colider (LHC) in European Organization for Nuclear Research (CERN), Geneva is a production environment. To develop new architectures, test new equipment and evaluate new technologies a well supported test bench is needed. A new one is now being commissioned and I will take a leading role in its development, commissioning and operation. This thesis will cover the requirements, the implementation, the documentation and the approach to the different challenges in implementing the testbed. I will be joining the project in the early stages and start by following the work that my colleagues are doing and then, as I get a better understanding, more responsibility will be given to me. To be able to suggest and implement solutions I will have to understand what the requirements are and how to achieve these requirements with the given resources.

  14. Deployment of a Testbed in a Brazilian Research Network using IPv6 and Optical Access Technologies

    Science.gov (United States)

    Martins, Luciano; Ferramola Pozzuto, João; Olimpio Tognolli, João; Chaves, Niudomar Siqueira De A.; Reggiani, Atilio Eduardo; Hortêncio, Claudio Antonio

    2012-04-01

    This article presents the implementation of a testbed and the experimental results obtained with it on the Brazilian Experimental Network of the government-sponsored "GIGA Project." The use of IPv6 integrated to current and emerging optical architectures and technologies, such as dense wavelength division multiplexing and 10-gigabit Ethernet on the core and gigabit capable passive optical network and optical distribution network on access, were tested. These protocols, architectures, and optical technologies are promising and part of a brand new worldwide technological scenario that has being fairly adopted in the networks of enterprises and providers of the world.

  15. Comparison of some aspects of the in situ and in vitro methods in evaluation of neutral detergent fiber digestion.

    Science.gov (United States)

    Krizsan, S J; Jančík, F; Ramin, M; Huhtanen, P

    2013-02-01

    The objective of the present study was to compare digestion rates (kd) of NDF for different feeds estimated with the in situ method or derived from an automated gas in vitro system. A meta-analysis was conducted to evaluate how in situ derived kd of NDF related to in vivo digestibility of NDF. Furthermore, in vitro true digestibility of the feed samples incubated within filter bags or dispersed in the medium was compared, and kd for insoluble and soluble components of those feeds were estimated. Four different concentrates and 4 forages were used in this study. Two lactating Swedish Red cows fed a diet of 60% grass silage and 40% concentrate on DM basis were used for in situ incubations and for collection of rumen fluid. The feed samples were ground through a 2.0-mm screen before the in situ incubations and a 1.0-mm screen before the in vitro gas incubations. In situ nylon bags were introduced into the rumen for determination of kd of NDF. Additional kinetic data were produced from isolated NDF and intact samples subjected to in vitro incubations in which gas production was recorded for 72 h. Samples were weighed in the bottles or within filter bags (for fiber and in vitro studies) that were placed in the bottles. The interaction between feed and method was significant (P production recordings. The meta-analysis suggested that in situ derived kd of NDF were biased and underestimated in vivo digestibility of NDF. Digestion rates of the intact samples were lower for all feeds, except for the hay, when incubated within the bags compared with dispersed in the medium (P < 0.01). Less OM and NDF were digested for all feeds when incubated within bags than dispersed in the medium (P < 0.01). It is concluded from the in vitro study that microbial activity within the bags is less than in the medium. Significant interactions between method (in situ vs. in vitro) and feed suggest that one or both methods result in biased estimates of digestion kinetics.

  16. An ODMG-compatible testbed architecture for scalable management and analysis of physics data

    International Nuclear Information System (INIS)

    Malon, D.M.; May, E.N.

    1997-01-01

    This paper describes a testbed architecture for the investigation and development of scalable approaches to the management and analysis of massive amounts of high energy physics data. The architecture has two components: an interface layer that is compliant with a substantial subset of the ODMG-93 Version 1.2 specification, and a lightweight object persistence manager that provides flexible storage and retrieval services on a variety of single- and multi-level storage architectures, and on a range of parallel and distributed computing platforms

  17. Introduction matters: Manipulating trust in automation and reliance in automated driving.

    Science.gov (United States)

    Körber, Moritz; Baseler, Eva; Bengler, Klaus

    2018-01-01

    Trust in automation is a key determinant for the adoption of automated systems and their appropriate use. Therefore, it constitutes an essential research area for the introduction of automated vehicles to road traffic. In this study, we investigated the influence of trust promoting (Trust promoted group) and trust lowering (Trust lowered group) introductory information on reported trust, reliance behavior and take-over performance. Forty participants encountered three situations in a 17-min highway drive in a conditionally automated vehicle (SAE Level 3). Situation 1 and Situation 3 were non-critical situations where a take-over was optional. Situation 2 represented a critical situation where a take-over was necessary to avoid a collision. A non-driving-related task (NDRT) was presented between the situations to record the allocation of visual attention. Participants reporting a higher trust level spent less time looking at the road or instrument cluster and more time looking at the NDRT. The manipulation of introductory information resulted in medium differences in reported trust and influenced participants' reliance behavior. Participants of the Trust promoted group looked less at the road or instrument cluster and more at the NDRT. The odds of participants of the Trust promoted group to overrule the automated driving system in the non-critical situations were 3.65 times (Situation 1) to 5 times (Situation 3) higher. In Situation 2, the Trust promoted group's mean take-over time was extended by 1154 ms and the mean minimum time-to-collision was 933 ms shorter. Six participants from the Trust promoted group compared to no participant of the Trust lowered group collided with the obstacle. The results demonstrate that the individual trust level influences how much drivers monitor the environment while performing an NDRT. Introductory information influences this trust level, reliance on an automated driving system, and if a critical take-over situation can be

  18. Designing a machinery control system (MCS) security testbed

    OpenAIRE

    Desso, Nathan H.

    2014-01-01

    Approved for public release; distribution is unlimited Industrial control systems (ICS) face daily cyber security threats, can have a significant impact to the security of our nation, and present a difficult challenge to defend. Critical infrastructures, including military systems like the machinery control systems (MCS) found onboard modern U.S. warships, are affected because of their use of commercial automation solutions. The increase of automated control systems within the U.S. Navy sa...

  19. Automation-aided Task Loads Index based on the Automation Rate Reflecting the Effects on Human Operators in NPPs

    International Nuclear Information System (INIS)

    Lee, Seungmin; Seong, Poonghyun; Kim, Jonghyun

    2013-01-01

    Many researchers have found that a high automation rate does not guarantee high performance. Therefore, to reflect the effects of automation on human performance, a new estimation method of the automation rate that considers the effects of automation on human operators in nuclear power plants (NPPs) was suggested. These suggested measures express how much automation support human operators but it cannot express the change of human operators' workload, whether the human operators' workload is increased or decreased. Before considering automation rates, whether the adopted automation is good or bad might be estimated in advance. In this study, to estimate the appropriateness of automation according to the change of the human operators' task loads, automation-aided task loads index is suggested based on the concept of the suggested automation rate. To insure plant safety and efficiency on behalf of human operators, various automation systems have been installed in NPPs, and many works which were previously conducted by human operators can now be supported by computer-based operator aids. According to the characteristics of the automation types, the estimation method of the system automation and the cognitive automation rate were suggested. The proposed estimation method concentrates on the effects of introducing automation, so it directly express how much the automated system support human operators. Based on the suggested automation rates, the way to estimate how much the automated system can affect the human operators' cognitive task load is suggested in this study. When there is no automation, the calculated index is 1, and it means there is no change of human operators' task load

  20. Photovoltaic Engineering Testbed: A Facility for Space Calibration and Measurement of Solar Cells on the International Space Station

    Science.gov (United States)

    Landis, Geoffrey A.; Bailey, Sheila G.; Jenkins, Phillip; Sexton, J. Andrew; Scheiman, David; Christie, Robert; Charpie, James; Gerber, Scott S.; Johnson, D. Bruce

    2001-01-01

    The Photovoltaic Engineering Testbed ("PET") is a facility to be flown on the International Space Station to perform calibration, measurement, and qualification of solar cells in the space environment and then returning the cells to Earth for laboratory use. PET will allow rapid turnaround testing of new photovoltaic technology under AM0 conditions.

  1. Automation in Warehouse Development

    NARCIS (Netherlands)

    Hamberg, R.; Verriet, J.

    2012-01-01

    The warehouses of the future will come in a variety of forms, but with a few common ingredients. Firstly, human operational handling of items in warehouses is increasingly being replaced by automated item handling. Extended warehouse automation counteracts the scarcity of human operators and

  2. Autonomy and Automation

    Science.gov (United States)

    Shively, Jay

    2017-01-01

    A significant level of debate and confusion has surrounded the meaning of the terms autonomy and automation. Automation is a multi-dimensional concept, and we propose that Remotely Piloted Aircraft Systems (RPAS) automation should be described with reference to the specific system and task that has been automated, the context in which the automation functions, and other relevant dimensions. In this paper, we present definitions of automation, pilot in the loop, pilot on the loop and pilot out of the loop. We further propose that in future, the International Civil Aviation Organization (ICAO) RPAS Panel avoids the use of the terms autonomy and autonomous when referring to automated systems on board RPA. Work Group 7 proposes to develop, in consultation with other workgroups, a taxonomy of Levels of Automation for RPAS.

  3. Visible nulling coronagraph testbed results

    Science.gov (United States)

    Lyon, Richard G.; Clampin, Mark; Woodruff, Robert A.; Vasudevan, Gopal; Thompson, Patrick; Petrone, Peter; Madison, Timothy; Rizzo, Maxime; Melnick, Gary; Tolls, Volker

    2009-08-01

    We report on our recent laboratory results with the NASA/Goddard Space Flight Center (GSFC) Visible Nulling Coronagraph (VNC) testbed. We have experimentally achieved focal plane contrasts of 1 x 108 and approaching 109 at inner working angles of 2 * wavelength/D and 4 * wavelength/D respectively where D is the aperture diameter. The result was obtained using a broadband source with a narrowband spectral filter of width 10 nm centered on 630 nm. To date this is the deepest nulling result with a visible nulling coronagraph yet obtained. Developed also is a Null Control Breadboard (NCB) to assess and quantify MEMS based segmented deformable mirror technology and develop and assess closed-loop null sensing and control algorithm performance from both the pupil and focal planes. We have demonstrated closed-loop control at 27 Hz in the laboratory environment. Efforts are underway to first bring the contrast to > 109 necessary for the direct detection and characterization of jovian (Jupiter-like) and then to > 1010 necessary for terrestrial (Earth-like) exosolar planets. Short term advancements are expected to both broaden the spectral passband from 10 nm to 100 nm and to increase both the long-term stability to > 2 hours and the extent of the null out to a ~ 10 * wavelength / D via the use of MEMS based segmented deformable mirror technology, a coherent fiber bundle, achromatic phase shifters, all in a vacuum chamber at the GSFC VNC facility. Additionally an extreme stability textbook sized compact VNC is under development.

  4. Filter-Adapted Fluorescent In Situ Hybridization (FA-FISH) for Filtration-Enriched Circulating Tumor Cells.

    Science.gov (United States)

    Oulhen, Marianne; Pailler, Emma; Faugeroux, Vincent; Farace, Françoise

    2017-01-01

    Circulating tumor cells (CTCs) may represent an easily accessible source of tumor material to assess genetic aberrations such as gene-rearrangements or gene-amplifications and screen cancer patients eligible for targeted therapies. As the number of CTCs is a critical parameter to identify such biomarkers, we developed fluorescent in situ hybridization (FISH) for CTCs enriched on filters (filter-adapted-FISH, FA-FISH). Here, we describe the FA-FISH protocol, the combination of immunofluorescent staining (DAPI/CD45) and FA-FISH techniques, as well as the semi-automated microscopy method that we developed to improve the feasibility and reliability of FISH analyses in filtration-enriched CTC.

  5. Automation-aided Task Loads Index based on the Automation Rate Reflecting the Effects on Human Operators in NPPs

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seungmin; Seong, Poonghyun [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Kim, Jonghyun [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2013-05-15

    Many researchers have found that a high automation rate does not guarantee high performance. Therefore, to reflect the effects of automation on human performance, a new estimation method of the automation rate that considers the effects of automation on human operators in nuclear power plants (NPPs) was suggested. These suggested measures express how much automation support human operators but it cannot express the change of human operators' workload, whether the human operators' workload is increased or decreased. Before considering automation rates, whether the adopted automation is good or bad might be estimated in advance. In this study, to estimate the appropriateness of automation according to the change of the human operators' task loads, automation-aided task loads index is suggested based on the concept of the suggested automation rate. To insure plant safety and efficiency on behalf of human operators, various automation systems have been installed in NPPs, and many works which were previously conducted by human operators can now be supported by computer-based operator aids. According to the characteristics of the automation types, the estimation method of the system automation and the cognitive automation rate were suggested. The proposed estimation method concentrates on the effects of introducing automation, so it directly express how much the automated system support human operators. Based on the suggested automation rates, the way to estimate how much the automated system can affect the human operators' cognitive task load is suggested in this study. When there is no automation, the calculated index is 1, and it means there is no change of human operators' task load.

  6. Development of Additive Construction Technologies for Application to Development of Lunar/Martian Surface Structures Using In-Situ Materials

    Science.gov (United States)

    Werkheiser, Niki J.; Fiske, Michael R.; Edmunson, Jennifer E.; Khoshnevis, Berokh

    2015-01-01

    For long-duration missions on other planetary bodies, the use of in situ materials will become increasingly critical. As human presence on these bodies expands, so must the breadth of the structures required to accommodate them including habitats, laboratories, berms, radiation shielding for natural radiation and surface reactors, garages, solar storm shelters, greenhouses, etc. Planetary surface structure manufacturing and assembly technologies that incorporate in situ resources provide options for autonomous, affordable, pre-positioned environments with radiation shielding features and protection from micrometeorites, exhaust plume debris, and other hazards. The ability to use in-situ materials to construct these structures will provide a benefit in the reduction of up-mass that would otherwise make long-term Moon or Mars structures cost prohibitive. The ability to fabricate structures in situ brings with it the ability to repair these structures, which allows for the self-sufficiency and sustainability necessary for long-duration habitation. Previously, under the auspices of the MSFC In-Situ Fabrication and Repair (ISFR) project and more recently, under the jointly-managed MSFC/KSC Additive Construction with Mobile Emplacement (ACME) project, the MSFC Surface Structures Group has been developing materials and construction technologies to support future planetary habitats with in-situ resources. One such additive construction technology is known as Contour Crafting. This paper presents the results to date of these efforts, including development of novel nozzle concepts for advanced layer deposition using this process. Conceived initially for rapid development of cementitious structures on Earth, it also lends itself exceptionally well to the automated fabrication of planetary surface structures using minimally processed regolith as aggregate, and binders developed from in situ materials as well. This process has been used successfully in the fabrication of

  7. Continuous Water Vapor Profiles from Operational Ground-Based Active and Passive Remote Sensors

    Science.gov (United States)

    Turner, D. D.; Feltz, W. F.; Ferrare, R. A.

    2000-01-01

    The Atmospheric Radiation Measurement program's Southern Great Plains Cloud and Radiation Testbed site central facility near Lamont, Oklahoma, offers unique operational water vapor profiling capabilities, including active and passive remote sensors as well as traditional in situ radiosonde measurements. Remote sensing technologies include an automated Raman lidar and an automated Atmospheric Emitted Radiance Interferometer (AERI), which are able to retrieve water vapor profiles operationally through the lower troposphere throughout the diurnal cycle. Comparisons of these two water vapor remote sensing methods to each other and to radiosondes over an 8-month period are presented and discussed, highlighting the accuracy and limitations of each method. Additionally, the AERI is able to retrieve profiles of temperature while the Raman lidar is able to retrieve aerosol extinction profiles operationally. These data, coupled with hourly wind profiles from a 915-MHz wind profiler, provide complete specification of the state of the atmosphere in noncloudy skies. Several case studies illustrate the utility of these high temporal resolution measurements in the characterization of mesoscale features within a 3-day time period in which passage of a dryline, warm air advection, and cold front occurred.

  8. Laboratory automation and LIMS in forensics

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Hansen, Anders Johannes; Morling, Niels

    2013-01-01

    . Furthermore, implementation of automated liquid handlers reduces the risk of sample misplacement. A LIMS can efficiently control the sample flow through the laboratory and manage the results of the conducted tests for each sample. Integration of automated liquid handlers with a LIMS provides the laboratory......Implementation of laboratory automation and LIMS in a forensic laboratory enables the laboratory, to standardize sample processing. Automated liquid handlers can increase throughput and eliminate manual repetitive pipetting operations, known to result in occupational injuries to the technical staff...... with the tools required for setting up automated production lines of complex laboratory processes and monitoring the whole process and the results. Combined, this enables processing of a large number of samples. Selection of the best automated solution for an individual laboratory should be based on user...

  9. Determination of the Optimized Automation Rate considering Effects of Automation on Human Operators in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Lee, Seung Min; Seong, Poong Hyun; Kim, Jong Hyun; Kim, Man Cheol

    2015-01-01

    Automation refers to the use of a device or a system to perform a function previously performed by a human operator. It is introduced to reduce the human errors and to enhance the performance in various industrial fields, including the nuclear industry. However, these positive effects are not always achieved in complex systems such as nuclear power plants (NPPs). An excessive introduction of automation can generate new roles for human operators and change activities in unexpected ways. As more automation systems are accepted, the ability of human operators to detect automation failures and resume manual control is diminished. This disadvantage of automation is called the Out-of-the- Loop (OOTL) problem. We should consider the positive and negative effects of automation at the same time to determine the appropriate level of the introduction of automation. Thus, in this paper, we suggest an estimation method to consider the positive and negative effects of automation at the same time to determine the appropriate introduction of automation. This concept is limited in that it does not consider the effects of automation on human operators. Thus, a new estimation method for automation rate was suggested to overcome this problem

  10. Individual differences in the calibration of trust in automation.

    Science.gov (United States)

    Pop, Vlad L; Shrewsbury, Alex; Durso, Francis T

    2015-06-01

    The objective was to determine whether operators with an expectancy that automation is trustworthy are better at calibrating their trust to changes in the capabilities of automation, and if so, why. Studies suggest that individual differences in automation expectancy may be able to account for why changes in the capabilities of automation lead to a substantial change in trust for some, yet only a small change for others. In a baggage screening task, 225 participants searched for weapons in 200 X-ray images of luggage. Participants were assisted by an automated decision aid exhibiting different levels of reliability. Measures of expectancy that automation is trustworthy were used in conjunction with subjective measures of trust and perceived reliability to identify individual differences in trust calibration. Operators with high expectancy that automation is trustworthy were more sensitive to changes (both increases and decreases) in automation reliability. This difference was eliminated by manipulating the causal attribution of automation errors. Attributing the cause of automation errors to factors external to the automation fosters an understanding of tasks and situations in which automation differs in reliability and may lead to more appropriate trust. The development of interventions can lead to calibrated trust in automation. © 2014, Human Factors and Ergonomics Society.

  11. Effective Application of the Methanol-Based PreservCyt (TM) Fixative and the Cellient (TM) Automated Cell Block Processor to Diagnostic Cytopathology, Immunocytochemistry, and Molecular Biology

    NARCIS (Netherlands)

    van Hemel, Bettien M.; Suurmeijer, Albert J. H.

    We studied the feasibility of immunocytochemistry (ICC), in situ hybridization (ISH), and polymerase chain reaction (PCR) after Cellient automated cell block processing, and tested whether methanol-based PreservCyt fixation could replace formalin fixation, in an attempt to eliminate toxic

  12. Experience of automation failures in training: effects on trust, automation bias, complacency and performance.

    Science.gov (United States)

    Sauer, Juergen; Chavaillaz, Alain; Wastell, David

    2016-06-01

    This work examined the effects of operators' exposure to various types of automation failures in training. Forty-five participants were trained for 3.5 h on a simulated process control environment. During training, participants either experienced a fully reliable, automatic fault repair facility (i.e. faults detected and correctly diagnosed), a misdiagnosis-prone one (i.e. faults detected but not correctly diagnosed) or a miss-prone one (i.e. faults not detected). One week after training, participants were tested for 3 h, experiencing two types of automation failures (misdiagnosis, miss). The results showed that automation bias was very high when operators trained on miss-prone automation encountered a failure of the diagnostic system. Operator errors resulting from automation bias were much higher when automation misdiagnosed a fault than when it missed one. Differences in trust levels that were instilled by the different training experiences disappeared during the testing session. Practitioner Summary: The experience of automation failures during training has some consequences. A greater potential for operator errors may be expected when an automatic system failed to diagnose a fault than when it failed to detect one.

  13. Toward biotechnology in space: High-throughput instruments for in situ biological research beyond Earth.

    Science.gov (United States)

    Karouia, Fathi; Peyvan, Kianoosh; Pohorille, Andrew

    2017-11-15

    Space biotechnology is a nascent field aimed at applying tools of modern biology to advance our goals in space exploration. These advances rely on our ability to exploit in situ high throughput techniques for amplification and sequencing DNA, and measuring levels of RNA transcripts, proteins and metabolites in a cell. These techniques, collectively known as "omics" techniques have already revolutionized terrestrial biology. A number of on-going efforts are aimed at developing instruments to carry out "omics" research in space, in particular on board the International Space Station and small satellites. For space applications these instruments require substantial and creative reengineering that includes automation, miniaturization and ensuring that the device is resistant to conditions in space and works independently of the direction of the gravity vector. Different paths taken to meet these requirements for different "omics" instruments are the subjects of this review. The advantages and disadvantages of these instruments and technological solutions and their level of readiness for deployment in space are discussed. Considering that effects of space environments on terrestrial organisms appear to be global, it is argued that high throughput instruments are essential to advance (1) biomedical and physiological studies to control and reduce space-related stressors on living systems, (2) application of biology to life support and in situ resource utilization, (3) planetary protection, and (4) basic research about the limits on life in space. It is also argued that carrying out measurements in situ provides considerable advantages over the traditional space biology paradigm that relies on post-flight data analysis. Published by Elsevier Inc.

  14. Four Models of In Situ Simulation

    DEFF Research Database (Denmark)

    Musaeus, Peter; Krogh, Kristian; Paltved, Charlotte

    2014-01-01

    Introduction In situ simulation is characterized by being situated in the clinical environment as opposed to the simulation laboratory. But in situ simulation bears a family resemblance to other types of on the job training. We explore a typology of in situ simulation and suggest that there are f......Introduction In situ simulation is characterized by being situated in the clinical environment as opposed to the simulation laboratory. But in situ simulation bears a family resemblance to other types of on the job training. We explore a typology of in situ simulation and suggest...... that there are four fruitful approaches to in situ simulation: (1) In situ simulation informed by reported critical incidents and adverse events from emergency departments (ED) in which team training is about to be conducted to write scenarios. (2) In situ simulation through ethnographic studies at the ED. (3) Using...... the following processes: Transition processes, Action processes and Interpersonal processes. Design and purpose This abstract suggests four approaches to in situ simulation. A pilot study will evaluate the different approaches in two emergency departments in the Central Region of Denmark. Methods The typology...

  15. Myths in test automation

    Directory of Open Access Journals (Sweden)

    Jazmine Francis

    2014-12-01

    Full Text Available Myths in automation of software testing is an issue of discussion that echoes about the areas of service in validation of software industry. Probably, the first though that appears in knowledgeable reader would be Why this old topic again? What's New to discuss the matter? But, for the first time everyone agrees that undoubtedly automation testing today is not today what it used to be ten or fifteen years ago, because it has evolved in scope and magnitude. What began as a simple linear scripts for web applications today has a complex architecture and a hybrid framework to facilitate the implementation of testing applications developed with various platforms and technologies. Undoubtedly automation has advanced, but so did the myths associated with it. The change in perspective and knowledge of people on automation has altered the terrain. This article reflects the points of views and experience of the author in what has to do with the transformation of the original myths in new versions, and how they are derived; also provides his thoughts on the new generation of myths.

  16. Myths in test automation

    Directory of Open Access Journals (Sweden)

    Jazmine Francis

    2015-01-01

    Full Text Available Myths in automation of software testing is an issue of discussion that echoes about the areas of service in validation of software industry. Probably, the first though that appears in knowledgeable reader would be Why this old topic again? What's New to discuss the matter? But, for the first time everyone agrees that undoubtedly automation testing today is not today what it used to be ten or fifteen years ago, because it has evolved in scope and magnitude. What began as a simple linear scripts for web applications today has a complex architecture and a hybrid framework to facilitate the implementation of testing applications developed with various platforms and technologies. Undoubtedly automation has advanced, but so did the myths associated with it. The change in perspective and knowledge of people on automation has altered the terrain. This article reflects the points of views and experience of the author in what has to do with the transformation of the original myths in new versions, and how they are derived; also provides his thoughts on the new generation of myths.

  17. Home Automation

    OpenAIRE

    Ahmed, Zeeshan

    2010-01-01

    In this paper I briefly discuss the importance of home automation system. Going in to the details I briefly present a real time designed and implemented software and hardware oriented house automation research project, capable of automating house's electricity and providing a security system to detect the presence of unexpected behavior.

  18. Development of a Dynamic Web Mapping Service for Vegetation Productivity Using Earth Observation and in situ Sensors in a Sensor Web Based Approach

    Directory of Open Access Journals (Sweden)

    Sytze de Bruin

    2009-03-01

    Full Text Available This paper describes the development of a sensor web based approach which combines earth observation and in situ sensor data to derive typical information offered by a dynamic web mapping service (WMS. A prototype has been developed which provides daily maps of vegetation productivity for the Netherlands with a spatial resolution of 250 m. Daily available MODIS surface reflectance products and meteorological parameters obtained through a Sensor Observation Service (SOS were used as input for a vegetation productivity model. This paper presents the vegetation productivity model, the sensor data sources and the implementation of the automated processing facility. Finally, an evaluation is made of the opportunities and limitations of sensor web based approaches for the development of web services which combine both satellite and in situ sensor sources.

  19. Fast-FISH Detection and Semi-Automated Image Analysis of Numerical Chromosome Aberrations in Hematological Malignancies

    Directory of Open Access Journals (Sweden)

    Arif Esa

    1998-01-01

    Full Text Available A new fluorescence in situ hybridization (FISH technique called Fast-FISH in combination with semi-automated image analysis was applied to detect numerical aberrations of chromosomes 8 and 12 in interphase nuclei of peripheral blood lymphocytes and bone marrow cells from patients with acute myelogenous leukemia (AML and chronic lymphocytic leukemia (CLL. Commercially available α-satellite DNA probes specific for the centromere regions of chromosome 8 and chromosome 12, respectively, were used. After application of the Fast-FISH protocol, the microscopic images of the fluorescence-labelled cell nuclei were recorded by the true color CCD camera Kappa CF 15 MC and evaluated quantitatively by computer analysis on a PC. These results were compared to results obtained from the same type of specimens using the same analysis system but with a standard FISH protocol. In addition, automated spot counting after both FISH techniques was compared to visual spot counting after standard FISH. A total number of about 3,000 cell nuclei was evaluated. For quantitative brightness parameters, a good correlation between standard FISH labelling and Fast-FISH was found. Automated spot counting after Fast-FISH coincided within a few percent to automated and visual spot counting after standard FISH. The examples shown indicate the reliability and reproducibility of Fast-FISH and its potential for automatized interphase cell diagnostics of numerical chromosome aberrations. Since the Fast-FISH technique requires a hybridization time as low as 1/20 of established standard FISH techniques, omitting most of the time consuming working steps in the protocol, it may contribute considerably to clinical diagnostics. This may especially be interesting in cases where an accurate result is required within a few hours.

  20. Development of an Autonomous Navigation Technology Test Vehicle

    National Research Council Canada - National Science Library

    Tobler, Chad K

    2004-01-01

    .... In order to continue these research activities at CIMAR, a new Kawasaki Mule All-Terrain Vehicle was chosen to be automated as a test-bed for the purpose of developing and testing autonomous vehicle technologies...

  1. Simulation to Flight Test for a UAV Controls Testbed

    Science.gov (United States)

    Motter, Mark A.; Logan, Michael J.; French, Michael L.; Guerreiro, Nelson M.

    2006-01-01

    The NASA Flying Controls Testbed (FLiC) is a relatively small and inexpensive unmanned aerial vehicle developed specifically to test highly experimental flight control approaches. The most recent version of the FLiC is configured with 16 independent aileron segments, supports the implementation of C-coded experimental controllers, and is capable of fully autonomous flight from takeoff roll to landing, including flight test maneuvers. The test vehicle is basically a modified Army target drone, AN/FQM-117B, developed as part of a collaboration between the Aviation Applied Technology Directorate (AATD) at Fort Eustis, Virginia and NASA Langley Research Center. Several vehicles have been constructed and collectively have flown over 600 successful test flights, including a fully autonomous demonstration at the Association of Unmanned Vehicle Systems International (AUVSI) UAV Demo 2005. Simulations based on wind tunnel data are being used to further develop advanced controllers for implementation and flight test.

  2. Carrier Plus: A sensor payload for Living With a Star Space Environment Testbed (LWS/SET)

    Science.gov (United States)

    Marshall, Cheryl J.; Moss, Steven; Howard, Regan; LaBel, Kenneth A.; Grycewicz, Tom; Barth, Janet L.; Brewer, Dana

    2003-01-01

    The Defense Threat Reduction Agency (DTR4) and National Aeronautics and Space Administration (NASA) Goddard Space Flight Center are collaborating to develop the Carrier Plus sensor experiment platform as a capability of the Space Environments Testbed (SET). The Space Environment Testbed (SET) provides flight opportunities for technology experiments as part of NASA's Living With a Star (LWS) program. The Carrier Plus will provide new capability to characterize sensor technologies such as state-of-the-art visible focal plane arrays (FPAs) in a natural space radiation environment. The technical objectives include on-orbit validation of recently developed FPA technologies and performance prediction methodologies, as well as characterization of the FPA radiation response to total ionizing dose damage, displacement damage and transients. It is expected that the sensor experiment will carry 4-6 FPAs and associated radiation correlative environment monitors (CEMs) for a 2006-2007 launch. Sensor technology candidates may include n- and p-charge coupled devices (CCDs), active pixel sensors (APS), and hybrid CMOS arrays. The presentation will describe the Carrier Plus goals and objectives, as well as provide details about the architecture and design. More information on the LWS program can be found at http://lws.gsfc.nasa.gov/. Business announcements for LWS/SET and program briefings are posted at http://lws-set.gsfc.nasa.gov

  3. Contemporary management of ductal carcinoma in situ and lobular carcinoma in situ.

    Science.gov (United States)

    Obeng-Gyasi, Samilia; Ong, Cecilia; Hwang, E Shelley

    2016-06-01

    The management of in situ lesions ductal carcinoma in situ (DCIS) and lobular carcinoma in situ (LCIS) continues to evolve. These diagnoses now comprise a large burden of mammographically diagnosed cancers, and with a global trend towards more population-based screening, the incidence of these lesions will continue to rise. Because outcomes following treatment for DCIS and LCIS are excellent, there is emerging controversy about what extent of treatment is optimal for both diseases. Here we review the current approaches to the diagnosis and treatment of both DCIS and LCIS. In addition, we will consider potential directions for future management of these lesions.

  4. Process automation

    International Nuclear Information System (INIS)

    Moser, D.R.

    1986-01-01

    Process automation technology has been pursued in the chemical processing industries and to a very limited extent in nuclear fuel reprocessing. Its effective use has been restricted in the past by the lack of diverse and reliable process instrumentation and the unavailability of sophisticated software designed for process control. The Integrated Equipment Test (IET) facility was developed by the Consolidated Fuel Reprocessing Program (CFRP) in part to demonstrate new concepts for control of advanced nuclear fuel reprocessing plants. A demonstration of fuel reprocessing equipment automation using advanced instrumentation and a modern, microprocessor-based control system is nearing completion in the facility. This facility provides for the synergistic testing of all chemical process features of a prototypical fuel reprocessing plant that can be attained with unirradiated uranium-bearing feed materials. The unique equipment and mission of the IET facility make it an ideal test bed for automation studies. This effort will provide for the demonstration of the plant automation concept and for the development of techniques for similar applications in a full-scale plant. A set of preliminary recommendations for implementing process automation has been compiled. Some of these concepts are not generally recognized or accepted. The automation work now under way in the IET facility should be useful to others in helping avoid costly mistakes because of the underutilization or misapplication of process automation. 6 figs

  5. Low cost automation

    International Nuclear Information System (INIS)

    1987-03-01

    This book indicates method of building of automation plan, design of automation facilities, automation and CHIP process like basics of cutting, NC processing machine and CHIP handling, automation unit, such as drilling unit, tapping unit, boring unit, milling unit and slide unit, application of oil pressure on characteristics and basic oil pressure circuit, application of pneumatic, automation kinds and application of process, assembly, transportation, automatic machine and factory automation.

  6. Time Distribution Using SpaceWire in the SCaN Testbed on ISS

    Science.gov (United States)

    Lux, James P.

    2012-01-01

    A paper describes an approach for timekeeping and time transfer among the devices on the CoNNeCT project s SCaN Testbed. It also describes how the clocks may be synchronized with an external time reference; e.g., time tags from the International Space Station (ISS) or RF signals received by a radio (TDRSS time service or GPS). All the units have some sort of counter that is fed by an oscillator at some convenient frequency. The basic problem in timekeeping is relating the counter value to some external time standard such as UTC. With SpaceWire, there are two approaches possible: one is to just use SpaceWire to send a message, and use an external wire for the sync signal. This is much the same as with the RS- 232 messages and l pps line from a GPS receiver. However, SpaceWire has an additional capability that was added to make it easier - it can insert and receive a special "timecode" word in the data stream.

  7. LIBRARY AUTOMATION IN NIGERAN UNIVERSITIES

    African Journals Online (AJOL)

    facilitate services and access to information in libraries is widely acceptable. ... Moreover, Ugah (2001) reports that the automation process at the. Abubakar ... blueprint in 1987 and a turn-key system of automation was suggested for the library.

  8. Context-Aware user interfaces in automation

    DEFF Research Database (Denmark)

    Olsen, Mikkel Holm

    2007-01-01

    Automation is deployed in a great range of different domains such as the chemical industry, the production of consumer goods, the production of energy (both in terms of power plants and in the petrochemical industry), transportation and several others. Through several decades the complexity...... of automation systems and the level of automation have been rising. This has caused problems regarding the operator's ability to comprehend the overall situation and state of the automation system, in particular in abnormal situations. The amount of data available to the operator results in information overload....... Since context-aware applications have been developed in other research areas it seems natural to analyze the findings of this research and examine how this can be applied to the domain of automation systems. By evaluating existing architectures for the development of context-aware applications we find...

  9. In-Situ Simulation

    DEFF Research Database (Denmark)

    Bjerregaard, Anders Thais; Slot, Susanne; Paltved, Charlotte

    2015-01-01

    , and organisational characteristic. Therefore, it might fail to fully mimic real clinical team processes. Though research on in situ simulation in healthcare is in its infancy, literature is abundant on patient safety and team training1. Patient safety reporting systems that identify risks to patients can improve......Introduction: In situ simulation offers on-site training to healthcare professionals. It refers to a training strategy where simulation technology is integrated into the clinical encounter. Training in the simulation laboratory does not easily tap into situational resources, e.g. individual, team...... patient safety if coupled with training and organisational support. This study explored the use of critical incidents and adverse events reports for in situ simulation and short-term observations were used to create learning objectives and training scenarios. Method: This study used an interventional case...

  10. Noise canceling in-situ detection

    Science.gov (United States)

    Walsh, David O.

    2014-08-26

    Technologies applicable to noise canceling in-situ NMR detection and imaging are disclosed. An example noise canceling in-situ NMR detection apparatus may comprise one or more of a static magnetic field generator, an alternating magnetic field generator, an in-situ NMR detection device, an auxiliary noise detection device, and a computer.

  11. In-situ uranium leaching

    International Nuclear Information System (INIS)

    Dotson, B.J.

    1986-01-01

    This invention provides a method for improving the recovery of mineral values from ore bodies subjected to in-situ leaching by controlling the flow behaviour of the leaching solution. In particular, the invention relates to an in-situ leaching operation employing a foam for mobility control of the leaching solution. A foam bank is either introduced into the ore bed or developed in-situ in the ore bed. The foam then becomes a diverting agent forcing the leaching fluid through the previously non-contacted regions of the deposit

  12. Space station as a vital focus for advancing the technologies of automation and robotics

    Science.gov (United States)

    Varsi, Giulio; Herman, Daniel H.

    1988-01-01

    A major guideline for the design of the U.S. Space Station is that the Space Station address a wide variety of functions. These functions include the servicing of unmanned assets in space, the support of commercial labs in space and the efficient management of the Space Station itself; the largest space asset. The technologies of Automation and Robotics have the promise to help in reducing Space Station operating costs and to achieve a highly efficient use of the human in space. The use of advanced automation and artificial intelligence techniques, such as expert systems, in Space Station subsystems for activity planning and failure mode management will enable us to reduce dependency on a mission control center and could ultimately result in breaking the umbilical link from Earth to the Space Station. The application of robotic technologies with advanced perception capability and hierarchical intelligent control to servicing system will enable the servicing of assets either in space or in situ with a high degree of human efficiency. The results of studies leading toward the formulation of an automation and robotics plan for Space Station development are presented.

  13. System reliability, performance and trust in adaptable automation.

    Science.gov (United States)

    Chavaillaz, Alain; Wastell, David; Sauer, Jürgen

    2016-01-01

    The present study examined the effects of reduced system reliability on operator performance and automation management in an adaptable automation environment. 39 operators were randomly assigned to one of three experimental groups: low (60%), medium (80%), and high (100%) reliability of automation support. The support system provided five incremental levels of automation which operators could freely select according to their needs. After 3 h of training on a simulated process control task (AutoCAMS) in which the automation worked infallibly, operator performance and automation management were measured during a 2.5-h testing session. Trust and workload were also assessed through questionnaires. Results showed that although reduced system reliability resulted in lower levels of trust towards automation, there were no corresponding differences in the operators' reliance on automation. While operators showed overall a noteworthy ability to cope with automation failure, there were, however, decrements in diagnostic speed and prospective memory with lower reliability. Copyright © 2015. Published by Elsevier Ltd.

  14. Programmable automation systems in PSA

    International Nuclear Information System (INIS)

    Pulkkinen, U.

    1997-06-01

    The Finnish safety authority (STUK) requires plant specific PSAs, and quantitative safety goals are set on different levels. The reliability analysis is more problematic when critical safety functions are realized by applying programmable automation systems. Conventional modeling techniques do not necessarily apply to the analysis of these systems, and the quantification seems to be impossible. However, it is important to analyze contribution of programmable automation systems to the plant safety and PSA is the only method with system analytical view over the safety. This report discusses the applicability of PSA methodology (fault tree analyses, failure modes and effects analyses) in the analysis of programmable automation systems. The problem of how to decompose programmable automation systems for reliability modeling purposes is discussed. In addition to the qualitative analysis and structural reliability modeling issues, the possibility to evaluate failure probabilities of programmable automation systems is considered. One solution to the quantification issue is the use of expert judgements, and the principles to apply expert judgements is discussed in the paper. A framework to apply expert judgements is outlined. Further, the impacts of subjective estimates on the interpretation of PSA results are discussed. (orig.) (13 refs.)

  15. Voltammetric, in-situ spectroelectrochemical and in-situ electrocolorimetric characterization of phthalocyanines

    Energy Technology Data Exchange (ETDEWEB)

    Koca, Atif [Department of Chemical Engineering, Faculty of Engineering, Marmara University, Goeztepe, 34722 Istanbul (Turkey)], E-mail: akoca@eng.marmara.edu.tr; Bayar, Serife; Dincer, Hatice A. [Department of Chemistry, Technical University of Istanbul, Maslak, 34469 Istanbul (Turkey); Gonca, Erguen [Department of Chemistry, Fatih University, TR34500 B.Cekmece, Istanbul (Turkey)

    2009-04-01

    In this work, electrochemical, and in-situ spectroelectrochemical characterization of the metallophthalocyanines bearing tetra-(1,1-(dicarbethoxy)-2-(2-methylbenzyl))-ethyl 3,10,17,24-tetra chloro groups were performed. Voltammetric and in-situ spectroelectrochemical measurements show that while cobalt phthalocyanine complex gives both metal-based and ring-based redox processes, zinc and copper phthalocyanines show only ring-based reduction and oxidation processes. The redox processes are generally diffusion-controlled, reversible and one-electron transfer processes. Differently lead phthalocyanine demetallized during second oxidation reaction while it was stable during reduction processes. An in-situ electrocolorimetric method, based on the 1931 CIE (Commission Internationale de l'Eclairage) system of colorimetry, has been applied to investigate the color of the electro-generated anionic and cationic forms of the complexes for the first time in this study.

  16. Integrated Systems Health Management for Sustainable Habitats (Using Sustainability Base as a Testbed)

    Science.gov (United States)

    Martin, Rodney A.

    2017-01-01

    Habitation systems provide a safe place for astronauts to live and work in space and on planetary surfaces. They enable crews to live and work safely in deep space, and include integrated life support systems, radiation protection, fire safety, and systems to reduce logistics and the need for resupply missions. Innovative health management technologies are needed in order to increase the safety and mission-effectiveness for future space habitats on other planets, asteroids, or lunar surfaces. For example, off-nominal or failure conditions occurring in safety-critical life support systems may need to be addressed quickly by the habitat crew without extensive technical support from Earth due to communication delays. If the crew in the habitat must manage, plan and operate much of the mission themselves, operations support must be migrated from Earth to the habitat. Enabling monitoring, tracking, and management capabilities on-board the habitat and related EVA platforms for a small crew to use will require significant automation and decision support software.Traditional caution and warning systems are typically triggered by out-of-bounds sensor values, but can be enhanced by including machine learning and data mining techniques. These methods aim to reveal latent, unknown conditions while still retaining and improving the ability to provide highly accurate alerts for known issues. A few of these techniques will briefly described, along with performance targets for known faults and failures. Specific system health management capabilities required for habitat system elements (environmental control and life support systems, etc.) may include relevant subsystems such as water recycling systems, photovoltaic systems, electrical power systems, and environmental monitoring systems. Sustainability Base, the agency's flagship LEED-platinum certified green building acts as a living laboratory for testing advanced information and sustainable technologies that provides an

  17. How to assess sustainability in automated manufacturing

    DEFF Research Database (Denmark)

    Dijkman, Teunis Johannes; Rödger, Jan-Markus; Bey, Niki

    2015-01-01

    The aim of this paper is to describe how sustainability in automation can be assessed. The assessment method is illustrated using a case study of a robot. Three aspects of sustainability assessment in automation are identified. Firstly, we consider automation as part of a larger system...... that fulfills the market demand for a given functionality. Secondly, three aspects of sustainability have to be assessed: environment, economy, and society. Thirdly, automation is part of a system with many levels, with different actors on each level, resulting in meeting the market demand. In this system......, (sustainability) specifications move top-down, which helps avoiding sub-optimization and problem shifting. From these three aspects, sustainable automation is defined as automation that contributes to products that fulfill a market demand in a more sustainable way. The case study presents the carbon footprints...

  18. Sonographically guided core biopsy of the breast: comparison of 14-gauge automated gun and 11-gauge directional vacuum-assisted biopsy methods

    International Nuclear Information System (INIS)

    Cho, Nariya; Moon, Woo Kyung; Cha, Joo Hee

    2005-01-01

    To compare the outcomes of 14-gauge automated biopsy and 11-gauge vacuum-assisted biopsy for the sonographically guided core biopsies of breast lesions. We retrospectively reviewed all sonographically guided core biopsies performed from January 2002 to February 2004. The sonographically guided core biopsies were performed with using a 14-gauge automated gun on 562 breast lesions or with using an 11-gauge vacuum-assisted device on 417 lesions. The histologic findings were compared with the surgical, imaging and follow-up findings. The histologic underestimation rate, the repeat biopsy rate and the false negative rates were compared between the two groups. A repeat biopsy was performed on 49 benign lesions because of the core biopsy results of the high-risk lesions (n=24), the imaging-histologic discordance (n=5), and the imaging findings showing disease progression (n=20). The total underestimation rates, according to the biopsy device, were 55% (12/22) for the 14-gauge automated gun biopsies and 36% (8/22) for the 11-gauge vacuum-assisted device (ρ = 0.226). The atypical ductal hyperplasia (ADH) underestimation (i.e., atypical ductal hyperplasia at core biopsy and carcinoma at surgery) was 58% (7/12) for the 14-gauge automated gun biopsies and 20% (1/5) for the 11-gauge vacuum-assisted biopsies. The ductal carcinoma in situ (DCIS) underestimation rate (i.e., ductal carcinoma in situ upon core biopsy and invasive carcinoma found at surgery) was 50% (5/10) for the 14-gauge automated gun biopsies and 41% (7/17) for the 11-gauge vacuum-assisted biopsies. The repeat biopsy rates were 6% (33/562) for the 14-gauge automated gun biopsies and 3.5% (16/417) for the 11-gauge vacuum-assisted biopsies. Only 5 (0.5%) of the 979 core biopsies were believed to have missed the malignant lesions. The false-negative rate was 3% (4 of 128 cancers) for the 14-gauge automated gun biopsies and 1% (1 of 69 cancers) for the 11-gauge vacuum-assisted biopsies. The outcomes of the

  19. ADVANTAGES/DISADVANTAGES FOR ISCO METHODS IN-SITU FENTON OXIDATION IN-SITU PERMANGANATE OXIDATION

    Science.gov (United States)

    The advantages and disadvantages of in-situ Fenton oxidation and in-situ permanganate oxidation will be presented. This presentation will provide a brief overview of each technology and a detailed analysis of the advantages and disadvantages of each technology. Included in the ...

  20. The Employment-Impact of Automation in Canada

    OpenAIRE

    McLean, Colin Alexander

    2015-01-01

    Standard neoclassical models of labour demand predict that automation does not produce long-term increases in unemployment. Supporting evidence in Canada between 1970 and 2008 is explained by the reallocation of labour from industries with high levels of automation such as Manufacturing to industries with low levels of automation such as Retail and Wholesale Trade, and Business Services. Recent evidence indicates however that on-going technological advances are now driving labour automation i...

  1. Automated Analysis of Protein Expression and Gene Amplification within the Same Cells of Paraffin-Embedded Tumour Tissue

    Directory of Open Access Journals (Sweden)

    Timo Gaiser

    2010-01-01

    Full Text Available Background: The simultaneous detection of protein expression and gene copy number changes in patient samples, like paraffin-embedded tissue sections, is challenging since the procedures of immunohistochemistry (IHC and Fluorescence in situ Hybridization (FISH negatively influence each other which often results in suboptimal staining. Therefore, we developed a novel automated algorithm based on relocation which allows subsequent detection of protein content and gene copy number changes within the same cell.

  2. Trace Gas Measurements from the GeoTASO and GCAS Airborne Instruments: An Instrument and Algorithm Test-Bed for Air Quality Observations from Geostationary Orbit

    Science.gov (United States)

    Nowlan, C. R.; Liu, X.; Janz, S. J.; Leitch, J. W.; Al-Saadi, J. A.; Chance, K.; Cole, J.; Delker, T.; Follette-Cook, M. B.; Gonzalez Abad, G.; Good, W. S.; Kowalewski, M. G.; Loughner, C.; Pickering, K. E.; Ruppert, L.; Soo, D.; Szykman, J.; Valin, L.; Zoogman, P.

    2016-12-01

    The Geostationary Trace gas and Aerosol Sensor Optimization (GeoTASO) and the GEO-CAPE Airborne Simulator (GCAS) instruments are pushbroom sensors capable of making remote sensing measurements of air quality and ocean color. Originally developed as test-bed instruments for the Geostationary Coastal and Air Pollution Events (GEO-CAPE) decadal survey, these instruments are now also part of risk reduction for the upcoming Tropospheric Emissions: Monitoring of Pollution (TEMPO) and Geostationary Environment Monitoring Spectrometer (GEMS) geostationary satellite missions, and will provide validation capabilities after the satellite instruments are in orbit. GeoTASO and GCAS flew on two different aircraft in their first intensive air quality field campaigns during the DISCOVER-AQ missions over Texas in 2013 and Colorado in 2014. GeoTASO was also deployed in 2016 during the KORUS-AQ field campaign to make measurements of trace gases and aerosols over Korea. GeoTASO and GCAS collect spectra of backscattered solar radiation in the UV and visible that can be used to derive 2-D maps of trace gas columns below the aircraft at spatial resolutions on the order of 250 x 500 m. We present spatially resolved maps of trace gas retrievals of ozone, nitrogen dioxide, formaldehyde and sulfur dioxide over urban areas and power plants from flights during the field campaigns, and comparisons with data from ground-based spectrometers, in situ monitoring instruments, and satellites.

  3. Controls and automation in the SPIRAL project

    International Nuclear Information System (INIS)

    Bothner, U.; Boulot, A.; Maherault, J.; Martial, L.

    1999-01-01

    The control and automation team of the R and D of Accelerator-Exotic Beam Department has had in the framework of SPIRAL collaboration the following tasks: 1. automation of the resonator high frequency equipment of the CIME cyclotron; 2. automation of the vacuum equipment, i.e. the low energy line (TBE), the CIME cyclotron, the low energy line (BE); 3. automation of load safety for power supply; 4. for each of these tasks a circuitry file based on the SCHEMA software has been worked out. The programs required in the automation of load safety for power supply (STEP5, PROTOOL, DESIGNER 4.1) were developed and implemented for PC

  4. Are we There Yet? ... Developing In-Situ Fabrication and Repair (ISFR) Technologies to Explore and Live on the Moon and Mars

    Science.gov (United States)

    Bassler, Julie A.; Bodiford, Melanie P.; Fiske, Michael R.; Strong, Janet D.

    2005-01-01

    NASA's human exploration initiative poses great opportunity and great risk for manned missions to the Moon and Mars. Engineers and Scientists at the Marshall Space Flight Center are evaluating current technologies for in situ exploration habitat and fabrication and repair applications. Several technologies to be addressed in this paper have technology readiness levels (TRLs) that are currently mature enough to pursue for exploration purposes. However, many technologies offer promising applications but these must be pulled along by the demands and applications of this great initiative. The In Situ Fabrication and Repair (ISFR) program will supply and push state of the art technologies for applications such as habitat structure development, in situ resource utilization for tool and part fabrication, and repair and replacement of common life support elements. This paper will look at the current and future habitat technology applications such as the implementation of in situ environmental elements such as caves, rilles and lavatubes, the development of lunar regolith concrete and structure design and development, thin film and inflatable technologies. We will address current rapid prototyping technologies, their ISFR applications and near term advancements. We will discuss the anticipated need to utilize in situ resources to produce replacement parts and fabricate repairs to vehicles, habitats, life support and quality of life elements. All ISFR technology developments will incorporate automated deployment and robotic construction and fabrication techniques. The current state of the art for these applications is fascinating, but the future is out of this world.

  5. ERP processes automation in corporate environments

    OpenAIRE

    Antonoaie Victor; Irimeş Adrian; Chicoş Lucia-Antoneta

    2017-01-01

    The automation processes are used in organizations to speed up analyses processes and reduce manual labour. Robotic Automation of IT processes implemented in a modern corporate workspace provides an excellent tool for assisting professionals in making decisions, saving resources and serving as a know-how repository. This study presents the newest trends in process automation, its benefits such as security, ease of use, reduction of overall process duration, and provide examples of SAPERP proj...

  6. Automated estimation of defects in magnetographic defectoscopy. 1. Automated magnetographic flow detectors

    International Nuclear Information System (INIS)

    Mikhajlov, S.P.; Vaulin, S.L.; Shcherbinin, V.E.; Shur, M.L.

    1993-01-01

    Consideration is given to specific features and possible functions of equipment for automated estimation of stretched continuity defects for samples with plane surface in magnetographic defectoscopy are discussed. Two models of automated magnetographic flow detectors, those with built-in microcomputer and in the form computer attachment, are described. Directions of further researches and development are discussed. 35 refs., 6 figs

  7. Both Automation and Paper.

    Science.gov (United States)

    Purcell, Royal

    1988-01-01

    Discusses the concept of a paperless society and the current situation in library automation. Various applications of automation and telecommunications are addressed, and future library automation is considered. Automation at the Monroe County Public Library in Bloomington, Indiana, is described as an example. (MES)

  8. Atmospheric Fluctuation Measurements with the Palomar Testbed Interferometer

    Science.gov (United States)

    Linfield, R. P.; Lane, B. F.; Colavita, M. M.; PTI Collaboration

    Observations of bright stars with the Palomar Testbed Interferometer, at a wavelength of 2.2 microns, have been used to measure atmospheric delay fluctuations. The delay structure function Dτ(Δ t) was calculated for 66 scans (each >= 120s in length) on seven nights in 1997 and one in 1998. For all except one scan, Dτ exhibited a clean power law shape over the time interval 50-500 msec. Over shorter time intervals, the effect of the delay line servo loop corrupts Dτ. Over longer time intervals (usually starting at > 1s), the slope of Dτ decreases, presumably due to some combination of saturation e.g. finite turbulent layer thickness) and the effect of the finite wind speed crossing time on our 110 m baseline. The mean power law slopes for the eight nights ranged from 1.16 to 1.36, substantially flatter than the value of 1.67 for three dimensional Kolmogorov turbulence. Such sub-Kolmogorov slopes will result in atmospheric seeling (θ) that improves rapidly with increasing wavelength: θ propto λ1-(2β), where β is the observed power law slope of Dτ. The atmospheric errors in astrometric measurements with an interferometer will average down more quickly than in the Kolmogorov case.

  9. DOE In Situ Remediation Integrated Program

    International Nuclear Information System (INIS)

    Yow, J.L. Jr.

    1993-01-01

    The In Situ Remediation Integrated Program (ISRP) supports and manages a balanced portfolio of applied research and development activities in support of DOE environmental restoration and waste management needs. ISRP technologies are being developed in four areas: containment, chemical and physical treatment, in situ bioremediation, and in situ manipulation (including electrokinetics). the focus of containment is to provide mechanisms to stop contaminant migration through the subsurface. In situ bioremediation and chemical and physical treatment both aim to destroy or eliminate contaminants in groundwater and soils. In situ manipulation (ISM) provides mechanisms to access contaminants or introduce treatment agents into the soil, and includes other technologies necessary to support the implementation of ISR methods. Descriptions of each major program area are provided to set the technical context of the ISM subprogram. Typical ISM needs for major areas of in situ remediation research and development are identified

  10. The Soil Moisture Active Passive Mission (SMAP) Science Data Products: Results of Testing with Field Experiment and Algorithm Testbed Simulation Environment Data

    Science.gov (United States)

    Entekhabi, Dara; Njoku, Eni E.; O'Neill, Peggy E.; Kellogg, Kent H.; Entin, Jared K.

    2010-01-01

    Talk outline 1. Derivation of SMAP basic and applied science requirements from the NRC Earth Science Decadal Survey applications 2. Data products and latencies 3. Algorithm highlights 4. SMAP Algorithm Testbed 5. SMAP Working Groups and community engagement

  11. Adaptive Signal Processing Testbed: VME-based DSP board market survey

    Science.gov (United States)

    Ingram, Rick E.

    1992-04-01

    The Adaptive Signal Processing Testbed (ASPT) is a real-time multiprocessor system utilizing digital signal processor technology on VMEbus based printed circuit boards installed on a Sun workstation. The ASPT has specific requirements, particularly as regards to the signal excision application, with respect to interfacing with current and planned data generation equipment, processing of the data, storage to disk of final and intermediate results, and the development tools for applications development and integration into the overall EW/COM computing environment. A prototype ASPT was implemented using three VME-C-30 boards from Applied Silicon. Experience gained during the prototype development led to the conclusions that interprocessor communications capability is the most significant contributor to overall ASPT performance. In addition, the host involvement should be minimized. Boards using different processors were evaluated with respect to the ASPT system requirements, pricing, and availability. Specific recommendations based on various priorities are made as well as recommendations concerning the integration and interaction of various tools developed during the prototype implementation.

  12. Novel insights in agent-based complex automated negotiation

    CERN Document Server

    Lopez-Carmona, Miguel; Ito, Takayuki; Zhang, Minjie; Bai, Quan; Fujita, Katsuhide

    2014-01-01

    This book focuses on all aspects of complex automated negotiations, which are studied in the field of autonomous agents and multi-agent systems. This book consists of two parts. I: Agent-Based Complex Automated Negotiations, and II: Automated Negotiation Agents Competition. The chapters in Part I are extended versions of papers presented at the 2012 international workshop on Agent-Based Complex Automated Negotiation (ACAN), after peer reviews by three Program Committee members. Part II examines in detail ANAC 2012 (The Third Automated Negotiating Agents Competition), in which automated agents that have different negotiation strategies and are implemented by different developers are automatically negotiated in the several negotiation domains. ANAC is an international competition in which automated negotiation strategies, submitted by a number of universities and research institutes across the world, are evaluated in tournament style. The purpose of the competition is to steer the research in the area of bilate...

  13. Cyclic olefin homopolymer-based microfluidics for protein crystallization and in situ X-ray diffraction

    International Nuclear Information System (INIS)

    Emamzadah, Soheila; Petty, Tom J.; De Almeida, Victor; Nishimura, Taisuke; Joly, Jacques; Ferrer, Jean-Luc; Halazonetis, Thanos D.

    2009-01-01

    A cyclic olefin homopolymer-based microfluidics system has been established for protein crystallization and in situ X-ray diffraction. Microfluidics is a promising technology for the rapid identification of protein crystallization conditions. However, most of the existing systems utilize silicone elastomers as the chip material which, despite its many benefits, is highly permeable to water vapour. This limits the time available for protein crystallization to less than a week. Here, the use of a cyclic olefin homopolymer-based microfluidics system for protein crystallization and in situ X-ray diffraction is described. Liquid handling in this system is performed in 2 mm thin transparent cards which contain 500 chambers, each with a volume of 320 nl. Microbatch, vapour-diffusion and free-interface diffusion protocols for protein crystallization were implemented and crystals were obtained of a number of proteins, including chicken lysozyme, bovine trypsin, a human p53 protein containing both the DNA-binding and oligomerization domains bound to DNA and a functionally important domain of Arabidopsis Morpheus’ molecule 1 (MOM1). The latter two polypeptides have not been crystallized previously. For X-ray diffraction analysis, either the cards were opened to allow mounting of the crystals on loops or the crystals were exposed to X-rays in situ. For lysozyme, an entire X-ray diffraction data set at 1.5 Å resolution was collected without removing the crystal from the card. Thus, cyclic olefin homopolymer-based microfluidics systems have the potential to further automate protein crystallization and structural genomics efforts

  14. Lean automation development : applying lean principles to the automation development process

    OpenAIRE

    Granlund, Anna; Wiktorsson, Magnus; Grahn, Sten; Friedler, Niklas

    2014-01-01

    By a broad empirical study it is indicated that automation development show potential of improvement. In the paper, 13 lean product development principles are contrasted to the automation development process and it is suggested why and how these principles can facilitate, support and improve the automation development process. The paper summarises a description of what characterises a lean automation development process and what consequences it entails. Main differences compared to current pr...

  15. In Situ Stem Psychrometry: toward a Physiologically-Based Drought Monitoring Network

    Science.gov (United States)

    KOCH, G. W.; Williams, C.; Ambrose, A.

    2012-12-01

    Plant water potential is a synoptic variable that integrates soil and atmospheric moisture stress and interacts with plant-internal factors to regulate gas exchange and determine vulnerability to drought-induced hydraulic dysfunction. Despite its importance, methods for measuring water potential are labor intensive. This limitation reduces measurement frequency, likely causes important transient events to be overlooked, and restricts development of a richer understanding of the impacts of integrated water stress on plant and ecosystem function. Recent technological advances have enabled in-situ, automated measurement of branch water potential over periods of weeks to months using stem psychrometers. We evaluated this technology through laboratory and field comparisons to standard pressure chamber measurements and with field installations in temperate forest, semi-arid woodland, and chaparral ecosystems. Performance was highly sensitive to installation procedures. With proper sealing, insulation, and radiation shielding, psychrometers typically differed from pressure chamber measurements by less than 0.2 MPa down to water potentials as low as -7 MPa. Measurements in tall trees reaffirmed the influence of gravity on water potential as previously documented with the pressure chamber. Psychrometer performance in situ was stable for periods of several weeks to months, with tissue wound response degrading sensor operation over time. We conclude that stem psychrometer technology is now suitable to serve as the foundation for a physiologically-based drought monitoring network that can anticipate important ecosystem impacts including changes in whole-system fluxes and mortality events.

  16. Multichannel Mars Organic Analyzer (McMOA): Microfluidic Networks for the Automated In Situ Microchip Electrophoretic Analysis of Organic Biomarkers on Mars

    Science.gov (United States)

    Chiesl, T. N.; Benhabib, M.; Stockton, A. M.; Mathies, R. A.

    2010-04-01

    We present the Multichannel Mars Organic Analyzer (McMOA) for the analysis of Amino Acids, PAHs, and Oxidized Carbon. Microfluidic architecures integrating automated metering, mixing, on chip reactions, and serial dilutions are also discussed.

  17. Participation through Automation: Fully Automated Critical PeakPricing in Commercial Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Watson, David S.; Motegi, Naoya; Kiliccote,Sila; Linkugel, Eric

    2006-06-20

    California electric utilities have been exploring the use of dynamic critical peak prices (CPP) and other demand response programs to help reduce peaks in customer electric loads. CPP is a tariff design to promote demand response. Levels of automation in DR can be defined as follows: Manual Demand Response involves a potentially labor-intensive approach such as manually turning off or changing comfort set points at each equipment switch or controller. Semi-Automated Demand Response involves a pre-programmed demand response strategy initiated by a person via centralized control system. Fully Automated Demand Response does not involve human intervention, but is initiated at a home, building, or facility through receipt of an external communications signal. The receipt of the external signal initiates pre-programmed demand response strategies. They refer to this as Auto-DR. This paper describes the development, testing, and results from automated CPP (Auto-CPP) as part of a utility project in California. The paper presents the project description and test methodology. This is followed by a discussion of Auto-DR strategies used in the field test buildings. They present a sample Auto-CPP load shape case study, and a selection of the Auto-CPP response data from September 29, 2005. If all twelve sites reached their maximum saving simultaneously, a total of approximately 2 MW of DR is available from these twelve sites that represent about two million ft{sup 2}. The average DR was about half that value, at about 1 MW. These savings translate to about 0.5 to 1.0 W/ft{sup 2} of demand reduction. They are continuing field demonstrations and economic evaluations to pursue increasing penetrations of automated DR that has demonstrated ability to provide a valuable DR resource for California.

  18. Automated Groundwater Screening

    International Nuclear Information System (INIS)

    Taylor, Glenn A.; Collard, Leonard B.

    2005-01-01

    The Automated Intruder Analysis has been extended to include an Automated Ground Water Screening option. This option screens 825 radionuclides while rigorously applying the National Council on Radiation Protection (NCRP) methodology. An extension to that methodology is presented to give a more realistic screening factor for those radionuclides which have significant daughters. The extension has the promise of reducing the number of radionuclides which must be tracked by the customer. By combining the Automated Intruder Analysis with the Automated Groundwater Screening a consistent set of assumptions and databases is used. A method is proposed to eliminate trigger values by performing rigorous calculation of the screening factor thereby reducing the number of radionuclides sent to further analysis. Using the same problem definitions as in previous groundwater screenings, the automated groundwater screening found one additional nuclide, Ge-68, which failed the screening. It also found that 18 of the 57 radionuclides contained in NCRP Table 3.1 failed the screening. This report describes the automated groundwater screening computer application

  19. Automation of orders in taxi service

    OpenAIRE

    Simčič, Matej

    2012-01-01

    Automation is rapidly growing in the last years. The advantages it brings are cost reduction, faster and better performance of tasks that would be otherwise done by humas. It began in the manufacturing industry and later expanded to other sectors. Today's technology allows the implementation of automation in a wide range of areas. The thesis deals with the implementation of a system that allows automated ordering of a taxi. The system consists of four components. They are two mobile app...

  20. Oceanic Platform of the Canary Islands: an ocean testbed for ocean energy converters

    Science.gov (United States)

    González, Javier; Hernández-Brito, Joaquín.; Llinás, Octavio

    2010-05-01

    The Oceanic Platform of the Canary Islands (PLOCAN) is a Governmental Consortium aimed to build and operate an off-shore infrastructure to facilitate the deep sea research and speed up the technology associated. This Consortium is overseen by the Spanish Ministry of Science and Innovation and the Canarian Agency for Research and Innovation. The infrastructure consists of an oceanic platform located in an area with depths between 50-100 meters, close to the continental slope and four kilometers off the coast of Gran Canaria, in the archipelago of the Canary Islands. The process of construction will start during the first months of 2010 and is expected to be finished in mid-year 2011. PLOCAN serves five strategic lines: an integral observatory able to explore from the deep ocean to the atmosphere, an ocean technology testbed, a base for underwater vehicles, an innovation platform and a highly specialized training centre. Ocean energy is a suitable source to contribute the limited mix-energy conformed in the archipelago of the Canary Islands with a total population around 2 million people unequally distributed in seven islands. Islands of Gran Canaria and Tenerife support the 80% of the total population with 800.000 people each. PLOCAN will contribute to develop the ocean energy sector establishing a marine testbed allowing prototypes testing at sea under a meticulous monitoring network provided by the integral observatory, generating valuable information to developers. Reducing costs throughout an integral project management is an essential objective to be reach, providing services such as transportation, customs and administrative permits. Ocean surface for testing activities is around 8 km2 with a depth going from 50 to 100 meters, 4km off the coast. Selected areas for testing have off-shore wind power conditions around 500-600 W/m2 and wave power conditions around 6 kW/m in the East coast and 10 kW/m in the North coast. Marine currents in the Canary Islands are

  1. Comparison of Size Modulation Standard Automated Perimetry and Conventional Standard Automated Perimetry with a 10-2 Test Program in Glaucoma Patients.

    Science.gov (United States)

    Hirasawa, Kazunori; Takahashi, Natsumi; Satou, Tsukasa; Kasahara, Masayuki; Matsumura, Kazuhiro; Shoji, Nobuyuki

    2017-08-01

    This prospective observational study compared the performance of size modulation standard automated perimetry with the Octopus 600 10-2 test program, with stimulus size modulation during testing, based on stimulus intensity and conventional standard automated perimetry, with that of the Humphrey 10-2 test program in glaucoma patients. Eighty-seven eyes of 87 glaucoma patients underwent size modulation standard automated perimetry with Dynamic strategy and conventional standard automated perimetry using the SITA standard strategy. The main outcome measures were global indices, point-wise threshold, visual defect size and depth, reliability indices, and test duration; these were compared between size modulation standard automated perimetry and conventional standard automated perimetry. Global indices and point-wise threshold values between size modulation standard automated perimetry and conventional standard automated perimetry were moderately to strongly correlated (p 33.40, p modulation standard automated perimetry than with conventional standard automated perimetry, but the visual-field defect size was smaller (p modulation-standard automated perimetry than on conventional standard automated perimetry. The reliability indices, particularly the false-negative response, of size modulation standard automated perimetry were worse than those of conventional standard automated perimetry (p modulation standard automated perimetry than with conventional standard automated perimetry (p = 0.02). Global indices and the point-wise threshold value of the two testing modalities correlated well. However, the potential of a large stimulus presented at an area with a decreased sensitivity with size modulation standard automated perimetry could underestimate the actual threshold in the 10-2 test protocol, as compared with conventional standard automated perimetry.

  2. International Conference Automation : Challenges in Automation, Robotics and Measurement Techniques

    CERN Document Server

    Zieliński, Cezary; Kaliczyńska, Małgorzata

    2016-01-01

    This book presents the set of papers accepted for presentation at the International Conference Automation, held in Warsaw, 2-4 March of 2016. It presents the research results presented by top experts in the fields of industrial automation, control, robotics and measurement techniques. Each chapter presents a thorough analysis of a specific technical problem which is usually followed by numerical analysis, simulation, and description of results of implementation of the solution of a real world problem. The presented theoretical results, practical solutions and guidelines will be valuable for both researchers working in the area of engineering sciences and for practitioners solving industrial problems. .

  3. Report of the Interagency Optical Network Testbeds Workshop 2, NASA Ames Research Center, September 12-14, 2005

    Science.gov (United States)

    2005-01-01

    The Optical Network Testbeds Workshop 2 (ONT2), held on September 12-14, 2005, was cosponsored by the Department of Energy Office of Science (DOE/SC) and the National Aeronautics and Space Administration (NASA), in cooperation with the Joint Engineering Team (JET) of the Federal Networking and Information Technology Research and Development (NITRD) Program's Large Scale Networking (LSN) Coordinating Group. The ONT2 workshop was a follow-on to an August 2004 Workshop on Optical Network Testbeds (ONT1). ONT1 recommended actions by the Federal agencies to assure timely development and implementation of optical networking technologies and infrastructure. Hosted by the NASA Ames Research Center in Mountain View, California, the ONT2 workshop brought together representatives of the U.S. advanced research and education (R&E) networks, regional optical networks (RONs), service providers, international networking organizations, and senior engineering and R&D managers from Federal agencies and national research laboratories. Its purpose was to develop a common vision of the optical network technologies, services, infrastructure, and organizations needed to enable widespread use of optical networks; recommend activities for transitioning the optical networking research community and its current infrastructure to leading-edge optical networks over the next three to five years; and present information enabling commercial network infrastructure providers to plan for and use leading-edge optical network services in that time frame.

  4. Automated model building

    CERN Document Server

    Caferra, Ricardo; Peltier, Nicholas

    2004-01-01

    This is the first book on automated model building, a discipline of automated deduction that is of growing importance Although models and their construction are important per se, automated model building has appeared as a natural enrichment of automated deduction, especially in the attempt to capture the human way of reasoning The book provides an historical overview of the field of automated deduction, and presents the foundations of different existing approaches to model construction, in particular those developed by the authors Finite and infinite model building techniques are presented The main emphasis is on calculi-based methods, and relevant practical results are provided The book is of interest to researchers and graduate students in computer science, computational logic and artificial intelligence It can also be used as a textbook in advanced undergraduate courses

  5. Quantification of Eosinophilic Granule Protein Deposition in Biopsies of Inflammatory Skin Diseases by Automated Image Analysis of Highly Sensitive Immunostaining

    Directory of Open Access Journals (Sweden)

    Peter Kiehl

    1999-01-01

    Full Text Available Eosinophilic granulocytes are major effector cells in inflammation. Extracellular deposition of toxic eosinophilic granule proteins (EGPs, but not the presence of intact eosinophils, is crucial for their functional effect in situ. As even recent morphometric approaches to quantify the involvement of eosinophils in inflammation have been only based on cell counting, we developed a new method for the cell‐independent quantification of EGPs by image analysis of immunostaining. Highly sensitive, automated immunohistochemistry was done on paraffin sections of inflammatory skin diseases with 4 different primary antibodies against EGPs. Image analysis of immunostaining was performed by colour translation, linear combination and automated thresholding. Using strictly standardized protocols, the assay was proven to be specific and accurate concerning segmentation in 8916 fields of 520 sections, well reproducible in repeated measurements and reliable over 16 weeks observation time. The method may be valuable for the cell‐independent segmentation of immunostaining in other applications as well.

  6. A microscopy approach for in situ inspection of micro-coordinate measurement machine styli for contamination

    Science.gov (United States)

    Feng, Xiaobing; Pascal, Jonathan; Lawes, Simon

    2017-09-01

    During the process of measurement using a micro-coordinate measurement machine (µCMM) contamination gradually builds up on the surface of the stylus tip and affects the dimensional accuracy of the measurement. Regular inspection of the stylus for contamination is essential to determine the appropriate cleaning interval and prevent the dimensional error from becoming significant. However, in situ inspection of a µCMM stylus is challenging due to the size, spherical shape, material and surface properties of a typical stylus. To address this challenge, this study evaluates several non-contact measurement technologies for in situ stylus inspection and, based on those findings, proposes a cost-effective microscopy approach. The operational principle is then demonstrated by an automated prototype, coordinated directly by the CMM software MCOSMOS, with an effective threshold of detection as low as 400 nm and a large field of view and depth of field. The level of contamination on the stylus has been found to increase steadily with the number of measurement contacts made. Once excessive contamination is detected on the stylus, measurement should be stopped and a stylus cleaning procedure should be performed to avoid affecting measurement accuracy.

  7. A microscopy approach for in situ inspection of micro-coordinate measurement machine styli for contamination

    International Nuclear Information System (INIS)

    Feng, Xiaobing; Lawes, Simon; Pascal, Jonathan

    2017-01-01

    During the process of measurement using a micro-coordinate measurement machine (µCMM) contamination gradually builds up on the surface of the stylus tip and affects the dimensional accuracy of the measurement. Regular inspection of the stylus for contamination is essential to determine the appropriate cleaning interval and prevent the dimensional error from becoming significant. However, in situ inspection of a µCMM stylus is challenging due to the size, spherical shape, material and surface properties of a typical stylus. To address this challenge, this study evaluates several non-contact measurement technologies for in situ stylus inspection and, based on those findings, proposes a cost-effective microscopy approach. The operational principle is then demonstrated by an automated prototype, coordinated directly by the CMM software MCOSMOS, with an effective threshold of detection as low as 400 nm and a large field of view and depth of field. The level of contamination on the stylus has been found to increase steadily with the number of measurement contacts made. Once excessive contamination is detected on the stylus, measurement should be stopped and a stylus cleaning procedure should be performed to avoid affecting measurement accuracy. (paper)

  8. Techno-economic and uncertainty analysis of in situ and ex situ fast pyrolysis for biofuel production

    Energy Technology Data Exchange (ETDEWEB)

    Li, Boyan; Ou, Longwen; Dang, Qi; Meyer, Pimphan A.; Jones, Susanne B.; Brown, Robert C.; Wright, Mark

    2015-11-01

    This study evaluates the techno-economic uncertainty in cost estimates for two emerging biorefinery technologies for biofuel production: in situ and ex situ catalytic pyrolysis. Stochastic simulations based on process and economic parameter distributions are applied to calculate biorefinery performance and production costs. The probability distributions for the minimum fuel-selling price (MFSP) indicate that in situ catalytic pyrolysis has an expected MFSP of $4.20 per gallon with a standard deviation of 1.15, while the ex situ catalytic pyrolysis has a similar MFSP with a smaller deviation ($4.27 per gallon and 0.79 respectively). These results suggest that a biorefinery based on ex situ catalytic pyrolysis could have a lower techno-economic risk than in situ pyrolysis despite a slightly higher MFSP cost estimate. Analysis of how each parameter affects the NPV indicates that internal rate of return, feedstock price, total project investment, electricity price, biochar yield and bio-oil yield are significant parameters which have substantial impact on the MFSP for both in situ and ex situ catalytic pyrolysis.

  9. Application of in situ diffraction in high-throughput structure determination platforms.

    Science.gov (United States)

    Aller, Pierre; Sanchez-Weatherby, Juan; Foadi, James; Winter, Graeme; Lobley, Carina M C; Axford, Danny; Ashton, Alun W; Bellini, Domenico; Brandao-Neto, Jose; Culurgioni, Simone; Douangamath, Alice; Duman, Ramona; Evans, Gwyndaf; Fisher, Stuart; Flaig, Ralf; Hall, David R; Lukacik, Petra; Mazzorana, Marco; McAuley, Katherine E; Mykhaylyk, Vitaliy; Owen, Robin L; Paterson, Neil G; Romano, Pierpaolo; Sandy, James; Sorensen, Thomas; von Delft, Frank; Wagner, Armin; Warren, Anna; Williams, Mark; Stuart, David I; Walsh, Martin A

    2015-01-01

    Macromolecular crystallography (MX) is the most powerful technique available to structural biologists to visualize in atomic detail the macromolecular machinery of the cell. Since the emergence of structural genomics initiatives, significant advances have been made in all key steps of the structure determination process. In particular, third-generation synchrotron sources and the application of highly automated approaches to data acquisition and analysis at these facilities have been the major factors in the rate of increase of macromolecular structures determined annually. A plethora of tools are now available to users of synchrotron beamlines to enable rapid and efficient evaluation of samples, collection of the best data, and in favorable cases structure solution in near real time. Here, we provide a short overview of the emerging use of collecting X-ray diffraction data directly from the crystallization experiment. These in situ experiments are now routinely available to users at a number of synchrotron MX beamlines. A practical guide to the use of the method on the MX suite of beamlines at Diamond Light Source is given.

  10. ERP processes automation in corporate environments

    Directory of Open Access Journals (Sweden)

    Antonoaie Victor

    2017-01-01

    Full Text Available The automation processes are used in organizations to speed up analyses processes and reduce manual labour. Robotic Automation of IT processes implemented in a modern corporate workspace provides an excellent tool for assisting professionals in making decisions, saving resources and serving as a know-how repository. This study presents the newest trends in process automation, its benefits such as security, ease of use, reduction of overall process duration, and provide examples of SAPERP projects where this technology was implemented and meaningful impact was obtained.

  11. Automated data collection in single particle electron microscopy

    Science.gov (United States)

    Tan, Yong Zi; Cheng, Anchi; Potter, Clinton S.; Carragher, Bridget

    2016-01-01

    Automated data collection is an integral part of modern workflows in single particle electron microscopy (EM) research. This review surveys the software packages available for automated single particle EM data collection. The degree of automation at each stage of data collection is evaluated, and the capabilities of the software packages are described. Finally, future trends in automation are discussed. PMID:26671944

  12. Chromosomal Rearrangements in Post-Chernobyl Papillary Thyroid Carcinomas: Evaluation by Spectral Karyotyping and Automated Interphase FISH

    Directory of Open Access Journals (Sweden)

    Ludwig Hieber

    2011-01-01

    Full Text Available Structural genomic rearrangements are frequent findings in human cancers. Therefore, papillary thyroid carcinomas (PTCs were investigated for chromosomal aberrations and rearrangements of the RET proto-oncogene. For this purpose, primary cultures from 23 PTC have been established and metaphase preparations were analysed by spectral karyotyping (SKY. In addition, interphase cell preparations of the same cases were investigated by fluorescence in situ hybridisation (FISH for the presence of RET/PTC rearrangements using RET-specific DNA probes. SKY analysis of PTC revealed structural aberrations of chromosome 11 and several numerical aberrations with frequent loss of chromosomes 20, 21, and 22. FISH analysis for RET/PTC rearrangements showed prevalence of this rearrangement in 72% (16 out of 22 of cases. However, only subpopulations of tumour cells exhibited this rearrangement indicating genetic heterogeneity. The comparison of visual and automated scoring of FISH signals revealed concordant results in 19 out of 22 cases (87% indicating reliable scoring results using the optimised scoring parameter for RET/PTC with the automated Metafer4 system. It can be concluded from this study that genomic rearrangements are frequent in PTC and therefore important events in thyroid carcinogenesis.

  13. An automated, noncontact laser profile meter for measuring soil roughness in situ

    International Nuclear Information System (INIS)

    Bertuzzi, P.; Caussignac, J.M.; Stengel, P.; Morel, G.; Lorendeau, J.Y.; Pelloux, G.

    1990-01-01

    This paper describes a new optical technique for measuring in situ soil surface roughness profiles using a laser profile meter. The described method uses a low-power HeNe (helium-neon) laser as a laser source and a matrix-array detector, as the laser image. The matrix-array detector gives a defect-of-focus laser image of the soil. Soil elevation is measured by projecting a laser beam normally onto the soil surface and measuring the ratio (Ir/It) on the matrix-array detector between the referenced intensity of the return Laser beam (Ir), measured by the central cell of the detector and the total intensity (It), measured by all the cells of the detector. The measured profile leads to 1001 sampled values (volt, range 0 to 10 V) of the surface height profile, at a constant increment of 0.002 m, registered automatically on a microcomputer. A calibration is made in the laboratory in order to convert the electrical measurements into elevation data. The method is universal and can be adapted to different scales of soil surface roughness. Changing the scale is done by changing the lens. Tests were carried out to improve this method for field use and to compare this technique with a method of reference. This technique is considerably quicker and causes no disturbance to the soil. The accuracy on height measurement depends on the choice of the lens. The small focal lens is convenient for smooth soil surfaces. The accuracy on height measurement is less than 0.75 mm. The wide focal lens is convenient for rough soil surfaces. The accuracy on height measurement is estimated at about 1.0 to 1.5 mm

  14. Automation trust and attention allocation in multitasking workspace.

    Science.gov (United States)

    Karpinsky, Nicole D; Chancey, Eric T; Palmer, Dakota B; Yamani, Yusuke

    2018-07-01

    Previous research suggests that operators with high workload can distrust and then poorly monitor automation, which has been generally inferred from automation dependence behaviors. To test automation monitoring more directly, the current study measured operators' visual attention allocation, workload, and trust toward imperfect automation in a dynamic multitasking environment. Participants concurrently performed a manual tracking task with two levels of difficulty and a system monitoring task assisted by an unreliable signaling system. Eye movement data indicate that operators allocate less visual attention to monitor automation when the tracking task is more difficult. Participants reported reduced levels of trust toward the signaling system when the tracking task demanded more focused visual attention. Analyses revealed that trust mediated the relationship between the load of the tracking task and attention allocation in Experiment 1, an effect that was not replicated in Experiment 2. Results imply a complex process underlying task load, visual attention allocation, and automation trust during multitasking. Automation designers should consider operators' task load in multitasking workspaces to avoid reduced automation monitoring and distrust toward imperfect signaling systems. Copyright © 2018. Published by Elsevier Ltd.

  15. Automated Assessment in Massive Open Online Courses

    Science.gov (United States)

    Ivaniushin, Dmitrii A.; Shtennikov, Dmitrii G.; Efimchick, Eugene A.; Lyamin, Andrey V.

    2016-01-01

    This paper describes an approach to use automated assessments in online courses. Open edX platform is used as the online courses platform. The new assessment type uses Scilab as learning and solution validation tool. This approach allows to use automated individual variant generation and automated solution checks without involving the course…

  16. I trust it, but I don't know why: effects of implicit attitudes toward automation on trust in an automated system.

    Science.gov (United States)

    Merritt, Stephanie M; Heimbaugh, Heather; LaChapell, Jennifer; Lee, Deborah

    2013-06-01

    This study is the first to examine the influence of implicit attitudes toward automation on users' trust in automation. Past empirical work has examined explicit (conscious) influences on user level of trust in automation but has not yet measured implicit influences. We examine concurrent effects of explicit propensity to trust machines and implicit attitudes toward automation on trust in an automated system. We examine differential impacts of each under varying automation performance conditions (clearly good, ambiguous, clearly poor). Participants completed both a self-report measure of propensity to trust and an Implicit Association Test measuring implicit attitude toward automation, then performed an X-ray screening task. Automation performance was manipulated within-subjects by varying the number and obviousness of errors. Explicit propensity to trust and implicit attitude toward automation did not significantly correlate. When the automation's performance was ambiguous, implicit attitude significantly affected automation trust, and its relationship with propensity to trust was additive: Increments in either were related to increases in trust. When errors were obvious, a significant interaction between the implicit and explicit measures was found, with those high in both having higher trust. Implicit attitudes have important implications for automation trust. Users may not be able to accurately report why they experience a given level of trust. To understand why users trust or fail to trust automation, measurements of implicit and explicit predictors may be necessary. Furthermore, implicit attitude toward automation might be used as a lever to effectively calibrate trust.

  17. Extrasolar Planetary Imaging Coronagraph: Visible Nulling Coronagraph Testbed Results

    Science.gov (United States)

    Lyon, Richard G.

    2008-01-01

    The Extrasolar Planetary Imaging Coronagraph (EPIC) is a proposed NASA Discovery mission to image and characterize extrasolar giant planets in orbits with semi-major axes between 2 and 10 AU. EPIC will provide insights into the physical nature of a variety of planets in other solar systems complimenting radial velocity (RV) and astrometric planet searches. It will detect and characterize the atmospheres of planets identified by radial velocity surveys, determine orbital inclinations and masses, characterize the atmospheres around A and F stars, observed the inner spatial structure and colors of inner Spitzer selected debris disks. EPIC would be launched to heliocentric Earth trailing drift-away orbit, with a 3-year mission lifetime ( 5 year goal) and will revisit planets at least three times at intervals of 9 months. The starlight suppression approach consists of a visible nulling coronagraph (VNC) that enables high order starlight suppression in broadband light. To demonstrate the VNC approach and advance it's technology readiness the NASA Goddard Space Flight Center and Lockheed-Martin have developed a laboratory VNC and have demonstrated white light nulling. We will discuss our ongoing VNC work and show the latest results from the VNC testbed,

  18. In situ and ex situ modifications of bacterial cellulose for applications in tissue engineering.

    Science.gov (United States)

    Stumpf, Taisa Regina; Yang, Xiuying; Zhang, Jingchang; Cao, Xudong

    2018-01-01

    Bacterial cellulose (BC) is secreted by a few strains of bacteria and consists of a cellulose nanofiber network with unique characteristics. Because of its excellent mechanical properties, outstanding biocompatibilities, and abilities to form porous structures, BC has been studied for a variety of applications in different fields, including the use as a biomaterial for scaffolds in tissue engineering. To extend its applications in tissue engineering, native BC is normally modified to enhance its properties. Generally, BC modifications can be made by either in situ modification during cell culture or ex situ modification of existing BC microfibers. In this review we will first provide a brief introduction of BC and its attributes; this will set the stage for in-depth and up-to-date discussions on modified BC. Finally, the review will focus on in situ and ex situ modifications of BC and its applications in tissue engineering, particularly in bone regeneration and wound dressing. Copyright © 2016. Published by Elsevier B.V.

  19. Instrumentation and process control development for in situ coal gasification. Fourth quarterly report, September--November 1975

    Energy Technology Data Exchange (ETDEWEB)

    Northrop, D.A. (ed.)

    1976-01-01

    The instrumentation effort for Phases 2 and 3 of the Second Hanna In Situ Coal Gasification Experiment was fielded and background data obtained prior to the initiation of Phase 2 on November 25, 1975. A total of over 600 channels of instrumentation in 15 instrumentation wells and two surface arrays was fielded for the instrumentation techniques under evaluation. The feasibility of the passive acoustic technique to locate the source of process-related noises has been demonstrated; its utility is presently hampered by the inexact definition of signal arrivals and the lack of automated signal monitoring and analysis systems. A revised mathematical model for the electrical techniques has been developed which demonstrates the potential for remote monitoring. (auth)

  20. Fatigue and voluntary utilization of automation in simulated driving.

    Science.gov (United States)

    Neubauer, Catherine; Matthews, Gerald; Langheim, Lisa; Saxby, Dyani

    2012-10-01

    A driving simulator was used to assess the impact on fatigue, stress, and workload of full vehicle automation that was initiated by the driver. Previous studies have shown that mandatory use of full automation induces a state of "passive fatigue" associated with loss of alertness. By contrast, voluntary use of automation may enhance the driver's perceptions of control and ability to manage fatigue. Participants were assigned to one of two experimental conditions, automation optional (AO) and nonautomation (NA), and then performed a 35 min, monotonous simulated drive. In the last 5 min, automation was unavailable and drivers were required to respond to an emergency event. Subjective state and workload were evaluated before and after the drive. Making automation available to the driver failed to alleviate fatigue and stress states induced by driving in monotonous conditions. Drivers who were fatigued prior to the drive were more likely to choose to use automation, but automation use increased distress, especially in fatigue-prone drivers. Drivers in the AO condition were slower to initiate steering responses to the emergency event, suggesting optional automation may be distracting. Optional, driver-controlled automation appears to pose the same dangers to task engagement and alertness as externally initiated automation. Drivers of automated vehicles may be vulnerable to fatigue that persists when normal vehicle control is restored. It is important to evaluate automated systems' impact on driver fatigue, to seek design solutions to the issue of maintaining driver engagement, and to address the vulnerabilities of fatigue-prone drivers.

  1. A Rural Next Generation Network (R-NGN and Its Testbed

    Directory of Open Access Journals (Sweden)

    Armein Z. R. Langi

    2007-05-01

    Full Text Available Rural Next Generation Networks (R-NGN technology allows Internet protocol (IP based systems to be used in rural areas. This paper reports a testbed of R-NGN that uses low cost Ethernet radio links, combined with media gateways and a softswitch. The network consists of point-to-point IP Ethernet 2.4 GHz wireless link, IP switches and gateways in each community, standard copper wires and telephone sets for users. It uses low power consumption, and suitable for low density users. This combination allows low cost systems as well as multiservices (voice, data, and multimedia for rural communications. An infrastructure has been deployed in two communities in Cipicung Girang, a village 10 km outside Bandung city, Indonesia. Two towers link the communities with a network of Institut Teknologi Bandung (ITB campus. In addition, local wirelines connect community houses to the network. Currently there are four houses connected to each community node (for a total of eight house, upon which we can perform various tests and measurements.

  2. A Rural Next Generation Network (R-NGN and Its Testbed

    Directory of Open Access Journals (Sweden)

    Armein Z. R. Langi

    2013-09-01

    Full Text Available Rural Next Generation Networks (R-NGN technology allows Internet protocol (IP based systems to be used in rural areas. This paper reports a testbed of R-NGN that uses low cost Ethernet radio links, combined with media gateways and a softswitch. The network consists of point-to-point IP Ethernet 2.4 GHz wireless link, IP switches and gateways in each community, standard copper wires and telephone sets for users. It uses low power consumption, and suitable for low density users. This combination allows low cost systems as well as multiservices (voice, data, and multimedia for rural communications. An infrastructure has been deployed in two communities in Cipicung Girang, a village 10 km outside Bandung city, Indonesia. Two towers link the communities with a network of Institut Teknologi Bandung (ITB campus. In addition, local wirelines connect community houses to the network. Currently there are four houses connected to each community node (for a total of eight house, upon which we can perform various tests and measurements.

  3. Preface to the special section on human factors and automation in vehicles: designing highly automated vehicles with the driver in mind.

    Science.gov (United States)

    Merat, Natasha; Lee, John D

    2012-10-01

    This special section brings together diverse research regarding driver interaction with advanced automotive technology to guide design of increasingly automated vehicles. Rapidly evolving vehicle automation will likely change cars and trucks more in the next 5 years than the preceding 50, radically redefining what it means to drive. This special section includes 10 articles from European and North American researchers reporting simulator and naturalistic driving studies. Little research has considered the consequences of fully automated driving, with most focusing on lane-keeping and speed control systems individually. The studies reveal two underlying design philosophies: automate driving versus support driving. Results of several studies, consistent with previous research in other domains, suggest that the automate philosophy can delay driver responses to incidents in which the driver has to intervene and take control from the automation. Understanding how to orchestrate the transfer or sharing of control between the system and the driver, particularly in critical incidents, emerges as a central challenge. Designers should not assume that automation can substitute seamlessly for a human driver, nor can they assume that the driver can safely accommodate the limitations of automation. Designers, policy makers, and researchers must give careful consideration to what role the person should have in highly automated vehicles and how to support the driver if the driver is to be responsible for vehicle control. As in other domains, driving safety increasingly depends on the combined performance of the human and automation, and successful designs will depend on recognizing and supporting the new roles of the driver.

  4. User's guide to the Reliability Estimation System Testbed (REST)

    Science.gov (United States)

    Nicol, David M.; Palumbo, Daniel L.; Rifkin, Adam

    1992-01-01

    The Reliability Estimation System Testbed is an X-window based reliability modeling tool that was created to explore the use of the Reliability Modeling Language (RML). RML was defined to support several reliability analysis techniques including modularization, graphical representation, Failure Mode Effects Simulation (FMES), and parallel processing. These techniques are most useful in modeling large systems. Using modularization, an analyst can create reliability models for individual system components. The modules can be tested separately and then combined to compute the total system reliability. Because a one-to-one relationship can be established between system components and the reliability modules, a graphical user interface may be used to describe the system model. RML was designed to permit message passing between modules. This feature enables reliability modeling based on a run time simulation of the system wide effects of a component's failure modes. The use of failure modes effects simulation enhances the analyst's ability to correctly express system behavior when using the modularization approach to reliability modeling. To alleviate the computation bottleneck often found in large reliability models, REST was designed to take advantage of parallel processing on hypercube processors.

  5. Virtual automation.

    Science.gov (United States)

    Casis, E; Garrido, A; Uranga, B; Vives, A; Zufiaurre, C

    2001-01-01

    Total laboratory automation (TLA) can be substituted in mid-size laboratories by a computer sample workflow control (virtual automation). Such a solution has been implemented in our laboratory using PSM, software developed in cooperation with Roche Diagnostics (Barcelona, Spain), to this purpose. This software is connected to the online analyzers and to the laboratory information system and is able to control and direct the samples working as an intermediate station. The only difference with TLA is the replacement of transport belts by personnel of the laboratory. The implementation of this virtual automation system has allowed us the achievement of the main advantages of TLA: workload increase (64%) with reduction in the cost per test (43%), significant reduction in the number of biochemistry primary tubes (from 8 to 2), less aliquoting (from 600 to 100 samples/day), automation of functional testing, drastic reduction of preanalytical errors (from 11.7 to 0.4% of the tubes) and better total response time for both inpatients (from up to 48 hours to up to 4 hours) and outpatients (from up to 10 days to up to 48 hours). As an additional advantage, virtual automation could be implemented without hardware investment and significant headcount reduction (15% in our lab).

  6. The SENSEI Generic In Situ Interface

    Energy Technology Data Exchange (ETDEWEB)

    Ayachit, Utkarsh [Kitware, Inc., Clifton Park, NY (United States); Whitlock, Brad [Intelligent Light, Rutherford, NJ (United States); Wolf, Matthew [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Loring, Burlen [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Geveci, Berk [Kitware, Inc., Clifton Park, NY (United States); Lonie, David [Kitware, Inc., Clifton Park, NY (United States); Bethel, E. Wes [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2017-04-11

    The SENSEI generic in situ interface is an API that promotes code portability and reusability. From the simulation view, a developer can instrument their code with the SENSEI API and then make make use of any number of in situ infrastructures. From the method view, a developer can write an in situ method using the SENSEI API, then expect it to run in any number of in situ infrastructures, or be invoked directly from a simulation code, with little or no modification. This paper presents the design principles underlying the SENSEI generic interface, along with some simplified coding examples.

  7. Cross-check of ex-situ and in-situ metrology of a bendable temperature stabilized KB mirror

    International Nuclear Information System (INIS)

    Yuan Sheng; Goldberg, Kenneth A.; Yashchuk, Valeriy V.; Celestre, Richard; McKinney, Wayne R.; Morrison, Gregory; Macdougall, James; Mochi, Iacopo; Warwick, Tony

    2011-01-01

    At the Advanced Light Source (ALS), we are developing broadly applicable, high-accuracy, in-situ, at-wavelength wavefront slope measurement techniques for Kirkpatrick-Baez (KB) mirror nano-focusing. In this paper, we report an initial cross-check of ex-situ and in-situ metrology of a bendable temperature stabilized KB mirror. This cross-check provides a validation of the in-situ shearing interferometry, currently under development at the ALS.

  8. Measuring Technology and Mechatronics Automation in Electrical Engineering

    CERN Document Server

    2012-01-01

    Measuring Technology and Mechatronics Automation in Electrical Engineering includes select presentations on measuring technology and mechatronics automation related to electrical engineering, originally presented during the International Conference on Measuring Technology and Mechanatronics Automation (ICMTMA2012). This Fourth ICMTMA, held at Sanya, China, offered a prestigious, international forum for scientists, engineers, and educators to present the state of the art of measuring technology and mechatronics automation research.

  9. SUPERSENSITIVE IN SITU HYBRIDIZATION BY TYRAMIDE SIGNAL AMPLIFICATION AND NANOGOLD SILVER STAINING: THE CONTRIBUTION OF AUTOMETALLOGRAPHY AND CATALYZED REPORTER DEPOSITION TO THE REJUVENATION OF IN SITU HYBRIDIZATION.

    Energy Technology Data Exchange (ETDEWEB)

    TUBBS,R.R.PETTAY,J.GROGAN,T.CHEUNG,A.L.M.POWELL,R.D.HAINFELD,J.HAUSER-KRONBERGER,C.HACKER,G.W.

    2002-04-17

    It is peculiar that in situ hybridization (ISH), a technique with many similarities to immunohistochemistry (IHC), has not enjoyed the phenomenal growth in both basic research and clinical applications as has its sister technique IHC. Since the late 1970s, when immunoperoxidase techniques began to be applied to routine diagnostic material and to numerous research applications, there has been a natural evolution of the IHC procedure. Namely, only a few primary antibodies were available commercially at the onset, and only one indirect and the peroxidase-antiperoxidase (PAP) technique detection systems were in place. With the advent of avidin-biotin detection systems and monoclonal antibodies, and a viable commercial market, extraordinary growth of the procedure's applications in clinical research and diagnostic pathology occurred during the subsequent two decades. Today, IHC is automated and widely used for research purposes and, to a large extent, has become a routine diagnostic ''special stain'' in most clinical laboratories. During the same period, ISH enjoyed very little growth in both research and diagnostic applications. What has accounted for this lack of maturation of the technique? The success of IHC is part of the reason measuring a gene's encoded protein routinely and inexpensively, particularly as automation evolved, rendered IHC a more viable choice in many instances. Inherent comparative sensitivity of the procedures has also clearly been a factor. Unfortunately, the chromogenic procedures in place are often insufficiently sensitive to detect the relatively low amounts of DNA and RNA levels at which the clinical utility is to be found.

  10. Advanced automation for in-space vehicle processing

    Science.gov (United States)

    Sklar, Michael; Wegerif, D.

    1990-01-01

    The primary objective of this 3-year planned study is to assure that the fully evolved Space Station Freedom (SSF) can support automated processing of exploratory mission vehicles. Current study assessments show that required extravehicular activity (EVA) and to some extent intravehicular activity (IVA) manpower requirements for required processing tasks far exceeds the available manpower. Furthermore, many processing tasks are either hazardous operations or they exceed EVA capability. Thus, automation is essential for SSF transportation node functionality. Here, advanced automation represents the replacement of human performed tasks beyond the planned baseline automated tasks. Both physical tasks such as manipulation, assembly and actuation, and cognitive tasks such as visual inspection, monitoring and diagnosis, and task planning are considered. During this first year of activity both the Phobos/Gateway Mars Expedition and Lunar Evolution missions proposed by the Office of Exploration have been evaluated. A methodology for choosing optimal tasks to be automated has been developed. Processing tasks for both missions have been ranked on the basis of automation potential. The underlying concept in evaluating and describing processing tasks has been the use of a common set of 'Primitive' task descriptions. Primitive or standard tasks have been developed both for manual or crew processing and automated machine processing.

  11. Design, development, and demonstration of a fully LabVIEW controlled in situ electrochemical Fourier transform infrared setup combined with a wall-jet electrode to investigate the electrochemical interface of nanoparticulate electrocatalysts under reaction conditions.

    Science.gov (United States)

    Nesselberger, Markus; Ashton, Sean J; Wiberg, Gustav K H; Arenz, Matthias

    2013-07-01

    We present a detailed description of the construction of an in situ electrochemical ATR-FTIR setup combined with a wall-jet electrode to investigate the electrocatalytic properties of nanoparticulate catalysts in situ under controlled mass transport conditions. The presented setup allows the electrochemical interface to be probed in combination with the simultaneous determination of reaction rates. At the same time, the high level of automation allows it to be used as a standard tool in electrocatalysis research. The performance of the setup was demonstrated by probing the oxygen reduction reaction on a platinum black catalyst in sulfuric electrolyte.

  12. Development of Open Test-bed for Autonomous Operation in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Kim, Seungmin; Heo, Gyunyoung

    2017-01-01

    Nuclear power plants also recognize the need for automation. However, it is dangerous technology to have a significant impact on human society. In addition, due to the uncertain legal responsibility for autonomous operation, the application and development speed of nuclear energy related automation technology will be significantly decrease compared to other industries. It is argued that the application of AI and automation technology to power plants should not be prematurely applied or not based on the principle of applying proven technology since nuclear power plants are the highest level security operated facilities. As described above, the overall algorithm of the Test Bed is an autonomous operation algorithm (rulebased algorithm, learning-based algorithm, semiautonomous operation algorithm) to judge the entry condition of the procedure through condition monitoring and to enter the appropriate operating procedure. In order to make a test bed, the investigation for the heuristic part of the existing procedures and the heuristic part from the circumstance which is not specified in the procedure is needed. In the learning based and semi-autonomous operation algorithms, using MARS to extract its operating data and operational logs and try out various diagnostic algorithms as described above. Through the completion of these future tasks, the test bed which can compared with actual operators will be constructed and that it will be able to check its effectiveness by improving competitively with other research teams through the characteristics of shared platform.

  13. Automation of Test Cases for Web Applications : Automation of CRM Test Cases

    OpenAIRE

    Seyoum, Alazar

    2012-01-01

    The main theme of this project was to design a test automation framework for automating web related test cases. Automating test cases designed for testing a web interface provide a means of improving a software development process by shortening the testing phase in the software development life cycle. In this project an existing AutoTester framework and iMacros test automation tools were used. CRM Test Agent was developed to integrate AutoTester to iMacros and to enable the AutoTester,...

  14. Active Sensing System with In Situ Adjustable Sensor Morphology

    Science.gov (United States)

    Nurzaman, Surya G.; Culha, Utku; Brodbeck, Luzius; Wang, Liyu; Iida, Fumiya

    2013-01-01

    Background Despite the widespread use of sensors in engineering systems like robots and automation systems, the common paradigm is to have fixed sensor morphology tailored to fulfill a specific application. On the other hand, robotic systems are expected to operate in ever more uncertain environments. In order to cope with the challenge, it is worthy of note that biological systems show the importance of suitable sensor morphology and active sensing capability to handle different kinds of sensing tasks with particular requirements. Methodology This paper presents a robotics active sensing system which is able to adjust its sensor morphology in situ in order to sense different physical quantities with desirable sensing characteristics. The approach taken is to use thermoplastic adhesive material, i.e. Hot Melt Adhesive (HMA). It will be shown that the thermoplastic and thermoadhesive nature of HMA enables the system to repeatedly fabricate, attach and detach mechanical structures with a variety of shape and size to the robot end effector for sensing purposes. Via active sensing capability, the robotic system utilizes the structure to physically probe an unknown target object with suitable motion and transduce the arising physical stimuli into information usable by a camera as its only built-in sensor. Conclusions/Significance The efficacy of the proposed system is verified based on two results. Firstly, it is confirmed that suitable sensor morphology and active sensing capability enables the system to sense different physical quantities, i.e. softness and temperature, with desirable sensing characteristics. Secondly, given tasks of discriminating two visually indistinguishable objects with respect to softness and temperature, it is confirmed that the proposed robotic system is able to autonomously accomplish them. The way the results motivate new research directions which focus on in situ adjustment of sensor morphology will also be discussed. PMID:24416094

  15. Active sensing system with in situ adjustable sensor morphology.

    Science.gov (United States)

    Nurzaman, Surya G; Culha, Utku; Brodbeck, Luzius; Wang, Liyu; Iida, Fumiya

    2013-01-01

    Despite the widespread use of sensors in engineering systems like robots and automation systems, the common paradigm is to have fixed sensor morphology tailored to fulfill a specific application. On the other hand, robotic systems are expected to operate in ever more uncertain environments. In order to cope with the challenge, it is worthy of note that biological systems show the importance of suitable sensor morphology and active sensing capability to handle different kinds of sensing tasks with particular requirements. This paper presents a robotics active sensing system which is able to adjust its sensor morphology in situ in order to sense different physical quantities with desirable sensing characteristics. The approach taken is to use thermoplastic adhesive material, i.e. Hot Melt Adhesive (HMA). It will be shown that the thermoplastic and thermoadhesive nature of HMA enables the system to repeatedly fabricate, attach and detach mechanical structures with a variety of shape and size to the robot end effector for sensing purposes. Via active sensing capability, the robotic system utilizes the structure to physically probe an unknown target object with suitable motion and transduce the arising physical stimuli into information usable by a camera as its only built-in sensor. The efficacy of the proposed system is verified based on two results. Firstly, it is confirmed that suitable sensor morphology and active sensing capability enables the system to sense different physical quantities, i.e. softness and temperature, with desirable sensing characteristics. Secondly, given tasks of discriminating two visually indistinguishable objects with respect to softness and temperature, it is confirmed that the proposed robotic system is able to autonomously accomplish them. The way the results motivate new research directions which focus on in situ adjustment of sensor morphology will also be discussed.

  16. Atmosphere influence on in situ ion beam analysis of thin film growth

    International Nuclear Information System (INIS)

    Lin, Yuping; Krauss, A.R.; Gruen, D.M.; Chang, R.P.H.; Auciello, O.H.; Schultz, J.A.

    1994-01-01

    In situ, nondestructive surface characterization of thin-film growth processes in an environment of chemically active gas at pressures of several mTorr is required both for the understanding of growth processes in multicomponent films and layered heterostructures and for the improvement of process reproducibility and device reliability. The authors have developed a differentially pumped pulsed ion beam surface analysis system that includes ion scattering spectroscopy (ISS) and direct recoil spectroscopy (DRS), coupled to an automated ion beam sputter-deposition system (IBSD), to study film growth processes in an environment of chemically active gas, such as required for the growth of oxide, nitride, or diamond thin films. The influence of gas-phase scattering and gas-surface interactions on the ISS and DRS signal intensity and peak shape have been studied. From the intensity variation as a function of ambient gas pressure, the authors have calculated the mean free path and the scattering cross-section for a given combination of primary ion species and ambient gas. Depending on the system geometry and the combination of primary beam and background, it is shown that surface-specific data can be obtained during thin-film growth at pressures ranging from a few mtorr to approximately 1 Torr. Detailed information such as surface composition, structure, and film growth mechanism may be obtained in real-time, making ion beam analysis an ideal nondestructive, in situ probe of thin-film growth processes

  17. Integrating In-Situ and Ex-Situ Data Management Processes for Biodiversity Conservation

    Directory of Open Access Journals (Sweden)

    Karin R. Schwartz

    2017-10-01

    Full Text Available There is an increasing need for a “one plan approach” for conservation strategies that integrate in-situ and ex-situ management processes. Zoological institutions contribute directly to threatened species conservation through paradigms, such as reintroduction, head-starting, supplementation, or rescue/rehabilitation/release. This in-situ/ex-situ integration necessitates collaboration at all levels of conservation action including planning, implementation, monitoring and assessment to drive adaptive management processes. Each component is dependent on the availability and accuracy of data for evidence to facilitate evaluation and adaptive management processes. The Zoological Information Management System (ZIMS, managed by Species360, is a centralized web-based information system used in zoological institutions worldwide to pool life history, behavior and health data and facilitate animal husbandry, health, and breeding management processes. Currently used for few integrated conservation programs, ZIMS is an innovative tool that offers a new opportunity to link data management processes for animals that spend a part of their lives under human care and part in their natural environment and has great potential for use in managed wild populations.

  18. Cost Accounting in the Automated Manufacturing Environment

    Science.gov (United States)

    1988-06-01

    1 NAVAL POSTGRADUATE SCHOOL M terey, California 0 DTIC II ELECTE R AD%$° NO 0,19880 -- THESIS COST ACCOUNTING IN THE AUTOMATED MANUFACTURING...PROJECT TASK WORK UNIT ELEMENT NO. NO NO ACCESSION NO 11. TITLE (Include Security Classification) E COST ACCOUNTING IN THE AUTOMATED MANUFACTURING...GROUP ’" Cost Accounting ; Product Costing ; Automated Manufacturing; CAD/CAM- CIM 19 ABSTRACT (Continue on reverse if necessary and identify by blo

  19. Automated high-pressure titration system with in situ infrared spectroscopic detection

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, Christopher J., E-mail: chris.thompson@pnnl.gov; Martin, Paul F.; Chen, Jeffrey; Schaef, Herbert T.; Rosso, Kevin M.; Felmy, Andrew R.; Loring, John S. [Pacific Northwest National Laboratory, Richland, Washington 99352 (United States); Benezeth, Pascale [Géosciences Environnement Toulouse (GET), CNRS-Université de Toulouse, 31400 Toulouse (France)

    2014-04-15

    A fully automated titration system with infrared detection was developed for investigating interfacial chemistry at high pressures. The apparatus consists of a high-pressure fluid generation and delivery system coupled to a high-pressure cell with infrared optics. A manifold of electronically actuated valves is used to direct pressurized fluids into the cell. Precise reagent additions to the pressurized cell are made with calibrated tubing loops that are filled with reagent and placed in-line with the cell and a syringe pump. The cell's infrared optics facilitate both transmission and attenuated total reflection (ATR) measurements to monitor bulk-fluid composition and solid-surface phenomena such as adsorption, desorption, complexation, dissolution, and precipitation. Switching between the two measurement modes is accomplished with moveable mirrors that direct the light path of a Fourier transform infrared spectrometer into the cell along transmission or ATR light paths. The versatility of the high-pressure IR titration system was demonstrated with three case studies. First, we titrated water into supercritical CO{sub 2} (scCO{sub 2}) to generate an infrared calibration curve and determine the solubility of water in CO{sub 2} at 50 °C and 90 bar. Next, we characterized the partitioning of water between a montmorillonite clay and scCO{sub 2} at 50 °C and 90 bar. Transmission-mode spectra were used to quantify changes in the clay's sorbed water concentration as a function of scCO{sub 2} hydration, and ATR measurements provided insights into competitive residency of water and CO{sub 2} on the clay surface and in the interlayer. Finally, we demonstrated how time-dependent studies can be conducted with the system by monitoring the carbonation reaction of forsterite (Mg{sub 2}SiO{sub 4}) in water-bearing scCO{sub 2} at 50 °C and 90 bar. Immediately after water dissolved in the scCO{sub 2}, a thin film of adsorbed water formed on the mineral surface

  20. Automated high-pressure titration system with in situ infrared spectroscopic detection

    International Nuclear Information System (INIS)

    Thompson, Christopher J.; Martin, Paul F.; Chen, Jeffrey; Schaef, Herbert T.; Rosso, Kevin M.; Felmy, Andrew R.; Loring, John S.; Benezeth, Pascale

    2014-01-01

    A fully automated titration system with infrared detection was developed for investigating interfacial chemistry at high pressures. The apparatus consists of a high-pressure fluid generation and delivery system coupled to a high-pressure cell with infrared optics. A manifold of electronically actuated valves is used to direct pressurized fluids into the cell. Precise reagent additions to the pressurized cell are made with calibrated tubing loops that are filled with reagent and placed in-line with the cell and a syringe pump. The cell's infrared optics facilitate both transmission and attenuated total reflection (ATR) measurements to monitor bulk-fluid composition and solid-surface phenomena such as adsorption, desorption, complexation, dissolution, and precipitation. Switching between the two measurement modes is accomplished with moveable mirrors that direct the light path of a Fourier transform infrared spectrometer into the cell along transmission or ATR light paths. The versatility of the high-pressure IR titration system was demonstrated with three case studies. First, we titrated water into supercritical CO 2 (scCO 2 ) to generate an infrared calibration curve and determine the solubility of water in CO 2 at 50 °C and 90 bar. Next, we characterized the partitioning of water between a montmorillonite clay and scCO 2 at 50 °C and 90 bar. Transmission-mode spectra were used to quantify changes in the clay's sorbed water concentration as a function of scCO 2 hydration, and ATR measurements provided insights into competitive residency of water and CO 2 on the clay surface and in the interlayer. Finally, we demonstrated how time-dependent studies can be conducted with the system by monitoring the carbonation reaction of forsterite (Mg 2 SiO 4 ) in water-bearing scCO 2 at 50 °C and 90 bar. Immediately after water dissolved in the scCO 2 , a thin film of adsorbed water formed on the mineral surface, and the film thickness increased with time as the

  1. Use of Bioregenerative Technologies for Advanced Life Support: Some Considerations for BIO-Plex and Related Testbeds

    Science.gov (United States)

    Wheeler, Raymond M.; Strayer, Richard F.

    1997-01-01

    A review of bioregenerative life support concepts is provided as a guide for developing ground-based testbeds for NASA's Advanced Life Support Program. Key among these concepts are the use of controlled environment plant culture for the production of food, oxygen, and clean water, and the use of bacterial bioreactors for degrading wastes and recycling nutrients. Candidate crops and specific bioreactor approaches are discussed based on experiences from the. Kennedy Space Center Advanced Life Support Breadboard Project, and a review of related literature is provided.

  2. Automation in a material processing/storage facility

    International Nuclear Information System (INIS)

    Peterson, K.; Gordon, J.

    1997-01-01

    The Savannah River Site (SRS) is currently developing a new facility, the Actinide Packaging and Storage Facility (APSF), to process and store legacy materials from the United States nuclear stockpile. A variety of materials, with a variety of properties, packaging and handling/storage requirements, will be processed and stored at the facility. Since these materials are hazardous and radioactive, automation will be used to minimize worker exposure. Other benefits derived from automation of the facility include increased throughput capacity and enhanced security. The diversity of materials and packaging geometries to be handled poses challenges to the automation of facility processes. In addition, the nature of the materials to be processed underscores the need for safety, reliability and serviceability. The application of automation in this facility must, therefore, be accomplished in a rational and disciplined manner to satisfy the strict operational requirements of the facility. Among the functions to be automated are the transport of containers between process and storage areas via an Automatic Guided Vehicle (AGV), and various processes in the Shipping Package Unpackaging (SPU) area, the Accountability Measurements (AM) area, the Special Isotope Storage (SIS) vault and the Special Nuclear Materials (SNM) vault. Other areas of the facility are also being automated, but are outside the scope of this paper

  3. Effect of silica particles modified by in-situ and ex-situ methods on the reinforcement of silicone rubber

    International Nuclear Information System (INIS)

    Song, Yingze; Yu, Jinhong; Dai, Dan; Song, Lixian; Jiang, Nan

    2014-01-01

    Highlights: • In-situ and ex-situ methods were applied to modify silica particles. • In-situ method was more beneficial to preparing silica particles with high BET surface area. • Silicone rubber filled with in-situ modified silica exhibits excellent mechanical and thermal properties. - Abstract: In-situ and ex-situ methods were applied to modify silica particles in order to investigate their effects on the reinforcement of silicone rubber. Surface area and pore analyzer, laser particle size analyzer, Fourier-transform infrared spectroscopy (FTIR), contact-angle instrument, and transmission electron microscope (TEM) were utilized to investigate the structure and properties of the modified silica particles. Dynamic mechanical thermal analyzer (DMTA) was employed to characterize the vulcanizing behavior and mechanical properties of the composites. Thermogravimetric analysis (TGA) was performed to test the thermal stability of the composites. FTIR and contact angle analysis indicated that silica particles were successfully modified by these two methods. The BET surface area and TEM results reflected that in-situ modification was more beneficial to preparing silica particles with irregular shape and higher BET surface area in comparison with ex-situ modification. The DMTA and TGA data revealed that compared with ex-situ modification, the in-situ modification produced positive influence on the reinforcement of silicone rubber

  4. Automation of analytical systems in power cycles

    International Nuclear Information System (INIS)

    Staub Lukas

    2008-01-01

    'Automation' is a widely used term in instrumentation and is often applied to signal exchange, PLC and SCADA systems. Common use, however, does not necessarily described autonomous operation of analytical devices. We define an automated analytical system as a black box with an input (sample) and an output (measured value). In addition we need dedicated status lines for assessing the validities of the input for our black box and the output for subsequent systems. We will discuss input parameters, automated analytical processes and output parameters. Further considerations will be given to signal exchange and integration into the operating routine of a power plant. Local control loops (chemical dosing) and the automation of sampling systems are not discussed here. (author)

  5. MODELING CIRCUMSTELLAR DISKS OF B-TYPE STARS WITH OBSERVATIONS FROM THE PALOMAR TESTBED INTERFEROMETER

    International Nuclear Information System (INIS)

    Grzenia, B. J.; Tycner, C.; Jones, C. E.; Sigut, T. A. A.; Rinehart, S. A.; Van Belle, G. T.

    2013-01-01

    Geometrical (uniform disk) and numerical models were calculated for a set of B-emission (Be) stars observed with the Palomar Testbed Interferometer (PTI). Physical extents have been estimated for the disks of a total of 15 stars via uniform disk models. Our numerical non-LTE models used parameters for the B0, B2, B5, and B8 spectral classes and following the framework laid by previous studies, we have compared them to infrared K-band interferometric observations taken at PTI. This is the first time such an extensive set of Be stars observed with long-baseline interferometry has been analyzed with self-consistent non-LTE numerical disk models.

  6. Study on the fabrication of Al matrix composites strengthened by combined in-situ alumina particle and in-situ alloying elements

    International Nuclear Information System (INIS)

    Huang Zanjun; Yang Bin; Cui Hua; Zhang Jishan

    2003-01-01

    A new idea to fabricate aluminum matrix composites strengthened by combined in-situ particle strengthening and in-situ alloying has been proposed. Following the concept of in-situ alloying and in-situ particle strengthening, aluminum matrix composites reinforced by Cu and α-Al 2 O 3 particulate (material I) and the same matrix reinforced by Cu, Si alloying elements and α-Al 2 O 3 particulate (material II) have been obtained. SEM observation, EDS and XRD analysis show that the alloy elements Cu and Si exist in the two materials, respectively. In-situ Al 2 O 3 particulates are generally spherical and their mean size is less than 0.5 μm. TEM observation shows that the in-situ α-Al 2 O 3 particulates have a good cohesion with the matrix. The reaction mechanism of the Al 2 O 3 particulate obtained by this method was studied. Thermodynamic considerations are given to the in-situ reactions and the distribution characteristic of in-situ the α-Al 2 O 3 particulate in the process of solidification is also discussed

  7. Breeding of in-situ Petroleum Degrading Bacteria in Hangzhou Bay and evaluating for the In-situ repair effect

    Science.gov (United States)

    Lan, Ru; Lin, Hai; Qiao, Bing; Dong, Yingbo; Zhang, Wei; Chang, Wen

    2018-02-01

    In this paper, the restoration behaviour of the in-situ microorganisms in seawater and sediments to the marine accident oil spill was researched. The experimental study on the breeding of in-situ petroleum-degrading bacteria in the seawater and sediments of Hangzhou Bay and the restoration of oil spill were carried out. Making use of the reinforced microbial flora, combined with physical and chemical methods in field environment, petroleum degrading and restoration experiment were performed, the effect of the breeding of in-situ degrading bacteria was evaluated, and the standard process of in-situ bacteria sampling, laboratory screening, domestication and degradation efficiency testing were formed. This study laid a foundation for further evaluation of the advantages and disadvantages for the petroleum-degrading bacteria of Hangzhou Bay during the process of in-situ restoration. The results showed that in-situ microbes of Hangzhou Bay could reach the growth peak in 5 days with the suitable environmental factors and sufficient nutrient elements, and the degradation efficiency could reach 65.2% (or 74.8% after acclimation). And also the microbes could adapt to the local sea water and environmental conditions, with a certain degree of degradation. The research results could provide parameter support for causal judgment and quantitative assessment of oil spill damage.

  8. In Situ Hybridization Pada Kanker Payudara

    OpenAIRE

    Diah Witari, Ni Putu

    2014-01-01

    Kesulitan yang dijumpai pada penanganan kanker payudara adalah terjadinya kekambuhan atau relaps. Deteksi status HER2 pada pasien merupakan salah satu upaya untuk mendeteksi terjadinya relaps dan juga untuk menentukan jenis terapi yang ada diberikan. Ekspresi protein HER2 dapat dideteksi dengan immunohistochemistry (IHC), sedangkan mutasi gen HER2 dapat dideteksi dengan teknik in situ hybridization baik berupa fluorescence in situ hybridization (FISH) ataupun chromogenic in situ hy...

  9. Use of In-Situ and Remotely Sensed Snow Observations for the National Water Model in Both an Analysis and Calibration Framework.

    Science.gov (United States)

    Karsten, L. R.; Gochis, D.; Dugger, A. L.; McCreight, J. L.; Barlage, M. J.; Fall, G. M.; Olheiser, C.

    2017-12-01

    Since version 1.0 of the National Water Model (NWM) has gone operational in Summer 2016, several upgrades to the model have occurred to improve hydrologic prediction for the continental United States. Version 1.1 of the NWM (Spring 2017) includes upgrades to parameter datasets impacting land surface hydrologic processes. These parameter datasets were upgraded using an automated calibration workflow that utilizes the Dynamic Data Search (DDS) algorithm to adjust parameter values using observed streamflow. As such, these upgrades to parameter values took advantage of various observations collected for snow analysis. In particular, in-situ SNOTEL observations in the Western US, volunteer in-situ observations across the entire US, gamma-derived snow water equivalent (SWE) observations courtesy of the NWS NOAA Corps program, gridded snow depth and SWE products from the Jet Propulsion Laboratory (JPL) Airborne Snow Observatory (ASO), gridded remotely sensed satellite-based snow products (MODIS,AMSR2,VIIRS,ATMS), and gridded SWE from the NWS Snow Data Assimilation System (SNODAS). This study explores the use of these observations to quantify NWM error and improvements from version 1.0 to version 1.1, along with subsequent work since then. In addition, this study explores the use of snow observations for use within the automated calibration workflow. Gridded parameter fields impacting the accumulation and ablation of snow states in the NWM were adjusted and calibrated using gridded remotely sensed snow states, SNODAS products, and in-situ snow observations. This calibration adjustment took place over various ecological regions in snow-dominated parts of the US for a retrospective period of time to capture a variety of climatological conditions. Specifically, the latest calibrated parameters impacting streamflow were held constant and only parameters impacting snow physics were tuned using snow observations and analysis. The adjusted parameter datasets were then used to

  10. Automation, consolidation, and integration in autoimmune diagnostics.

    Science.gov (United States)

    Tozzoli, Renato; D'Aurizio, Federica; Villalta, Danilo; Bizzaro, Nicola

    2015-08-01

    Over the past two decades, we have witnessed an extraordinary change in autoimmune diagnostics, characterized by the progressive evolution of analytical technologies, the availability of new tests, and the explosive growth of molecular biology and proteomics. Aside from these huge improvements, organizational changes have also occurred which brought about a more modern vision of the autoimmune laboratory. The introduction of automation (for harmonization of testing, reduction of human error, reduction of handling steps, increase of productivity, decrease of turnaround time, improvement of safety), consolidation (combining different analytical technologies or strategies on one instrument or on one group of connected instruments) and integration (linking analytical instruments or group of instruments with pre- and post-analytical devices) opened a new era in immunodiagnostics. In this article, we review the most important changes that have occurred in autoimmune diagnostics and present some models related to the introduction of automation in the autoimmunology laboratory, such as automated indirect immunofluorescence and changes in the two-step strategy for detection of autoantibodies; automated monoplex immunoassays and reduction of turnaround time; and automated multiplex immunoassays for autoantibody profiling.

  11. Working toward Transparency in Library Automation

    Science.gov (United States)

    Breeding, Marshall

    2007-01-01

    In this article, the author argues the need for transparency with regard to the automation systems used in libraries. As librarians make decisions regarding automation software and services, they should have convenient access to information about the organizations it will potentially acquire technology from and about the collective experiences of…

  12. PROSCARA Inc. in-situ burning summary paper

    International Nuclear Information System (INIS)

    1994-06-01

    In-situ burning as a viable response tactic in the event of an oil spill, was discussed. Key factors which influence a decision to use burning were enumerated, including a detailed analysis of the environmental effects of in-situ burning on soils. The critical parameters were time, soil heating and extent of oil penetration into the soil. It was noted that on water-saturated and frozen soil in-situ burning had no adverse effects. The advantages and disadvantages of in-situ burning vis-a-vis conventional mechanical recovery were discussed. Factors that do, and factors that do not support decisions in favour of in-situ burning were listed. 4 refs., 2 tabs

  13. Ex-situ and in-situ mineral carbonation as a means to sequester carbon dioxide

    Energy Technology Data Exchange (ETDEWEB)

    Gerdemann, Stephen J.; Dahlin, David C.; O' Connor, William K.; Penner, Larry R.; Rush, G.E.

    2004-01-01

    The U. S. Department of Energy's Albany Research Center is investigating mineral carbonation as a method of sequestering CO2 from coal-fired-power plants. Magnesium-silicate minerals such as serpentine [Mg3Si2O5(OH)4] and olivine (Mg2SiO4) react with CO2 to produce magnesite (MgCO3), and the calcium-silicate mineral, wollastonite (CaSiO3), reacts to form calcite (CaCO3). It is possible to carry out these reactions either ex situ (above ground in a traditional chemical processing plant) or in situ (storage underground and subsequent reaction with the host rock to trap CO2 as carbonate minerals). For ex situ mineral carbonation to be economically attractive, the reaction must proceed quickly to near completion. The reaction rate is accelerated by raising the activity of CO2 in solution, heat (but not too much), reducing the particle size, high-intensity grinding to disrupt the crystal structure, and, in the case of serpentine, heat-treatment to remove the chemically bound water. All of these carry energy/economic penalties. An economic study illustrates the impact of mineral availability and process parameters on the cost of ex situ carbon sequestration. In situ carbonation offers economic advantages over ex situ processes, because no chemical plant is required. Knowledge gained from the ex situ work was applied to long-term experiments designed to simulate in situ CO2 storage conditions. The Columbia River Basalt Group (CRBG), a multi-layered basaltic lava formation, has potentially favorable mineralogy (up to 25% combined concentration of Ca, Fe2+, and Mg cations) for storage of CO2. However, more information about the interaction of CO2 with aquifers and the host rock is needed. Core samples from the CRBG, as well as samples of olivine, serpentine, and sandstone, were reacted in an autoclave for up to 2000 hours at elevated temperatures and pressures. Changes in core porosity, secondary mineralizations, and both solution and solid chemistry were measured.

  14. Laboratory automation in clinical bacteriology: what system to choose?

    Science.gov (United States)

    Croxatto, A; Prod'hom, G; Faverjon, F; Rochais, Y; Greub, G

    2016-03-01

    Automation was introduced many years ago in several diagnostic disciplines such as chemistry, haematology and molecular biology. The first laboratory automation system for clinical bacteriology was released in 2006, and it rapidly proved its value by increasing productivity, allowing a continuous increase in sample volumes despite limited budgets and personnel shortages. Today, two major manufacturers, BD Kiestra and Copan, are commercializing partial or complete laboratory automation systems for bacteriology. The laboratory automation systems are rapidly evolving to provide improved hardware and software solutions to optimize laboratory efficiency. However, the complex parameters of the laboratory and automation systems must be considered to determine the best system for each given laboratory. We address several topics on laboratory automation that may help clinical bacteriologists to understand the particularities and operative modalities of the different systems. We present (a) a comparison of the engineering and technical features of the various elements composing the two different automated systems currently available, (b) the system workflows of partial and complete laboratory automation, which define the basis for laboratory reorganization required to optimize system efficiency, (c) the concept of digital imaging and telebacteriology, (d) the connectivity of laboratory automation to the laboratory information system, (e) the general advantages and disadvantages as well as the expected impacts provided by laboratory automation and (f) the laboratory data required to conduct a workflow assessment to determine the best configuration of an automated system for the laboratory activities and specificities. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  15. Facilitating Automation Development in Internal Logistics Systems

    OpenAIRE

    Granlund, Anna

    2014-01-01

    The internal logistics system includes all activities connected with managing the flow of materials within the physical limits of a facility. This system is an important part of operations in need of increased focus and continuous improvements. Automation is one possible tool with a previously confirmed great potential to improve internal logistics. Despite this great potential and a growing trend of using automation in the area, internal logistics activities are still not automated to the sa...

  16. Automation of a Versatile Crane (the LSMS) for Lunar Outpost Construction, Maintenance and Inspection

    Science.gov (United States)

    Doggett, William R.; Roithmayr, Carlos M.; Dorsey, John T.; Jones, Thomas C.; Shen, Haijun; Seywald, Hans; King, Bruce D.; Mikulas, Martin M., Jr.

    2009-01-01

    Devices for lifting, translating and precisely placing payloads are critical for efficient Earth-based construction operations. Both recent and past studies have demonstrated that devices with similar functionality will be needed to support lunar outpost operations. Although several designs have been developed for Earth based applications, these devices lack unique design characteristics necessary for transport to and use on the harsh lunar surface. These design characteristics include: a) lightweight components, b) compact packaging for launch, c) automated deployment, d) simple in-field reconfiguration and repair, and e) support for tele-operated or automated operations. Also, because the cost to transport mass to the lunar surface is very high, the number of devices that can be dedicated to surface operations will be limited. Thus, in contrast to Earth-based construction, where many single-purpose devices dominate a construction site, a lunar outpost will require a limited number of versatile devices that provide operational benefit from initial construction through sustained operations. The first generation test-bed of a new high performance device, the Lunar Surface Manipulation System (LSMS) has been designed, built and field tested. The LSMS has many unique features resulting in a mass efficient solution to payload handling on the lunar surface. Typically, the LSMS device mass is estimated at approximately 3% of the mass of the heaviest payload lifted at the tip, or 1.8 % of the mass of the heaviest mass lifted at the elbow or mid-span of the boom for a high performance variant incorporating advanced structural components. Initial operational capabilities of the LSMS were successfully demonstrated during field tests at Moses Lake, Washington using a tele-operated approach. Joint angle sensors have been developed for the LSMS to improve operator situational awareness. These same sensors provide the necessary information to support fully automated operations

  17. An overview of in situ waste treatment technologies

    International Nuclear Information System (INIS)

    Walker, S.; Hyde, R.A.; Piper, R.B.; Roy, M.W.

    1992-01-01

    In situ technologies are becoming an attractive remedial alternative for eliminating environmental problems. In situ treatments typically reduce risks and costs associated with retrieving, packaging, and storing or disposing-waste and are generally preferred over ex situ treatments. Each in situ technology has specific applications, and, in order to provide the most economical and practical solution to a waste problem, these applications must be understood. This paper presents an overview of thirty different in situ remedial technologies for buried wastes or contaminated soil areas. The objective of this paper is to familiarize those involved in waste remediation activities with available and emerging in situ technologies so that they may consider these options in the remediation of hazardous and/or radioactive waste sites. Several types of in situ technologies are discussed, including biological treatments, containment technologies, physical/chemical treatments, solidification/stabilization technologies, and thermal treatments. Each category of in situ technology is briefly examined in this paper. Specific treatments belonging to these categories are also reviewed. Much of the information on in situ treatment technologies in this paper was obtained directly from vendors and universities and this information has not been verified

  18. Automation in airport security X-ray screening of cabin baggage: Examining benefits and possible implementations of automated explosives detection.

    Science.gov (United States)

    Hättenschwiler, Nicole; Sterchi, Yanik; Mendes, Marcia; Schwaninger, Adrian

    2018-10-01

    Bomb attacks on civil aviation make detecting improvised explosive devices and explosive material in passenger baggage a major concern. In the last few years, explosive detection systems for cabin baggage screening (EDSCB) have become available. Although used by a number of airports, most countries have not yet implemented these systems on a wide scale. We investigated the benefits of EDSCB with two different levels of automation currently being discussed by regulators and airport operators: automation as a diagnostic aid with an on-screen alarm resolution by the airport security officer (screener) or EDSCB with an automated decision by the machine. The two experiments reported here tested and compared both scenarios and a condition without automation as baseline. Participants were screeners at two international airports who differed in both years of work experience and familiarity with automation aids. Results showed that experienced screeners were good at detecting improvised explosive devices even without EDSCB. EDSCB increased only their detection of bare explosives. In contrast, screeners with less experience (tenure automated decision provided better human-machine detection performance than on-screen alarm resolution and no automation. This came at the cost of slightly higher false alarm rates on the human-machine system level, which would still be acceptable from an operational point of view. Results indicate that a wide-scale implementation of EDSCB would increase the detection of explosives in passenger bags and automated decision instead of automation as diagnostic aid with on screen alarm resolution should be considered. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  19. Asleep at the automated wheel-Sleepiness and fatigue during highly automated driving.

    Science.gov (United States)

    Vogelpohl, Tobias; Kühn, Matthias; Hummel, Thomas; Vollrath, Mark

    2018-03-20

    Due to the lack of active involvement in the driving situation and due to monotonous driving environments drivers with automation may be prone to become fatigued faster than manual drivers (e.g. Schömig et al., 2015). However, little is known about the progression of fatigue during automated driving and its effects on the ability to take back manual control after a take-over request. In this driving simulator study with Nö=ö60 drivers we used a three factorial 2ö×ö2ö×ö12 mixed design to analyze the progression (12ö×ö5ömin; within subjects) of driver fatigue in drivers with automation compared to manual drivers (between subjects). Driver fatigue was induced as either mainly sleep related or mainly task related fatigue (between subjects). Additionally, we investigated the drivers' reactions to a take-over request in a critical driving scenario to gain insights into the ability of fatigued drivers to regain manual control and situation awareness after automated driving. Drivers in the automated driving condition exhibited facial indicators of fatigue after 15 to 35ömin of driving. Manual drivers only showed similar indicators of fatigue if they suffered from a lack of sleep and then only after a longer period of driving (approx. 40ömin). Several drivers in the automated condition closed their eyes for extended periods of time. In the driving with automation condition mean automation deactivation times after a take-over request were slower for a certain percentage (about 30%) of the drivers with a lack of sleep (Mö=ö3.2; SDö=ö2.1ös) compared to the reaction times after a long drive (Mö=ö2.4; SDö=ö0.9ös). Drivers with automation also took longer than manual drivers to first glance at the speed display after a take-over request and were more likely to stay behind a braking lead vehicle instead of overtaking it. Drivers are unable to stay alert during extended periods of automated driving without non-driving related tasks. Fatigued drivers could

  20. The influence of the roll diameter in flat rolling of of superconducting in situ and ex situ MgB2 tape

    DEFF Research Database (Denmark)

    Hancock, Michael Halloway; Bay, Niels

    2007-01-01

    , 150 and 210 mm in each step. The investigation has shown that the in situ powder is more readily compacted than the ex situ powder, with an average increase of relative density after mechanical processing of 37% for in situ powder and 19% for ex situ powder. Statistical analysis showed that the choice......Applying the powder in tube (PIT) method, single-filament MgB2/Fe wire and tape has been manufactured applying both the ex situ and the in situ approach. The influence of the roll diameter in three-step flat rolling on the powder density and critical temperature has been examined using rolls of 70...... roll in the first and second reductions followed by the 150 mm or 210 mm roll in the last reduction was the optimum strategy for both powder types. AC susceptibility testing showed that for the in situ tapes there was no correlation between the powder density and the critical temperature. For ex situ...

  1. Procedure automation: the effect of automated procedure execution on situation awareness and human performance

    International Nuclear Information System (INIS)

    Andresen, Gisle; Svengren, Haakan; Heimdal, Jan O.; Nilsen, Svein; Hulsund, John-Einar; Bisio, Rossella; Debroise, Xavier

    2004-04-01

    As advised by the procedure workshop convened in Halden in 2000, the Halden Project conducted an experiment on the effect of automation of Computerised Procedure Systems (CPS) on situation awareness and human performance. The expected outcome of the study was to provide input for guidance on CPS design, and to support the Halden Project's ongoing research on human reliability analysis. The experiment was performed in HAMMLAB using the HAMBO BWR simulator and the COPMA-III CPS. Eight crews of operators from Forsmark 3 and Oskarshamn 3 participated. Three research questions were investigated: 1) Does procedure automation create Out-Of-The-Loop (OOTL) performance problems? 2) Does procedure automation affect situation awareness? 3) Does procedure automation affect crew performance? The independent variable, 'procedure configuration', had four levels: paper procedures, manual CPS, automation with breaks, and full automation. The results showed that the operators experienced OOTL problems in full automation, but that situation awareness and crew performance (response time) were not affected. One possible explanation for this is that the operators monitored the automated procedure execution conscientiously, something which may have prevented the OOTL problems from having negative effects on situation awareness and crew performance. In a debriefing session, the operators clearly expressed their dislike for the full automation condition, but that automation with breaks could be suitable for some tasks. The main reason why the operators did not like the full automation was that they did not feel being in control. A qualitative analysis addressing factors contributing to response time delays revealed that OOTL problems did not seem to cause delays, but that some delays could be explained by the operators having problems with the freeze function of the CPS. Also other factors such as teamwork and operator tendencies were of importance. Several design implications were drawn

  2. Automated Mobility Transitions: Governing Processes in the UK

    Directory of Open Access Journals (Sweden)

    Debbie Hopkins

    2018-03-01

    Full Text Available Contemporary systems of mobility are undergoing a transition towards automation. In the UK, this transition is being led by (often new partnerships between incumbent manufacturers and new entrants, in collaboration with national governments, local/regional councils, and research institutions. This paper first offers a framework for analyzing the governance of the transition, adapting ideas from the Transition Management (TM perspective, and then applies the framework to ongoing automated vehicle transition dynamics in the UK. The empirical analysis suggests that the UK has adopted a reasonably comprehensive approach to the governing of automated vehicle innovation but that this approach cannot be characterized as sufficiently inclusive, democratic, diverse and open. The lack of inclusivity, democracy, diversity and openness is symptomatic of the post-political character of how the UK’s automated mobility transition is being governed. The paper ends with a call for a reconfiguration of the automated vehicle transition in the UK and beyond, so that much more space is created for dissent and for reflexive and comprehensive big picture thinking on (automated mobility futures.

  3. Virtual Machine in Automation Projects

    OpenAIRE

    Xing, Xiaoyuan

    2010-01-01

    Virtual machine, as an engineering tool, has recently been introduced into automation projects in Tetra Pak Processing System AB. The goal of this paper is to examine how to better utilize virtual machine for the automation projects. This paper designs different project scenarios using virtual machine. It analyzes installability, performance and stability of virtual machine from the test results. Technical solutions concerning virtual machine are discussed such as the conversion with physical...

  4. TORCH Computational Reference Kernels - A Testbed for Computer Science Research

    Energy Technology Data Exchange (ETDEWEB)

    Kaiser, Alex; Williams, Samuel Webb; Madduri, Kamesh; Ibrahim, Khaled; Bailey, David H.; Demmel, James W.; Strohmaier, Erich

    2010-12-02

    For decades, computer scientists have sought guidance on how to evolve architectures, languages, and programming models in order to improve application performance, efficiency, and productivity. Unfortunately, without overarching advice about future directions in these areas, individual guidance is inferred from the existing software/hardware ecosystem, and each discipline often conducts their research independently assuming all other technologies remain fixed. In today's rapidly evolving world of on-chip parallelism, isolated and iterative improvements to performance may miss superior solutions in the same way gradient descent optimization techniques may get stuck in local minima. To combat this, we present TORCH: A Testbed for Optimization ResearCH. These computational reference kernels define the core problems of interest in scientific computing without mandating a specific language, algorithm, programming model, or implementation. To compliment the kernel (problem) definitions, we provide a set of algorithmically-expressed verification tests that can be used to verify a hardware/software co-designed solution produces an acceptable answer. Finally, to provide some illumination as to how researchers have implemented solutions to these problems in the past, we provide a set of reference implementations in C and MATLAB.

  5. Unmet needs in automated cytogenetics

    International Nuclear Information System (INIS)

    Bender, M.A.

    1976-01-01

    Though some, at least, of the goals of automation systems for analysis of clinical cytogenetic material seem either at hand, like automatic metaphase finding, or at least likely to be met in the near future, like operator-assisted semi-automatic analysis of banded metaphase spreads, important areas of cytogenetic analsis, most importantly the determination of chromosomal aberration frequencies in populations of cells or in samples of cells from people exposed to environmental mutagens, await practical methods of automation. Important as are the clinical diagnostic applications, it is apparent that increasing concern over the clastogenic effects of the multitude of potentially clastogenic chemical and physical agents to which human populations are being increasingly exposed, and the resulting emergence of extensive cytogenetic testing protocols, makes the development of automation not only economically feasible but almost mandatory. The nature of the problems involved, and acutal of possible approaches to their solution, are discussed

  6. Recent advances in automated system model extraction (SME)

    International Nuclear Information System (INIS)

    Narayanan, Nithin; Bloomsburgh, John; He Yie; Mao Jianhua; Patil, Mahesh B; Akkaraju, Sandeep

    2006-01-01

    In this paper we present two different techniques for automated extraction of system models from FEA models. We discuss two different algorithms: for (i) automated N-DOF SME for electrostatically actuated MEMS and (ii) automated N-DOF SME for MEMS inertial sensors. We will present case studies for the two different algorithms presented

  7. Digital pathology: DICOM-conform draft, testbed, and first results.

    Science.gov (United States)

    Zwönitzer, Ralf; Kalinski, Thomas; Hofmann, Harald; Roessner, Albert; Bernarding, Johannes

    2007-09-01

    Hospital information systems are state of the art nowadays. Therefore, Digital Pathology, also labelled as Virtual Microscopy, has gained increased attention. Triggered by radiology, standardized information models and workflows were world-wide defined based on DICOM. However, DICOM-conform integration of Digital Pathology into existing clinical information systems imposes new problems requiring specific solutions concerning the huge amount of data as well as the special structure of the data to be managed, transferred, and stored. We implemented a testbed to realize and evaluate the workflow of digitized slides from acquisition to archiving. The experiences led to the draft of a DICOM-conform information model that accounted for extensions, definitions, and technical requirements necessary to integrate digital pathology in a hospital-wide DICOM environment. Slides were digitized, compressed, and could be viewed remotely. Real-time transfer of the huge amount of data was optimized using streaming techniques. Compared to a recent discussion in the DICOM Working Group for Digital Pathology (WG26) our experiences led to a preference of a JPEG2000/JPIP-based streaming of the whole slide image. The results showed that digital pathology is feasible but strong efforts by users and vendors are still necessary to integrate Digital Pathology into existing information systems.

  8. Conceptual Design for a Dual-Bell Rocket Nozzle System Using a NASA F-15 Airplane as the Flight Testbed

    Science.gov (United States)

    Jones, Daniel S.; Ruf, Joseph H.; Bui, Trong T.; Martinez, Martel; St. John, Clinton W.

    2014-01-01

    The dual-bell rocket nozzle was first proposed in 1949, offering a potential improvement in rocket nozzle performance over the conventional-bell nozzle. Despite the performance advantages that have been predicted, both analytically and through static test data, the dual-bell nozzle has still not been adequately tested in a relevant flight environment. In 2013 a proposal was constructed that offered a National Aeronautics and Space Administration (NASA) F-15 airplane as the flight testbed, with the plan to operate a dual-bell rocket nozzle during captive-carried flight. If implemented, this capability will permit nozzle operation into an external flow field similar to that of a launch vehicle, and facilitate an improved understanding of dual-bell nozzle plume sensitivity to external flow-field effects. More importantly, this flight testbed can be utilized to help quantify the performance benefit with the dual-bell nozzle, as well as to advance its technology readiness level. Toward this ultimate goal, this report provides plans for future flights to quantify the external flow field of the airplane near the nozzle experiment, as well as details on the conceptual design for the dual-bell nozzle cold-flow propellant feed system integration within the NASA F-15 Propulsion Flight Test Fixture. The current study shows that this concept of flight research is feasible, and could result in valuable flight data for the dual-bell nozzle.

  9. Wind power integration in island-based smart grid projects : A comparative study between Jeju Smart Grid Test-bed and Smart Grid Gotland

    OpenAIRE

    Piehl, Hampus

    2014-01-01

    Smart grids seem to be the solution to use energy from renewable and intermittent energy sources in an efficient manner. There are many research projects around the world and two of them are Jeju Smart Grid Test-bed and Smart Grid Gotland. They have in common that they are both island-based projects and connected to the Powergrid on the mainland by HVDC-link. The purpose of this thesis is to compare the two projects and find out what challenges and strategies they have related to wind power i...

  10. Development of small-bore, high-current-density railgun as testbed for study of plasma-materials interaction. Progress report for October 16, 2000 - May 13, 2003

    International Nuclear Information System (INIS)

    Kyekyoon, Kim-Kevin

    2003-01-01

    The present document is a final technical report summarizing the progress made during 10/16/2000 - 05/13/2003 toward the development of a small-bore railgun with transaugmentation as a testbed for investigating plasma-materials interaction

  11. Automation and Integration in Semiconductor Manufacturing

    OpenAIRE

    Liao, Da-Yin

    2010-01-01

    Semiconductor automation originates from the prevention and avoidance of frauds in daily fab operations. As semiconductor technology and business continuously advance and grow, manufacturing systems must aggressively evolve to meet the changing technical and business requirements in this industry. Semiconductor manufacturing has been suffering pains from islands of automation. The problems associated with these systems are limited

  12. Automation in biological crystallization.

    Science.gov (United States)

    Stewart, Patrick Shaw; Mueller-Dieckmann, Jochen

    2014-06-01

    Crystallization remains the bottleneck in the crystallographic process leading from a gene to a three-dimensional model of the encoded protein or RNA. Automation of the individual steps of a crystallization experiment, from the preparation of crystallization cocktails for initial or optimization screens to the imaging of the experiments, has been the response to address this issue. Today, large high-throughput crystallization facilities, many of them open to the general user community, are capable of setting up thousands of crystallization trials per day. It is thus possible to test multiple constructs of each target for their ability to form crystals on a production-line basis. This has improved success rates and made crystallization much more convenient. High-throughput crystallization, however, cannot relieve users of the task of producing samples of high quality. Moreover, the time gained from eliminating manual preparations must now be invested in the careful evaluation of the increased number of experiments. The latter requires a sophisticated data and laboratory information-management system. A review of the current state of automation at the individual steps of crystallization with specific attention to the automation of optimization is given.

  13. Rapid access to compound libraries through flow technology: fully automated synthesis of a 3-aminoindolizine library via orthogonal diversification.

    Science.gov (United States)

    Lange, Paul P; James, Keith

    2012-10-08

    A novel methodology for the synthesis of druglike heterocycle libraries has been developed through the use of flow reactor technology. The strategy employs orthogonal modification of a heterocyclic core, which is generated in situ, and was used to construct both a 25-membered library of druglike 3-aminoindolizines, and selected examples of a 100-member virtual library. This general protocol allows a broad range of acylation, alkylation and sulfonamidation reactions to be performed in conjunction with a tandem Sonogashira coupling/cycloisomerization sequence. All three synthetic steps were conducted under full automation in the flow reactor, with no handling or isolation of intermediates, to afford the desired products in good yields. This fully automated, multistep flow approach opens the way to highly efficient generation of druglike heterocyclic systems as part of a lead discovery strategy or within a lead optimization program.

  14. Logistic control in automated transportation networks

    NARCIS (Netherlands)

    Ebben, Mark

    2001-01-01

    Increasing congestion problems lead to a search for alternative transportation systems. Automated transportation networks, possibly underground, are an option. Logistic control systems are essential for future implementations of such automated transportation networks. This book contributes to the

  15. Realtime Automation Networks in moVing industrial Environments

    Directory of Open Access Journals (Sweden)

    Rafael Leidinger

    2012-04-01

    Full Text Available The radio-based wireless data communication has made the realization of new technical solutions possible in many fields of the automation technology (AT. For about ten years, a constant disproportionate growth of wireless technologies can be observed in the automation technology. However, it shows that especially for the AT, conven-tional technologies of office automation are unsuitable and/or not manageable. The employment of mobile ser-vices in the industrial automation technology has the potential of significant cost and time savings. This leads to an increased productivity in various fields of the AT, for example in the factory and process automation or in production logistics. In this paper technologies and solu-tions for an automation-suited supply of mobile wireless services will be introduced under the criteria of real time suitability, IT-security and service orientation. Emphasis will be put on the investigation and develop-ment of wireless convergence layers for different radio technologies, on the central provision of support services for an easy-to-use, central, backup enabled management of combined wired / wireless networks and on the study on integrability in a Profinet real-time Ethernet network [1].

  16. Automated fault-management in a simulated spaceflight micro-world

    Science.gov (United States)

    Lorenz, Bernd; Di Nocera, Francesco; Rottger, Stefan; Parasuraman, Raja

    2002-01-01

    BACKGROUND: As human spaceflight missions extend in duration and distance from Earth, a self-sufficient crew will bear far greater onboard responsibility and authority for mission success. This will increase the need for automated fault management (FM). Human factors issues in the use of such systems include maintenance of cognitive skill, situational awareness (SA), trust in automation, and workload. This study examine the human performance consequences of operator use of intelligent FM support in interaction with an autonomous, space-related, atmospheric control system. METHODS: An expert system representing a model-base reasoning agent supported operators at a low level of automation (LOA) by a computerized fault finding guide, at a medium LOA by an automated diagnosis and recovery advisory, and at a high LOA by automate diagnosis and recovery implementation, subject to operator approval or veto. Ten percent of the experimental trials involved complete failure of FM support. RESULTS: Benefits of automation were reflected in more accurate diagnoses, shorter fault identification time, and reduced subjective operator workload. Unexpectedly, fault identification times deteriorated more at the medium than at the high LOA during automation failure. Analyses of information sampling behavior showed that offloading operators from recovery implementation during reliable automation enabled operators at high LOA to engage in fault assessment activities CONCLUSIONS: The potential threat to SA imposed by high-level automation, in which decision advisories are automatically generated, need not inevitably be counteracted by choosing a lower LOA. Instead, freeing operator cognitive resources by automatic implementation of recover plans at a higher LOA can promote better fault comprehension, so long as the automation interface is designed to support efficient information sampling.

  17. 78 FR 53466 - Modification of Two National Customs Automation Program (NCAP) Tests Concerning Automated...

    Science.gov (United States)

    2013-08-29

    ... Customs Automation Program (NCAP) Tests Concerning Automated Commercial Environment (ACE) Document Image... National Customs Automation Program (NCAP) tests concerning document imaging, known as the Document Image... the National Customs Automation Program (NCAP) tests concerning document imaging, known as the...

  18. Triplex in-situ hybridization

    Science.gov (United States)

    Fresco, Jacques R.; Johnson, Marion D.

    2002-01-01

    Disclosed are methods for detecting in situ the presence of a target sequence in a substantially double-stranded nucleic acid segment, which comprises: a) contacting in situ under conditions suitable for hybridization a substantially double-stranded nucleic acid segment with a detectable third strand, said third strand being capable of hybridizing to at least a portion of the target sequence to form a triple-stranded structure, if said target sequence is present; and b) detecting whether hybridization between the third strand and the target sequence has occured.

  19. Unintended and in situ amorphisation of pharmaceuticals.

    Science.gov (United States)

    Priemel, P A; Grohganz, H; Rades, T

    2016-05-01

    Amorphisation of poorly water-soluble drugs is one approach that can be applied to improve their solubility and thus their bioavailability. Amorphisation is a process that usually requires deliberate external energy input. However, amorphisation can happen both unintentionally, as in process-induced amorphisation during manufacturing, or in situ during dissolution, vaporisation, or lipolysis. The systems in which unintended and in situ amorphisation has been observed normally contain a drug and a carrier. Common carriers include polymers and mesoporous silica particles. However, the precise mechanisms by which in situ amorphisation occurs are often not fully understood. In situ amorphisation can be exploited and performed before administration of the drug or possibly even within the gastrointestinal tract, as can be inferred from in situ amorphisation observed during in vitro lipolysis. The use of in situ amorphisation can thus confer the advantages of the amorphous form, such as higher apparent solubility and faster dissolution rate, without the disadvantage of its physical instability. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Welding process automation in power machine building

    International Nuclear Information System (INIS)

    Mel'bard, S.N.; Shakhnov, A.F.; Shergov, I.V.

    1977-01-01

    The level of welding automation operations in power engineering and ways of its enhancement are highlighted. Used as the examples of comlex automation are an apparatus for the horizontal welding of turbine rotors, remotely controlled automatic machine for welding ring joint of large-sized vessels, equipment for the electron-beam welding of steam turbine assemblies of alloyed steels. The prospects of industrial robots are noted. The importance of the complex automation of technological process, including stocking, assemblying, transportation and auxiliary operations, is emphasized

  1. Application of magnetic sensors in automation control

    Energy Technology Data Exchange (ETDEWEB)

    Hou Chunhong [AMETEK Inc., Paoli, PA 19301 (United States); Qian Zhenghong, E-mail: zqian@hdu.edu.cn [Center For Integrated Spintronic Devices (CISD), Hangzhou Dianzi University, Hangzhou, ZJ 310018 (China)

    2011-01-01

    Controls in automation need speed and position feedback. The feedback device is often referred to as encoder. Feedback technology includes mechanical, optical, and magnetic, etc. All advance with new inventions and discoveries. Magnetic sensing as a feedback technology offers certain advantages over other technologies like optical one. With new discoveries like GMR (Giant Magneto-Resistance), TMR (Tunneling Magneto-Resistance) becoming feasible for commercialization, more and more applications will be using advanced magnetic sensors in automation. This paper offers a general review on encoder and applications of magnetic sensors in automation control.

  2. Design and construction of a testbed for the application of real volcanic ash from the Eyjafjallajökull and Grimsvötn eruptions to microgas turbines

    Science.gov (United States)

    Weber, Konradin; Fischer, Christian; Lange, Martin; Schulz, Uwe; Naraparaju, Ravisankar; Kramer, Dietmar

    2017-04-01

    It is well known that volcanic ash clouds emitted from erupting volcanoes pose a considerable threat to the aviation. The volcanic ash particles can damage the turbine blades and their thermal barrier coatings as well as the bearings of the turbine. For a detailed investigation of this damaging effect a testbed was designed and constructed, which allowed to study the damaging effects of real volcanic ash to an especially for these investigations modified microgas turbine. The use of this microgas turbine had the advantage that it delivers near reality conditions, using kerosene and operating at similar temperatures as big turbines, but at a very cost effective level. The testbed consisted out of a disperser for the real volcanic ash and all the equipment needed to control the micro gas turbine. Moreover, in front and behind the microgas turbine the concentration and the distribution of the volcanic ash were measured online by optical particle counters (OPCs). The particle concentration and size distribution of the volcanic ash particles in the intake in front of the microgas turbine was measured by an optical particle counter (OPC) combined with an isokinetic intake. Behind the microgas turbine in the exhaust gas additionally to the measurement with a second OPC ash particles were caught with an impactor, in order to enable the later analysis with an electron microscope concerning the morphology to verify possible melting processes of the ash particles. This testbed is of high importance as it allows detailed investigations of the impact of volcanic ash to jet turbines and appropriate countermeasures.

  3. Laboratory Automation and Middleware.

    Science.gov (United States)

    Riben, Michael

    2015-06-01

    The practice of surgical pathology is under constant pressure to deliver the highest quality of service, reduce errors, increase throughput, and decrease turnaround time while at the same time dealing with an aging workforce, increasing financial constraints, and economic uncertainty. Although not able to implement total laboratory automation, great progress continues to be made in workstation automation in all areas of the pathology laboratory. This report highlights the benefits and challenges of pathology automation, reviews middleware and its use to facilitate automation, and reviews the progress so far in the anatomic pathology laboratory. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. 77 FR 48527 - National Customs Automation Program (NCAP) Test Concerning Automated Commercial Environment (ACE...

    Science.gov (United States)

    2012-08-14

    ... National Customs Automation Program (NCAP) test concerning the simplified entry functionality in the... DEPARTMENT OF HOMELAND SECURITY U.S. Customs and Border Protection National Customs Automation Program (NCAP) Test Concerning Automated Commercial Environment (ACE) Simplified Entry: Modification of...

  5. World-wide distribution automation systems

    International Nuclear Information System (INIS)

    Devaney, T.M.

    1994-01-01

    A worldwide power distribution automation system is outlined. Distribution automation is defined and the status of utility automation is discussed. Other topics discussed include a distribution management system, substation feeder, and customer functions, potential benefits, automation costs, planning and engineering considerations, automation trends, databases, system operation, computer modeling of system, and distribution management systems

  6. WIDAFELS flexible automation systems

    International Nuclear Information System (INIS)

    Shende, P.S.; Chander, K.P.; Ramadas, P.

    1990-01-01

    After discussing the various aspects of automation, some typical examples of various levels of automation are given. One of the examples is of automated production line for ceramic fuel pellets. (M.G.B.)

  7. A comparison of damage profiling of automated tap testers on aircraft CFRP panel

    Science.gov (United States)

    Mohd Aris, K. D.; Shariff, M. F.; Abd Latif, B. R.; Mohd Haris, M. Y.; Baidzawi, I. J.

    2017-12-01

    The use of composite materials nevertheless is getting more prominent. The combination of reinforcing fibers and matrices will produce the desired strength orientation, tailorability and not to mention the complex shape that is hard to form on metallic structure. The weight percentage of composite materials used in aerospace, civil, marine etc. has increased tremendously. Since composite are stacked together, the possibility of delamination and/disbond defects are highly present either in the monolithic or sandwich structures. Tap test is the cheapest form of nondestructive test to identify the presence of this damage. However, its inconsistency and wide area of coverage can reduce its effectivity since it is carried out manually. The indigenous automated tap tester known as KETOK was used to detect the damage due to trapped voids and air pockets. The mechanism of detection is through controlling the tapping on the surface automatically at a constant rate. Another manual tap tester RD-3 from Wichitech Industries Inc. was used as reference. The acquired data was translated into damage profiling and both results were compared. The results have shown that the indigenous automated tester can profile the damage better when compared with the existing tap tester. As a conclusion, the indigenous automated tap tester has a potential to be used as an IN-SITU damage detection tool to detect delamination and disbond damage on composite panel. However, more conclusive tests need to be done in order to make the unit available to conventional users.

  8. Continuous Calibration of Trust in Automated Systems

    Science.gov (United States)

    2014-01-01

    Airlines Flight 214 in San Francisco. Therefore, understanding how users form, lose, and recover trust in imperfect automation is of critical...1997). Misuse and disuse can have fatal consequences; for example, inappropriate automation reliance has been implicated in the recent crash of Asiana

  9. 76 FR 34246 - Automated Commercial Environment (ACE); Announcement of National Customs Automation Program Test...

    Science.gov (United States)

    2011-06-13

    ... Environment (ACE); Announcement of National Customs Automation Program Test of Automated Procedures for In... Customs Automation Program (NCAP) test relating to highway movements of commercial goods that are transported in-bond through the United States from one point in Canada to another point in Canada. The NCAP...

  10. Study of the Integration of LIDAR and Photogrammetric Datasets by in Situ Camera Calibration and Integrated Sensor Orientation

    Science.gov (United States)

    Mitishita, E.; Costa, F.; Martins, M.

    2017-05-01

    Photogrammetric and Lidar datasets should be in the same mapping or geodetic frame to be used simultaneously in an engineering project. Nowadays direct sensor orientation is a common procedure used in simultaneous photogrammetric and Lidar surveys. Although the direct sensor orientation technologies provide a high degree of automation process due to the GNSS/INS technologies, the accuracies of the results obtained from the photogrammetric and Lidar surveys are dependent on the quality of a group of parameters that models accurately the user conditions of the system at the moment the job is performed. This paper shows the study that was performed to verify the importance of the in situ camera calibration and Integrated Sensor Orientation without control points to increase the accuracies of the photogrammetric and LIDAR datasets integration. The horizontal and vertical accuracies of photogrammetric and Lidar datasets integration by photogrammetric procedure improved significantly when the Integrated Sensor Orientation (ISO) approach was performed using Interior Orientation Parameter (IOP) values estimated from the in situ camera calibration. The horizontal and vertical accuracies, estimated by the Root Mean Square Error (RMSE) of the 3D discrepancies from the Lidar check points, increased around of 37% and 198% respectively.

  11. Experimental aerodynamic and acoustic model testing of the Variable Cycle Engine (VCE) testbed coannular exhaust nozzle system: Comprehensive data report

    Science.gov (United States)

    Nelson, D. P.; Morris, P. M.

    1980-01-01

    The component detail design drawings of the one sixth scale model of the variable cycle engine testbed demonstrator exhaust syatem tested are presented. Also provided are the basic acoustic and aerodynamic data acquired during the experimental model tests. The model drawings, an index to the acoustic data, an index to the aerodynamic data, tabulated and graphical acoustic data, and the tabulated aerodynamic data and graphs are discussed.

  12. Open Orchestration Cloud Radio Access Network (OOCRAN) Testbed

    OpenAIRE

    Floriach-Pigem, Marti; Xercavins-Torregrosa, Guillem; Marojevic, Vuk; Gelonch-Bosch, Antoni

    2017-01-01

    The Cloud radio access network (C-RAN) offers a revolutionary approach to cellular network deployment, management and evolution. Advances in software-defined radio (SDR) and networking technology, moreover, enable delivering software-defined everything through the Cloud. Resources will be pooled and dynamically allocated leveraging abstraction, virtualization, and consolidation techniques; processes will be automated using common application programming interfaces; and network functions and s...

  13. Development of design principles for automated systems in transport control.

    Science.gov (United States)

    Balfe, Nora; Wilson, John R; Sharples, Sarah; Clarke, Theresa

    2012-01-01

    This article reports the results of a qualitative study investigating attitudes towards and opinions of an advanced automation system currently used in UK rail signalling. In-depth interviews were held with 10 users, key issues associated with automation were identified and the automation's impact on the signalling task investigated. The interview data highlighted the importance of the signallers' understanding of the automation and their (in)ability to predict its outputs. The interviews also covered the methods used by signallers to interact with and control the automation, and the perceived effects on their workload. The results indicate that despite a generally low level of understanding and ability to predict the actions of the automation system, signallers have developed largely successful coping mechanisms that enable them to use the technology effectively. These findings, along with parallel work identifying desirable attributes of automation from the literature in the area, were used to develop 12 principles of automation which can be used to help design new systems which better facilitate cooperative working. The work reported in this article was completed with the active involvement of operational rail staff who regularly use automated systems in rail signalling. The outcomes are currently being used to inform decisions on the extent and type of automation and user interfaces in future generations of rail control systems.

  14. Automation of radioimmunoassay

    International Nuclear Information System (INIS)

    Yamaguchi, Chisato; Yamada, Hideo; Iio, Masahiro

    1974-01-01

    Automation systems for measuring Australian antigen by radioimmunoassay under development were discussed. Samples were processed as follows: blood serum being dispensed by automated sampler to the test tube, and then incubated under controlled time and temperature; first counting being omitted; labelled antibody being dispensed to the serum after washing; samples being incubated and then centrifuged; radioactivities in the precipitate being counted by auto-well counter; measurements being tabulated by automated typewriter. Not only well-type counter but also position counter was studied. (Kanao, N.)

  15. Layered distributed architecture for plant automation

    International Nuclear Information System (INIS)

    Aravamuthan, G.; Verma, Yachika; Ranjan, Jyoti; Chachondia, Alka S.; Ganesh, G.

    2005-01-01

    The development of plant automation system and associated software remains one of the greatest challenges to the widespread implementation of highly adaptive re-configurable automation technology. This paper presents a layered distributed architecture for a plant automation system designed to support rapid reconfiguration and redeployment of automation components. The paper first presents evolution of automation architecture and their associated environment in the past few decades and then presents the concept of layered system architecture and the use of automation components to support the construction of a wide variety of automation system. It also highlights the role of standards and technology, which can be used in the development of automation components. We have attempted to adhere to open standards and technology for the development of automation component at a various layers. It also highlights the application of this concept in the development of an Operator Information System (OIS) for Advanced Heavy Water Reactor (AHWR). (author)

  16. Automated quantification of proliferation with automated hot-spot selection in phosphohistone H3/MART1 dual-stained stage I/II melanoma.

    Science.gov (United States)

    Nielsen, Patricia Switten; Riber-Hansen, Rikke; Schmidt, Henrik; Steiniche, Torben

    2016-04-09

    Staging of melanoma includes quantification of a proliferation index, i.e., presumed melanocytic mitoses of H&E stains are counted manually in hot spots. Yet, its reproducibility and prognostic impact increases by immunohistochemical dual staining for phosphohistone H3 (PHH3) and MART1, which also may enable fully automated quantification by image analysis. To ensure manageable workloads and repeatable measurements in modern pathology, the study aimed to present an automated quantification of proliferation with automated hot-spot selection in PHH3/MART1-stained melanomas. Formalin-fixed, paraffin-embedded tissue from 153 consecutive stage I/II melanoma patients was immunohistochemically dual-stained for PHH3 and MART1. Whole slide images were captured, and the number of PHH3/MART1-positive cells was manually and automatically counted in the global tumor area and in a manually and automatically selected hot spot, i.e., a fixed 1-mm(2) square. Bland-Altman plots and hypothesis tests compared manual and automated procedures, and the Cox proportional hazards model established their prognostic impact. The mean difference between manual and automated global counts was 2.9 cells/mm(2) (P = 0.0071) and 0.23 cells per hot spot (P = 0.96) for automated counts in manually and automatically selected hot spots. In 77 % of cases, manual and automated hot spots overlapped. Fully manual hot-spot counts yielded the highest prognostic performance with an adjusted hazard ratio of 5.5 (95 % CI, 1.3-24, P = 0.024) as opposed to 1.3 (95 % CI, 0.61-2.9, P = 0.47) for automated counts with automated hot spots. The automated index and automated hot-spot selection were highly correlated to their manual counterpart, but altogether their prognostic impact was noticeably reduced. Because correct recognition of only one PHH3/MART1-positive cell seems important, extremely high sensitivity and specificity of the algorithm is required for prognostic purposes. Thus, automated

  17. Automation, Performance and International Competition

    DEFF Research Database (Denmark)

    Kromann, Lene; Sørensen, Anders

    This paper presents new evidence on trade‐induced automation in manufacturing firms using unique data combining a retrospective survey that we have assembled with register data for 2005‐2010. In particular, we establish a causal effect where firms that have specialized in product types for which...... the Chinese exports to the world market has risen sharply invest more in automated capital compared to firms that have specialized in other product types. We also study the relationship between automation and firm performance and find that firms with high increases in scale and scope of automation have faster...... productivity growth than other firms. Moreover, automation improves the efficiency of all stages of the production process by reducing setup time, run time, and inspection time and increasing uptime and quantity produced per worker. The efficiency improvement varies by type of automation....

  18. Mobile home automation-merging mobile value added services and home automation technologies

    OpenAIRE

    Rosendahl, Andreas; Hampe, Felix J.; Botterweck, Goetz

    2007-01-01

    non-peer-reviewed In this paper we study mobile home automation, a field that emerges from an integration of mobile application platforms and home automation technologies. In a conceptual introduction we first illustrate the need for such applications by introducing a two-dimensional conceptual model of mobility. Subsequently we suggest an architecture and discuss different options of how a user might access a mobile home automation service and the controlled devices. As another contrib...

  19. Polluted soils with heavy metals. Stabilization by magnesium oxide. Ex-situ and in-situ testings; Suelos contaminados con metales pesados. Estabilizacion con oxido de magnesio. Ensayos ex situ-in situ

    Energy Technology Data Exchange (ETDEWEB)

    Cenoz, S.; Hernandez, J.; Gangutia, N.

    2004-07-01

    This work describes the use of Low-Grade MgO as a stabiliser agent for polluted soil reclaim. Low-Grade MgO may be an economically feasible alternative in the stabilisation of heavy metals from heavily contaminated soils. The effectiveness of Low-Grade MgO has been studied in three ex-situ stabilisation of heavily polluted soils contaminated by the flue-dust of pyrite roasting. LG-MgO provides an alkali reservoir guaranteeing long-term stabilisation without varying the pH conditions. The success of the ex-situ stabilisation was corroborated with the analysis of heavy metals in the leachates collected from the landfill o ver a long period of time. The study also includes the results obtained in an in-situ pilot scale stabilisation of contaminated soil. (Author) 17 refs.

  20. In Situ Remediation Integrated Program. In situ physical/chemical treatment technologies for remediation of contaminated sites: Applicability, developing status, and research needs

    International Nuclear Information System (INIS)

    Siegrist, R.L.; Gates, D.D.; West, O.R.; Liang, L.; Donaldson, T.L.; Webb, O.F.; Corder, S.L.; Dickerson, K.S.

    1994-06-01

    The U.S. Department of Energy (DOE) In Situ Remediation Integrated Program (ISR IP) was established in June 1991 to facilitate the development and implementation of in situ remediation technologies for environmental restoration within the DOE complex. Within the ISR IP, four subareas of research have been identified: (1) in situ containment, (2) in situ physical/chemical treatment (ISPCT), (3) in situ bioremediation, and (4) subsurface manipulation/electrokinetics. Although set out as individual focus areas, these four are interrelated, and successful developments in one will often necessitate successful developments in another. In situ remediation technologies are increasingly being sought for environmental restoration due to the potential advantages that in situ technologies can offer as opposed to more traditional ex situ technologies. These advantages include limited site disruption, lower cost, reduced worker exposure, and treatment at depth under structures. While in situ remediation technologies can offer great advantages, many technology gaps exist in their application. This document presents an overview of ISPCT technologies and describes their applicability to DOE-complex needs, their development status, and relevant ongoing research. It also highlights research needs that the ISR IP should consider when making funding decisions