WorldWideScience

Sample records for scdu testbed automated

  1. SCDU Testbed Automated In-Situ Alignment, Data Acquisition and Analysis

    Science.gov (United States)

    Werne, Thomas A.; Wehmeier, Udo J.; Wu, Janet P.; An, Xin; Goullioud, Renaud; Nemati, Bijan; Shao, Michael; Shen, Tsae-Pyng J.; Wang, Xu; Weilert, Mark A.; hide

    2010-01-01

    In the course of fulfilling its mandate, the Spectral Calibration Development Unit (SCDU) testbed for SIM-Lite produces copious amounts of raw data. To effectively spend time attempting to understand the science driving the data, the team devised computerized automations to limit the time spent bringing the testbed to a healthy state and commanding it, and instead focus on analyzing the processed results. We developed a multi-layered scripting language that emphasized the scientific experiments we conducted, which drastically shortened our experiment scripts, improved their readability, and all-but-eliminated testbed operator errors. In addition to scientific experiment functions, we also developed a set of automated alignments that bring the testbed up to a well-aligned state with little more than the push of a button. These scripts were written in the scripting language, and in Matlab via an interface library, allowing all members of the team to augment the existing scripting language with complex analysis scripts. To keep track of these results, we created an easily-parseable state log in which we logged both the state of the testbed and relevant metadata. Finally, we designed a distributed processing system that allowed us to farm lengthy analyses to a collection of client computers which reported their results in a central log. Since these logs were parseable, we wrote query scripts that gave us an effortless way to compare results collected under different conditions. This paper serves as a case-study, detailing the motivating requirements for the decisions we made and explaining the implementation process.

  2. Future Autonomous and Automated Systems Testbed

    Data.gov (United States)

    National Aeronautics and Space Administration — Trust is the greatest obstacle to implementing greater autonomy and automation (A&A) in the human spaceflight program. The Future Autonomous and Automated...

  3. Automated tools and techniques for distributed Grid Software Development of the testbed infrastructure

    CERN Document Server

    Aguado Sanchez, C

    2007-01-01

    Grid technology is becoming more and more important as the new paradigm for sharing computational resources across different organizations in a secure way. The great powerfulness of this solution, requires the definition of a generic stack of services and protocols and this is the scope of the different Grid initiatives. As a result of international collaborations for its development, the Open Grid Forum created the Open Grid Services Architecture (OGSA) which aims to define the common set of services that will enable interoperability across the different implementations. This master thesis has been developed in this framework, as part of the two European-funded projects ETICS and OMII-Europe. The main objective is to contribute to the design and maintenance of large distributed development projects with the automated tool that enables to implement Software Engineering techniques oriented to achieve an acceptable level of quality at the release process. Specifically, this thesis develops the testbed concept a...

  4. Implementation strategies for load center automation on the space station module/power management and distribution testbed

    Science.gov (United States)

    Watson, Karen

    1990-01-01

    The Space Station Module/Power Management and Distribution (SSM/PMAD) testbed was developed to study the tertiary power management on modules in large spacecraft. The main goal was to study automation techniques, not necessarily develop flight ready systems. Because of the confidence gained in many of automation strategies investigated, it is appropriate to study, in more detail, implementation strategies in order to find better trade-offs for nearer to flight ready systems. These trade-offs particularly concern the weight, volume, power consumption, and performance of the automation system. These systems, in their present implementation are described.

  5. An overview of the U.S. Army Research Laboratory's Sensor Information Testbed for Collaborative Research Environment (SITCORE) and Automated Online Data Repository (AODR) capabilities

    Science.gov (United States)

    Ward, Dennis W.; Bennett, Kelly W.

    2017-05-01

    The Sensor Information Testbed COllaberative Research Environment (SITCORE) and the Automated Online Data Repository (AODR) are significant enablers of the U.S. Army Research Laboratory (ARL)'s Open Campus Initiative and together create a highly-collaborative research laboratory and testbed environment focused on sensor data and information fusion. SITCORE creates a virtual research development environment allowing collaboration from other locations, including DoD, industry, academia, and collation facilities. SITCORE combined with AODR provides end-toend algorithm development, experimentation, demonstration, and validation. The AODR enterprise allows the U.S. Army Research Laboratory (ARL), as well as other government organizations, industry, and academia to store and disseminate multiple intelligence (Multi-INT) datasets collected at field exercises and demonstrations, and to facilitate research and development (R and D), and advancement of analytical tools and algorithms supporting the Intelligence, Surveillance, and Reconnaissance (ISR) community. The AODR provides a potential central repository for standards compliant datasets to serve as the "go-to" location for lessons-learned and reference products. Many of the AODR datasets have associated ground truth and other metadata which provides a rich and robust data suite for researchers to develop, test, and refine their algorithms. Researchers download the test data to their own environments using a sophisticated web interface. The AODR allows researchers to request copies of stored datasets and for the government to process the requests and approvals in an automated fashion. Access to the AODR requires two-factor authentication in the form of a Common Access Card (CAC) or External Certificate Authority (ECA)

  6. Trace explosives sensor testbed (TESTbed)

    Science.gov (United States)

    Collins, Greg E.; Malito, Michael P.; Tamanaha, Cy R.; Hammond, Mark H.; Giordano, Braden C.; Lubrano, Adam L.; Field, Christopher R.; Rogers, Duane A.; Jeffries, Russell A.; Colton, Richard J.; Rose-Pehrsson, Susan L.

    2017-03-01

    A novel vapor delivery testbed, referred to as the Trace Explosives Sensor Testbed, or TESTbed, is demonstrated that is amenable to both high- and low-volatility explosives vapors including nitromethane, nitroglycerine, ethylene glycol dinitrate, triacetone triperoxide, 2,4,6-trinitrotoluene, pentaerythritol tetranitrate, and hexahydro-1,3,5-trinitro-1,3,5-triazine. The TESTbed incorporates a six-port dual-line manifold system allowing for rapid actuation between a dedicated clean air source and a trace explosives vapor source. Explosives and explosives-related vapors can be sourced through a number of means including gas cylinders, permeation tube ovens, dynamic headspace chambers, and a Pneumatically Modulated Liquid Delivery System coupled to a perfluoroalkoxy total-consumption microflow nebulizer. Key features of the TESTbed include continuous and pulseless control of trace vapor concentrations with wide dynamic range of concentration generation, six sampling ports with reproducible vapor profile outputs, limited low-volatility explosives adsorption to the manifold surface, temperature and humidity control of the vapor stream, and a graphical user interface for system operation and testing protocol implementation.

  7. NASA Robotic Neurosurgery Testbed

    Science.gov (United States)

    Mah, Robert

    1997-01-01

    The detection of tissue interface (e.g., normal tissue, cancer, tumor) has been limited clinically to tactile feedback, temperature monitoring, and the use of a miniature ultrasound probe for tissue differentiation during surgical operations, In neurosurgery, the needle used in the standard stereotactic CT or MRI guided brain biopsy provides no information about the tissue being sampled. The tissue sampled depends entirely upon the accuracy with which the localization provided by the preoperative CT or MRI scan is translated to the intracranial biopsy site. In addition, no information about the tissue being traversed by the needle (e.g., a blood vessel) is provided. Hemorrhage due to the biopsy needle tearing a blood vessel within the brain is the most devastating complication of stereotactic CT/MRI guided brain biopsy. A robotic neurosurgery testbed has been developed at NASA Ames Research Center as a spin-off of technologies from space, aeronautics and medical programs. The invention entitled "Robotic Neurosurgery Leading to Multimodality Devices for Tissue Identification" is nearing a state ready for commercialization. The devices will: 1) improve diagnostic accuracy and precision of general surgery, with near term emphasis on stereotactic brain biopsy, 2) automate tissue identification, with near term emphasis on stereotactic brain biopsy, to permit remote control of the procedure, and 3) reduce morbidity for stereotactic brain biopsy. The commercial impact from this work is the potential development of a whole new generation of smart surgical tools to increase the safety, accuracy and efficiency of surgical procedures. Other potential markets include smart surgical tools for tumor ablation in neurosurgery, general exploratory surgery, prostate cancer surgery, and breast cancer surgery.

  8. Virtual Factory Testbed

    Data.gov (United States)

    Federal Laboratory Consortium — The Virtual Factory Testbed (VFT) is comprised of three physical facilities linked by a standalone network (VFNet). The three facilities are the Smart and Wireless...

  9. Wireless Testbed Bonsai

    Science.gov (United States)

    2006-02-01

    wireless sensor device network, and a about 200 Stargate nodes higher-tier multi-hop peer- to-peer 802.11b wireless network. Leading up to the full ExScal...deployment, we conducted spatial scaling tests on our higher-tier protocols on a 7 × 7 grid of Stargates nodes 45m and with 90m separations respectively...onW and its scaled version W̃ . III. EXPERIMENTAL SETUP Description of Kansei testbed. A stargate is a single board linux-based computer [7]. It uses a

  10. Holodeck Testbed Project

    Science.gov (United States)

    Arias, Adriel (Inventor)

    2016-01-01

    The main objective of the Holodeck Testbed is to create a cost effective, realistic, and highly immersive environment that can be used to train astronauts, carry out engineering analysis, develop procedures, and support various operations tasks. Currently, the Holodeck testbed allows to step into a simulated ISS (International Space Station) and interact with objects; as well as, perform Extra Vehicular Activities (EVA) on the surface of the Moon or Mars. The Holodeck Testbed is using the products being developed in the Hybrid Reality Lab (HRL). The HRL is combining technologies related to merging physical models with photo-realistic visuals to create a realistic and highly immersive environment. The lab also investigates technologies and concepts that are needed to allow it to be integrated with other testbeds; such as, the gravity offload capability provided by the Active Response Gravity Offload System (ARGOS). My main two duties were to develop and animate models for use in the HRL environments and work on a new way to interface with computers using Brain Computer Interface (BCI) technology. On my first task, I was able to create precise computer virtual tool models (accurate down to the thousandths or hundredths of an inch). To make these tools even more realistic, I produced animations for these tools so they would have the same mechanical features as the tools in real life. The computer models were also used to create 3D printed replicas that will be outfitted with tracking sensors. The sensor will allow the 3D printed models to align precisely with the computer models in the physical world and provide people with haptic/tactile feedback while wearing a VR (Virtual Reality) headset and interacting with the tools. Getting close to the end of my internship the lab bought a professional grade 3D Scanner. With this, I was able to replicate more intricate tools at a much more time-effective rate. The second task was to investigate the use of BCI to control

  11. Optical Network Testbeds Workshop

    Energy Technology Data Exchange (ETDEWEB)

    Joe Mambretti

    2007-06-01

    This is the summary report of the third annual Optical Networking Testbed Workshop (ONT3), which brought together leading members of the international advanced research community to address major challenges in creating next generation communication services and technologies. Networking research and development (R&D) communities throughout the world continue to discover new methods and technologies that are enabling breakthroughs in advanced communications. These discoveries are keystones for building the foundation of the future economy, which requires the sophisticated management of extremely large qualities of digital information through high performance communications. This innovation is made possible by basic research and experiments within laboratories and on specialized testbeds. Initial network research and development initiatives are driven by diverse motives, including attempts to solve existing complex problems, the desire to create powerful new technologies that do not exist using traditional methods, and the need to create tools to address specific challenges, including those mandated by large scale science or government agency mission agendas. Many new discoveries related to communications technologies transition to wide-spread deployment through standards organizations and commercialization. These transition paths allow for new communications capabilities that drive many sectors of the digital economy. In the last few years, networking R&D has increasingly focused on advancing multiple new capabilities enabled by next generation optical networking. Both US Federal networking R&D and other national R&D initiatives, such as those organized by the National Institute of Information and Communications Technology (NICT) of Japan are creating optical networking technologies that allow for new, powerful communication services. Among the most promising services are those based on new types of multi-service or hybrid networks, which use new optical networking

  12. Environment Emulation For Wsn Testbed

    Directory of Open Access Journals (Sweden)

    Radosław Kapłoniak

    2012-01-01

    Full Text Available The development of applications for wireless sensor networks is a challenging task. For this reason, several testbed platforms have been created. They simplify the manageability of nodes by offering easy ways of programming and debugging sensor nodes. These platforms, sometimes composed of dozens of sensors, provide a convenient way for carrying out research on medium access control and data exchange between nodes. In this article, we propose the extension of the WSN testbed, which could be used for evaluating and testing the functionality of sensor networks applications by emulating a real-world environment.

  13. Advanced Artificial Intelligence Technology Testbed

    Science.gov (United States)

    Anken, Craig S.

    1993-01-01

    The Advanced Artificial Intelligence Technology Testbed (AAITT) is a laboratory testbed for the design, analysis, integration, evaluation, and exercising of large-scale, complex, software systems, composed of both knowledge-based and conventional components. The AAITT assists its users in the following ways: configuring various problem-solving application suites; observing and measuring the behavior of these applications and the interactions between their constituent modules; gathering and analyzing statistics about the occurrence of key events; and flexibly and quickly altering the interaction of modules within the applications for further study.

  14. Implementation of a virtual link between power system testbeds at Marshall Spaceflight Center and Lewis Research Center

    Science.gov (United States)

    Doreswamy, Rajiv

    1990-01-01

    The Marshall Space Flight Center (MSFC) owns and operates a space station module power management and distribution (SSM-PMAD) testbed. This system, managed by expert systems, is used to analyze and develop power system automation techniques for Space Station Freedom. The Lewis Research Center (LeRC), Cleveland, Ohio, has developed and implemented a space station electrical power system (EPS) testbed. This system and its power management controller are representative of the overall Space Station Freedom power system. A virtual link is being implemented between the testbeds at MSFC and LeRC. This link would enable configuration of SSM-PMAD as a load center for the EPS testbed at LeRC. This connection will add to the versatility of both systems, and provide an environment of enhanced realism for operation of both testbeds.

  15. A remote integrated testbed for cooperating objects

    CERN Document Server

    Dios, Jose Ramiro Martinez-de; Bernabe, Alberto de San; Ollero, Anibal

    2013-01-01

    Testbeds are gaining increasing relevance in research domains and also in industrial applications. However, very few books devoted to testbeds have been published. To the best of my knowledge no book on this topic has been published. This book is particularly interesting for the growing community of testbed developers. I believe the book is also very interesting for researchers in robot-WSN cooperation.This book provides detailed description of a system that can be considered the first testbed that allows full peer-to-peer interoperability between heterogeneous robots and ubiquitous systems su

  16. Fast Physics Testbed for the FASTER Project

    Energy Technology Data Exchange (ETDEWEB)

    Lin, W.; Liu, Y.; Hogan, R.; Neggers, R.; Jensen, M.; Fridlind, A.; Lin, Y.; Wolf, A.

    2010-03-15

    This poster describes the Fast Physics Testbed for the new FAst-physics System Testbed and Research (FASTER) project. The overall objective is to provide a convenient and comprehensive platform for fast turn-around model evaluation against ARM observations and to facilitate development of parameterizations for cloud-related fast processes represented in global climate models. The testbed features three major components: a single column model (SCM) testbed, an NWP-Testbed, and high-resolution modeling (HRM). The web-based SCM-Testbed features multiple SCMs from major climate modeling centers and aims to maximize the potential of SCM approach to enhance and accelerate the evaluation and improvement of fast physics parameterizations through continuous evaluation of existing and evolving models against historical as well as new/improved ARM and other complementary measurements. The NWP-Testbed aims to capitalize on the large pool of operational numerical weather prediction products. Continuous evaluations of NWP forecasts against observations at ARM sites are carried out to systematically identify the biases and skills of physical parameterizations under all weather conditions. The highresolution modeling (HRM) activities aim to simulate the fast processes at high resolution to aid in the understanding of the fast processes and their parameterizations. A four-tier HRM framework is established to augment the SCM- and NWP-Testbeds towards eventual improvement of the parameterizations.

  17. INFN Tier-1 Testbed Facility

    International Nuclear Information System (INIS)

    Gregori, Daniele; Cavalli, Alessandro; Dell'Agnello, Luca; Dal Pra, Stefano; Prosperini, Andrea; Ricci, Pierpaolo; Ronchieri, Elisabetta; Sapunenko, Vladimir

    2012-01-01

    INFN-CNAF, located in Bologna, is the Information Technology Center of National Institute of Nuclear Physics (INFN). In the framework of the Worldwide LHC Computing Grid, INFN-CNAF is one of the eleven worldwide Tier-1 centers to store and reprocessing Large Hadron Collider (LHC) data. The Italian Tier-1 provides the resources of storage (i.e., disk space for short term needs and tapes for long term needs) and computing power that are needed for data processing and analysis to the LHC scientific community. Furthermore, INFN Tier-1 houses computing resources for other particle physics experiments, like CDF at Fermilab, SuperB at Frascati, as well as for astro particle and spatial physics experiments. The computing center is a very complex infrastructure, the hardaware layer include the network, storage and farming area, while the software layer includes open source and proprietary software. Software updating and new hardware adding can unexpectedly deteriorate the production activity of the center: therefore a testbed facility has been set up in order to reproduce and certify the various layers of the Tier-1. In this article we describe the testbed and the checks performed.

  18. LTE-Advanced/WLAN testbed

    OpenAIRE

    Plaisner, Denis

    2017-01-01

    Táto práca sa zaoberá skúmaním a vyhodnocovaním komunikácie štandardov LTE-Advance a WiFi (IEEE 802.11n/ac). Pri jednotlivých štandardoch je preskúmaný chybový parameter EVM. Pre prácu s jednotlivými štandardmi je navrhnuté univerzálne pracovisko (testbed). Toto univerzálne pracovisko slúži na nastavovanie vysielacieho a prijímacieho zariadenia a na spracovávanie prenášaných signálov a ich vyhodnocovanie. Pre túto prácu je vybrané prostredie Matlab, cez ktoré sa ovládajú použité prístroje ako...

  19. Wireless Sensor Networks TestBed: ASNTbed

    CSIR Research Space (South Africa)

    Dludla, AG

    2013-05-01

    Full Text Available Wireless sensor networks (WSNs) have been used in different types of applications and deployed within various environments. Simulation tools are essential for studying WSNs, especially for exploring large-scale networks. However, WSN testbeds...

  20. AMS San Diego Testbed - Calibration Data

    Data.gov (United States)

    Department of Transportation — The data in this repository were collected from the San Diego, California testbed, namely, I-15 from the interchange with SR-78 in the north to the interchange with...

  1. University of Florida Advanced Technologies Campus Testbed

    Science.gov (United States)

    2017-09-21

    The University of Florida (UF) and its Transportation Institute (UFTI), the Florida Department of Transportation (FDOT) and the City of Gainesville (CoG) are cooperating to develop a smart transportation testbed on the University of Florida (UF) main...

  2. Versatile Electric Propulsion Aircraft Testbed, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — An all-electric aircraft testbed is proposed to provide a dedicated development environment for the rigorous study and advancement of electrically powered aircraft....

  3. Automatic Integration Testbeds validation on Open Science Grid

    Science.gov (United States)

    Caballero, J.; Thapa, S.; Gardner, R.; Potekhin, M.

    2011-12-01

    A recurring challenge in deploying high quality production middleware is the extent to which realistic testing occurs before release of the software into the production environment. We describe here an automated system for validating releases of the Open Science Grid software stack that leverages the (pilot-based) PanDA job management system developed and used by the ATLAS experiment. The system was motivated by a desire to subject the OSG Integration Testbed to more realistic validation tests. In particular those which resemble to every extent possible actual job workflows used by the experiments thus utilizing job scheduling at the compute element (CE), use of the worker node execution environment, transfer of data to/from the local storage element (SE), etc. The context is that candidate releases of OSG compute and storage elements can be tested by injecting large numbers of synthetic jobs varying in complexity and coverage of services tested. The native capabilities of the PanDA system can thus be used to define jobs, monitor their execution, and archive the resulting run statistics including success and failure modes. A repository of generic workflows and job types to measure various metrics of interest has been created. A command-line toolset has been developed so that testbed managers can quickly submit "VO-like" jobs into the system when newly deployed services are ready for testing. A system for automatic submission has been crafted to send jobs to integration testbed sites, collecting the results in a central service and generating regular reports for performance and reliability.

  4. Automatic Integration Testbeds validation on Open Science Grid

    International Nuclear Information System (INIS)

    Caballero, J; Potekhin, M; Thapa, S; Gardner, R

    2011-01-01

    A recurring challenge in deploying high quality production middleware is the extent to which realistic testing occurs before release of the software into the production environment. We describe here an automated system for validating releases of the Open Science Grid software stack that leverages the (pilot-based) PanDA job management system developed and used by the ATLAS experiment. The system was motivated by a desire to subject the OSG Integration Testbed to more realistic validation tests. In particular those which resemble to every extent possible actual job workflows used by the experiments thus utilizing job scheduling at the compute element (CE), use of the worker node execution environment, transfer of data to/from the local storage element (SE), etc. The context is that candidate releases of OSG compute and storage elements can be tested by injecting large numbers of synthetic jobs varying in complexity and coverage of services tested. The native capabilities of the PanDA system can thus be used to define jobs, monitor their execution, and archive the resulting run statistics including success and failure modes. A repository of generic workflows and job types to measure various metrics of interest has been created. A command-line toolset has been developed so that testbed managers can quickly submit 'VO-like' jobs into the system when newly deployed services are ready for testing. A system for automatic submission has been crafted to send jobs to integration testbed sites, collecting the results in a central service and generating regular reports for performance and reliability.

  5. A Reconfigurable Testbed Environment for Spacecraft Autonomy

    Science.gov (United States)

    Biesiadecki, Jeffrey; Jain, Abhinandan

    1996-01-01

    A key goal of NASA's New Millennium Program is the development of technology for increased spacecraft on-board autonomy. Achievement of this objective requires the development of a new class of ground-based automony testbeds that can enable the low-cost and rapid design, test, and integration of the spacecraft autonomy software. This paper describes the development of an Autonomy Testbed Environment (ATBE) for the NMP Deep Space I comet/asteroid rendezvous mission.

  6. Implementation of standard testbeds for numerical relativity

    Energy Technology Data Exchange (ETDEWEB)

    Babiuc, M C [Department of Physics and Physical Science, Marshall University, Huntington, WV 25755 (United States); Husa, S [Friedrich Schiller University Jena, Max-Wien-Platz 1, 07743 Jena (Germany); Alic, D [Department of Physics, University of the Balearic Islands, Cra Valldemossa km 7.5, 07122 Palma de Mallorca (Spain); Hinder, I [Center for Gravitational Wave Physics, Pennsylvania State University, University Park, PA 16802 (United States); Lechner, C [Weierstrass Institute for Applied Analysis and Stochastics (WIAS), Mohrenstrasse 39, 10117 Berlin (Germany); Schnetter, E [Center for Computation and Technology, 216 Johnston Hall, Louisiana State University, Baton Rouge, LA 70803 (United States); Szilagyi, B; Dorband, N; Pollney, D; Winicour, J [Max-Planck-Institut fuer Gravitationsphysik (Albert-Einstein-Institut), Am Muehlenberg 1, 14076 Golm (Germany); Zlochower, Y [Center for Computational Relativity and Gravitation, School of Mathematical Sciences, Rochester Institute of Technology, 78 Lomb Memorial Drive, Rochester, New York 14623 (United States)

    2008-06-21

    We discuss results that have been obtained from the implementation of the initial round of testbeds for numerical relativity which was proposed in the first paper of the Apples with Apples Alliance. We present benchmark results for various codes which provide templates for analyzing the testbeds and to draw conclusions about various features of the codes. This allows us to sharpen the initial test specifications, design a new test and add theoretical insight.

  7. A demonstration of remote survey and characterization of a buried waste site using the SRIP [Soldier Robot Interface Project] testbed

    International Nuclear Information System (INIS)

    Burks, B.L.; Richardson, B.S.; Armstrong, G.A.; Hamel, W.R.; Jansen, J.F.; Killough, S.M.; Thompson, D.H.; Emery, M.S.

    1990-01-01

    During FY 1990, the Oak Ridge National Laboratory (ORNL) supported the Department of Energy (DOE) Environmental Restoration and Waste Management (ER ampersand WM) Office of Technology Development through several projects including the development of a semiautonomous survey of a buried waste site using a remotely operated all-terrain robotic testbed borrowed from the US Army. The testbed was developed for the US Army's Human Engineering Laboratory (HEL) for the US Army's Soldier Robot Interface Project (SRIP). Initial development of the SRIP testbed was performed by a team including ORNL, HEL, Tooele Army Depot, and Odetics, Inc., as an experimental testbed for a variety of human factors issues related to military applications of robotics. The SRIP testbed was made available to the DOE and ORNL for the further development required for a remote landfill survey. The robot was modified extensively, equipped with environmental sensors, and used to demonstrate an automated remote survey of Solid Waste Storage Area No. 3 (SWSA 3) at ORNL on Tuesday, September 18, 1990. Burial trenches in this area containing contaminated materials were covered with soil nearly twenty years ago. This paper describes the SRIP testbed and work performed in FY 1990 to demonstrate a semiautonomous landfill survey at ORNL. 5 refs

  8. Benchmarking Diagnostic Algorithms on an Electrical Power System Testbed

    Science.gov (United States)

    Kurtoglu, Tolga; Narasimhan, Sriram; Poll, Scott; Garcia, David; Wright, Stephanie

    2009-01-01

    Diagnostic algorithms (DAs) are key to enabling automated health management. These algorithms are designed to detect and isolate anomalies of either a component or the whole system based on observations received from sensors. In recent years a wide range of algorithms, both model-based and data-driven, have been developed to increase autonomy and improve system reliability and affordability. However, the lack of support to perform systematic benchmarking of these algorithms continues to create barriers for effective development and deployment of diagnostic technologies. In this paper, we present our efforts to benchmark a set of DAs on a common platform using a framework that was developed to evaluate and compare various performance metrics for diagnostic technologies. The diagnosed system is an electrical power system, namely the Advanced Diagnostics and Prognostics Testbed (ADAPT) developed and located at the NASA Ames Research Center. The paper presents the fundamentals of the benchmarking framework, the ADAPT system, description of faults and data sets, the metrics used for evaluation, and an in-depth analysis of benchmarking results obtained from testing ten diagnostic algorithms on the ADAPT electrical power system testbed.

  9. The CMS integration grid testbed

    Energy Technology Data Exchange (ETDEWEB)

    Graham, Gregory E.

    2004-08-26

    The CMS Integration Grid Testbed (IGT) comprises USCMS Tier-1 and Tier-2 hardware at the following sites: the California Institute of Technology, Fermi National Accelerator Laboratory, the University of California at San Diego, and the University of Florida at Gainesville. The IGT runs jobs using the Globus Toolkit with a DAGMan and Condor-G front end. The virtual organization (VO) is managed using VO management scripts from the European Data Grid (EDG). Gridwide monitoring is accomplished using local tools such as Ganglia interfaced into the Globus Metadata Directory Service (MDS) and the agent based Mona Lisa. Domain specific software is packaged and installed using the Distribution After Release (DAR) tool of CMS, while middleware under the auspices of the Virtual Data Toolkit (VDT) is distributed using Pacman. During a continuous two month span in Fall of 2002, over 1 million official CMS GEANT based Monte Carlo events were generated and returned to CERN for analysis while being demonstrated at SC2002. In this paper, we describe the process that led to one of the world's first continuously available, functioning grids.

  10. The CMS Integration Grid Testbed

    CERN Document Server

    Graham, G E; Aziz, Shafqat; Bauerdick, L.A.T.; Ernst, Michael; Kaiser, Joseph; Ratnikova, Natalia; Wenzel, Hans; Wu, Yu-jun; Aslakson, Erik; Bunn, Julian; Iqbal, Saima; Legrand, Iosif; Newman, Harvey; Singh, Suresh; Steenberg, Conrad; Branson, James; Fisk, Ian; Letts, James; Arbree, Adam; Avery, Paul; Bourilkov, Dimitri; Cavanaugh, Richard; Rodriguez, Jorge Luis; Kategari, Suchindra; Couvares, Peter; DeSmet, Alan; Livny, Miron; Roy, Alain; Tannenbaum, Todd; Graham, Gregory E.; Aziz, Shafqat; Ernst, Michael; Kaiser, Joseph; Ratnikova, Natalia; Wenzel, Hans; Wu, Yujun; Aslakson, Erik; Bunn, Julian; Iqbal, Saima; Legrand, Iosif; Newman, Harvey; Singh, Suresh; Steenberg, Conrad; Branson, James; Fisk, Ian; Letts, James; Arbree, Adam; Avery, Paul; Bourilkov, Dimitri; Cavanaugh, Richard; Rodriguez, Jorge; Kategari, Suchindra; Couvares, Peter; Smet, Alan De; Livny, Miron; Roy, Alain; Tannenbaum, Todd

    2003-01-01

    The CMS Integration Grid Testbed (IGT) comprises USCMS Tier-1 and Tier-2 hardware at the following sites: the California Institute of Technology, Fermi National Accelerator Laboratory, the University of California at San Diego, and the University of Florida at Gainesville. The IGT runs jobs using the Globus Toolkit with a DAGMan and Condor-G front end. The virtual organization (VO) is managed using VO management scripts from the European Data Grid (EDG). Gridwide monitoring is accomplished using local tools such as Ganglia interfaced into the Globus Metadata Directory Service (MDS) and the agent based Mona Lisa. Domain specific software is packaged and installed using the Distrib ution After Release (DAR) tool of CMS, while middleware under the auspices of the Virtual Data Toolkit (VDT) is distributed using Pacman. During a continuo us two month span in Fall of 2002, over 1 million official CMS GEANT based Monte Carlo events were generated and returned to CERN for analysis while being demonstrated at SC2002. ...

  11. Visible nulling coronagraph testbed results

    Science.gov (United States)

    Lyon, Richard G.; Clampin, Mark; Woodruff, Robert A.; Vasudevan, Gopal; Thompson, Patrick; Petrone, Peter; Madison, Timothy; Rizzo, Maxime; Melnick, Gary; Tolls, Volker

    2009-08-01

    We report on our recent laboratory results with the NASA/Goddard Space Flight Center (GSFC) Visible Nulling Coronagraph (VNC) testbed. We have experimentally achieved focal plane contrasts of 1 x 108 and approaching 109 at inner working angles of 2 * wavelength/D and 4 * wavelength/D respectively where D is the aperture diameter. The result was obtained using a broadband source with a narrowband spectral filter of width 10 nm centered on 630 nm. To date this is the deepest nulling result with a visible nulling coronagraph yet obtained. Developed also is a Null Control Breadboard (NCB) to assess and quantify MEMS based segmented deformable mirror technology and develop and assess closed-loop null sensing and control algorithm performance from both the pupil and focal planes. We have demonstrated closed-loop control at 27 Hz in the laboratory environment. Efforts are underway to first bring the contrast to > 109 necessary for the direct detection and characterization of jovian (Jupiter-like) and then to > 1010 necessary for terrestrial (Earth-like) exosolar planets. Short term advancements are expected to both broaden the spectral passband from 10 nm to 100 nm and to increase both the long-term stability to > 2 hours and the extent of the null out to a ~ 10 * wavelength / D via the use of MEMS based segmented deformable mirror technology, a coherent fiber bundle, achromatic phase shifters, all in a vacuum chamber at the GSFC VNC facility. Additionally an extreme stability textbook sized compact VNC is under development.

  12. The DataTAG transatlantic testbed

    CERN Document Server

    Martin, O; Martin-Flatin, J P; Moroni, P; Nae, D; Newman, H; Ravot, S

    2005-01-01

    Wide area network testbeds allow researchers and engineers to test out new equipment, protocols and services in real-life situations, without jeopardizing the stability and reliability of production networks. The Data TransAtlantic Grid (DataTAG) testbed, deployed in 2002 between CERN, Geneva, Switzerland and StarLight, Chicago, IL, USA, is probably the largest testbed built to date. Jointly managed by CERN and Caltech, it is funded by the European Commission, the U.S. Department of Energy and the U.S. National Science Foundation. The main objectives of this testbed are to improve the Grid community's understanding of the networking issues posed by data- intensive Grid applications over transoceanic gigabit networks, design and develop new Grid middleware services, and improve the interoperability of European and U.S. Grid applications in High- Energy and Nuclear Physics. In this paper, we give an overview of this testbed, describe its various topologies over time, and summarize the main lessons learned after...

  13. Exploration Systems Health Management Facilities and Testbed Workshop

    Science.gov (United States)

    Wilson, Scott; Waterman, Robert; McCleskey, Carey

    2004-01-01

    Presentation Agenda : (1) Technology Maturation Pipeline (The Plan) (2) Cryogenic testbed (and other KSC Labs) (2a) Component / Subsystem technologies (3) Advanced Technology Development Center (ATDC) (3a) System / Vehic1e technologies (4) EL V Flight Experiments (Flight Testbeds).

  14. The design and implementation of the LLNL gigabit testbed

    Energy Technology Data Exchange (ETDEWEB)

    Garcia, D. [Lawrence Livermore National Labs., CA (United States)

    1994-12-01

    This paper will look at the design and implementation of the LLNL Gigabit testbed (LGTB), where various high speed networking products, can be tested in one environment. The paper will discuss the philosophy behind the design of and the need for the testbed, the tests that are performed in the testbed, and the tools used to implement those tests.

  15. SSERVI Analog Regolith Simulant Testbed Facility

    Science.gov (United States)

    Minafra, J.; Schmidt, G. K.

    2016-12-01

    SSERVI's goals include supporting planetary researchers within NASA, other government agencies; private sector and hardware developers; competitors in focused prize design competitions; and academic sector researchers. The SSERVI Analog Regolith Simulant Testbed provides opportunities for research scientists and engineers to study the effects of regolith analog testbed research in the planetary exploration field. This capability is essential to help to understand the basic effects of continued long-term exposure to a simulated analog test environment. The current facility houses approximately eight tons of JSC-1A lunar regolith simulant in a test bin consisting of a 4 meter by 4 meter area. SSERVI provides a bridge between several groups, joining together researchers from: 1) scientific and exploration communities, 2) multiple disciplines across a wide range of planetary sciences, and 3) domestic and international communities and partnerships. This testbed provides a means of consolidating the tasks of acquisition, storage and safety mitigation in handling large quantities of regolith simulant Facility hardware and environment testing scenarios include, but are not limited to the following; Lunar surface mobility, Dust exposure and mitigation, Regolith handling and excavation, Solar-like illumination, Lunar surface compaction profile, Lofted dust, Mechanical properties of lunar regolith, and Surface features (i.e. grades and rocks) Numerous benefits vary from easy access to a controlled analog regolith simulant testbed, and planetary exploration activities at NASA Research Park, to academia and expanded commercial opportunities in California's Silicon Valley, as well as public outreach and education opportunities.

  16. Cognitive Medical Wireless Testbed System (COMWITS)

    Science.gov (United States)

    2016-11-01

    Number: ...... ...... Sub Contractors (DD882) Names of other research staff Inventions (DD882) Scientific Progress This testbed merges two ARO grants...bit 64 bit CPU Intel Xeon Processor E5-1650v3 (6C, 3.5 GHz, Turbo, HT , 15M, 140W) Intel Core i7-3770 (3.4 GHz Quad Core, 77W) Dual Intel Xeon

  17. A Business-to-Business Interoperability Testbed: An Overview

    Energy Technology Data Exchange (ETDEWEB)

    Kulvatunyou, Boonserm [ORNL; Ivezic, Nenad [ORNL; Monica, Martin [Sun Microsystems, Inc.; Jones, Albert [National Institute of Standards and Technology (NIST)

    2003-10-01

    In this paper, we describe a business-to-business (B2B) testbed co-sponsored by the Open Applications Group, Inc. (OAGI) and the National Institute of Standard and Technology (NIST) to advance enterprise e-commerce standards. We describe the business and technical objectives and initial activities within the B2B Testbed. We summarize our initial lessons learned to form the requirements that drive the next generation testbed development. We also give an overview of a promising testing framework architecture in which to drive the testbed developments. We outline the future plans for the testbed development.

  18. Mini-mast CSI testbed user's guide

    Science.gov (United States)

    Tanner, Sharon E.; Pappa, Richard S.; Sulla, Jeffrey L.; Elliott, Kenny B.; Miserentino, Robert; Bailey, James P.; Cooper, Paul A.; Williams, Boyd L., Jr.; Bruner, Anne M.

    1992-01-01

    The Mini-Mast testbed is a 20 m generic truss highly representative of future deployable trusses for space applications. It is fully instrumented for system identification and active vibrations control experiments and is used as a ground testbed at NASA-Langley. The facility has actuators and feedback sensors linked via fiber optic cables to the Advanced Real Time Simulation (ARTS) system, where user defined control laws are incorporated into generic controls software. The object of the facility is to conduct comprehensive active vibration control experiments on a dynamically realistic large space structure. A primary goal is to understand the practical effects of simplifying theoretical assumptions. This User's Guide describes the hardware and its primary components, the dynamic characteristics of the test article, the control law implementation process, and the necessary safeguards employed to protect the test article. Suggestions for a strawman controls experiment are also included.

  19. SSERVI Analog Regolith Simulant Testbed Facility

    Science.gov (United States)

    Minafra, Joseph; Schmidt, Gregory; Bailey, Brad; Gibbs, Kristina

    2016-10-01

    The Solar System Exploration Research Virtual Institute (SSERVI) at NASA's Ames Research Center in California's Silicon Valley was founded in 2013 to act as a virtual institute that provides interdisciplinary research centered on the goals of its supporting directorates: NASA Science Mission Directorate (SMD) and the Human Exploration & Operations Mission Directorate (HEOMD).Primary research goals of the Institute revolve around the integration of science and exploration to gain knowledge required for the future of human space exploration beyond low Earth orbit. SSERVI intends to leverage existing JSC1A regolith simulant resources into the creation of a regolith simulant testbed facility. The purpose of this testbed concept is to provide the planetary exploration community with a readily available capability to test hardware and conduct research in a large simulant environment.SSERVI's goals include supporting planetary researchers within NASA, other government agencies; private sector and hardware developers; competitors in focused prize design competitions; and academic sector researchers.SSERVI provides opportunities for research scientists and engineers to study the effects of regolith analog testbed research in the planetary exploration field. This capability is essential to help to understand the basic effects of continued long-term exposure to a simulated analog test environment.The current facility houses approximately eight tons of JSC-1A lunar regolith simulant in a test bin consisting of a 4 meter by 4 meter area, including dust mitigation and safety oversight.Facility hardware and environment testing scenarios could include, Lunar surface mobility, Dust exposure and mitigation, Regolith handling and excavation, Solar-like illumination, Lunar surface compaction profile, Lofted dust, Mechanical properties of lunar regolith, Surface features (i.e. grades and rocks)Numerous benefits vary from easy access to a controlled analog regolith simulant testbed, and

  20. Current Developments in DETER Cybersecurity Testbed Technology

    Science.gov (United States)

    2015-12-08

    Management Experimental cybersecurity research is often inherently risky. An experiment may involve releasing live malware code, operating a real botnet...imagine a worm that can only propagate by first contacting a “propagation service” (T1 constraint), composed with a testbed firewall (T2...experiment. Finally, T1 constraints might be enforced by (1) explicit modification of malware to constrain its behavior, (2) implicit constraints

  1. The Airborne Optical Systems Testbed (AOSTB)

    Science.gov (United States)

    2017-05-31

    are the Atlantic Ocean and coastal waterways, which reflect back very little light at our SWIR operating wavelength of 1064 nm. The Airborne Optical...demonstrate our typical FOPEN capabilities, figure 5 shows two images taken over a forested area near Burlington, VT. Figure 5(a) is a 3D point...Systems Testbed (AOSTB) 1 - 6 STO-MP-SET-999 (a) (b) Fig. 5. Ladar target scan of a forested area in northern Vermont

  2. Towards standard testbeds for numerical relativity

    International Nuclear Information System (INIS)

    Alcubierre, Miguel; Allen, Gabrielle; Bona, Carles; Fiske, David; Goodale, Tom; Guzman, F Siddhartha; Hawke, Ian; Hawley, Scott H; Husa, Sascha; Koppitz, Michael; Lechner, Christiane; Pollney, Denis; Rideout, David; Salgado, Marcelo; Schnetter, Erik; Seidel, Edward; Shinkai, Hisa-aki; Shoemaker, Deirdre; Szilagyi, Bela; Takahashi, Ryoji; Winicour, Jeff

    2004-01-01

    In recent years, many different numerical evolution schemes for Einstein's equations have been proposed to address stability and accuracy problems that have plagued the numerical relativity community for decades. Some of these approaches have been tested on different spacetimes, and conclusions have been drawn based on these tests. However, differences in results originate from many sources, including not only formulations of the equations, but also gauges, boundary conditions, numerical methods and so on. We propose to build up a suite of standardized testbeds for comparing approaches to the numerical evolution of Einstein's equations that are designed to both probe their strengths and weaknesses and to separate out different effects, and their causes, seen in the results. We discuss general design principles of suitable testbeds, and we present an initial round of simple tests with periodic boundary conditions. This is a pivotal first step towards building a suite of testbeds to serve the numerical relativists and researchers from related fields who wish to assess the capabilities of numerical relativity codes. We present some examples of how these tests can be quite effective in revealing various limitations of different approaches, and illustrating their differences. The tests are presently limited to vacuum spacetimes, can be run on modest computational resources and can be used with many different approaches used in the relativity community

  3. Towards standard testbeds for numerical relativity

    Energy Technology Data Exchange (ETDEWEB)

    Alcubierre, Miguel [Inst. de Ciencias Nucleares, Univ. Nacional Autonoma de Mexico, Apartado Postal 70-543, Mexico Distrito Federal 04510 (Mexico); Allen, Gabrielle; Goodale, Tom; Guzman, F Siddhartha; Hawke, Ian; Husa, Sascha; Koppitz, Michael; Lechner, Christiane; Pollney, Denis; Rideout, David [Max-Planck-Inst. fuer Gravitationsphysik, Albert-Einstein-Institut, 14476 Golm (Germany); Bona, Carles [Departament de Fisica, Universitat de les Illes Balears, Ctra de Valldemossa km 7.5, 07122 Palma de Mallorca (Spain); Fiske, David [Dept. of Physics, Univ. of Maryland, College Park, MD 20742-4111 (United States); Hawley, Scott H [Center for Relativity, Univ. of Texas at Austin, Austin, Texas 78712 (United States); Salgado, Marcelo [Inst. de Ciencias Nucleares, Univ. Nacional Autonoma de Mexico, Apartado Postal 70-543, Mexico Distrito Federal 04510 (Mexico); Schnetter, Erik [Inst. fuer Astronomie und Astrophysik, Universitaet Tuebingen, 72076 Tuebingen (Germany); Seidel, Edward [Max-Planck-Inst. fuer Gravitationsphysik, Albert-Einstein-Inst., 14476 Golm (Germany); Shinkai, Hisa-aki [Computational Science Div., Inst. of Physical and Chemical Research (RIKEN), Hirosawa 2-1, Wako, Saitama 351-0198 (Japan); Shoemaker, Deirdre [Center for Radiophysics and Space Research, Cornell Univ., Ithaca, NY 14853 (United States); Szilagyi, Bela [Dept. of Physics and Astronomy, Univ. of Pittsburgh, Pittsburgh, PA 15260 (United States); Takahashi, Ryoji [Theoretical Astrophysics Center, Juliane Maries Vej 30, 2100 Copenhagen, (Denmark); Winicour, Jeff [Max-Planck-Inst. fuer Gravitationsphysik, Albert-Einstein-Institut, 14476 Golm (Germany)

    2004-01-21

    In recent years, many different numerical evolution schemes for Einstein's equations have been proposed to address stability and accuracy problems that have plagued the numerical relativity community for decades. Some of these approaches have been tested on different spacetimes, and conclusions have been drawn based on these tests. However, differences in results originate from many sources, including not only formulations of the equations, but also gauges, boundary conditions, numerical methods and so on. We propose to build up a suite of standardized testbeds for comparing approaches to the numerical evolution of Einstein's equations that are designed to both probe their strengths and weaknesses and to separate out different effects, and their causes, seen in the results. We discuss general design principles of suitable testbeds, and we present an initial round of simple tests with periodic boundary conditions. This is a pivotal first step towards building a suite of testbeds to serve the numerical relativists and researchers from related fields who wish to assess the capabilities of numerical relativity codes. We present some examples of how these tests can be quite effective in revealing various limitations of different approaches, and illustrating their differences. The tests are presently limited to vacuum spacetimes, can be run on modest computational resources and can be used with many different approaches used in the relativity community.

  4. A Testbed Environment for Buildings-to-Grid Cyber Resilience Research and Development

    Energy Technology Data Exchange (ETDEWEB)

    Sridhar, Siddharth; Ashok, Aditya; Mylrea, Michael E.; Pal, Seemita; Rice, Mark J.; Gourisetti, Sri Nikhil Gup

    2017-09-19

    The Smart Grid is characterized by the proliferation of advanced digital controllers at all levels of its operational hierarchy from generation to end consumption. Such controllers within modern residential and commercial buildings enable grid operators to exercise fine-grained control over energy consumption through several emerging Buildings-to-Grid (B2G) applications. Though this capability promises significant benefits in terms of operational economics and improved reliability, cybersecurity weaknesses in the supporting infrastructure could be exploited to cause a detrimental effect and this necessitates focused research efforts on two fronts. First, the understanding of how cyber attacks in the B2G space could impact grid reliability and to what extent. Second, the development and validation of cyber-physical application-specific countermeasures that are complementary to traditional infrastructure cybersecurity mechanisms for enhanced cyber attack detection and mitigation. The PNNL B2G testbed is currently being developed to address these core research needs. Specifically, the B2G testbed combines high-fidelity buildings+grid simulators, industry-grade building automation and Supervisory Control and Data Acquisition (SCADA) systems in an integrated, realistic, and reconfigurable environment capable of supporting attack-impact-detection-mitigation experimentation. In this paper, we articulate the need for research testbeds to model various B2G applications broadly by looking at the end-to-end operational hierarchy of the Smart Grid. Finally, the paper not only describes the architecture of the B2G testbed in detail, but also addresses the broad spectrum of B2G resilience research it is capable of supporting based on the smart grid operational hierarchy identified earlier.

  5. Building a ROS-Based Testbed for Realistic Multi-Robot Simulation: Taking the Exploration as an Example

    Directory of Open Access Journals (Sweden)

    Zhi Yan

    2017-09-01

    Full Text Available While the robotics community agrees that the benchmarking is of high importance to objectively compare different solutions, there are only few and limited tools to support it. To address this issue in the context of multi-robot systems, we have defined a benchmarking process based on experimental designs, which aimed at improving the reproducibility of experiments by making explicit all elements of a benchmark such as parameters, measurements and metrics. We have also developed a ROS (Robot Operating System-based testbed with the goal of making it easy for users to validate, benchmark, and compare different algorithms including coordination strategies. Our testbed uses the MORSE (Modular OpenRobots Simulation Engine simulator for realistic simulation and a computer cluster for decentralized computation. In this paper, we present our testbed in details with the architecture and infrastructure, the issues encountered in implementing the infrastructure, and the automation of the deployment. We also report a series of experiments on multi-robot exploration, in order to demonstrate the capabilities of our testbed.

  6. A commercial space technology testbed on ISS

    Science.gov (United States)

    Boyle, David R.

    2000-01-01

    There is a significant and growing commercial market for new, more capable communications and remote sensing satellites. Competition in this market strongly motivates satellite manufacturers and spacecraft component developers to test and demonstrate new space hardware in a realistic environment. External attach points on the International Space Station allow it to function uniquely as a space technology testbed to satisfy this market need. However, space industry officials have identified three critical barriers to their commercial use of the ISS: unpredictable access, cost risk, and schedule uncertainty. Appropriate NASA policy initiatives and business/technical assistance for industry from the Commercial Space Center for Engineering can overcome these barriers. .

  7. Use of Tabu Search in a Solver to Map Complex Networks onto Emulab Testbeds

    National Research Council Canada - National Science Library

    MacDonald, Jason E

    2007-01-01

    The University of Utah's solver for the testbed mapping problem uses a simulated annealing metaheuristic algorithm to map a researcher's experimental network topology onto available testbed resources...

  8. Nuclear Instrumentation and Control Cyber Testbed Considerations – Lessons Learned

    Energy Technology Data Exchange (ETDEWEB)

    Jonathan Gray; Robert Anderson; Julio G. Rodriguez; Cheol-Kwon Lee

    2014-08-01

    Abstract: Identifying and understanding digital instrumentation and control (I&C) cyber vulnerabilities within nuclear power plants and other nuclear facilities, is critical if nation states desire to operate nuclear facilities safely, reliably, and securely. In order to demonstrate objective evidence that cyber vulnerabilities have been adequately identified and mitigated, a testbed representing a facility’s critical nuclear equipment must be replicated. Idaho National Laboratory (INL) has built and operated similar testbeds for common critical infrastructure I&C for over ten years. This experience developing, operating, and maintaining an I&C testbed in support of research identifying cyber vulnerabilities has led the Korean Atomic Energy Research Institute of the Republic of Korea to solicit the experiences of INL to help mitigate problems early in the design, development, operation, and maintenance of a similar testbed. The following information will discuss I&C testbed lessons learned and the impact of these experiences to KAERI.

  9. 77 FR 18793 - Spectrum Sharing Innovation Test-Bed Pilot Program

    Science.gov (United States)

    2012-03-28

    .... 120322212-2212-01] Spectrum Sharing Innovation Test-Bed Pilot Program AGENCY: National Telecommunications... Innovation Test-Bed pilot program to assess whether devices employing Dynamic Spectrum Access techniques can... Spectrum Sharing Innovation Test-Bed (Test-Bed) pilot program to examine the feasibility of increased...

  10. Development of an autonomous power system testbed

    International Nuclear Information System (INIS)

    Barton, J.R.; Adams, T.; Liffring, M.E.

    1985-01-01

    A power system testbed has been assembled to advance the development of large autonomous electrical power systems required for the space station, spacecraft, and aircraft. The power system for this effort was designed to simulate single- or dual-bus autonomous power systems, or autonomous systems that reconfigure from a single bus to a dual bus following a severe fault. The approach taken was to provide a flexible power system design with two computer systems for control and management. One computer operates as the control system and performs basic control functions, data and command processing, charge control, and provides status to the second computer. The second computer contains expert system software for mission planning, load management, fault identification and recovery, and sends load and configuration commands to the control system

  11. Aerodynamic design of the National Rotor Testbed.

    Energy Technology Data Exchange (ETDEWEB)

    Kelley, Christopher Lee [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-10-01

    A new wind turbine blade has been designed for the National Rotor Testbed (NRT) project and for future experiments at the Scaled Wind Farm Technology (SWiFT) facility with a specific focus on scaled wakes. This report shows the aerodynamic design of new blades that can produce a wake that has similitude to utility scale blades despite the difference in size and location in the atmospheric boundary layer. Dimensionless quantities circulation, induction, thrust coefficient, and tip-speed-ratio were kept equal between rotor scales in region 2 of operation. The new NRT design matched the aerodynamic quantities of the most common wind turbine in the United States, the GE 1.5sle turbine with 37c model blades. The NRT blade design is presented along with its performance subject to the winds at SWiFT. The design requirements determined by the SWiFT experimental test campaign are shown to be met.

  12. Testbed model and data assimilation for ARM

    International Nuclear Information System (INIS)

    Louis, J.F.

    1992-01-01

    The objectives of this contract are to further develop and test the ALFA (AER Local Forecast and Assimilation) model originally designed at AER for local weather prediction and apply it to three distinct but related purposes in connection with the Atmospheric Radiation Measurement (ARM) program: (a) to provide a testbed that simulates a global climate model in order to facilitate the development and testing of new cloud parametrizations and radiation models; (b) to assimilate the ARM data continuously at the scale of a climate model, using the adjoint method, thus providing the initial conditions and verification data for testing parameumtions; (c) to study the sensitivity of a radiation scheme to cloud parameters, again using the adjoint method, thus demonstrating the usefulness of the testbed model. The data assimilation will use a variational technique that minimizes the difference between the model results and the observation during the analysis period. The adjoint model is used to compute the gradient of a measure of the model errors with respect to nudging terms that are added to the equations to force the model output closer to the data. The radiation scheme that will be included in the basic ALFA model makes use of a gen two-stream approximation, and is designed for vertically inhonogeneous, multiple-scattering atmospheres. The sensitivity of this model to the definition of cloud parameters will be studied. The adjoint technique will also be used to compute the sensitivities. This project is designed to provide the Science Team members with the appropriate tools and modeling environment for proper testing and tuning of new radiation models and cloud parametrization schemes

  13. An Approach for Smart Antenna Testbed

    Science.gov (United States)

    Kawitkar, R. S.; Wakde, D. G.

    2003-07-01

    The use of wireless, mobile, personal communications services are expanding rapidly. Adaptive or "Smart" antenna arrays can increase channel capacity through spatial division. Adaptive antennas can also track mobile users, improving both signal range and quality. For these reasons, smart antenna systems have attracted widespread interest in the telecommunications industry for applications to third generation wireless systems.This paper aims to design and develop an advanced antennas testbed to serve as a common reference for testing adaptive antenna arrays and signal combining algorithms, as well as complete systems. A flexible suite of off line processing software should be written using matlab to perform system calibration, test bed initialization, data acquisition control, data storage/transfer, off line signal processing and analysis and graph plotting. The goal of this paper is to develop low complexity smart antenna structures for 3G systems. The emphasis will be laid on ease of implementation in a multichannel / multi-user environment. A smart antenna test bed will be developed, and various state-of-the-art DSP structures and algorithms will be investigated.Facing the soaring demand for mobile communications, the use of smart antenna arrays in mobile communications systems to exploit spatial diversity to further improve spectral efficiency has recently received considerable attention. Basically, a smart antenna array comprises a number of antenna elements combined via a beamforming network (amplitude and phase control network). Some of the benefits that can be achieved by using SAS (Smart Antenna System) include lower mobile terminal power consumption, range extension, ISI reduction, higher data rate support, and ease of integration into the existing base station system. In terms of economic benefits, adaptive antenna systems employed at base station, though increases the per base station cost, can increase coverage area of each cell site, thereby reducing

  14. Development of a space-systems network testbed

    Science.gov (United States)

    Lala, Jaynarayan; Alger, Linda; Adams, Stuart; Burkhardt, Laura; Nagle, Gail; Murray, Nicholas

    1988-01-01

    This paper describes a communications network testbed which has been designed to allow the development of architectures and algorithms that meet the functional requirements of future NASA communication systems. The central hardware components of the Network Testbed are programmable circuit switching communication nodes which can be adapted by software or firmware changes to customize the testbed to particular architectures and algorithms. Fault detection, isolation, and reconfiguration has been implemented in the Network with a hybrid approach which utilizes features of both centralized and distributed techniques to provide efficient handling of faults within the Network.

  15. Wavefront control performance modeling with WFIRST shaped pupil coronagraph testbed

    Science.gov (United States)

    Zhou, Hanying; Nemati, Bijian; Krist, John; Cady, Eric; Kern, Brian; Poberezhskiy, Ilya

    2017-09-01

    NASA's WFIRST mission includes a coronagraph instrument (CGI) for direct imaging of exoplanets. Significant improvement in CGI model fidelity has been made recently, alongside a testbed high contrast demonstration in a simulated dynamic environment at JPL. We present our modeling method and results of comparisons to testbed's high order wavefront correction performance for the shaped pupil coronagraph. Agreement between model prediction and testbed result at better than a factor of 2 has been consistently achieved in raw contrast (contrast floor, chromaticity, and convergence), and with that comes good agreement in contrast sensitivity to wavefront perturbations and mask lateral shear.

  16. Improving Flight Software Module Validation Efforts : a Modular, Extendable Testbed Software Framework

    Science.gov (United States)

    Lange, R. Connor

    2012-01-01

    Ever since Explorer-1, the United States' first Earth satellite, was developed and launched in 1958, JPL has developed many more spacecraft, including landers and orbiters. While these spacecraft vary greatly in their missions, capabilities,and destination, they all have something in common. All of the components of these spacecraft had to be comprehensively tested. While thorough testing is important to mitigate risk, it is also a very expensive and time consuming process. Thankfully,since virtually all of the software testing procedures for SMAP are computer controlled, these procedures can be automated. Most people testing SMAP flight software (FSW) would only need to write tests that exercise specific requirements and then check the filtered results to verify everything occurred as planned. This gives developers the ability to automatically launch tests on the testbed, distill the resulting logs into only the important information, generate validation documentation, and then deliver the documentation to management. With many of the steps in FSW testing automated, developers can use their limited time more effectively and can validate SMAP FSW modules quicker and test them more rigorously. As a result of the various benefits of automating much of the testing process, management is considering this automated tools use in future FSW validation efforts.

  17. Application of automation for low cost aircraft cabin simulator

    NARCIS (Netherlands)

    Tan, C.F.; Chen, W.; Boomen, van den G.J.A.; Rauterberg, G.W.M.

    2010-01-01

    This paper presents an application of automation for low cost aircraft cabin simulator. The aircraft cabin simulator is a testbed that was designed for research on aircraft passenger comfort mprovement product. The simulator consists of an economy class section, a business class section, a lavatory

  18. Prognostics-Enabled Power Supply for ADAPT Testbed, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Ridgetop's role is to develop electronic prognostics for sensing power systems in support of NASA/Ames ADAPT testbed. The prognostic enabled power systems from...

  19. The Living With a Star Space Environment Testbed Payload

    Science.gov (United States)

    Xapsos, Mike

    2015-01-01

    This presentation outlines a brief description of the Living With a Star (LWS) Program missions and detailed information about the Space Environment Testbed (SET) payload consisting of a space weather monitor and carrier containing 4 board experiments.

  20. Integrating Simulated Physics and Device Virtualization in Control System Testbeds

    OpenAIRE

    Redwood , Owen; Reynolds , Jason; Burmester , Mike

    2016-01-01

    Part 3: INFRASTRUCTURE MODELING AND SIMULATION; International audience; Malware and forensic analyses of embedded cyber-physical systems are tedious, manual processes that testbeds are commonly not designed to support. Additionally, attesting the physics impact of embedded cyber-physical system malware has no formal methodologies and is currently an art. This chapter describes a novel testbed design methodology that integrates virtualized embedded industrial control systems and physics simula...

  1. A Novel UAV Electric Propulsion Testbed for Diagnostics and Prognostics

    Science.gov (United States)

    Gorospe, George E., Jr.; Kulkarni, Chetan S.

    2017-01-01

    This paper presents a novel hardware-in-the-loop (HIL) testbed for systems level diagnostics and prognostics of an electric propulsion system used in UAVs (unmanned aerial vehicle). Referencing the all electric, Edge 540T aircraft used in science and research by NASA Langley Flight Research Center, the HIL testbed includes an identical propulsion system, consisting of motors, speed controllers and batteries. Isolated under a controlled laboratory environment, the propulsion system has been instrumented for advanced diagnostics and prognostics. To produce flight like loading on the system a slave motor is coupled to the motor under test (MUT) and provides variable mechanical resistance, and the capability of introducing nondestructive mechanical wear-like frictional loads on the system. This testbed enables the verification of mathematical models of each component of the propulsion system, the repeatable generation of flight-like loads on the system for fault analysis, test-to-failure scenarios, and the development of advanced system level diagnostics and prognostics methods. The capabilities of the testbed are extended through the integration of a LabVIEW-based client for the Live Virtual Constructive Distributed Environment (LVCDC) Gateway which enables both the publishing of generated data for remotely located observers and prognosers and the synchronization the testbed propulsion system with vehicles in the air. The developed HIL testbed gives researchers easy access to a scientifically relevant portion of the aircraft without the overhead and dangers encountered during actual flight.

  2. The Science of Home Automation

    Science.gov (United States)

    Thomas, Brian Louis

    Smart home technologies and the concept of home automation have become more popular in recent years. This popularity has been accompanied by social acceptance of passive sensors installed throughout the home. The subsequent increase in smart homes facilitates the creation of home automation strategies. We believe that home automation strategies can be generated intelligently by utilizing smart home sensors and activity learning. In this dissertation, we hypothesize that home automation can benefit from activity awareness. To test this, we develop our activity-aware smart automation system, CARL (CASAS Activity-aware Resource Learning). CARL learns the associations between activities and device usage from historical data and utilizes the activity-aware capabilities to control the devices. To help validate CARL we deploy and test three different versions of the automation system in a real-world smart environment. To provide a foundation of activity learning, we integrate existing activity recognition and activity forecasting into CARL home automation. We also explore two alternatives to using human-labeled data to train the activity learning models. The first unsupervised method is Activity Detection, and the second is a modified DBSCAN algorithm that utilizes Dynamic Time Warping (DTW) as a distance metric. We compare the performance of activity learning with human-defined labels and with automatically-discovered activity categories. To provide evidence in support of our hypothesis, we evaluate CARL automation in a smart home testbed. Our results indicate that home automation can be boosted through activity awareness. We also find that the resulting automation has a high degree of usability and comfort for the smart home resident.

  3. Optical testbed for the LISA phasemeter

    International Nuclear Information System (INIS)

    Schwarze, T S; Fernández Barranco, G; Penkert, D; Gerberding, O; Heinzel, G; Danzmann, K

    2016-01-01

    The planned spaceborne gravitational wave detector LISA will allow the detection of gravitational waves at frequencies between 0.1 mHz and 1 Hz. A breadboard model for the metrology system aka the phasemeter was developed in the scope of an ESA technology development project by a collaboration between the Albert Einstein Institute, the Technical University of Denmark and the Danish industry partner Axcon Aps. It in particular provides the electronic readout of the main interferometer phases besides auxiliary functions. These include clock noise transfer, ADC pilot tone correction, inter-satellite ranging and data transfer. Besides in LISA, the phasemeter can also be applied in future satellite geodesy missions. Here we show the planning and advances in the implementation of an optical testbed for the full metrology chain. It is based on an ultra-stable hexagonal optical bench. This bench allows the generation of three unequal heterodyne beatnotes with a zero phase combination, thus providing the possibility to probe the phase readout for non-linearities in an optical three signal test. Additionally, the utilization of three independent phasemeters will allow the testing of the auxiliary functions. Once working, components can individually be replaced with flight-qualified hardware in this setup. (paper)

  4. Termite: Emulation Testbed for Encounter Networks

    Directory of Open Access Journals (Sweden)

    Rodrigo Bruno

    2015-08-01

    Full Text Available Cutting-edge mobile devices like smartphones and tablets are equipped with various infrastructureless wireless interfaces, such as WiFi Direct and Bluetooth. Such technologies allow for novel mobile applications that take advantage of casual encounters between co-located users. However, the need to mimic the behavior of real-world encounter networks makes testing and debugging of such applications hard tasks. We present Termite, an emulation testbed for encounter networks. Our system allows developers to run their applications on a virtual encounter network emulated by software. Developers can model arbitrary encounter networks and specify user interactions on the emulated virtual devices. To facilitate testing and debugging, developers can place breakpoints, inspect the runtime state of virtual nodes, and run experiments in a stepwise fashion. Termite defines its own Petri Net variant to model the dynamically changing topology and synthesize user interactions with virtual devices. The system is designed to efficiently multiplex an underlying emulation hosting infrastructure across multiple developers, and to support heterogeneous mobile platforms. Our current system implementation supports virtual Android devices communicating over WiFi Direct networks and runs on top of a local cloud infrastructure. We evaluated our system using emulator network traces, and found that Termite is expressive and performs well.

  5. Optical testbed for the LISA phasemeter

    Science.gov (United States)

    Schwarze, T. S.; Fernández Barranco, G.; Penkert, D.; Gerberding, O.; Heinzel, G.; Danzmann, K.

    2016-05-01

    The planned spaceborne gravitational wave detector LISA will allow the detection of gravitational waves at frequencies between 0.1 mHz and 1 Hz. A breadboard model for the metrology system aka the phasemeter was developed in the scope of an ESA technology development project by a collaboration between the Albert Einstein Institute, the Technical University of Denmark and the Danish industry partner Axcon Aps. It in particular provides the electronic readout of the main interferometer phases besides auxiliary functions. These include clock noise transfer, ADC pilot tone correction, inter-satellite ranging and data transfer. Besides in LISA, the phasemeter can also be applied in future satellite geodesy missions. Here we show the planning and advances in the implementation of an optical testbed for the full metrology chain. It is based on an ultra-stable hexagonal optical bench. This bench allows the generation of three unequal heterodyne beatnotes with a zero phase combination, thus providing the possibility to probe the phase readout for non-linearities in an optical three signal test. Additionally, the utilization of three independent phasemeters will allow the testing of the auxiliary functions. Once working, components can individually be replaced with flight-qualified hardware in this setup.

  6. Ames life science telescience testbed evaluation

    Science.gov (United States)

    Haines, Richard F.; Johnson, Vicki; Vogelsong, Kristofer H.; Froloff, Walt

    1989-01-01

    Eight surrogate spaceflight mission specialists participated in a real-time evaluation of remote coaching using the Ames Life Science Telescience Testbed facility. This facility consisted of three remotely located nodes: (1) a prototype Space Station glovebox; (2) a ground control station; and (3) a principal investigator's (PI) work area. The major objective of this project was to evaluate the effectiveness of telescience techniques and hardware to support three realistic remote coaching science procedures: plant seed germinator charging, plant sample acquisition and preservation, and remote plant observation with ground coaching. Each scenario was performed by a subject acting as flight mission specialist, interacting with a payload operations manager and a principal investigator expert. All three groups were physically isolated from each other yet linked by duplex audio and color video communication channels and networked computer workstations. Workload ratings were made by the flight and ground crewpersons immediately after completing their assigned tasks. Time to complete each scientific procedural step was recorded automatically. Two expert observers also made performance ratings and various error assessments. The results are presented and discussed.

  7. The Impact of Automation Reliability and Operator Fatigue on Performance and Reliance

    Science.gov (United States)

    2016-09-23

    Cummings et al., 2007). Automation designed to assist operators in overload situations may promote operator disengagement during periods of low...Calhoun et al., 2011). This testbed offers several tasks designed to emulate the cognitive demands that an operator managing multiple UAVs is likely...reliable (Cronbach’s α = 0.94) measure of affective and cognitive components of trust in automation. Items gauge confidence in an automation and

  8. Development of a Tethered Formation Flight Testbed for ISS, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The development of a testbed for the development and demonstration of technologies needed by tethered formation flying satellites is proposed. Such a testbed would...

  9. The Living With a Star Space Environment Testbed Program

    Science.gov (United States)

    Barth, Janet; LaBel, Kenneth; Day, John H. (Technical Monitor)

    2001-01-01

    NASA has initiated the Living with a Star (LWS) Program to develop the scientific understanding to address the aspects of the Connected Sun-Earth system that affects life and society. The Program Architecture includes science missions, theory and modeling and Space Environment Testbeds (SET). This current paper discusses the Space Environment Testbeds. The goal of the SET program is to improve the engineering approach to accomodate and/or mitigate the effects of solar variability on spacecraft design and operations. The SET Program will infuse new technologies into the space programs through collection of data in space and subsequent design and validation of technologies. Examples of these technologies are cited and discussed.

  10. Dr. Tulga Ersal at NSF Workshop Accessible Remote Testbeds ART'15

    Science.gov (United States)

    Event Archives Dr. Tulga Ersal at NSF Workshop Accessible Remote Testbeds ART'15 On November 12th, Dr Workshop on Accessible Remote Testbeds (ART'15) at Georgia Tech. From the event website: The rationale behind the ART'15 workshop is that remote-access testbeds could, if done right, significantly change how

  11. Smart Antenna UKM Testbed for Digital Beamforming System

    Directory of Open Access Journals (Sweden)

    2009-03-01

    Full Text Available A new design of smart antenna testbed developed at UKM for digital beamforming purpose is proposed. The smart antenna UKM testbed developed based on modular design employing two novel designs of L-probe fed inverted hybrid E-H (LIEH array antenna and software reconfigurable digital beamforming system (DBS. The antenna is developed based on using the novel LIEH microstrip patch element design arranged into 4×1 uniform linear array antenna. An interface board is designed to interface to the ADC board with the RF front-end receiver. The modular concept of the system provides the capability to test the antenna hardware, beamforming unit, and beamforming algorithm in an independent manner, thus allowing the smart antenna system to be developed and tested in parallel, hence reduces the design time. The DBS was developed using a high-performance TMS320C6711TM floating-point DSP board and a 4-channel RF front-end receiver developed in-house. An interface board is designed to interface to the ADC board with the RF front-end receiver. A four-element receiving array testbed at 1.88–2.22 GHz frequency is constructed, and digital beamforming on this testbed is successfully demonstrated.

  12. ASE-BAN, a Wireless Body Area Network Testbed

    DEFF Research Database (Denmark)

    Madsen, Jens Kargaard; Karstoft, Henrik; Toftegaard, Thomas Skjødeberg

    2010-01-01

    /actuators attached to the body and a host server application. The gateway uses the BlackFin BF533 processor from Analog Devices, and uses Bluetooth for wireless communication. Two types of sensors are attached to the network: an electro-cardio-gram sensor and an oximeter sensor. The testbed has been successfully...

  13. Towards a Perpetual Sensor Network Testbed without Backchannel

    DEFF Research Database (Denmark)

    Johansen, Aslak; Bonnet, Philippe; Sørensen, Thomas

    2012-01-01

    The sensor network testbeds available today rely on a communication channel different from the mote radio - a backchannel - to facilitate mote reprogramming, health monitoring and performance analysis. Such backchannels are either supported as wired communication channels (USB or Ethernet), or vi...

  14. Torpedo and countermeasures modelling in the Torpedo Defence System Testbed

    NARCIS (Netherlands)

    Benders, F.P.A.; Witberg, R.R.; H.J. Grootendorst, H.J.

    2002-01-01

    Several years ago, TNO-FEL started the development of the Torpedo Defence System Testbed (TDSTB) based on the TORpedo SIMulation (TORSIM) model and the Maritime Operations Simulation and Evaluation System (MOSES). MOSES provides the simulation and modelling environment for the evaluation and

  15. Operation Duties on the F-15B Research Testbed

    Science.gov (United States)

    Truong, Samson S.

    2010-01-01

    This presentation entails what I have done this past summer for my Co-op tour in the Operations Engineering Branch. Activities included supporting the F-15B Research Testbed, supporting the incoming F-15D models, design work, and other operations engineering duties.

  16. An Intelligent Archive Testbed Incorporating Data Mining

    Science.gov (United States)

    Ramapriyan, H.; Isaac, D.; Yang, W.; Bonnlander, B.; Danks, D.

    2009-01-01

    interoperability, and being able to convert data to information and usable knowledge in an efficient, convenient manner, aided significantly by automation (Ramapriyan et al. 2004; NASA 2005). We can look upon the distributed provider environment with capabilities to convert data to information and to knowledge as an Intelligent Archive in the Context of a Knowledge Building system (IA-KBS). Some of the key capabilities of an IA-KBS are: Virtual Product Generation, Significant Event Detection, Automated Data Quality Assessment, Large-Scale Data Mining, Dynamic Feedback Loop, and Data Discovery and Efficient Requesting (Ramapriyan et al. 2004).

  17. Development of a Testbed for Wireless Underground Sensor Networks

    Directory of Open Access Journals (Sweden)

    Mehmet C. Vuran

    2010-01-01

    Full Text Available Wireless Underground Sensor Networks (WUSNs constitute one of the promising application areas of the recently developed wireless sensor networking techniques. WUSN is a specialized kind of Wireless Sensor Network (WSN that mainly focuses on the use of sensors that communicate through soil. Recent models for the wireless underground communication channel are proposed but few field experiments were realized to verify the accuracy of the models. The realization of field WUSN experiments proved to be extremely complex and time-consuming in comparison with the traditional wireless environment. To the best of our knowledge, this is the first work that proposes guidelines for the development of an outdoor WUSN testbed with the goals of improving the accuracy and reducing of time for WUSN experiments. Although the work mainly aims WUSNs, many of the presented practices can also be applied to generic WSN testbeds.

  18. A MIMO-OFDM Testbed for Wireless Local Area Networks

    Directory of Open Access Journals (Sweden)

    Conrat Jean-Marc

    2006-01-01

    Full Text Available We describe the design steps and final implementation of a MIMO OFDM prototype platform developed to enhance the performance of wireless LAN standards such as HiperLAN/2 and 802.11, using multiple transmit and multiple receive antennas. We first describe the channel measurement campaign used to characterize the indoor operational propagation environment, and analyze the influence of the channel on code design through a ray-tracing channel simulator. We also comment on some antenna and RF issues which are of importance for the final realization of the testbed. Multiple coding, decoding, and channel estimation strategies are discussed and their respective performance-complexity trade-offs are evaluated over the realistic channel obtained from the propagation studies. Finally, we present the design methodology, including cross-validation of the Matlab, C++, and VHDL components, and the final demonstrator architecture. We highlight the increased measured performance of the MIMO testbed over the single-antenna system.

  19. Design and Prototyping of a Satellite Antenna Slew Testbed

    Science.gov (United States)

    2013-12-01

    beers and kind advice gave me a family away from home. To my familia here in the Bay Area; their constant support, understanding and surprise...Encoder Cable Maxon 275934 2 CAB 29 EPOS Power Cable Maxon 275829 2 CAB 30 Misc Hardware** NPS 30 - - Bill of Materials 35 closely match the actual ...computed trajectory. The position and velocity results were then implemented on the testbed motors for comparison of actual versus commanded values

  20. A technical description of the FlexHouse Project Testbed

    DEFF Research Database (Denmark)

    Sørensen, Jens Otto

    2000-01-01

    This paper describes the FlexHouse project testbed; a server dedicated to experiments within the FlexHouse project. The FlexHouse project is a project originating from The Business Computing Research Group at The Aarhus School of Business. The purpose of the project is to identify and develop...... methods that satisfy the following three requirements. Flexibility with respect to evolving data sources. Flexibility with respect to change of information needs. Efficiency with respect to view management....

  1. Testbed for a LiFi system integrated in streetlights

    OpenAIRE

    Monzón Baeza, Victor; Sánchez Fernández, Matilde Pilar; García-Armada, Ana; Royo, A.

    2015-01-01

    Proceeding at: 2015 European Conference on Networks and Communications (EuCNC) took place June 29 - July 2 in Paris, France. In this paper, a functional LiFi real-time testbed implemented on FPGAs is presented. The setup evaluates the performance of our design in a downlink scenario where the transmitter is embedded on the streetlights and a mobile phone’s camera is used as receiver, therefore achieving the goal of lighting and communicating simultaneously. To validate the ...

  2. Development and experimentation of an eye/brain/task testbed

    Science.gov (United States)

    Harrington, Nora; Villarreal, James

    1987-01-01

    The principal objective is to develop a laboratory testbed that will provide a unique capability to elicit, control, record, and analyze the relationship of operator task loading, operator eye movement, and operator brain wave data in a computer system environment. The ramifications of an integrated eye/brain monitor to the man machine interface are staggering. The success of such a system would benefit users of space and defense, paraplegics, and the monitoring of boring screens (nuclear power plants, air defense, etc.)

  3. Home Automation

    OpenAIRE

    Ahmed, Zeeshan

    2010-01-01

    In this paper I briefly discuss the importance of home automation system. Going in to the details I briefly present a real time designed and implemented software and hardware oriented house automation research project, capable of automating house's electricity and providing a security system to detect the presence of unexpected behavior.

  4. Visible nulling coronagraphy testbed development for exoplanet detection

    Science.gov (United States)

    Lyon, Richard G.; Clampin, Mark; Woodruff, Robert A.; Vasudevan, Gopal; Thompson, Patrick; Chen, Andrew; Petrone, Peter; Booth, Andrew; Madison, Timothy; Bolcar, Matthew; Noecker, M. Charley; Kendrick, Stephen; Melnick, Gary; Tolls, Volker

    2010-07-01

    Three of the recently completed NASA Astrophysics Strategic Mission Concept (ASMC) studies addressed the feasibility of using a Visible Nulling Coronagraph (VNC) as the prime instrument for exoplanet science. The VNC approach is one of the few approaches that works with filled, segmented and sparse or diluted aperture telescope systems and thus spans the space of potential ASMC exoplanet missions. NASA/Goddard Space Flight Center (GSFC) has a well-established effort to develop VNC technologies and has developed an incremental sequence of VNC testbeds to advance the this approach and the technologies associated with it. Herein we report on the continued development of the vacuum Visible Nulling Coronagraph testbed (VNT). The VNT is an ultra-stable vibration isolated testbed that operates under high bandwidth closed-loop control within a vacuum chamber. It will be used to achieve an incremental sequence of three visible light nulling milestones of sequentially higher contrasts of 108, 109 and 1010 at an inner working angle of 2*λ/D and ultimately culminate in spectrally broadband (>20%) high contrast imaging. Each of the milestones, one per year, is traceable to one or more of the ASMC studies. The VNT uses a modified Mach-Zehnder nulling interferometer, modified with a modified "W" configuration to accommodate a hex-packed MEMS based deformable mirror, a coherent fiber bundle and achromatic phase shifters. Discussed will be the optical configuration laboratory results, critical technologies and the null sensing and control approach.

  5. Easy as Pi: A Network Coding Raspberry Pi Testbed

    Directory of Open Access Journals (Sweden)

    Chres W. Sørensen

    2016-10-01

    Full Text Available In the near future, upcoming communications and storage networks are expected to tolerate major difficulties produced by huge amounts of data being generated from the Internet of Things (IoT. For these types of networks, strategies and mechanisms based on network coding have appeared as an alternative to overcome these difficulties in a holistic manner, e.g., without sacrificing the benefit of a given network metric when improving another. There has been recurrent issues on: (i making large-scale deployments akin to the Internet of Things; (ii assessing and (iii replicating the obtained results in preliminary studies. Therefore, finding testbeds that can deal with large-scale deployments and not lose historic data in order to evaluate these mechanisms are greatly needed and desirable from a research perspective. However, this can be hard to manage, not only due to the inherent costs of the hardware, but also due to maintenance challenges. In this paper, we present the required key steps to design, setup and maintain an inexpensive testbed using Raspberry Pi devices for communications and storage networks with network coding capabilities. This testbed can be utilized for any applications requiring results replicability.

  6. CanOpen on RASTA: The Integration of the CanOpen IP Core in the Avionics Testbed

    Science.gov (United States)

    Furano, Gianluca; Guettache, Farid; Magistrati, Giorgio; Tiotto, Gabriele; Ortega, Carlos Urbina; Valverde, Alberto

    2013-08-01

    This paper presents the work done within the ESA Estec Data Systems Division, targeting the integration of the CanOpen IP Core with the existing Reference Architecture Test-bed for Avionics (RASTA). RASTA is the reference testbed system of the ESA Avionics Lab, designed to integrate the main elements of a typical Data Handling system. It aims at simulating a scenario where a Mission Control Center communicates with on-board computers and systems through a TM/TC link, thus providing the data management through qualified processors and interfaces such as Leon2 core processors, CAN bus controllers, MIL-STD-1553 and SpaceWire. This activity aims at the extension of the RASTA with two boards equipped with HurriCANe controller, acting as CANOpen slaves. CANOpen software modules have been ported on the RASTA system I/O boards equipped with Gaisler GR-CAN controller and acts as master communicating with the CCIPC boards. CanOpen serves as upper application layer for based on CAN defined within the CAN-in-Automation standard and can be regarded as the definitive standard for the implementation of CAN-based systems solutions. The development and integration of CCIPC performed by SITAEL S.p.A., is the first application that aims to bring the CANOpen standard for space applications. The definition of CANOpen within the European Cooperation for Space Standardization (ECSS) is under development.

  7. Development of Liquid Propulsion Systems Testbed at MSFC

    Science.gov (United States)

    Alexander, Reginald; Nelson, Graham

    2016-01-01

    As NASA, the Department of Defense and the aerospace industry in general strive to develop capabilities to explore near-Earth, Cis-lunar and deep space, the need to create more cost effective techniques of propulsion system design, manufacturing and test is imperative in the current budget constrained environment. The physics of space exploration have not changed, but the manner in which systems are developed and certified needs to change if there is going to be any hope of designing and building the high performance liquid propulsion systems necessary to deliver crew and cargo to the further reaches of space. To further the objective of developing these systems, the Marshall Space Flight Center is currently in the process of formulating a Liquid Propulsion Systems testbed, which will enable rapid integration of components to be tested and assessed for performance in integrated systems. The manifestation of this testbed is a breadboard engine configuration (BBE) with facility support for consumables and/or other components as needed. The goal of the facility is to test NASA developed elements, but can be used to test articles developed by other government agencies, industry or academia. Joint government/private partnership is likely the approach that will be required to enable efficient propulsion system development. MSFC has recently tested its own additively manufactured liquid hydrogen pump, injector, and valves in a BBE hot firing. It is rapidly building toward testing the pump and a new CH4 injector in the BBE configuration to demonstrate a 22,000 lbf, pump-fed LO2/LCH4 engine for the Mars lander or in-space transportation. The value of having this BBE testbed is that as components are developed they may be easily integrated in the testbed and tested. MSFC is striving to enhance its liquid propulsion system development capability. Rapid design, analysis, build and test will be critical to fielding the next high thrust rocket engine. With the maturity of the

  8. Mounted Smartphones as Measurement and Control Platforms for Motor-Based Laboratory Test-Beds

    OpenAIRE

    Jared A. Frank; Anthony Brill; Vikram Kapila

    2016-01-01

    Laboratory education in science and engineering often entails the use of test-beds equipped with costly peripherals for sensing, acquisition, storage, processing, and control of physical behavior. However, costly peripherals are no longer necessary to obtain precise measurements and achieve stable feedback control of test-beds. With smartphones performing diverse sensing and processing tasks, this study examines the feasibility of mounting smartphones directly to test-beds to exploit their em...

  9. Analysis, Modeling, and Simulation (AMS) Testbed Development and Evaluation to Support Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) Programs - calibration Report for Phoenix Testbed : Final Report. [supporting datasets - Phoenix Testbed

    Science.gov (United States)

    2017-07-26

    The datasets in this zip file are in support of FHWA-JPO-16-379, Analysis, Modeling, and Simulation (AMS) Testbed Development and Evaluation to Support Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) Program...

  10. COLUMBUS as Engineering Testbed for Communications and Multimedia Equipment

    Science.gov (United States)

    Bank, C.; Anspach von Broecker, G. O.; Kolloge, H.-G.; Richters, M.; Rauer, D.; Urban, G.; Canovai, G.; Oesterle, E.

    2002-01-01

    The paper presents ongoing activities to prepare COLUMBUS for communications and multimedia technology experiments. For this purpose, Astrium SI, Bremen, has studied several options how to best combine the given system architecture with flexible and state-of-the-art interface avionics and software. These activities have been conducted in coordination with, and partially under contract of, DLR and ESA/ESTEC. Moreover, Astrium SI has realized three testbeds for multimedia software and hardware testing under own funding. The experimental core avionics unit - about a half double rack - establishes the core of a new multi-user experiment facility for this type of investigation onboard COLUMBUS, which shall be available to all users of COLUMBUS. It allows for the connection of 2nd generation payload, that is payload requiring broadband data transfer and near-real-time access by the Principal Investigator on ground, to test highly interactive and near-realtime payload operation. The facility is also foreseen to test new equipment to provide the astronauts onboard the ISS/COLUMBUS with bi- directional hi-fi voice and video connectivity to ground, private voice coms and e-mail, and a multimedia workstation for ops training and recreation. Connection to an appropriate Wide Area Network (WAN) on Earth is possible. The facility will include a broadband data transmission front-end terminal, which is mounted externally on the COLUMBUS module. This Equipment provides high flexibility due to the complete transparent transmit and receive chains, the steerable multi-frequency antenna system and its own thermal and power control and distribution. The Equipment is monitored and controlled via the COLUMBUS internal facility. It combines several new hardware items, which are newly developed for the next generation of broadband communication satellites and operates in Ka -Band with the experimental ESA data relay satellite ARTEMIS. The equipment is also TDRSS compatible; the open loop

  11. EMERGE - ESnet/MREN Regional Science Grid Experimental NGI Testbed

    Energy Technology Data Exchange (ETDEWEB)

    Mambretti, Joe; DeFanti, Tom; Brown, Maxine

    2001-07-31

    This document is the final report on the EMERGE Science Grid testbed research project from the perspective of the International Center for Advanced Internet Research (iCAIR) at Northwestern University, which was a subcontractor to this UIC project. This report is a compilation of information gathered from a variety of materials related to this project produced by multiple EMERGE participants, especially those at Electronic Visualization Lab (EVL) at the University of Illinois at Chicago (UIC), Argonne National Lab and iCAIR. The EMERGE Science Grid project was managed by Tom DeFanti, PI from EVL at UIC.

  12. The Living With a Star Space Environment Testbed Experiments

    Science.gov (United States)

    Xapsos, Michael A.

    2014-01-01

    The focus of the Living With a Star (LWS) Space Environment Testbed (SET) program is to improve the performance of hardware in the space radiation environment. The program has developed a payload for the Air Force Research Laboratory (AFRL) Demonstration and Science Experiments (DSX) spacecraft that is scheduled for launch in August 2015 on the SpaceX Falcon Heavy rocket. The primary structure of DSX is an Evolved Expendable Launch Vehicle (EELV) Secondary Payload Adapter (ESPA) ring. DSX will be in a Medium Earth Orbit (MEO). This oral presentation will describe the SET payload.

  13. SCaN Testbed Software Development and Lessons Learned

    Science.gov (United States)

    Kacpura, Thomas J.; Varga, Denise M.

    2012-01-01

    National Aeronautics and Space Administration (NASA) has developed an on-orbit, adaptable, Software Defined Radio (SDR)Space Telecommunications Radio System (STRS)-based testbed facility to conduct a suite of experiments to advance technologies, reduce risk, and enable future mission capabilities on the International Space Station (ISS). The SCAN Testbed Project will provide NASA, industry, other Government agencies, and academic partners the opportunity to develop and field communications, navigation, and networking technologies in the laboratory and space environment based on reconfigurable, SDR platforms and the STRS Architecture.The SDRs are a new technology for NASA, and the support infrastructure they require is different from legacy, fixed function radios. SDRs offer the ability to reconfigure on-orbit communications by changing software for new waveforms and operating systems to enable new capabilities or fix any anomalies, which was not a previous option. They are not stand alone devices, but required a new approach to effectively control them and flow data. This requires extensive software to be developed to utilize the full potential of these reconfigurable platforms. The paper focuses on development, integration and testing as related to the avionics processor system, and the software required to command, control, monitor, and interact with the SDRs, as well as the other communication payload elements. An extensive effort was required to develop the flight software and meet the NASA requirements for software quality and safety. The flight avionics must be radiation tolerant, and these processors have limited capability in comparison to terrestrial counterparts. A big challenge was that there are three SDRs onboard, and interfacing with multiple SDRs simultaneously complicatesd the effort. The effort also includes ground software, which is a key element for both the command of the payload, and displaying data created by the payload. The verification of

  14. Software Testbed for Developing and Evaluating Integrated Autonomous Subsystems

    Science.gov (United States)

    Ong, James; Remolina, Emilio; Prompt, Axel; Robinson, Peter; Sweet, Adam; Nishikawa, David

    2015-01-01

    To implement fault tolerant autonomy in future space systems, it will be necessary to integrate planning, adaptive control, and state estimation subsystems. However, integrating these subsystems is difficult, time-consuming, and error-prone. This paper describes Intelliface/ADAPT, a software testbed that helps researchers develop and test alternative strategies for integrating planning, execution, and diagnosis subsystems more quickly and easily. The testbed's architecture, graphical data displays, and implementations of the integrated subsystems support easy plug and play of alternate components to support research and development in fault-tolerant control of autonomous vehicles and operations support systems. Intelliface/ADAPT controls NASA's Advanced Diagnostics and Prognostics Testbed (ADAPT), which comprises batteries, electrical loads (fans, pumps, and lights), relays, circuit breakers, invertors, and sensors. During plan execution, an experimentor can inject faults into the ADAPT testbed by tripping circuit breakers, changing fan speed settings, and closing valves to restrict fluid flow. The diagnostic subsystem, based on NASA's Hybrid Diagnosis Engine (HyDE), detects and isolates these faults to determine the new state of the plant, ADAPT. Intelliface/ADAPT then updates its model of the ADAPT system's resources and determines whether the current plan can be executed using the reduced resources. If not, the planning subsystem generates a new plan that reschedules tasks, reconfigures ADAPT, and reassigns the use of ADAPT resources as needed to work around the fault. The resource model, planning domain model, and planning goals are expressed using NASA's Action Notation Modeling Language (ANML). Parts of the ANML model are generated automatically, and other parts are constructed by hand using the Planning Model Integrated Development Environment, a visual Eclipse-based IDE that accelerates ANML model development. Because native ANML planners are currently

  15. The Living With a Star Program Space Environment Testbed

    Science.gov (United States)

    Barth, Janet; Day, John H. (Technical Monitor)

    2001-01-01

    This viewgraph presentation describes the objective, approach, and scope of the Living With a Star (LWS) program at the Marshall Space Flight Center. Scientists involved in the project seek to refine the understanding of space weather and the role of solar variability in terrestrial climate change. Research and the development of improved analytic methods have led to increased predictive capabilities and the improvement of environment specification models. Specifically, the Space Environment Testbed (SET) project of LWS is responsible for the implementation of improved engineering approaches to observing solar effects on climate change. This responsibility includes technology development, ground test protocol development, and the development of a technology application model/engineering tool.

  16. Smart Grid: Network simulator for smart grid test-bed

    International Nuclear Information System (INIS)

    Lai, L C; Ong, H S; Che, Y X; Do, N Q; Ong, X J

    2013-01-01

    Smart Grid become more popular, a smaller scale of smart grid test-bed is set up at UNITEN to investigate the performance and to find out future enhancement of smart grid in Malaysia. The fundamental requirement in this project is design a network with low delay, no packet drop and with high data rate. Different type of traffic has its own characteristic and is suitable for different type of network and requirement. However no one understands the natural of traffic in smart grid. This paper presents the comparison between different types of traffic to find out the most suitable traffic for the optimal network performance.

  17. Analysis, modeling, and simulation (AMS) testbed development and evaluation to support dynamic mobility applications (DMA) and active transportation and demand management (ATDM) programs : Dallas testbed analysis plan.

    Science.gov (United States)

    2016-06-16

    The primary objective of this project is to develop multiple simulation Testbeds/transportation models to evaluate theimpacts of DMA connected vehicle applications and the active and dynamic transportation management (ATDM)strategies. The outputs (mo...

  18. Growth plan for an inspirational test-bed of smart textile services

    NARCIS (Netherlands)

    Wensveen, S.A.G.; Tomico, O.; Bhomer, ten M.; Kuusk, K.

    2015-01-01

    In this pictorial we visualize the growth plan for an inspirational test-bed of smart textile product service systems. The goal of the test-bed is to inspire and inform the Dutch creative industries of textile, interaction and service design to combine their strengths and share opportunities. The

  19. Development of a smart-antenna test-bed, demonstrating software defined digital beamforming

    NARCIS (Netherlands)

    Kluwer, T.; Slump, Cornelis H.; Schiphorst, Roelof; Hoeksema, F.W.

    2001-01-01

    This paper describes a smart-antenna test-bed consisting of ‘common of the shelf’ (COTS) hardware and software defined radio components. The use of software radio components enables a flexible platform to implement and test mobile communication systems as a real-world system. The test-bed is

  20. Modular Algorithm Testbed Suite (MATS): A Software Framework for Automatic Target Recognition

    Science.gov (United States)

    2017-01-01

    NAVAL SURFACE WARFARE CENTER PANAMA CITY DIVISION PANAMA CITY, FL 32407-7001 TECHNICAL REPORT NSWC PCD TR-2017-004 MODULAR ...31-01-2017 Technical Modular Algorithm Testbed Suite (MATS): A Software Framework for Automatic Target Recognition DR...flexible platform to facilitate the development and testing of ATR algorithms. To that end, NSWC PCD has created the Modular Algorithm Testbed Suite

  1. Context-aware local Intrusion Detection in SCADA systems : a testbed and two showcases

    NARCIS (Netherlands)

    Chromik, Justyna Joanna; Haverkort, Boudewijn R.H.M.; Remke, Anne Katharina Ingrid; Pilch, Carina; Brackmann, Pascal; Duhme, Christof; Everinghoff, Franziska; Giberlein, Artur; Teodorowicz, Thomas; Wieland, Julian

    2017-01-01

    This paper illustrates the use of a testbed that we have developed for context-aware local intrusion detection. This testbed is based on the co-simulation framework Mosaik and allows for the validation of local intrusion detection mechanisms at field stations in power distribution networks. For two

  2. Design of aircraft cabin testbed for stress free air travel experiment

    NARCIS (Netherlands)

    Tan, C.F.; Chen, W.; Rauterberg, G.W.M.

    2009-01-01

    The paper presents an aircraft cabin testbed that is designed and built for the stress free air travel experiment. The project is funded by European Union in the aim of improving air travel comfort during long haul flight. The testbed is used to test and validate the adaptive system that is capable

  3. User interface design principles for the SSM/PMAD automated power system

    Science.gov (United States)

    Jakstas, Laura M.; Myers, Chris J.

    1991-01-01

    Martin Marietta has developed a user interface for the space station module power management and distribution (SSM/PMAD) automated power system testbed which provides human access to the functionality of the power system, as well as exemplifying current techniques in user interface design. The testbed user interface was designed to enable an engineer to operate the system easily without having significant knowledge of computer systems, as well as provide an environment in which the engineer can monitor and interact with the SSM/PMAD system hardware. The design of the interface supports a global view of the most important data from the various hardware and software components, as well as enabling the user to obtain additional or more detailed data when needed. The components and representations of the SSM/PMAD testbed user interface are examined. An engineer's interactions with the system are also described.

  4. The University of Canberra quantum key distribution testbed

    International Nuclear Information System (INIS)

    Ganeshkumar, G.; Edwards, P.J.; Cheung, W.N.; Barbopoulos, L.O.; Pham, H.; Hazel, J.C.

    1999-01-01

    Full text: We describe the design, operation and preliminary results obtained from a quantum key distribution (QKD) testbed constructed at the University of Canberra. Quantum cryptographic systems use shared secret keys exchanged in the form of sequences of polarisation coded or phase encoded single photons transmitted over an optical communications channel. Secrecy of this quantum key rests upon fundamental laws of quantum physics: measurements of linear or circular photon polarisation states introduce noise into the conjugate variable and so reveal eavesdropping. In its initial realisation reported here, pulsed light from a 650nm laser diode is attenuated by a factor of 10 6 , plane-polarised and then transmitted through a birefringent liquid crystal modulator (LCM) to a polarisation sensitive single photon receiver. This transmitted key sequence consists of a 1 kHz train of weak coherent 100ns wide light pulses, polarisation coded according to the BB84 protocol. Each pulse is randomly assigned one of four polarisation states (two orthogonal linear and two orthogonal circular) by computer PCA operated by the sender ('Alice'). This quaternary polarisation shift keyed photon stream is detected by the receiver ('Bob') whose computer (PCB) randomly chooses either a linear or a circular polarisation basis. Computer PCB is also used for final key selection, authentication, privacy amplification and eavesdropping. We briefly discuss the realisation of a mesoscopic single photon QKD source and the use of the testbed to simulate a global quantum key distribution system using earth satellites. Copyright (1999) Australian Optical Society

  5. Distributed computing testbed for a remote experimental environment

    International Nuclear Information System (INIS)

    Butner, D.N.; Casper, T.A.; Howard, B.C.; Henline, P.A.; Davis, S.L.; Barnes, D.

    1995-01-01

    Collaboration is increasing as physics research becomes concentrated on a few large, expensive facilities, particularly in magnetic fusion energy research, with national and international participation. These facilities are designed for steady state operation and interactive, real-time experimentation. We are developing tools to provide for the establishment of geographically distant centers for interactive operations; such centers would allow scientists to participate in experiments from their home institutions. A testbed is being developed for a Remote Experimental Environment (REE), a ''Collaboratory.'' The testbed will be used to evaluate the ability of a remotely located group of scientists to conduct research on the DIII-D Tokamak at General Atomics. The REE will serve as a testing environment for advanced control and collaboration concepts applicable to future experiments. Process-to-process communications over high speed wide area networks provide real-time synchronization and exchange of data among multiple computer networks, while the ability to conduct research is enhanced by adding audio/video communication capabilities. The Open Software Foundation's Distributed Computing Environment is being used to test concepts in distributed control, security, naming, remote procedure calls and distributed file access using the Distributed File Services. We are exploring the technology and sociology of remotely participating in the operation of a large scale experimental facility

  6. A Battery Certification Testbed for Small Satellite Missions

    Science.gov (United States)

    Cameron, Zachary; Kulkarni, Chetan S.; Luna, Ali Guarneros; Goebel, Kai; Poll, Scott

    2015-01-01

    A battery pack consisting of standard cylindrical 18650 lithium-ion cells has been chosen for small satellite missions based on previous flight heritage and compliance with NASA battery safety requirements. However, for batteries that transit through the International Space Station (ISS), additional certification tests are required for individual cells as well as the battery packs. In this manuscript, we discuss the development of generalized testbeds for testing and certifying different types of batteries critical to small satellite missions. Test procedures developed and executed for this certification effort include: a detailed physical inspection before and after experiments; electrical cycling characterization at the cell and pack levels; battery-pack overcharge, over-discharge, external short testing; battery-pack vacuum leak and vibration testing. The overall goals of these certification procedures are to conform to requirements set forth by the agency and identify unique safety hazards. The testbeds, procedures, and experimental results are discussed for batteries chosen for small satellite missions to be launched from the ISS.

  7. An Overview of NASA's Subsonic Research Aircraft Testbed (SCRAT)

    Science.gov (United States)

    Baumann, Ethan; Hernandez, Joe; Ruhf, John C.

    2013-01-01

    National Aeronautics and Space Administration Dryden Flight Research Center acquired a Gulfstream III (GIII) aircraft to serve as a testbed for aeronautics flight research experiments. The aircraft is referred to as SCRAT, which stands for SubsoniC Research Aircraft Testbed. The aircraft's mission is to perform aeronautics research; more specifically raising the Technology Readiness Level (TRL) of advanced technologies through flight demonstrations and gathering high-quality research data suitable for verifying the technologies, and validating design and analysis tools. The SCRAT has the ability to conduct a range of flight research experiments throughout a transport class aircraft's flight envelope. Experiments ranging from flight-testing of a new aircraft system or sensor to those requiring structural and aerodynamic modifications to the aircraft can be accomplished. The aircraft has been modified to include an instrumentation system and sensors necessary to conduct flight research experiments along with a telemetry capability. An instrumentation power distribution system was installed to accommodate the instrumentation system and future experiments. An engineering simulation of the SCRAT has been developed to aid in integrating research experiments. A series of baseline aircraft characterization flights has been flown that gathered flight data to aid in developing and integrating future research experiments. This paper describes the SCRAT's research systems and capabilities.

  8. Development of optical packet and circuit integrated ring network testbed.

    Science.gov (United States)

    Furukawa, Hideaki; Harai, Hiroaki; Miyazawa, Takaya; Shinada, Satoshi; Kawasaki, Wataru; Wada, Naoya

    2011-12-12

    We developed novel integrated optical packet and circuit switch-node equipment. Compared with our previous equipment, a polarization-independent 4 × 4 semiconductor optical amplifier switch subsystem, gain-controlled optical amplifiers, and one 100 Gbps optical packet transponder and seven 10 Gbps optical path transponders with 10 Gigabit Ethernet (10GbE) client-interfaces were newly installed in the present system. The switch and amplifiers can provide more stable operation without equipment adjustments for the frequent polarization-rotations and dynamic packet-rate changes of optical packets. We constructed an optical packet and circuit integrated ring network testbed consisting of two switch nodes for accelerating network development, and we demonstrated 66 km fiber transmission and switching operation of multiplexed 14-wavelength 10 Gbps optical paths and 100 Gbps optical packets encapsulating 10GbE frames. Error-free (frame error rate optical packets of various packet lengths and packet rates, and stable operation of the network testbed was confirmed. In addition, 4K uncompressed video streaming over OPS links was successfully demonstrated. © 2011 Optical Society of America

  9. Automated tools and techniques for distributed Grid Software: Development of the testbed infrastructure

    OpenAIRE

    Aguado Sanchez, C; Di Meglio, A

    2007-01-01

    Grid technology is becoming more and more important as the new paradigm for sharing computational resources across different organizations in a secure way. The great powerfulness of this solution, requires the definition of a generic stack of services and protocols and this is the scope of the different Grid initiatives. As a result of international collaborations for its development, the Open Grid Forum created the Open Grid Services Architecture (OGSA) which aims to define the common set of...

  10. Automating testbed documentation and database access using World Wide Web (WWW) tools

    Science.gov (United States)

    Ames, Charles; Auernheimer, Brent; Lee, Young H.

    1994-01-01

    A method for providing uniform transparent access to disparate distributed information systems was demonstrated. A prototype testing interface was developed to access documentation and information using publicly available hypermedia tools. The prototype gives testers a uniform, platform-independent user interface to on-line documentation, user manuals, and mission-specific test and operations data. Mosaic was the common user interface, and HTML (Hypertext Markup Language) provided hypertext capability.

  11. Process automation

    International Nuclear Information System (INIS)

    Moser, D.R.

    1986-01-01

    Process automation technology has been pursued in the chemical processing industries and to a very limited extent in nuclear fuel reprocessing. Its effective use has been restricted in the past by the lack of diverse and reliable process instrumentation and the unavailability of sophisticated software designed for process control. The Integrated Equipment Test (IET) facility was developed by the Consolidated Fuel Reprocessing Program (CFRP) in part to demonstrate new concepts for control of advanced nuclear fuel reprocessing plants. A demonstration of fuel reprocessing equipment automation using advanced instrumentation and a modern, microprocessor-based control system is nearing completion in the facility. This facility provides for the synergistic testing of all chemical process features of a prototypical fuel reprocessing plant that can be attained with unirradiated uranium-bearing feed materials. The unique equipment and mission of the IET facility make it an ideal test bed for automation studies. This effort will provide for the demonstration of the plant automation concept and for the development of techniques for similar applications in a full-scale plant. A set of preliminary recommendations for implementing process automation has been compiled. Some of these concepts are not generally recognized or accepted. The automation work now under way in the IET facility should be useful to others in helping avoid costly mistakes because of the underutilization or misapplication of process automation. 6 figs

  12. Space Station technology testbed: 2010 deep space transport

    Science.gov (United States)

    Holt, Alan C.

    1993-01-01

    A space station in a crew-tended or permanently crewed configuration will provide major R&D opportunities for innovative, technology and materials development and advanced space systems testing. A space station should be designed with the basic infrastructure elements required to grow into a major systems technology testbed. This space-based technology testbed can and should be used to support the development of technologies required to expand our utilization of near-Earth space, the Moon and the Earth-to-Jupiter region of the Solar System. Space station support of advanced technology and materials development will result in new techniques for high priority scientific research and the knowledge and R&D base needed for the development of major, new commercial product thrusts. To illustrate the technology testbed potential of a space station and to point the way to a bold, innovative approach to advanced space systems' development, a hypothetical deep space transport development and test plan is described. Key deep space transport R&D activities are described would lead to the readiness certification of an advanced, reusable interplanetary transport capable of supporting eight crewmembers or more. With the support of a focused and highly motivated, multi-agency ground R&D program, a deep space transport of this type could be assembled and tested by 2010. Key R&D activities on a space station would include: (1) experimental research investigating the microgravity assisted, restructuring of micro-engineered, materials (to develop and verify the in-space and in-situ 'tuning' of materials for use in debris and radiation shielding and other protective systems), (2) exposure of microengineered materials to the space environment for passive and operational performance tests (to develop in-situ maintenance and repair techniques and to support the development, enhancement, and implementation of protective systems, data and bio-processing systems, and virtual reality and

  13. An agent-oriented approach to automated mission operations

    Science.gov (United States)

    Truszkowski, Walt; Odubiyi, Jide

    1994-01-01

    As we plan for the next generation of Mission Operations Control Center (MOCC) systems, there are many opportunities for the increased utilization of innovative knowledge-based technologies. The innovative technology discussed is an advanced use of agent-oriented approaches to the automation of mission operations. The paper presents an overview of this technology and discusses applied operational scenarios currently being investigated and prototyped. A major focus of the current work is the development of a simple user mechanism that would empower operations staff members to create, in real time, software agents to assist them in common, labor intensive operations tasks. These operational tasks would include: handling routine data and information management functions; amplifying the capabilities of a spacecraft analyst/operator to rapidly identify, analyze, and correct spacecraft anomalies by correlating complex data/information sets and filtering error messages; improving routine monitoring and trend analysis by detecting common failure signatures; and serving as a sentinel for spacecraft changes during critical maneuvers enhancing the system's capabilities to support nonroutine operational conditions with minimum additional staff. An agent-based testbed is under development. This testbed will allow us to: (1) more clearly understand the intricacies of applying agent-based technology in support of the advanced automation of mission operations and (2) access the full set of benefits that can be realized by the proper application of agent-oriented technology in a mission operations environment. The testbed under development addresses some of the data management and report generation functions for the Explorer Platform (EP)/Extreme UltraViolet Explorer (EUVE) Flight Operations Team (FOT). We present an overview of agent-oriented technology and a detailed report on the operation's concept for the testbed.

  14. Distribution automation

    International Nuclear Information System (INIS)

    Gruenemeyer, D.

    1991-01-01

    This paper reports on a Distribution Automation (DA) System enhances the efficiency and productivity of a utility. It also provides intangible benefits such as improved public image and market advantages. A utility should evaluate the benefits and costs of such a system before committing funds. The expenditure for distribution automation is economical when justified by the deferral of a capacity increase, a decrease in peak power demand, or a reduction in O and M requirements

  15. Testbed for High-Acuity Imaging and Stable Photometry

    Science.gov (United States)

    Gregory, James

    This proposal from MIT Lincoln Laboratory (LL) accompanies the NASA/APRA proposal enti-tled THAI-SPICE: Testbed for High-Acuity Imaging - Stable Photometry and Image-Motion Compensa-tion Experiment (submitted by Eliot Young, Southwest Research Institute). The goal of the THAI-SPICE project is to demonstrate three technologies that will help low-cost balloon-borne telescopes achieve diffraction-limited imaging: stable pointing, passive thermal stabilization and in-flight monitoring of the wave front error. This MIT LL proposal supplies a key element of the pointing stabilization component of THAI-SPICE: an electronic camera based on an orthogonaltransfer charge-coupled device (OTCCD). OTCCD cameras have been demonstrated with charge-transfer efficiencies >0.99999, noise of 90%. In addition to supplying a camera with an OTCCD detector, MIT LL will help with integration and testing of the OTCCD with the THAI-SPICE payload’s guide camera.

  16. Designing, Implementing and Documenting the Atlas Networking Test-bed.

    CERN Document Server

    Martinsen, Hans Åge

    The A Toroidal LHC ApparatuS (Atlas) experiment at the Large Hadron Colider (LHC) in European Organization for Nuclear Research (CERN), Geneva is a production environment. To develop new architectures, test new equipment and evaluate new technologies a well supported test bench is needed. A new one is now being commissioned and I will take a leading role in its development, commissioning and operation. This thesis will cover the requirements, the implementation, the documentation and the approach to the different challenges in implementing the testbed. I will be joining the project in the early stages and start by following the work that my colleagues are doing and then, as I get a better understanding, more responsibility will be given to me. To be able to suggest and implement solutions I will have to understand what the requirements are and how to achieve these requirements with the given resources.

  17. Development of a Remotely Operated Vehicle Test-bed

    Directory of Open Access Journals (Sweden)

    Biao WANG

    2013-06-01

    Full Text Available This paper presents the development of a remotely operated vehicle (ROV, designed to serve as a convenient, cost-effective platform for research and experimental validation of hardware, sensors and control algorithms. Both of the mechanical and control system design are introduced. The vehicle with a dimension 0.65 m long, 0.45 m wide has been designed to have a frame structure for modification of mounted devices and thruster allocation. For control system, STM32 based MCU boards specially designed for this project, are used as core processing boards. And an open source, modular, flexible software is developed. Experiment results demonstrate the effectiveness of the test-bed.

  18. SABA: A Testbed for a Real-Time MIMO System

    Directory of Open Access Journals (Sweden)

    Brühl Lars

    2006-01-01

    Full Text Available The growing demand for high data rates for wireless communication systems leads to the development of new technologies to increase the channel capacity thus increasing the data rate. MIMO (multiple-input multiple-output systems are best qualified for these applications. In this paper, we present a MIMO test environment for high data rate transmissions in frequency-selective environments. An overview of the testbed is given, including the analyzed algorithms, the digital signal processing with a new highly parallel processor to perform the algorithms in real time, as well as the analog front-ends. A brief overview of the influence of polarization on the channel capacity is given as well.

  19. Segmented Aperture Interferometric Nulling Testbed (SAINT) II: component systems update

    Science.gov (United States)

    Hicks, Brian A.; Bolcar, Matthew R.; Helmbrecht, Michael A.; Petrone, Peter; Burke, Elliot; Corsetti, James; Dillon, Thomas; Lea, Andrew; Pellicori, Samuel; Sheets, Teresa; Shiri, Ron; Agolli, Jack; DeVries, John; Eberhardt, Andrew; McCabe, Tyler

    2017-09-01

    This work presents updates to the coronagraph and telescope components of the Segmented Aperture Interferometric Nulling Testbed (SAINT). The project pairs an actively-controlled macro-scale segmented mirror with the Visible Nulling Coronagraph (VNC) towards demonstrating capabilities for the future space observatories needed to directly detect and characterize a significant sample of Earth-sized worlds around nearby stars in the quest for identifying those which may be habitable and possibly harbor life. Efforts to improve the VNC wavefront control optics and mechanisms towards repeating narrowband results are described. A narrative is provided for the design of new optical components aimed at enabling broadband performance. Initial work with the hardware and software interface for controlling the segmented telescope mirror is also presented.

  20. Telescience testbed: Operational support functions for biomedical experiments

    Science.gov (United States)

    Yamashita, Masamichi; Watanabe, Satoru; Shoji, Takatoshi; Clarke, Andrew H.; Suzuki, Hiroyuki; Yanagihara, Dai

    A telescience testbed was conducted to study the methodology of space biomedicine with simulated constraints imposed on space experiments. An experimental subject selected for this testbedding was an elaborate surgery of animals and electrophysiological measurements conducted by an operator onboard. The standing potential in the ampulla of the pigeon's semicircular canal was measured during gravitational and caloric stimulation. A principal investigator, isolated from the operation site, participated in the experiment interactively by telecommunication links. Reliability analysis was applied to the whole layers of experimentation, including design of experimental objectives and operational procedures. Engineering and technological aspects of telescience are discussed in terms of reliability to assure quality of science. Feasibility of robotics was examined for supportive functions to reduce the workload of the onboard operator.

  1. Simulation to Flight Test for a UAV Controls Testbed

    Science.gov (United States)

    Motter, Mark A.; Logan, Michael J.; French, Michael L.; Guerreiro, Nelson M.

    2006-01-01

    The NASA Flying Controls Testbed (FLiC) is a relatively small and inexpensive unmanned aerial vehicle developed specifically to test highly experimental flight control approaches. The most recent version of the FLiC is configured with 16 independent aileron segments, supports the implementation of C-coded experimental controllers, and is capable of fully autonomous flight from takeoff roll to landing, including flight test maneuvers. The test vehicle is basically a modified Army target drone, AN/FQM-117B, developed as part of a collaboration between the Aviation Applied Technology Directorate (AATD) at Fort Eustis, Virginia and NASA Langley Research Center. Several vehicles have been constructed and collectively have flown over 600 successful test flights, including a fully autonomous demonstration at the Association of Unmanned Vehicle Systems International (AUVSI) UAV Demo 2005. Simulations based on wind tunnel data are being used to further develop advanced controllers for implementation and flight test.

  2. X-ray Pulsar Navigation Algorithms and Testbed for SEXTANT

    Science.gov (United States)

    Winternitz, Luke M. B.; Hasouneh, Monther A.; Mitchell, Jason W.; Valdez, Jennifer E.; Price, Samuel R.; Semper, Sean R.; Yu, Wayne H.; Ray, Paul S.; Wood, Kent S.; Arzoumanian, Zaven; hide

    2015-01-01

    The Station Explorer for X-ray Timing and Navigation Technology (SEXTANT) is a NASA funded technologydemonstration. SEXTANT will, for the first time, demonstrate real-time, on-board X-ray Pulsar-based Navigation (XNAV), a significant milestone in the quest to establish a GPS-like navigation capability available throughout our Solar System and beyond. This paper describes the basic design of the SEXTANT system with a focus on core models and algorithms, and the design and continued development of the GSFC X-ray Navigation Laboratory Testbed (GXLT) with its dynamic pulsar emulation capability. We also present early results from GXLT modeling of the combined NICER X-ray timing instrument hardware and SEXTANT flight software algorithms.

  3. Vacuum Nuller Testbed Performance, Characterization and Null Control

    Science.gov (United States)

    Lyon, R. G.; Clampin, M.; Petrone, P.; Mallik, U.; Madison, T.; Bolcar, M.; Noecker, C.; Kendrick, S.; Helmbrecht, M. A.

    2011-01-01

    The Visible Nulling Coronagraph (VNC) can detect and characterize exoplanets with filled, segmented and sparse aperture telescopes, thereby spanning the choice of future internal coronagraph exoplanet missions. NASA/Goddard Space Flight Center (GSFC) has developed a Vacuum Nuller Testbed (VNT) to advance this approach, and assess and advance technologies needed to realize a VNC as a flight instrument. The VNT is an ultra-stable testbed operating at 15 Hz in vacuum. It consists of a MachZehnder nulling interferometer; modified with a "W" configuration to accommodate a hexpacked MEMS based deformable mirror (DM), coherent fiber bundle and achromatic phase shifters. The 2-output channels are imaged with a vacuum photon counting camera and conventional camera. Error-sensing and feedback to DM and delay line with control algorithms are implemented in a real-time architecture. The inherent advantage of the VNC is that it is its own interferometer and directly controls its errors by exploiting images from bright and dark channels simultaneously. Conservation of energy requires the sum total of the photon counts be conserved independent of the VNC state. Thus sensing and control bandwidth is limited by the target stars throughput, with the net effect that the higher bandwidth offloads stressing stability tolerances within the telescope. We report our recent progress with the VNT towards achieving an incremental sequence of contrast milestones of 10(exp 8) , 10(exp 9) and 10(exp 10) respectively at inner working angles approaching 2A/D. Discussed will be the optics, lab results, technologies, and null control. Shown will be evidence that the milestones have been achieved.

  4. High Precision Testbed to Evaluate Ethernet Performance for In-Car Networks

    DEFF Research Database (Denmark)

    Revsbech, Kasper; Madsen, Tatiana Kozlova; Schiøler, Henrik

    2012-01-01

    Validating safety-critical real-time systems such as in-car networks often involves a model-based performance analysis of the network. An important issue performing such analysis is to provide precise model parameters, matching the actual equipment. One way to obtain such parameters is to derive...... them by measurements of the equipment. In this work we describe the design of a testbed enabling active measurements on up to 1 [Gb=Sec] Copper based Ethernet Switches. By use of the testbed it self, we conduct a series of tests where the precision of the testbed is estimated. We find a maximum error...

  5. Designing a machinery control system (MCS) security testbed

    OpenAIRE

    Desso, Nathan H.

    2014-01-01

    Approved for public release; distribution is unlimited Industrial control systems (ICS) face daily cyber security threats, can have a significant impact to the security of our nation, and present a difficult challenge to defend. Critical infrastructures, including military systems like the machinery control systems (MCS) found onboard modern U.S. warships, are affected because of their use of commercial automation solutions. The increase of automated control systems within the U.S. Navy sa...

  6. Virtual automation.

    Science.gov (United States)

    Casis, E; Garrido, A; Uranga, B; Vives, A; Zufiaurre, C

    2001-01-01

    Total laboratory automation (TLA) can be substituted in mid-size laboratories by a computer sample workflow control (virtual automation). Such a solution has been implemented in our laboratory using PSM, software developed in cooperation with Roche Diagnostics (Barcelona, Spain), to this purpose. This software is connected to the online analyzers and to the laboratory information system and is able to control and direct the samples working as an intermediate station. The only difference with TLA is the replacement of transport belts by personnel of the laboratory. The implementation of this virtual automation system has allowed us the achievement of the main advantages of TLA: workload increase (64%) with reduction in the cost per test (43%), significant reduction in the number of biochemistry primary tubes (from 8 to 2), less aliquoting (from 600 to 100 samples/day), automation of functional testing, drastic reduction of preanalytical errors (from 11.7 to 0.4% of the tubes) and better total response time for both inpatients (from up to 48 hours to up to 4 hours) and outpatients (from up to 10 days to up to 48 hours). As an additional advantage, virtual automation could be implemented without hardware investment and significant headcount reduction (15% in our lab).

  7. A Testbed For Validating the LHC Controls System Core Before Deployment

    CERN Document Server

    Nguyen Xuan, J

    2011-01-01

    Since the start-up of the LHC, it is crucial to carefully test core controls components before deploying them operationally. The Testbed of the CERN accelerator controls group was developed for this purpose. It contains different hardware (PPC, i386) running various operating systems (Linux and LynxOS) and core software components running on front-ends, communication middleware and client libraries. The Testbed first executes integration tests to verify that the components delivered by individual teams interoperate, and then system tests, which verify high-level, end-user functionality. It also verifies that different versions of components are compatible, which is vital, because not all parts of the operational LHC control system can be upgraded simultaneously. In addition, the Testbed can be used for performance and stress tests. Internally, the Testbed is driven by Atlassian Bamboo, a Continuous Integration server, which builds and deploys automatically new software versions into the Test...

  8. Construction of test-bed system of voltage management system to ...

    African Journals Online (AJOL)

    Construction of test-bed system of voltage management system to apply physical power system. ... Journal of Fundamental and Applied Sciences ... system of voltage management system (VMS) in order to apply physical power system.

  9. Cooperative Search with Autonomous Vehicles in a 3D Aquatic Testbed

    Science.gov (United States)

    2012-01-01

    Cooperative Search with Autonomous Vehicles in a 3D Aquatic Testbed Matthew Keeter1, Daniel Moore2,3, Ryan Muller2,3, Eric Nieters1, Jennifer...Many applications for autonomous vehicles involve three-dimensional domains, notably aerial and aquatic environments. Such applications include mon...TYPE 3. DATES COVERED 00-00-2012 to 00-00-2012 4. TITLE AND SUBTITLE Cooperative Search With Autonomous Vehicles In A 3D Aquatic Testbed 5a

  10. Closing the contrast gap between testbed and model prediction with WFIRST-CGI shaped pupil coronagraph

    Science.gov (United States)

    Zhou, Hanying; Nemati, Bijan; Krist, John; Cady, Eric; Prada, Camilo M.; Kern, Brian; Poberezhskiy, Ilya

    2016-07-01

    JPL has recently passed an important milestone in its technology development for a proposed NASA WFIRST mission coronagraph: demonstration of better than 1x10-8 contrast over broad bandwidth (10%) on both shaped pupil coronagraph (SPC) and hybrid Lyot coronagraph (HLC) testbeds with the WFIRST obscuration pattern. Challenges remain, however, in the technology readiness for the proposed mission. One is the discrepancies between the achieved contrasts on the testbeds and their corresponding model predictions. A series of testbed diagnoses and modeling activities were planned and carried out on the SPC testbed in order to close the gap. A very useful tool we developed was a derived "measured" testbed wavefront control Jacobian matrix that could be compared with the model-predicted "control" version that was used to generate the high contrast dark hole region in the image plane. The difference between these two is an estimate of the error in the control Jacobian. When the control matrix, which includes both amplitude and phase, was modified to reproduce the error, the simulated performance closely matched the SPC testbed behavior in both contrast floor and contrast convergence speed. This is a step closer toward model validation for high contrast coronagraphs. Further Jacobian analysis and modeling provided clues to the possible sources for the mismatch: DM misregistration and testbed optical wavefront error (WFE) and the deformable mirror (DM) setting for correcting this WFE. These analyses suggested that a high contrast coronagraph has a tight tolerance in the accuracy of its control Jacobian. Modifications to both testbed control model as well as prediction model are being implemented, and future works are discussed.

  11. PEER Testbed Study on a Laboratory Building: Exercising Seismic Performance Assessment

    OpenAIRE

    Comerio, Mary C.; Stallmeyer, John C.; Smith, Ryan; Makris, Nicos; Konstantinidis, Dimitrios; Mosalam, Khalid; Lee, Tae-Hyung; Beck, James L.; Porter, Keith A.; Shaikhutdinov, Rustem; Hutchinson, Tara; Chaudhuri, Samit Ray; Chang, Stephanie E.; Falit-Baiamonte, Anthony; Holmes, William T.

    2005-01-01

    From 2002 to 2004 (years five and six of a ten-year funding cycle), the PEER Center organized the majority of its research around six testbeds. Two buildings and two bridges, a campus, and a transportation network were selected as case studies to “exercise” the PEER performance-based earthquake engineering methodology. All projects involved interdisciplinary teams of researchers, each producing data to be used by other colleagues in their research. The testbeds demonstrat...

  12. Mounted Smartphones as Measurement and Control Platforms for Motor-Based Laboratory Test-Beds

    Science.gov (United States)

    Frank, Jared A.; Brill, Anthony; Kapila, Vikram

    2016-01-01

    Laboratory education in science and engineering often entails the use of test-beds equipped with costly peripherals for sensing, acquisition, storage, processing, and control of physical behavior. However, costly peripherals are no longer necessary to obtain precise measurements and achieve stable feedback control of test-beds. With smartphones performing diverse sensing and processing tasks, this study examines the feasibility of mounting smartphones directly to test-beds to exploit their embedded hardware and software in the measurement and control of the test-beds. This approach is a first step towards replacing laboratory-grade peripherals with more compact and affordable smartphone-based platforms, whose interactive user interfaces can engender wider participation and engagement from learners. Demonstrative cases are presented in which the sensing, computation, control, and user interaction with three motor-based test-beds are handled by a mounted smartphone. Results of experiments and simulations are used to validate the feasibility of mounted smartphones as measurement and feedback control platforms for motor-based laboratory test-beds, report the measurement precision and closed-loop performance achieved with such platforms, and address challenges in the development of platforms to maintain system stability. PMID:27556464

  13. Mounted Smartphones as Measurement and Control Platforms for Motor-Based Laboratory Test-Beds

    Directory of Open Access Journals (Sweden)

    Jared A. Frank

    2016-08-01

    Full Text Available Laboratory education in science and engineering often entails the use of test-beds equipped with costly peripherals for sensing, acquisition, storage, processing, and control of physical behavior. However, costly peripherals are no longer necessary to obtain precise measurements and achieve stable feedback control of test-beds. With smartphones performing diverse sensing and processing tasks, this study examines the feasibility of mounting smartphones directly to test-beds to exploit their embedded hardware and software in the measurement and control of the test-beds. This approach is a first step towards replacing laboratory-grade peripherals with more compact and affordable smartphone-based platforms, whose interactive user interfaces can engender wider participation and engagement from learners. Demonstrative cases are presented in which the sensing, computation, control, and user interaction with three motor-based test-beds are handled by a mounted smartphone. Results of experiments and simulations are used to validate the feasibility of mounted smartphones as measurement and feedback control platforms for motor-based laboratory test-beds, report the measurement precision and closed-loop performance achieved with such platforms, and address challenges in the development of platforms to maintain system stability.

  14. Mounted Smartphones as Measurement and Control Platforms for Motor-Based Laboratory Test-Beds.

    Science.gov (United States)

    Frank, Jared A; Brill, Anthony; Kapila, Vikram

    2016-08-20

    Laboratory education in science and engineering often entails the use of test-beds equipped with costly peripherals for sensing, acquisition, storage, processing, and control of physical behavior. However, costly peripherals are no longer necessary to obtain precise measurements and achieve stable feedback control of test-beds. With smartphones performing diverse sensing and processing tasks, this study examines the feasibility of mounting smartphones directly to test-beds to exploit their embedded hardware and software in the measurement and control of the test-beds. This approach is a first step towards replacing laboratory-grade peripherals with more compact and affordable smartphone-based platforms, whose interactive user interfaces can engender wider participation and engagement from learners. Demonstrative cases are presented in which the sensing, computation, control, and user interaction with three motor-based test-beds are handled by a mounted smartphone. Results of experiments and simulations are used to validate the feasibility of mounted smartphones as measurement and feedback control platforms for motor-based laboratory test-beds, report the measurement precision and closed-loop performance achieved with such platforms, and address challenges in the development of platforms to maintain system stability.

  15. Microgrid testbeds around the world: State of art

    International Nuclear Information System (INIS)

    Hossain, Eklas; Kabalci, Ersan; Bayindir, Ramazan; Perez, Ronald

    2014-01-01

    Highlights: • A detail discussion on microgrid project around the world such as North American, Europe, and Japan. • Key benefits of microgrid, issues with on-site generation, features. • Why we need distributed generation system with a brief introduction. • Distributed generation technologies with cost analysis. • The overview on existing distribution network. - Abstract: This paper deals with the recent evolution of microgrids being used around the world in real life applications as well as laboratory application for research. This study is intended to introduce the subject by reviewing the components level, structure and types of microgrid applications installed as a plant or modeled as a simulation environment. The paper also presents a survey regarding published papers on why the microgrid is required, and what the components and control systems are which constitute the actual microgrid studies. It leads the researcher to see the microgrid in terms of the actual bigger picture of today and creates a new outlook about the potential developments. Additionally, comparison of microgrids in various regions based on several parameters allows researchers to define the required criteria and features of a special microgrid that is chosen for a particular scenario. The authors of this paper also tabulated all the necessary information about microgrids, and proposed a standard microgrid for better power quality and optimizing energy generation. Consequently, it is focused on inadequate knowledge and technology gaps in the power system field with regards to the future, and it is this which has been illustrated for the reader. The existing microgrid testbeds all around the world have been studied and analyzed and several of them are explained as an example in this study. Later, those investigated distribution systems are classified based on region (North America, Europe and Asia) and, as presented in literature, a significant amount of deviation has been found

  16. Automating Finance

    Science.gov (United States)

    Moore, John

    2007-01-01

    In past years, higher education's financial management side has been riddled with manual processes and aging mainframe applications. This article discusses schools which had taken advantage of an array of technologies that automate billing, payment processing, and refund processing in the case of overpayment. The investments are well worth it:…

  17. Library Automation.

    Science.gov (United States)

    Husby, Ole

    1990-01-01

    The challenges and potential benefits of automating university libraries are reviewed, with special attention given to cooperative systems. Aspects discussed include database size, the role of the university computer center, storage modes, multi-institutional systems, resource sharing, cooperative system management, networking, and intelligent…

  18. Towards an automated checked baggage inspection system augmented with robots

    Science.gov (United States)

    DeDonato, Matthew P.; Dimitrov, Velin; Padır, Taskin

    2014-05-01

    We present a novel system for enhancing the efficiency and accuracy of checked baggage screening process at airports. The system requirements address the identification and retrieval of objects of interest that are prohibited in a checked luggage. The automated testbed is comprised of a Baxter research robot designed by Rethink Robotics for luggage and object manipulation, and a down-looking overhead RGB-D sensor for inspection and detection. We discuss an overview of current system implementations, areas of opportunity for improvements, robot system integration challenges, details of the proposed software architecture and experimental results from a case study for identifying various kinds of lighters in checked bags.

  19. Digital Preservation Theory and Application: Transcontinental Persistent Archives Testbed Activity

    Directory of Open Access Journals (Sweden)

    Paul Watry

    2007-12-01

    Full Text Available The National Archives and Records Administration (NARA and EU SHAMAN projects are working with multiple research institutions on tools and technologies that will supply a comprehensive, systematic, and dynamic means for preserving virtually any type of electronic record, free from dependence on any specific hardware or software. This paper describes the joint development work between the University of Liverpool and the San Diego Supercomputer Center (SDSC at the University of California, San Diego on the NARA and SHAMAN prototypes. The aim is to provide technologies in support of the required generic data management infrastructure. We describe a Theory of Preservation that quantifies how communication can be accomplished when future technologies are different from those available at present. This includes not only different hardware and software, but also different standards for encoding information. We describe the concept of a “digital ontology” to characterize preservation processes; this is an advance on the current OAIS Reference Model of providing representation information about records. To realize a comprehensive Theory of Preservation, we describe the ongoing integration of distributed shared collection management technologies, digital library browsing, and presentation technologies for the NARA and SHAMAN Persistent Archive Testbeds.

  20. Event metadata records as a testbed for scalable data mining

    International Nuclear Information System (INIS)

    Gemmeren, P van; Malon, D

    2010-01-01

    At a data rate of 200 hertz, event metadata records ('TAGs,' in ATLAS parlance) provide fertile grounds for development and evaluation of tools for scalable data mining. It is easy, of course, to apply HEP-specific selection or classification rules to event records and to label such an exercise 'data mining,' but our interest is different. Advanced statistical methods and tools such as classification, association rule mining, and cluster analysis are common outside the high energy physics community. These tools can prove useful, not for discovery physics, but for learning about our data, our detector, and our software. A fixed and relatively simple schema makes TAG export to other storage technologies such as HDF5 straightforward. This simplifies the task of exploiting very-large-scale parallel platforms such as Argonne National Laboratory's BlueGene/P, currently the largest supercomputer in the world for open science, in the development of scalable tools for data mining. Using a domain-neutral scientific data format may also enable us to take advantage of existing data mining components from other communities. There is, further, a substantial literature on the topic of one-pass algorithms and stream mining techniques, and such tools may be inserted naturally at various points in the event data processing and distribution chain. This paper describes early experience with event metadata records from ATLAS simulation and commissioning as a testbed for scalable data mining tool development and evaluation.

  1. TORCH Computational Reference Kernels - A Testbed for Computer Science Research

    Energy Technology Data Exchange (ETDEWEB)

    Kaiser, Alex; Williams, Samuel Webb; Madduri, Kamesh; Ibrahim, Khaled; Bailey, David H.; Demmel, James W.; Strohmaier, Erich

    2010-12-02

    For decades, computer scientists have sought guidance on how to evolve architectures, languages, and programming models in order to improve application performance, efficiency, and productivity. Unfortunately, without overarching advice about future directions in these areas, individual guidance is inferred from the existing software/hardware ecosystem, and each discipline often conducts their research independently assuming all other technologies remain fixed. In today's rapidly evolving world of on-chip parallelism, isolated and iterative improvements to performance may miss superior solutions in the same way gradient descent optimization techniques may get stuck in local minima. To combat this, we present TORCH: A Testbed for Optimization ResearCH. These computational reference kernels define the core problems of interest in scientific computing without mandating a specific language, algorithm, programming model, or implementation. To compliment the kernel (problem) definitions, we provide a set of algorithmically-expressed verification tests that can be used to verify a hardware/software co-designed solution produces an acceptable answer. Finally, to provide some illumination as to how researchers have implemented solutions to these problems in the past, we provide a set of reference implementations in C and MATLAB.

  2. NN-SITE: A remote monitoring testbed facility

    International Nuclear Information System (INIS)

    Kadner, S.; White, R.; Roman, W.; Sheely, K.; Puckett, J.; Ystesund, K.

    1997-01-01

    DOE, Aquila Technologies, LANL and SNL recently launched collaborative efforts to create a Non-Proliferation Network Systems Integration and Test (NN-Site, pronounced N-Site) facility. NN-Site will focus on wide area, local area, and local operating level network connectivity including Internet access. This facility will provide thorough and cost-effective integration, testing and development of information connectivity among diverse operating systems and network topologies prior to full-scale deployment. In concentrating on instrument interconnectivity, tamper indication, and data collection and review, NN-Site will facilitate efforts of equipment providers and system integrators in deploying systems that will meet nuclear non-proliferation and safeguards objectives. The following will discuss the objectives of ongoing remote monitoring efforts, as well as the prevalent policy concerns. An in-depth discussion of the Non-Proliferation Network Systems Integration and Test facility (NN-Site) will illuminate the role that this testbed facility can perform in meeting the objectives of remote monitoring efforts, and its potential contribution in promoting eventual acceptance of remote monitoring systems in facilities worldwide

  3. Digital pathology: DICOM-conform draft, testbed, and first results.

    Science.gov (United States)

    Zwönitzer, Ralf; Kalinski, Thomas; Hofmann, Harald; Roessner, Albert; Bernarding, Johannes

    2007-09-01

    Hospital information systems are state of the art nowadays. Therefore, Digital Pathology, also labelled as Virtual Microscopy, has gained increased attention. Triggered by radiology, standardized information models and workflows were world-wide defined based on DICOM. However, DICOM-conform integration of Digital Pathology into existing clinical information systems imposes new problems requiring specific solutions concerning the huge amount of data as well as the special structure of the data to be managed, transferred, and stored. We implemented a testbed to realize and evaluate the workflow of digitized slides from acquisition to archiving. The experiences led to the draft of a DICOM-conform information model that accounted for extensions, definitions, and technical requirements necessary to integrate digital pathology in a hospital-wide DICOM environment. Slides were digitized, compressed, and could be viewed remotely. Real-time transfer of the huge amount of data was optimized using streaming techniques. Compared to a recent discussion in the DICOM Working Group for Digital Pathology (WG26) our experiences led to a preference of a JPEG2000/JPIP-based streaming of the whole slide image. The results showed that digital pathology is feasible but strong efforts by users and vendors are still necessary to integrate Digital Pathology into existing information systems.

  4. Extrasolar Planetary Imaging Coronagraph (EPIC): visible nulling cornagraph testbed results

    Science.gov (United States)

    Lyon, Richard G.; Clampin, Mark; Melnick, Gary; Tolls, Volker; Woodruff, Robert; Vasudevan, Gopal

    2008-07-01

    The Extrasolar Planetary Imaging Coronagraph (EPIC) is a NASA Astrophysics Strategic Mission Concept under study for the upcoming Exoplanet Probe. EPIC's mission would be to image and characterize extrasolar giant planets, and potential super-Earths, in orbits with semi-major axes between 2 and 10 AU. EPIC will provide insights into the physical nature of a variety of planets in other solar systems complimenting radial velocity (RV) and astrometric planet searches. It will detect and characterize the atmospheres of planets identified by radial velocity surveys and potentially some transits, determine orbital inclinations and masses, characterize the atmospheres of gas giants around A and F stars, observed the inner spatial structure and colors of inner Spitzer selected debris disks. EPIC would be launched into a heliocentric Earth trailing drift-away orbit, with a 3-year mission lifetime (5 year goal) and will revisit planets at least three times. The starlight suppression approach consists of a visible nulling coronagraph (VNC) that enables high order starlight suppression in broadband light. To demonstrate the VNC approach and advance it's technology readiness the NASA/Goddard Space Flight Center and Lockheed-Martin have developed a laboratory VNC and have demonstrated white light nulling. We will discuss our ongoing VNC work and show the latest results from the VNC testbed.

  5. Extrasolar Planetary Imaging Coronagraph: Visible Nulling Coronagraph Testbed Results

    Science.gov (United States)

    Lyon, Richard G.

    2008-01-01

    The Extrasolar Planetary Imaging Coronagraph (EPIC) is a proposed NASA Discovery mission to image and characterize extrasolar giant planets in orbits with semi-major axes between 2 and 10 AU. EPIC will provide insights into the physical nature of a variety of planets in other solar systems complimenting radial velocity (RV) and astrometric planet searches. It will detect and characterize the atmospheres of planets identified by radial velocity surveys, determine orbital inclinations and masses, characterize the atmospheres around A and F stars, observed the inner spatial structure and colors of inner Spitzer selected debris disks. EPIC would be launched to heliocentric Earth trailing drift-away orbit, with a 3-year mission lifetime ( 5 year goal) and will revisit planets at least three times at intervals of 9 months. The starlight suppression approach consists of a visible nulling coronagraph (VNC) that enables high order starlight suppression in broadband light. To demonstrate the VNC approach and advance it's technology readiness the NASA Goddard Space Flight Center and Lockheed-Martin have developed a laboratory VNC and have demonstrated white light nulling. We will discuss our ongoing VNC work and show the latest results from the VNC testbed,

  6. Atmospheric Fluctuation Measurements with the Palomar Testbed Interferometer

    Science.gov (United States)

    Linfield, R. P.; Lane, B. F.; Colavita, M. M.; PTI Collaboration

    Observations of bright stars with the Palomar Testbed Interferometer, at a wavelength of 2.2 microns, have been used to measure atmospheric delay fluctuations. The delay structure function Dτ(Δ t) was calculated for 66 scans (each >= 120s in length) on seven nights in 1997 and one in 1998. For all except one scan, Dτ exhibited a clean power law shape over the time interval 50-500 msec. Over shorter time intervals, the effect of the delay line servo loop corrupts Dτ. Over longer time intervals (usually starting at > 1s), the slope of Dτ decreases, presumably due to some combination of saturation e.g. finite turbulent layer thickness) and the effect of the finite wind speed crossing time on our 110 m baseline. The mean power law slopes for the eight nights ranged from 1.16 to 1.36, substantially flatter than the value of 1.67 for three dimensional Kolmogorov turbulence. Such sub-Kolmogorov slopes will result in atmospheric seeling (θ) that improves rapidly with increasing wavelength: θ propto λ1-(2β), where β is the observed power law slope of Dτ. The atmospheric errors in astrometric measurements with an interferometer will average down more quickly than in the Kolmogorov case.

  7. Aerospace Engineering Systems and the Advanced Design Technologies Testbed Experience

    Science.gov (United States)

    VanDalsem, William R.; Livingston, Mary E.; Melton, John E.; Torres, Francisco J.; Stremel, Paul M.

    1999-01-01

    Continuous improvement of aerospace product development processes is a driving requirement across much of the aerospace community. As up to 90% of the cost of an aerospace product is committed during the first 10% of the development cycle, there is a strong emphasis on capturing, creating, and communicating better information (both requirements and performance) early in the product development process. The community has responded by pursuing the development of computer-based systems designed to enhance the decision-making capabilities of product development individuals and teams. Recently, the historical foci on sharing the geometrical representation and on configuration management are being augmented: 1) Physics-based analysis tools for filling the design space database; 2) Distributed computational resources to reduce response time and cost; 3) Web-based technologies to relieve machine-dependence; and 4) Artificial intelligence technologies to accelerate processes and reduce process variability. The Advanced Design Technologies Testbed (ADTT) activity at NASA Ames Research Center was initiated to study the strengths and weaknesses of the technologies supporting each of these trends, as well as the overall impact of the combination of these trends on a product development event. Lessons learned and recommendations for future activities are reported.

  8. User's guide to the Reliability Estimation System Testbed (REST)

    Science.gov (United States)

    Nicol, David M.; Palumbo, Daniel L.; Rifkin, Adam

    1992-01-01

    The Reliability Estimation System Testbed is an X-window based reliability modeling tool that was created to explore the use of the Reliability Modeling Language (RML). RML was defined to support several reliability analysis techniques including modularization, graphical representation, Failure Mode Effects Simulation (FMES), and parallel processing. These techniques are most useful in modeling large systems. Using modularization, an analyst can create reliability models for individual system components. The modules can be tested separately and then combined to compute the total system reliability. Because a one-to-one relationship can be established between system components and the reliability modules, a graphical user interface may be used to describe the system model. RML was designed to permit message passing between modules. This feature enables reliability modeling based on a run time simulation of the system wide effects of a component's failure modes. The use of failure modes effects simulation enhances the analyst's ability to correctly express system behavior when using the modularization approach to reliability modeling. To alleviate the computation bottleneck often found in large reliability models, REST was designed to take advantage of parallel processing on hypercube processors.

  9. Automated Operations Development for Advanced Exploration Systems

    Science.gov (United States)

    Haddock, Angie T.; Stetson, Howard

    2012-01-01

    Automated space operations command and control software development and its implementation must be an integral part of the vehicle design effort. The software design must encompass autonomous fault detection, isolation, recovery capabilities and also provide "single button" intelligent functions for the crew. Development, operations and safety approval experience with the Timeliner system onboard the International Space Station (ISS), which provided autonomous monitoring with response and single command functionality of payload systems, can be built upon for future automated operations as the ISS Payload effort was the first and only autonomous command and control system to be in continuous execution (6 years), 24 hours a day, 7 days a week within a crewed spacecraft environment. Utilizing proven capabilities from the ISS Higher Active Logic (HAL) System, along with the execution component design from within the HAL 9000 Space Operating System, this design paper will detail the initial HAL System software architecture and interfaces as applied to NASA's Habitat Demonstration Unit (HDU) in support of the Advanced Exploration Systems, Autonomous Mission Operations project. The development and implementation of integrated simulators within this development effort will also be detailed and is the first step in verifying the HAL 9000 Integrated Test-Bed Component [2] designs effectiveness. This design paper will conclude with a summary of the current development status and future development goals as it pertains to automated command and control for the HDU.

  10. Building a framework to manage trust in automation

    Science.gov (United States)

    Metcalfe, J. S.; Marathe, A. R.; Haynes, B.; Paul, V. J.; Gremillion, G. M.; Drnec, K.; Atwater, C.; Estepp, J. R.; Lukos, J. R.; Carter, E. C.; Nothwang, W. D.

    2017-05-01

    All automations must, at some point in their lifecycle, interface with one or more humans. Whether operators, end-users, or bystanders, human responses can determine the perceived utility and acceptance of an automation. It has been long believed that human trust is a primary determinant of human-automation interactions and further presumed that calibrating trust can lead to appropriate choices regarding automation use. However, attempts to improve joint system performance by calibrating trust have not yet provided a generalizable solution. To address this, we identified several factors limiting the direct integration of trust, or metrics thereof, into an active mitigation strategy. The present paper outlines our approach to addressing this important issue, its conceptual underpinnings, and practical challenges encountered in execution. Among the most critical outcomes has been a shift in focus from trust to basic interaction behaviors and their antecedent decisions. This change in focus inspired the development of a testbed and paradigm that was deployed in two experiments of human interactions with driving automation that were executed in an immersive, full-motion simulation environment. Moreover, by integrating a behavior and physiology-based predictor within a novel consequence-based control system, we demonstrated that it is possible to anticipate particular interaction behaviors and influence humans towards more optimal choices about automation use in real time. Importantly, this research provides a fertile foundation for the development and integration of advanced, wearable technologies for sensing and inferring critical state variables for better integration of human elements into otherwise fully autonomous systems.

  11. Laboratory Spacecraft Data Processing and Instrument Autonomy: AOSAT as Testbed

    Science.gov (United States)

    Lightholder, Jack; Asphaug, Erik; Thangavelautham, Jekan

    2015-11-01

    Recent advances in small spacecraft allow for their use as orbiting microgravity laboratories (e.g. Asphaug and Thangavelautham LPSC 2014) that will produce substantial amounts of data. Power, bandwidth and processing constraints impose limitations on the number of operations which can be performed on this data as well as the data volume the spacecraft can downlink. We show that instrument autonomy and machine learning techniques can intelligently conduct data reduction and downlink queueing to meet data storage and downlink limitations. As small spacecraft laboratory capabilities increase, we must find techniques to increase instrument autonomy and spacecraft scientific decision making. The Asteroid Origins Satellite (AOSAT) CubeSat centrifuge will act as a testbed for further proving these techniques. Lightweight algorithms, such as connected components analysis, centroid tracking, K-means clustering, edge detection, convex hull analysis and intelligent cropping routines can be coupled with the tradition packet compression routines to reduce data transfer per image as well as provide a first order filtering of what data is most relevant to downlink. This intelligent queueing provides timelier downlink of scientifically relevant data while reducing the amount of irrelevant downlinked data. Resulting algorithms allow for scientists to throttle the amount of data downlinked based on initial experimental results. The data downlink pipeline, prioritized for scientific relevance based on incorporated scientific objectives, can continue from the spacecraft until the data is no longer fruitful. Coupled with data compression and cropping strategies at the data packet level, bandwidth reductions exceeding 40% can be achieved while still downlinking data deemed to be most relevant in a double blind study between scientist and algorithm. Applications of this technology allow for the incorporation of instrumentation which produces significant data volumes on small spacecraft

  12. A test-bed modeling study for wave resource assessment

    Science.gov (United States)

    Yang, Z.; Neary, V. S.; Wang, T.; Gunawan, B.; Dallman, A.

    2016-02-01

    Hindcasts from phase-averaged wave models are commonly used to estimate standard statistics used in wave energy resource assessments. However, the research community and wave energy converter industry is lacking a well-documented and consistent modeling approach for conducting these resource assessments at different phases of WEC project development, and at different spatial scales, e.g., from small-scale pilot study to large-scale commercial deployment. Therefore, it is necessary to evaluate current wave model codes, as well as limitations and knowledge gaps for predicting sea states, in order to establish best wave modeling practices, and to identify future research needs to improve wave prediction for resource assessment. This paper presents the first phase of an on-going modeling study to address these concerns. The modeling study is being conducted at a test-bed site off the Central Oregon Coast using two of the most widely-used third-generation wave models - WaveWatchIII and SWAN. A nested-grid modeling approach, with domain dimension ranging from global to regional scales, was used to provide wave spectral boundary condition to a local scale model domain, which has a spatial dimension around 60km by 60km and a grid resolution of 250m - 300m. Model results simulated by WaveWatchIII and SWAN in a structured-grid framework are compared to NOAA wave buoy data for the six wave parameters, including omnidirectional wave power, significant wave height, energy period, spectral width, direction of maximum directionally resolved wave power, and directionality coefficient. Model performance and computational efficiency are evaluated, and the best practices for wave resource assessments are discussed, based on a set of standard error statistics and model run times.

  13. Earthbound Unmanned Autonomous Vehicles (UAVS) As Planetary Science Testbeds

    Science.gov (United States)

    Pieri, D. C.; Bland, G.; Diaz, J. A.; Fladeland, M. M.

    2014-12-01

    Recent advances in the technology of unmanned vehicles have greatly expanded the range of contemplated terrestrial operational environments for their use, including aerial, surface, and submarine. The advances have been most pronounced in the areas of autonomy, miniaturization, durability, standardization, and ease of operation, most notably (especially in the popular press) for airborne vehicles. Of course, for a wide range of planetary venues, autonomy at high cost of both money and risk, has always been a requirement. Most recently, missions to Mars have also featured an unprecedented degree of mobility. Combining the traditional planetary surface deployment operational and science imperatives with emerging, very accessible, and relatively economical small UAV platforms on Earth can provide flexible, rugged, self-directed, test-bed platforms for landed instruments and strategies that will ultimately be directed elsewhere, and, in the process, provide valuable earth science data. While the most direct transfer of technology from terrestrial to planetary venues is perhaps for bodies with atmospheres (and oceans), with appropriate technology and strategy accommodations, single and networked UAVs can be designed to operate on even airless bodies, under a variety of gravities. In this presentation, we present and use results and lessons learned from our recent earth-bound UAV volcano deployments, as well as our future plans for such, to conceptualize a range of planetary and small-body missions. We gratefully acknowledge the assistance of students and colleagues at our home institutions, and the government of Costa Rica, without which our UAV deployments would not have been possible. This work was carried out, in part, at the Jet Propulsion Laboratory of the California Institute of Technology under contract to NASA.

  14. High-contrast imager for Complex Aperture Telescopes (HiCAT): testbed design and coronagraph developments

    Science.gov (United States)

    N'Diaye, Mamadou; Choquet, E.; Pueyo, L.; Elliot, E.; Perrin, M. D.; Wallace, J.; Anderson, R. E.; Carlotti, A.; Groff, T. D.; Hartig, G. F.; Kasdin, J.; Lajoie, C.; Levecq, O.; Long, C.; Macintosh, B.; Mawet, D.; Norman, C. A.; Shaklan, S.; Sheckells, M.; Sivaramakrishnan, A.; Soummer, R.

    2014-01-01

    We present a new high-contrast imaging testbed designed to provide complete solutions for wavefront sensing and control and starlight suppression with complex aperture telescopes (NASA APRA; Soummer PI). This includes geometries with central obstruction, support structures, and/or primary mirror segmentation. Complex aperture telescopes are often associated with large telescope designs, which are considered for future space missions. However, these designs makes high-contrast imaging challenging because of additional diffraction features in the point spread function. We present a novel optimization approach for the testbed optical and opto-mechanical design that minimizes the impact of both phase and amplitude errors from the wave propagation of testbed optics surface errors. This design approach allows us to define the specification for the bench optics, which we then compare to the manufactured parts. We discuss the testbed alignment and first results. We also present our coronagraph design for different testbed pupil shapes (AFTA or ATLAST), which involves a new method for the optimization of Apodized Pupil Lyot Coronagraphs (APLC).

  15. Graphical interface between the CIRSSE testbed and CimStation software with MCS/CTOS

    Science.gov (United States)

    Hron, Anna B.

    1992-01-01

    This research is concerned with developing a graphical simulation of the testbed at the Center for Intelligent Robotic Systems for Space Exploration (CIRSSE) and the interface which allows for communication between the two. Such an interface is useful in telerobotic operations, and as a functional interaction tool for testbed users. Creating a simulated model of a real world system, generates inevitable calibration discrepancies between them. This thesis gives a brief overview of the work done to date in the area of workcell representation and communication, describes the development of the CIRSSE interface, and gives a direction for future work in the area of system calibration. The CimStation software used for development of this interface, is a highly versatile robotic workcell simulation package which has been programmed for this application with a scale graphical model of the testbed, and supporting interface menu code. A need for this tool has been identified for the reasons of path previewing, as a window on teleoperation and for calibration of simulated vs. real world models. The interface allows information (i.e., joint angles) generated by CimStation to be sent as motion goal positions to the testbed robots. An option of the interface has been established such that joint angle information generated by supporting testbed algorithms (i.e., TG, collision avoidance) can be piped through CimStation as a visual preview of the path.

  16. Integrated Systems Health Management for Sustainable Habitats (Using Sustainability Base as a Testbed)

    Science.gov (United States)

    Martin, Rodney A.

    2017-01-01

    Habitation systems provide a safe place for astronauts to live and work in space and on planetary surfaces. They enable crews to live and work safely in deep space, and include integrated life support systems, radiation protection, fire safety, and systems to reduce logistics and the need for resupply missions. Innovative health management technologies are needed in order to increase the safety and mission-effectiveness for future space habitats on other planets, asteroids, or lunar surfaces. For example, off-nominal or failure conditions occurring in safety-critical life support systems may need to be addressed quickly by the habitat crew without extensive technical support from Earth due to communication delays. If the crew in the habitat must manage, plan and operate much of the mission themselves, operations support must be migrated from Earth to the habitat. Enabling monitoring, tracking, and management capabilities on-board the habitat and related EVA platforms for a small crew to use will require significant automation and decision support software.Traditional caution and warning systems are typically triggered by out-of-bounds sensor values, but can be enhanced by including machine learning and data mining techniques. These methods aim to reveal latent, unknown conditions while still retaining and improving the ability to provide highly accurate alerts for known issues. A few of these techniques will briefly described, along with performance targets for known faults and failures. Specific system health management capabilities required for habitat system elements (environmental control and life support systems, etc.) may include relevant subsystems such as water recycling systems, photovoltaic systems, electrical power systems, and environmental monitoring systems. Sustainability Base, the agency's flagship LEED-platinum certified green building acts as a living laboratory for testing advanced information and sustainable technologies that provides an

  17. Open Orchestration Cloud Radio Access Network (OOCRAN) Testbed

    OpenAIRE

    Floriach-Pigem, Marti; Xercavins-Torregrosa, Guillem; Marojevic, Vuk; Gelonch-Bosch, Antoni

    2017-01-01

    The Cloud radio access network (C-RAN) offers a revolutionary approach to cellular network deployment, management and evolution. Advances in software-defined radio (SDR) and networking technology, moreover, enable delivering software-defined everything through the Cloud. Resources will be pooled and dynamically allocated leveraging abstraction, virtualization, and consolidation techniques; processes will be automated using common application programming interfaces; and network functions and s...

  18. Plant automation

    International Nuclear Information System (INIS)

    Christensen, L.J.; Sackett, J.I.; Dayal, Y.; Wagner, W.K.

    1989-01-01

    This paper describes work at EBR-II in the development and demonstration of new control equipment and methods and associated schemes for plant prognosis, diagnosis, and automation. The development work has attracted the interest of other national laboratories, universities, and commercial companies. New initiatives include use of new control strategies, expert systems, advanced diagnostics, and operator displays. The unique opportunity offered by EBR-II is as a test bed where a total integrated approach to automatic reactor control can be directly tested under real power plant conditions

  19. Conceptual Design and Cost Estimate of a Subsonic NASA Testbed Vehicle (NTV) for Aeronautics Research

    Science.gov (United States)

    Nickol, Craig L.; Frederic, Peter

    2013-01-01

    A conceptual design and cost estimate for a subsonic flight research vehicle designed to support NASA's Environmentally Responsible Aviation (ERA) project goals is presented. To investigate the technical and economic feasibility of modifying an existing aircraft, a highly modified Boeing 717 was developed for maturation of technologies supporting the three ERA project goals of reduced fuel burn, noise, and emissions. This modified 717 utilizes midfuselage mounted modern high bypass ratio engines in conjunction with engine exhaust shielding structures to provide a low noise testbed. The testbed also integrates a natural laminar flow wing section and active flow control for the vertical tail. An eight year program plan was created to incrementally modify and test the vehicle, enabling the suite of technology benefits to be isolated and quantified. Based on the conceptual design and programmatic plan for this testbed vehicle, a full cost estimate of $526M was developed, representing then-year dollars at a 50% confidence level.

  20. Definition study for variable cycle engine testbed engine and associated test program

    Science.gov (United States)

    Vdoviak, J. W.

    1978-01-01

    The product/study double bypass variable cycle engine (VCE) was updated to incorporate recent improvements. The effect of these improvements on mission range and noise levels was determined. This engine design was then compared with current existing high-technology core engines in order to define a subscale testbed configuration that simulated many of the critical technology features of the product/study VCE. Detailed preliminary program plans were then developed for the design, fabrication, and static test of the selected testbed engine configuration. These plans included estimated costs and schedules for the detail design, fabrication and test of the testbed engine and the definition of a test program, test plan, schedule, instrumentation, and test stand requirements.

  1. Comparison of two matrix data structures for advanced CSM testbed applications

    Science.gov (United States)

    Regelbrugge, M. E.; Brogan, F. A.; Nour-Omid, B.; Rankin, C. C.; Wright, M. A.

    1989-01-01

    The first section describes data storage schemes presently used by the Computational Structural Mechanics (CSM) testbed sparse matrix facilities and similar skyline (profile) matrix facilities. The second section contains a discussion of certain features required for the implementation of particular advanced CSM algorithms, and how these features might be incorporated into the data storage schemes described previously. The third section presents recommendations, based on the discussions of the prior sections, for directing future CSM testbed development to provide necessary matrix facilities for advanced algorithm implementation and use. The objective is to lend insight into the matrix structures discussed and to help explain the process of evaluating alternative matrix data structures and utilities for subsequent use in the CSM testbed.

  2. Expert systems and advanced automation for space missions operations

    Science.gov (United States)

    Durrani, Sajjad H.; Perkins, Dorothy C.; Carlton, P. Douglas

    1990-01-01

    Increased complexity of space missions during the 1980s led to the introduction of expert systems and advanced automation techniques in mission operations. This paper describes several technologies in operational use or under development at the National Aeronautics and Space Administration's Goddard Space Flight Center. Several expert systems are described that diagnose faults, analyze spacecraft operations and onboard subsystem performance (in conjunction with neural networks), and perform data quality and data accounting functions. The design of customized user interfaces is discussed, with examples of their application to space missions. Displays, which allow mission operators to see the spacecraft position, orientation, and configuration under a variety of operating conditions, are described. Automated systems for scheduling are discussed, and a testbed that allows tests and demonstrations of the associated architectures, interface protocols, and operations concepts is described. Lessons learned are summarized.

  3. Multi-level infrastructure of interconnected testbeds of large-scale wireless sensor networks (MI2T-WSN)

    CSIR Research Space (South Africa)

    Abu-Mahfouz, Adnan M

    2012-06-01

    Full Text Available are still required for further testing before the real implementation. In this paper we propose a multi-level infrastructure of interconnected testbeds of large- scale WSNs. This testbed consists of 1000 sensor motes that will be distributed into four...

  4. NBodyLab: A Testbed for Undergraduates Utilizing a Web Interface to NEMO and MD-GRAPE2 Hardware

    Science.gov (United States)

    Johnson, V. L.; Teuben, P. J.; Penprase, B. E.

    An N-body simulation testbed called NBodyLab was developed at Pomona College as a teaching tool for undergraduates. The testbed runs under Linux and provides a web interface to selected back-end NEMO modeling and analysis tools, and several integration methods which can optionally use an MD-GRAPE2 supercomputer card in the server to accelerate calculation of particle-particle forces. The testbed provides a framework for using and experimenting with the main components of N-body simulations: data models and transformations, numerical integration of the equations of motion, analysis and visualization products, and acceleration techniques (in this case, special purpose hardware). The testbed can be used by students with no knowledge of programming or Unix, freeing such students and their instructor to spend more time on scientific experimentation. The advanced student can extend the testbed software and/or more quickly transition to the use of more advanced Unix-based toolsets such as NEMO, Starlab and model builders such as GalactICS. Cosmology students at Pomona College used the testbed to study collisions of galaxies with different speeds, masses, densities, collision angles, angular momentum, etc., attempting to simulate, for example, the Tadpole Galaxy and the Antenna Galaxies. The testbed framework is available as open-source to assist other researchers and educators. Recommendations are made for testbed enhancements.

  5. WIDAFELS flexible automation systems

    International Nuclear Information System (INIS)

    Shende, P.S.; Chander, K.P.; Ramadas, P.

    1990-01-01

    After discussing the various aspects of automation, some typical examples of various levels of automation are given. One of the examples is of automated production line for ceramic fuel pellets. (M.G.B.)

  6. An Automation Planning Primer.

    Science.gov (United States)

    Paynter, Marion

    1988-01-01

    This brief planning guide for library automation incorporates needs assessment and evaluation of options to meet those needs. A bibliography of materials on automation planning and software reviews, library software directories, and library automation journals is included. (CLB)

  7. Advanced Diagnostic and Prognostic Testbed (ADAPT) Testability Analysis Report

    Science.gov (United States)

    Ossenfort, John

    2008-01-01

    As system designs become more complex, determining the best locations to add sensors and test points for the purpose of testing and monitoring these designs becomes more difficult. Not only must the designer take into consideration all real and potential faults of the system, he or she must also find efficient ways of detecting and isolating those faults. Because sensors and cabling take up valuable space and weight on a system, and given constraints on bandwidth and power, it is even more difficult to add sensors into these complex designs after the design has been completed. As a result, a number of software tools have been developed to assist the system designer in proper placement of these sensors during the system design phase of a project. One of the key functions provided by many of these software programs is a testability analysis of the system essentially an evaluation of how observable the system behavior is using available tests. During the design phase, testability metrics can help guide the designer in improving the inherent testability of the design. This may include adding, removing, or modifying tests; breaking up feedback loops, or changing the system to reduce fault propagation. Given a set of test requirements, the analysis can also help to verify that the system will meet those requirements. Of course, a testability analysis requires that a software model of the physical system is available. For the analysis to be most effective in guiding system design, this model should ideally be constructed in parallel with these efforts. The purpose of this paper is to present the final testability results of the Advanced Diagnostic and Prognostic Testbed (ADAPT) after the system model was completed. The tool chosen to build the model and to perform the testability analysis with is the Testability Engineering and Maintenance System Designer (TEAMS-Designer). The TEAMS toolset is intended to be a solution to span all phases of the system, from design and

  8. Aeronautics Autonomy Testbed Capability (AATC) Team Developed Concepts

    Science.gov (United States)

    Smith, Phillip J.

    2018-01-01

    In 2015, the National Aeronautics and Space Administration (NASA) formed a multi-center, interdisciplinary team of engineers from three different aeronautics research centers who were tasked with improving NASA autonomy research capabilities. This group was subsequently named the Aeronautics Autonomy Testbed Capability (AATC) team. To aid in confronting the autonomy research directive, NASA contracted IDEO, a design firm, to provide consultants and guides to educate NASA engineers through the practice of design thinking, which is an unconventional method for aerospace design processes. The team then began learning about autonomy research challenges by conducting interviews with a diverse group of researchers and pilots, military personnel and civilians, experts and amateurs. Part of this design thinking process involved developing ideas for products or programs known as concepts that could enable real world fulfillment of the most important latent needs identified through analysis of the interviews. The concepts are intended to be sacrificial, intermediate steps in the design thinking process and are presented in this report to record the efforts of the AATC group. Descriptions are provided in present tense to allow for further ideation and imagining the concept as reality as was attempted during the teams discussions and interviews. This does not indicate that the concepts are actually in practice within NASA though there may be similar existing programs independent of AATC. These concepts were primarily created at two distinct stages during the design thinking process. After the initial interviews, there was a workshop for concept development and the resulting ideas are shown in this work as from the First Round. As part of succeeding interviews, the team members presented the First Round concepts to refine the understanding of existing research needs. This knowledge was then used to generate an additional set of concepts denoted as the Second Round. Some

  9. Towards Autonomous Operations of the Robonaut 2 Humanoid Robotic Testbed

    Science.gov (United States)

    Badger, Julia; Nguyen, Vienny; Mehling, Joshua; Hambuchen, Kimberly; Diftler, Myron; Luna, Ryan; Baker, William; Joyce, Charles

    2016-01-01

    The Robonaut project has been conducting research in robotics technology on board the International Space Station (ISS) since 2012. Recently, the original upper body humanoid robot was upgraded by the addition of two climbing manipulators ("legs"), more capable processors, and new sensors, as shown in Figure 1. While Robonaut 2 (R2) has been working through checkout exercises on orbit following the upgrade, technology development on the ground has continued to advance. Through the Active Reduced Gravity Offload System (ARGOS), the Robonaut team has been able to develop technologies that will enable full operation of the robotic testbed on orbit using similar robots located at the Johnson Space Center. Once these technologies have been vetted in this way, they will be implemented and tested on the R2 unit on board the ISS. The goal of this work is to create a fully-featured robotics research platform on board the ISS to increase the technology readiness level of technologies that will aid in future exploration missions. Technology development has thus far followed two main paths, autonomous climbing and efficient tool manipulation. Central to both technologies has been the incorporation of a human robotic interaction paradigm that involves the visualization of sensory and pre-planned command data with models of the robot and its environment. Figure 2 shows screenshots of these interactive tools, built in rviz, that are used to develop and implement these technologies on R2. Robonaut 2 is designed to move along the handrails and seat track around the US lab inside the ISS. This is difficult for many reasons, namely the environment is cluttered and constrained, the robot has many degrees of freedom (DOF) it can utilize for climbing, and remote commanding for precision tasks such as grasping handrails is time-consuming and difficult. Because of this, it is important to develop the technologies needed to allow the robot to reach operator-specified positions as

  10. LOS Throughput Measurements in Real-Time with a 128-Antenna Massive MIMO Testbed

    OpenAIRE

    Harris, Paul; Zhang, Siming; Beach, Mark; Mellios, Evangelos; Nix, Andrew; Armour, Simon; Doufexi, Angela; Nieman, Karl; Kundargi, Nikhil

    2017-01-01

    This paper presents initial results for a novel 128-antenna massive Multiple-Input, Multiple- Output (MIMO) testbed developed through Bristol Is Open in collaboration with National Instruments and Lund University. We believe that the results presented here validate the adoption of massive MIMO as a key enabling technology for 5G and pave the way for further pragmatic research by the massive MIMO community. The testbed operates in real-time with a Long-Term Evolution (LTE)-like PHY in Time Div...

  11. Implementation of a Wireless Time Distribution Testbed Protected with Quantum Key Distribution

    Energy Technology Data Exchange (ETDEWEB)

    Bonior, Jason D [ORNL; Evans, Philip G [ORNL; Sheets, Gregory S [ORNL; Jones, John P [ORNL; Flynn, Toby H [ORNL; O' Neil, Lori Ross [Pacific Northwest National Laboratory (PNNL); Hutton, William [Pacific Northwest National Laboratory (PNNL); Pratt, Richard [Pacific Northwest National Laboratory (PNNL); Carroll, Thomas E. [Pacific Northwest National Laboratory (PNNL)

    2017-01-01

    Secure time transfer is critical for many timesensitive applications. the Global Positioning System (GPS) which is often used for this purpose has been shown to be susceptible to spoofing attacks. Quantum Key Distribution offers a way to securely generate encryption keys at two locations. Through careful use of this information it is possible to create a system that is more resistant to spoofing attacks. In this paper we describe our work to create a testbed which utilizes QKD and traditional RF links. This testbed will be used for the development of more secure and spoofing resistant time distribution protocols.

  12. Accelerating Innovation that Enhances Resource Recovery in the Wastewater Sector: Advancing a National Testbed Network.

    Science.gov (United States)

    Mihelcic, James R; Ren, Zhiyong Jason; Cornejo, Pablo K; Fisher, Aaron; Simon, A J; Snyder, Seth W; Zhang, Qiong; Rosso, Diego; Huggins, Tyler M; Cooper, William; Moeller, Jeff; Rose, Bob; Schottel, Brandi L; Turgeon, Jason

    2017-07-18

    This Feature examines significant challenges and opportunities to spur innovation and accelerate adoption of reliable technologies that enhance integrated resource recovery in the wastewater sector through the creation of a national testbed network. The network is a virtual entity that connects appropriate physical testing facilities, and other components needed for a testbed network, with researchers, investors, technology providers, utilities, regulators, and other stakeholders to accelerate the adoption of innovative technologies and processes that are needed for the water resource recovery facility of the future. Here we summarize and extract key issues and developments, to provide a strategy for the wastewater sector to accelerate a path forward that leads to new sustainable water infrastructures.

  13. Data dissemination in the wild: A testbed for high-mobility MANETs

    DEFF Research Database (Denmark)

    Vingelmann, Peter; Pedersen, Morten Videbæk; Heide, Janus

    2012-01-01

    This paper investigates the problem of efficient data dissemination in Mobile Ad hoc NETworks (MANETs) with high mobility. A testbed is presented; which provides a high degree of mobility in experiments. The testbed consists of 10 autonomous robots with mobile phones mounted on them. The mobile...... information, and the goal is to convey that information to all devices. A strategy is proposed that uses UDP broadcast transmissions and random linear network coding to facilitate the efficient exchange of information in the network. An application is introduced that implements this strategy on Nokia phones...

  14. The Segmented Aperture Interferometric Nulling Testbed (SAINT) I: overview and air-side system description

    Science.gov (United States)

    Hicks, Brian A.; Lyon, Richard G.; Petrone, Peter; Ballard, Marlin; Bolcar, Matthew R.; Bolognese, Jeff; Clampin, Mark; Dogoda, Peter; Dworzanski, Daniel; Helmbrecht, Michael A.; Koca, Corina; Shiri, Ron

    2016-07-01

    This work presents an overview of the Segmented Aperture Interferometric Nulling Testbed (SAINT), a project that will pair an actively-controlled macro-scale segmented mirror with the Visible Nulling Coronagraph (VNC). SAINT will incorporate the VNC's demonstrated wavefront sensing and control system to refine and quantify end-to-end high-contrast starlight suppression performance. This pathfinder testbed will be used as a tool to study and refine approaches to mitigating instabilities and complex diffraction expected from future large segmented aperture telescopes.

  15. Low cost automation

    International Nuclear Information System (INIS)

    1987-03-01

    This book indicates method of building of automation plan, design of automation facilities, automation and CHIP process like basics of cutting, NC processing machine and CHIP handling, automation unit, such as drilling unit, tapping unit, boring unit, milling unit and slide unit, application of oil pressure on characteristics and basic oil pressure circuit, application of pneumatic, automation kinds and application of process, assembly, transportation, automatic machine and factory automation.

  16. Automated Budget System -

    Data.gov (United States)

    Department of Transportation — The Automated Budget System (ABS) automates management and planning of the Mike Monroney Aeronautical Center (MMAC) budget by providing enhanced capability to plan,...

  17. Automation 2017

    CERN Document Server

    Zieliński, Cezary; Kaliczyńska, Małgorzata

    2017-01-01

    This book consists of papers presented at Automation 2017, an international conference held in Warsaw from March 15 to 17, 2017. It discusses research findings associated with the concepts behind INDUSTRY 4.0, with a focus on offering a better understanding of and promoting participation in the Fourth Industrial Revolution. Each chapter presents a detailed analysis of a specific technical problem, in most cases followed by a numerical analysis, simulation and description of the results of implementing the solution in a real-world context. The theoretical results, practical solutions and guidelines presented are valuable for both researchers working in the area of engineering sciences and practitioners looking for solutions to industrial problems. .

  18. Marketing automation

    Directory of Open Access Journals (Sweden)

    TODOR Raluca Dania

    2017-01-01

    Full Text Available The automation of the marketing process seems to be nowadays, the only solution to face the major changes brought by the fast evolution of technology and the continuous increase in supply and demand. In order to achieve the desired marketing results, businessis have to employ digital marketing and communication services. These services are efficient and measurable thanks to the marketing technology used to track, score and implement each campaign. Due to the technical progress, the marketing fragmentation, demand for customized products and services on one side and the need to achieve constructive dialogue with the customers, immediate and flexible response and the necessity to measure the investments and the results on the other side, the classical marketing approached had changed continue to improve substantially.

  19. Diffraction-based analysis of tunnel size for a scaled external occulter testbed

    Science.gov (United States)

    Sirbu, Dan; Kasdin, N. Jeremy; Vanderbei, Robert J.

    2016-07-01

    For performance verification of an external occulter mask (also called a starshade), scaled testbeds have been developed to measure the suppression of the occulter shadow in the pupil plane and contrast in the image plane. For occulter experiments the scaling is typically performed by maintaining an equivalent Fresnel number. The original Princeton occulter testbed was oversized with respect to both input beam and shadow propagation to limit any diffraction effects due to finite testbed enclosure edges; however, to operate at realistic space-mission equivalent Fresnel numbers an extended testbed is currently under construction. With the longer propagation distances involved, diffraction effects due to the edge of the tunnel must now be considered in the experiment design. Here, we present a diffraction-based model of two separate tunnel effects. First, we consider the effect of tunnel-edge induced diffraction ringing upstream from the occulter mask. Second, we consider the diffraction effect due to clipping of the output shadow by the tunnel downstream from the occulter mask. These calculations are performed for a representative point design relevant to the new Princeton occulter experiment, but we also present an analytical relation that can be used for other propagation distances.

  20. Vacuum Nuller Testbed (VNT) Performance, Characterization and Null Control: Progress Report

    Science.gov (United States)

    Lyon, Richard G.; Clampin, Mark; Petrone, Peter; Mallik, Udayan; Madison, Timothy; Bolcar, Matthew R.; Noecker, M. Charley; Kendrick, Stephen; Helmbrecht, Michael

    2011-01-01

    Herein we report on the development. sensing and control and our first results with the Vacuum Nuller Testbed to realize a Visible Nulling Coronagraph (VNC) for exoplanet coronagraphy. The VNC is one of the few approaches that works with filled. segmented and sparse or diluted-aperture telescope systems. It thus spans a range of potential future NASA telescopes and could be Hown as a separate instrument on such a future mission. NASA/Goddard Space Flight Center (GSFC) has a well-established effort to develop VNC technologies. and has developed an incremental sequence of VNC testbeds to advance this approach and the enabling technologies associated with it. We discuss the continued development of the vacuum Visible Nulling Coronagraph testbed (VNT). Tbe VNT is an ultra-stable vibration isolated testbed that operates under closed-loop control within a vacuum chamber. It will be used to achieve an incremental sequence of three visible-light nulling milestones with sequentially higher contrasts of 10(sup 8), 10(sup 9) and ideally 10(sup 10) at an inner working angle of 2*lambda/D. The VNT is based on a modified Mach-Zehnder nulling interferometer, with a "W" configuration to accommodate a hex-packed MEMS based deformable mirror, a coherent fiber bundle and achromatic phase shifters. We discuss the initial laboratory results, the optical configuration, critical technologies and the null sensing and control approach.

  1. Solar Resource Assessment with Sky Imagery and a Virtual Testbed for Sky Imager Solar Forecasting

    Science.gov (United States)

    Kurtz, Benjamin Bernard

    In recent years, ground-based sky imagers have emerged as a promising tool for forecasting solar energy on short time scales (0 to 30 minutes ahead). Following the development of sky imager hardware and algorithms at UC San Diego, we present three new or improved algorithms for sky imager forecasting and forecast evaluation. First, we present an algorithm for measuring irradiance with a sky imager. Sky imager forecasts are often used in conjunction with other instruments for measuring irradiance, so this has the potential to decrease instrumentation costs and logistical complexity. In particular, the forecast algorithm itself often relies on knowledge of the current irradiance which can now be provided directly from the sky images. Irradiance measurements are accurate to within about 10%. Second, we demonstrate a virtual sky imager testbed that can be used for validating and enhancing the forecast algorithm. The testbed uses high-quality (but slow) simulations to produce virtual clouds and sky images. Because virtual cloud locations are known, much more advanced validation procedures are possible with the virtual testbed than with measured data. In this way, we are able to determine that camera geometry and non-uniform evolution of the cloud field are the two largest sources of forecast error. Finally, with the assistance of the virtual sky imager testbed, we develop improvements to the cloud advection model used for forecasting. The new advection schemes are 10-20% better at short time horizons.

  2. Design of a low-power testbed for Wireless Sensor Networks and verification

    NARCIS (Netherlands)

    van Hoesel, L.F.W.; Dulman, S.O.; Havinga, Paul J.M.; Kip, Harry J.

    In this document the design considerations and component choices of a testbed prototype device for wireless sensor networks will be discussed. These devices must be able to monitor their physical environment, process data and assist other nodes in forwarding sensor readings. For these tasks, five

  3. Evaluation of Unmanned Aircraft Systems (UAS) for Weather and Climate using the Multi-testbed approach

    Science.gov (United States)

    Baker, B.; Lee, T.; Buban, M.; Dumas, E. J.

    2017-12-01

    Evaluation of Unmanned Aircraft Systems (UAS) for Weather and Climate using the Multi-testbed approachC. Bruce Baker1, Ed Dumas1,2, Temple Lee1,2, Michael Buban1,21NOAA ARL, Atmospheric Turbulence and Diffusion Division, Oak Ridge, TN2Oak Ridge Associated Universities, Oak Ridge, TN The development of a small Unmanned Aerial System (sUAS) testbeds that can be used to validate, integrate, calibrate and evaluate new technology and sensors for routine boundary layer research, validation of operational weather models, improvement of model parameterizations, and recording observations within high-impact storms is important for understanding the importance and impact of using sUAS's routinely as a new observing platform. The goal of the multi-testbed approach is to build a robust set of protocols to assess the cost and operational feasibility of unmanned observations for routine applications using various combinations of sUAS aircraft and sensors in different locations and field experiments. All of these observational testbeds serve different community needs, but they also use a diverse suite of methodologies for calibration and evaluation of different sensors and platforms for severe weather and boundary layer research. The primary focus will be to evaluate meteorological sensor payloads to measure thermodynamic parameters and define surface characteristics with visible, IR, and multi-spectral cameras. This evaluation will lead to recommendations for sensor payloads for VTOL and fixed-wing sUAS.

  4. Creative thinking of design and redesign on SEAT aircraft cabin testbed: a case study

    NARCIS (Netherlands)

    Tan, C.F.; Chen, W.; Rauterberg, G.W.M.

    2009-01-01

    this paper, the intuition approach in the design and redesign of the environmental friendly innovative aircraft cabin simulator is presented.. The aircraft cabin simulator is a testbed that used for European Project SEAT (Smart tEchnologies for Stress free Air Travel). The SEAT project aims to

  5. Vacuum nuller testbed (VNT) performance, characterization and null control: progress report

    Science.gov (United States)

    Lyon, Richard G.; Clampin, Mark; Petrone, Peter; Mallik, Udayan; Madison, Timothy; Bolcar, Matthew R.; Noecker, M. Charley; Kendrick, Stephen; Helmbrecht, Michael

    2011-10-01

    Herein we report on the development, sensing and control and our first results with the Vacuum Nuller Testbed to realize a Visible Nulling Coronagraph (VNC) for exoplanet coronagraphy. The VNC is one of the few approaches that works with filled, segmented and sparse or diluted-aperture telescope systems. It thus spans a range of potential future NASA telescopes and could be flown as a separate instrument on such a future mission. NASA/Goddard Space Flight Center (GSFC) has a well-established effort to develop VNC technologies, and has developed an incremental sequence of VNC testbeds to advance this approach and the enabling technologies associated with it. We discuss the continued development of the vacuum Visible Nulling Coronagraph testbed (VNT). The VNT is an ultra-stable vibration isolated testbed that operates under closed-loop control within a vacuum chamber. It will be used to achieve an incremental sequence of three visible-light nulling milestones with sequentially higher contrasts of 108, 109, and ideally 1010 at an inner working angle of 2*λ/D. The VNT is based on a modified Mach-Zehnder nulling interferometer, with a "W" configuration to accommodate a hex-packed MEMS based deformable mirror, a coherent fiber bundle and achromatic phase shifters. We discuss the initial laboratory results, the optical configuration, critical technologies and the null sensing and control approach.

  6. An adaptable, low cost test-bed for unmanned vehicle systems research

    Science.gov (United States)

    Goppert, James M.

    2011-12-01

    An unmanned vehicle systems test-bed has been developed. The test-bed has been designed to accommodate hardware changes and various vehicle types and algorithms. The creation of this test-bed allows research teams to focus on algorithm development and employ a common well-tested experimental framework. The ArduPilotOne autopilot was developed to provide the necessary level of abstraction for multiple vehicle types. The autopilot was also designed to be highly integrated with the Mavlink protocol for Micro Air Vehicle (MAV) communication. Mavlink is the native protocol for QGroundControl, a MAV ground control program. Features were added to QGroundControl to accommodate outdoor usage. Next, the Mavsim toolbox was developed for Scicoslab to allow hardware-in-the-loop testing, control design and analysis, and estimation algorithm testing and verification. In order to obtain linear models of aircraft dynamics, the JSBSim flight dynamics engine was extended to use a probabilistic Nelder-Mead simplex method. The JSBSim aircraft dynamics were compared with wind-tunnel data collected. Finally, a structured methodology for successive loop closure control design is proposed. This methodology is demonstrated along with the rest of the test-bed tools on a quadrotor, a fixed wing RC plane, and a ground vehicle. Test results for the ground vehicle are presented.

  7. Embedded Sensors and Controls to Improve Component Performance and Reliability -- Bench-scale Testbed Design Report

    Energy Technology Data Exchange (ETDEWEB)

    Melin, Alexander M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Kisner, Roger A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Drira, Anis [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Reed, Frederick K. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-09-01

    Embedded instrumentation and control systems that can operate in extreme environments are challenging due to restrictions on sensors and materials. As a part of the Department of Energy's Nuclear Energy Enabling Technology cross-cutting technology development programs Advanced Sensors and Instrumentation topic, this report details the design of a bench-scale embedded instrumentation and control testbed. The design goal of the bench-scale testbed is to build a re-configurable system that can rapidly deploy and test advanced control algorithms in a hardware in the loop setup. The bench-scale testbed will be designed as a fluid pump analog that uses active magnetic bearings to support the shaft. The testbed represents an application that would improve the efficiency and performance of high temperature (700 C) pumps for liquid salt reactors that operate in an extreme environment and provide many engineering challenges that can be overcome with embedded instrumentation and control. This report will give details of the mechanical design, electromagnetic design, geometry optimization, power electronics design, and initial control system design.

  8. Data Distribution Service-Based Interoperability Framework for Smart Grid Testbed Infrastructure

    Directory of Open Access Journals (Sweden)

    Tarek A. Youssef

    2016-03-01

    Full Text Available This paper presents the design and implementation of a communication and control infrastructure for smart grid operation. The proposed infrastructure enhances the reliability of the measurements and control network. The advantages of utilizing the data-centric over message-centric communication approach are discussed in the context of smart grid applications. The data distribution service (DDS is used to implement a data-centric common data bus for the smart grid. This common data bus improves the communication reliability, enabling distributed control and smart load management. These enhancements are achieved by avoiding a single point of failure while enabling peer-to-peer communication and an automatic discovery feature for dynamic participating nodes. The infrastructure and ideas presented in this paper were implemented and tested on the smart grid testbed. A toolbox and application programing interface for the testbed infrastructure are developed in order to facilitate interoperability and remote access to the testbed. This interface allows control, monitoring, and performing of experiments remotely. Furthermore, it could be used to integrate multidisciplinary testbeds to study complex cyber-physical systems (CPS.

  9. A Matlab-Based Testbed for Integration, Evaluation and Comparison of Heterogeneous Stereo Vision Matching Algorithms

    Directory of Open Access Journals (Sweden)

    Raul Correal

    2016-11-01

    Full Text Available Stereo matching is a heavily researched area with a prolific published literature and a broad spectrum of heterogeneous algorithms available in diverse programming languages. This paper presents a Matlab-based testbed that aims to centralize and standardize this variety of both current and prospective stereo matching approaches. The proposed testbed aims to facilitate the application of stereo-based methods to real situations. It allows for configuring and executing algorithms, as well as comparing results, in a fast, easy and friendly setting. Algorithms can be combined so that a series of processes can be chained and executed consecutively, using the output of a process as input for the next; some additional filtering and image processing techniques have been included within the testbed for this purpose. A use case is included to illustrate how these processes are sequenced and its effect on the results for real applications. The testbed has been conceived as a collaborative and incremental open-source project, where its code is accessible and modifiable, with the objective of receiving contributions and releasing future versions to include new algorithms and features. It is currently available online for the research community.

  10. Human Exploration Spacecraft Testbed for Integration and Advancement (HESTIA)

    Science.gov (United States)

    Banker, Brian F.; Robinson, Travis

    2016-01-01

    The proposed paper will cover ongoing effort named HESTIA (Human Exploration Spacecraft Testbed for Integration and Advancement), led at the National Aeronautics and Space Administration (NASA) Johnson Space Center (JSC) to promote a cross-subsystem approach to developing Mars-enabling technologies with the ultimate goal of integrated system optimization. HESTIA also aims to develop the infrastructure required to rapidly test these highly integrated systems at a low cost. The initial focus is on the common fluids architecture required to enable human exploration of mars, specifically between life support and in-situ resource utilization (ISRU) subsystems. An overview of the advancements in both integrated technologies, in infrastructure, in simulation, and in modeling capabilities will be presented, as well as the results and findings of integrated testing,. Due to the enormous mass gear-ratio required for human exploration beyond low-earth orbit, (for every 1 kg of payload landed on Mars, 226 kg will be required on Earth), minimization of surface hardware and commodities is paramount. Hardware requirements can be minimized by reduction of equipment performing similar functions though for different subsystems. If hardware could be developed which meets the requirements of both life support and ISRU it could result in the reduction of primary hardware and/or reduction in spares. Minimization of commodities to the surface of mars can be achieved through the creation of higher efficiency systems producing little to no undesired waste, such as a closed-loop life support subsystem. Where complete efficiency is impossible or impractical, makeup commodities could be manufactured via ISRU. Although, utilization of ISRU products (oxygen and water) for crew consumption holds great promise of reducing demands on life support hardware, there exist concerns as to the purity and transportation of commodities. To date, ISRU has been focused on production rates and purities for

  11. Both Automation and Paper.

    Science.gov (United States)

    Purcell, Royal

    1988-01-01

    Discusses the concept of a paperless society and the current situation in library automation. Various applications of automation and telecommunications are addressed, and future library automation is considered. Automation at the Monroe County Public Library in Bloomington, Indiana, is described as an example. (MES)

  12. A Method to Analyze Threats and Vulnerabilities by Using a Cyber Security Test-bed of an Operating NPP

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yong Sik; Son, Choul Woong; Lee, Soo Ill [KHNP CRI, Daejeon (Korea, Republic of)

    2016-10-15

    In order to implement cyber security controls for an Operating NPP, a security assessment should conduct in advance, and it is essential to analyze threats and vulnerabilities for a cyber security risk assessment phase. It might be impossible to perform a penetration test or scanning for a vulnerability analysis because the test may cause adverse effects on the inherent functions of ones. This is the reason why we develop and construct a cyber security test-bed instead of using real I and C systems in the operating NPP. In this paper, we propose a method to analyze threats and vulnerabilities of a specific target system by using a cyber security test-bed. The test-bed is being developed considering essential functions of the selected safety and non-safety system. This paper shows the method to analyze threats and vulnerabilities of a specific target system by using a cyber security test-bed. In order to develop the cyber security test-bed with both safety and non-safety functions, test-bed functions analysis and preliminary threats and vulnerabilities identification have been conducted. We will determine the attack scenarios and conduct the test-bed based vulnerability analysis.

  13. A Method to Analyze Threats and Vulnerabilities by Using a Cyber Security Test-bed of an Operating NPP

    International Nuclear Information System (INIS)

    Kim, Yong Sik; Son, Choul Woong; Lee, Soo Ill

    2016-01-01

    In order to implement cyber security controls for an Operating NPP, a security assessment should conduct in advance, and it is essential to analyze threats and vulnerabilities for a cyber security risk assessment phase. It might be impossible to perform a penetration test or scanning for a vulnerability analysis because the test may cause adverse effects on the inherent functions of ones. This is the reason why we develop and construct a cyber security test-bed instead of using real I and C systems in the operating NPP. In this paper, we propose a method to analyze threats and vulnerabilities of a specific target system by using a cyber security test-bed. The test-bed is being developed considering essential functions of the selected safety and non-safety system. This paper shows the method to analyze threats and vulnerabilities of a specific target system by using a cyber security test-bed. In order to develop the cyber security test-bed with both safety and non-safety functions, test-bed functions analysis and preliminary threats and vulnerabilities identification have been conducted. We will determine the attack scenarios and conduct the test-bed based vulnerability analysis

  14. Embedded Sensors and Controls to Improve Component Performance and Reliability -- Loop-scale Testbed Design Report

    Energy Technology Data Exchange (ETDEWEB)

    Melin, Alexander M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Kisner, Roger A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-09-01

    Embedded instrumentation and control systems that can operate in extreme environments are challenging to design and operate. Extreme environments limit the options for sensors and actuators and degrade their performance. Because sensors and actuators are necessary for feedback control, these limitations mean that designing embedded instrumentation and control systems for the challenging environments of nuclear reactors requires advanced technical solutions that are not available commercially. This report details the development of testbed that will be used for cross-cutting embedded instrumentation and control research for nuclear power applications. This research is funded by the Department of Energy's Nuclear Energy Enabling Technology program's Advanced Sensors and Instrumentation topic. The design goal of the loop-scale testbed is to build a low temperature pump that utilizes magnetic bearing that will be incorporated into a water loop to test control system performance and self-sensing techniques. Specifically, this testbed will be used to analyze control system performance in response to nonlinear and cross-coupling fluid effects between the shaft axes of motion, rotordynamics and gyroscopic effects, and impeller disturbances. This testbed will also be used to characterize the performance losses when using self-sensing position measurement techniques. Active magnetic bearings are a technology that can reduce failures and maintenance costs in nuclear power plants. They are particularly relevant to liquid salt reactors that operate at high temperatures (700 C). Pumps used in the extreme environment of liquid salt reactors provide many engineering challenges that can be overcome with magnetic bearings and their associated embedded instrumentation and control. This report will give details of the mechanical design and electromagnetic design of the loop-scale embedded instrumentation and control testbed.

  15. Analysis, modeling, and simulation (AMS) testbed development and evaluation to support dynamic mobility applications (DMA) and active transportation and demand management (ATDM) programs - evaluation summary for the San Diego testbed

    Science.gov (United States)

    2017-08-01

    The primary objective of this project is to develop multiple simulation testbeds and transportation models to evaluate the impacts of Connected Vehicle Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) strateg...

  16. Analysis, Modeling, and Simulation (AMS) Testbed Development and Evaluation to Support Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) Programs : Evaluation Report for the San Diego Testbed : Draft Report.

    Science.gov (United States)

    2017-07-01

    The primary objective of this project is to develop multiple simulation testbeds and transportation models to evaluate the impacts of Connected Vehicle Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) strateg...

  17. Analysis, Modeling, and Simulation (AMS) Testbed Development and Evaluation to Support Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) Programs - Evaluation Report for the San Diego Testbed

    Science.gov (United States)

    2017-07-01

    The primary objective of this project is to develop multiple simulation testbeds and transportation models to evaluate the impacts of Connected Vehicle Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) strateg...

  18. Development and verification testing of automation and robotics for assembly of space structures

    Science.gov (United States)

    Rhodes, Marvin D.; Will, Ralph W.; Quach, Cuong C.

    1993-01-01

    A program was initiated within the past several years to develop operational procedures for automated assembly of truss structures suitable for large-aperture antennas. The assembly operations require the use of a robotic manipulator and are based on the principle of supervised autonomy to minimize crew resources. A hardware testbed was established to support development and evaluation testing. A brute-force automation approach was used to develop the baseline assembly hardware and software techniques. As the system matured and an operation was proven, upgrades were incorprated and assessed against the baseline test results. This paper summarizes the developmental phases of the program, the results of several assembly tests, the current status, and a series of proposed developments for additional hardware and software control capability. No problems that would preclude automated in-space assembly of truss structures have been encountered. The current system was developed at a breadboard level and continued development at an enhanced level is warranted.

  19. Coral-based Proxy Records of Ocean Acidification: A Pilot Study at the Puerto Rico Test-bed Site

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Coral cores collected nearby the Atlantic Ocean Acidification Test-bed (AOAT) at La Parguera, Puerto Rico were used to characterize the relationship between...

  20. Genetic Algorithm Phase Retrieval for the Systematic Image-Based Optical Alignment Testbed

    Science.gov (United States)

    Taylor, Jaime; Rakoczy, John; Steincamp, James

    2003-01-01

    Phase retrieval requires calculation of the real-valued phase of the pupil fimction from the image intensity distribution and characteristics of an optical system. Genetic 'algorithms were used to solve two one-dimensional phase retrieval problem. A GA successfully estimated the coefficients of a polynomial expansion of the phase when the number of coefficients was correctly specified. A GA also successfully estimated the multiple p h e s of a segmented optical system analogous to the seven-mirror Systematic Image-Based Optical Alignment (SIBOA) testbed located at NASA s Marshall Space Flight Center. The SIBOA testbed was developed to investigate phase retrieval techniques. Tiphilt and piston motions of the mirrors accomplish phase corrections. A constant phase over each mirror can be achieved by an independent tip/tilt correction: the phase Conection term can then be factored out of the Discrete Fourier Tranform (DFT), greatly reducing computations.

  1. Phased Array Antenna Testbed Development at the NASA Glenn Research Center

    Science.gov (United States)

    Lambert, Kevin M.; Kubat, Gregory; Johnson, Sandra K.; Anzic, Godfrey

    2003-01-01

    Ideal phased array antennas offer advantages for communication systems, such as wide-angle scanning and multibeam operation, which can be utilized in certain NASA applications. However, physically realizable, electronically steered, phased array antennas introduce additional system performance parameters, which must be included in the evaluation of the system. The NASA Glenn Research Center (GRC) is currently conducting research to identify these parameters and to develop the tools necessary to measure them. One of these tools is a testbed where phased array antennas may be operated in an environment that simulates their use. This paper describes the development of the testbed and its use in characterizing a particular K-Band, phased array antenna.

  2. EPIC: A Testbed for Scientifically Rigorous Cyber-Physical Security Experimentation

    OpenAIRE

    SIATERLIS CHRISTOS; GENGE BELA; HOHENADEL MARC

    2013-01-01

    Recent malware, like Stuxnet and Flame, constitute a major threat to Networked Critical Infrastructures (NCIs), e.g., power plants. They revealed several vulnerabilities in today's NCIs, but most importantly they highlighted the lack of an efficient scientific approach to conduct experiments that measure the impact of cyber threats on both the physical and the cyber parts of NCIs. In this paper we present EPIC, a novel cyber-physical testbed and a modern scientific instrument that can pr...

  3. Cooperating expert systems for Space Station - Power/thermal subsystem testbeds

    Science.gov (United States)

    Wong, Carla M.; Weeks, David J.; Sundberg, Gale R.; Healey, Kathleen L.; Dominick, Jeffrey S.

    1988-01-01

    The Systems Autonomy Demonstration Project (SADP) is a NASA-sponsored series of increasingly complex demonstrations to show the benefits of integrating knowledge-based systems with conventional process control in real-time, real-world problem domains that can facilitate the operations and availability of major Space Station distributed systems. This paper describes the system design, objectives, approaches, and status of each of the testbed knowledge-based systems. Simplified schematics of the systems are shown.

  4. The end-to-end testbed of the optical metrology system on-board LISA Pathfinder

    Energy Technology Data Exchange (ETDEWEB)

    Steier, F; Cervantes, F Guzman; Marin, A F GarcIa; Heinzel, G; Danzmann, K [Max-Planck-Institut fuer Gravitationsphysik (Albert-Einstein-Institut) and Universitaet Hannover (Germany); Gerardi, D, E-mail: frank.steier@aei.mpg.d [EADS Astrium Satellites GmbH, Friedrichshafen (Germany)

    2009-05-07

    LISA Pathfinder is a technology demonstration mission for the Laser Interferometer Space Antenna (LISA). The main experiment on-board LISA Pathfinder is the so-called LISA Technology Package (LTP) which has the aim to measure the differential acceleration between two free-falling test masses with an accuracy of 3 x 10{sup -14} ms{sup -2} Hz{sup -1/2} between 1 mHz and 30 mHz. This measurement is performed interferometrically by the optical metrology system (OMS) on-board LISA Pathfinder. In this paper, we present the development of an experimental end-to-end testbed of the entire OMS. It includes the interferometer and its sub-units, the interferometer backend which is a phasemeter and the processing of the phasemeter output data. Furthermore, three-axes piezo-actuated mirrors are used instead of the free-falling test masses for the characterization of the dynamic behaviour of the system and some parts of the drag-free and attitude control system (DFACS) which controls the test masses and the satellite. The end-to-end testbed includes all parts of the LTP that can reasonably be tested on earth without free-falling test masses. At its present status it consists mainly of breadboard components. Some of those have already been replaced by engineering models of the LTP experiment. In the next steps, further engineering and flight models will also be inserted in this testbed and tested against well-characterized breadboard components. The presented testbed is an important reference for the unit tests and can also be used for validation of the on-board experiment during the mission.

  5. Development of an Experimental Testbed for Research in Lithium-Ion Battery Management Systems

    Directory of Open Access Journals (Sweden)

    Mehdi Ferdowsi

    2013-10-01

    Full Text Available Advanced electrochemical batteries are becoming an integral part of a wide range of applications from household and commercial to smart grid, transportation, and aerospace applications. Among different battery technologies, lithium-ion (Li-ion batteries are growing more and more popular due to their high energy density, high galvanic potential, low self-discharge, low weight, and the fact that they have almost no memory effect. However, one of the main obstacles facing the widespread commercialization of Li-ion batteries is the design of reliable battery management systems (BMSs. An efficient BMS ensures electrical safety during operation, while increasing battery lifetime, capacity and thermal stability. Despite the need for extensive research in this field, the majority of research conducted on Li-ion battery packs and BMS are proprietary works conducted by manufacturers. The available literature, however, provides either general descriptions or detailed analysis of individual components of the battery system, and ignores addressing details of the overall system development. This paper addresses the development of an experimental research testbed for studying Li-ion batteries and their BMS design. The testbed can be configured in a variety of cell and pack architectures, allowing for a wide range of BMS monitoring, diagnostics, and control technologies to be tested and analyzed. General considerations that should be taken into account while designing Li-ion battery systems are reviewed and different technologies and challenges commonly encountered in Li-ion battery systems are investigated. This testbed facilitates future development of more practical and improved BMS technologies with the aim of increasing the safety, reliability, and efficiency of existing Li-ion battery systems. Experimental results of initial tests performed on the system are used to demonstrate some of the capabilities of the developed research testbed. To the authors

  6. Static and dynamic optimization of CAPE problems using a Model Testbed

    DEFF Research Database (Denmark)

    This paper presents a new computer aided tool for setting up and solving CAPE related static and dynamic optimisation problems. The Model Testbed (MOT) offers an integrated environment for setting up and solving a very large range of CAPE problems, including complex optimisation problems...... and dynamic optimisation, and how interfacing of solvers and seamless information flow can lead to more efficient solution of process design problems....

  7. Implementation of a RPS Cyber Security Test-bed with Two PLCs

    International Nuclear Information System (INIS)

    Shin, Jinsoo; Heo, Gyunyoung; Son, Hanseong; An, Yongkyu; Rizwan, Uddin

    2015-01-01

    Our research team proposed the methodology to evaluate cyber security with Bayesian network (BN) as a cyber security evaluation model and help operator, licensee, licensor or regulator in granting evaluation priorities. The methodology allowed for overall evaluation of cyber security by considering architectural aspect of facility and management aspect of cyber security at the same time. In order to emphasize reality of this model by inserting true data, it is necessary to conduct a penetration test that pretends an actual cyber-attack. Through the collaboration with University of Illinois at Urbana-Champaign, which possesses the Tricon a safety programmable logic controller (PLC) used at nuclear power plants and develops a test-bed for nuclear power plant, a test-bed for reactor protection system (RPS) is being developed with the PLCs. Two PLCs are used to construct a simple test-bed for RPS, bi-stable processor (BP) and coincidence processor (CP). By using two PLCs, it is possible to examine cyber-attack against devices such as PLC, cyber-attack against communication between devices, and the effects of a PLC on the other PLC. Two PLCs were used to construct a test-bed for penetration test in this study. Advantages of using two or more PLCs instead of single PLC are as follows. 1) Results of cyber-attack reflecting characteristics among PLCs can be obtained. 2) Cyber-attack can be attempted using a method of attacking communication between PLCs. True data obtained can be applied to existing cyber security evaluation model to emphasize reality of the model

  8. Implementation of a RPS Cyber Security Test-bed with Two PLCs

    Energy Technology Data Exchange (ETDEWEB)

    Shin, Jinsoo; Heo, Gyunyoung [Kyung Hee Univ., Yongin (Korea, Republic of); Son, Hanseong [Joongbu Univ., Geumsan (Korea, Republic of); An, Yongkyu; Rizwan, Uddin [University of Illinois at Urbana-Champaign, Urbana (United States)

    2015-10-15

    Our research team proposed the methodology to evaluate cyber security with Bayesian network (BN) as a cyber security evaluation model and help operator, licensee, licensor or regulator in granting evaluation priorities. The methodology allowed for overall evaluation of cyber security by considering architectural aspect of facility and management aspect of cyber security at the same time. In order to emphasize reality of this model by inserting true data, it is necessary to conduct a penetration test that pretends an actual cyber-attack. Through the collaboration with University of Illinois at Urbana-Champaign, which possesses the Tricon a safety programmable logic controller (PLC) used at nuclear power plants and develops a test-bed for nuclear power plant, a test-bed for reactor protection system (RPS) is being developed with the PLCs. Two PLCs are used to construct a simple test-bed for RPS, bi-stable processor (BP) and coincidence processor (CP). By using two PLCs, it is possible to examine cyber-attack against devices such as PLC, cyber-attack against communication between devices, and the effects of a PLC on the other PLC. Two PLCs were used to construct a test-bed for penetration test in this study. Advantages of using two or more PLCs instead of single PLC are as follows. 1) Results of cyber-attack reflecting characteristics among PLCs can be obtained. 2) Cyber-attack can be attempted using a method of attacking communication between PLCs. True data obtained can be applied to existing cyber security evaluation model to emphasize reality of the model.

  9. Implementation of Real-Time Feedback Flow Control Algorithms on a Canonical Testbed

    Science.gov (United States)

    Tian, Ye; Song, Qi; Cattafesta, Louis

    2005-01-01

    This report summarizes the activities on "Implementation of Real-Time Feedback Flow Control Algorithms on a Canonical Testbed." The work summarized consists primarily of two parts. The first part summarizes our previous work and the extensions to adaptive ID and control algorithms. The second part concentrates on the validation of adaptive algorithms by applying them to a vibration beam test bed. Extensions to flow control problems are discussed.

  10. TESTING THE APODIZED PUPIL LYOT CORONAGRAPH ON THE LABORATORY FOR ADAPTIVE OPTICS EXTREME ADAPTIVE OPTICS TESTBED

    International Nuclear Information System (INIS)

    Thomas, Sandrine J.; Dillon, Daren; Gavel, Donald; Soummer, Remi; Macintosh, Bruce; Sivaramakrishnan, Anand

    2011-01-01

    We present testbed results of the Apodized Pupil Lyot Coronagraph (APLC) at the Laboratory for Adaptive Optics (LAO). These results are part of the validation and tests of the coronagraph and of the Extreme Adaptive Optics (ExAO) for the Gemini Planet Imager (GPI). The apodizer component is manufactured with a halftone technique using black chrome microdots on glass. Testing this APLC (like any other coronagraph) requires extremely good wavefront correction, which is obtained to the 1 nm rms level using the microelectricalmechanical systems (MEMS) technology, on the ExAO visible testbed of the LAO at the University of Santa Cruz. We used an APLC coronagraph without central obstruction, both with a reference super-polished flat mirror and with the MEMS to obtain one of the first images of a dark zone in a coronagraphic image with classical adaptive optics using a MEMS deformable mirror (without involving dark hole algorithms). This was done as a complementary test to the GPI coronagraph testbed at American Museum of Natural History, which studied the coronagraph itself without wavefront correction. Because we needed a full aperture, the coronagraph design is very different from the GPI design. We also tested a coronagraph with central obstruction similar to that of GPI. We investigated the performance of the APLC coronagraph and more particularly the effect of the apodizer profile accuracy on the contrast. Finally, we compared the resulting contrast to predictions made with a wavefront propagation model of the testbed to understand the effects of phase and amplitude errors on the final contrast.

  11. Testbed diversity as a fundamental principle for effective ICS security research

    OpenAIRE

    Green, Benjamin; Frey, Sylvain Andre Francis; Rashid, Awais; Hutchison, David

    2016-01-01

    The implementation of diversity in testbeds is essential to understanding and improving the security and resilience of Industrial Control Systems (ICS). Employing a wide spec- trum of equipment, diverse networks, and business processes, as deployed in real-life infrastructures, is particularly diffi- cult in experimental conditions. However, this level of di- versity is key from a security perspective, as attackers can exploit system particularities and process intricacies to their advantage....

  12. An automated swimming respirometer

    DEFF Research Database (Denmark)

    STEFFENSEN, JF; JOHANSEN, K; BUSHNELL, PG

    1984-01-01

    An automated respirometer is described that can be used for computerized respirometry of trout and sharks.......An automated respirometer is described that can be used for computerized respirometry of trout and sharks....

  13. Autonomy and Automation

    Science.gov (United States)

    Shively, Jay

    2017-01-01

    A significant level of debate and confusion has surrounded the meaning of the terms autonomy and automation. Automation is a multi-dimensional concept, and we propose that Remotely Piloted Aircraft Systems (RPAS) automation should be described with reference to the specific system and task that has been automated, the context in which the automation functions, and other relevant dimensions. In this paper, we present definitions of automation, pilot in the loop, pilot on the loop and pilot out of the loop. We further propose that in future, the International Civil Aviation Organization (ICAO) RPAS Panel avoids the use of the terms autonomy and autonomous when referring to automated systems on board RPA. Work Group 7 proposes to develop, in consultation with other workgroups, a taxonomy of Levels of Automation for RPAS.

  14. Configuration Management Automation (CMA) -

    Data.gov (United States)

    Department of Transportation — Configuration Management Automation (CMA) will provide an automated, integrated enterprise solution to support CM of FAA NAS and Non-NAS assets and investments. CMA...

  15. High Contrast Vacuum Nuller Testbed (VNT) Contrast, Performance and Null Control

    Science.gov (United States)

    Lyon, Richard G.; Clampin, Mark; Petrone, Peter; Mallik, Udayan; Madison, Timothy; Bolcar, Matthew R.

    2012-01-01

    Herein we report on our Visible Nulling Coronagraph high-contrast result of 109 contrast averaged over a focal planeregion extending from 14 D with the Vacuum Nuller Testbed (VNT) in a vibration isolated vacuum chamber. TheVNC is a hybrid interferometriccoronagraphic approach for exoplanet science. It operates with high Lyot stopefficiency for filled, segmented and sparse or diluted-aperture telescopes, thereby spanning the range of potential futureNASA flight telescopes. NASAGoddard Space Flight Center (GSFC) has a well-established effort to develop the VNCand its technologies, and has developed an incremental sequence of VNC testbeds to advance this approach and itsenabling technologies. These testbeds have enabled advancement of high-contrast, visible light, nulling interferometry tounprecedented levels. The VNC is based on a modified Mach-Zehnder nulling interferometer, with a W configurationto accommodate a hex-packed MEMS based deformable mirror, a coherent fiber bundle and achromatic phase shifters.We give an overview of the VNT and discuss the high-contrast laboratory results, the optical configuration, criticaltechnologies and null sensing and control.

  16. Large Scale Data Mining to Improve Usability of Data: An Intelligent Archive Testbed

    Science.gov (United States)

    Ramapriyan, Hampapuram; Isaac, David; Yang, Wenli; Morse, Steve

    2005-01-01

    Research in certain scientific disciplines - including Earth science, particle physics, and astrophysics - continually faces the challenge that the volume of data needed to perform valid scientific research can at times overwhelm even a sizable research community. The desire to improve utilization of this data gave rise to the Intelligent Archives project, which seeks to make data archives active participants in a knowledge building system capable of discovering events or patterns that represent new information or knowledge. Data mining can automatically discover patterns and events, but it is generally viewed as unsuited for large-scale use in disciplines like Earth science that routinely involve very high data volumes. Dozens of research projects have shown promising uses of data mining in Earth science, but all of these are based on experiments with data subsets of a few gigabytes or less, rather than the terabytes or petabytes typically encountered in operational systems. To bridge this gap, the Intelligent Archives project is establishing a testbed with the goal of demonstrating the use of data mining techniques in an operationally-relevant environment. This paper discusses the goals of the testbed and the design choices surrounding critical issues that arose during testbed implementation.

  17. Variable Coding and Modulation Experiment Using NASA's Space Communication and Navigation Testbed

    Science.gov (United States)

    Downey, Joseph A.; Mortensen, Dale J.; Evans, Michael A.; Tollis, Nicholas S.

    2016-01-01

    National Aeronautics and Space Administration (NASA)'s Space Communication and Navigation Testbed on the International Space Station provides a unique opportunity to evaluate advanced communication techniques in an operational system. The experimental nature of the Testbed allows for rapid demonstrations while using flight hardware in a deployed system within NASA's networks. One example is variable coding and modulation, which is a method to increase data-throughput in a communication link. This paper describes recent flight testing with variable coding and modulation over S-band using a direct-to-earth link between the SCaN Testbed and the Glenn Research Center. The testing leverages the established Digital Video Broadcasting Second Generation (DVB-S2) standard to provide various modulation and coding options. The experiment was conducted in a challenging environment due to the multipath and shadowing caused by the International Space Station structure. Performance of the variable coding and modulation system is evaluated and compared to the capacity of the link, as well as standard NASA waveforms.

  18. James Webb Space Telescope Optical Simulation Testbed: Segmented Mirror Phase Retrieval Testing

    Science.gov (United States)

    Laginja, Iva; Egron, Sylvain; Brady, Greg; Soummer, Remi; Lajoie, Charles-Philippe; Bonnefois, Aurélie; Long, Joseph; Michau, Vincent; Choquet, Elodie; Ferrari, Marc; Leboulleux, Lucie; Mazoyer, Johan; N’Diaye, Mamadou; Perrin, Marshall; Petrone, Peter; Pueyo, Laurent; Sivaramakrishnan, Anand

    2018-01-01

    The James Webb Space Telescope (JWST) Optical Simulation Testbed (JOST) is a hardware simulator designed to produce JWST-like images. A model of the JWST three mirror anastigmat is realized with three lenses in form of a Cooke Triplet, which provides JWST-like optical quality over a field equivalent to a NIRCam module, and an Iris AO segmented mirror with hexagonal elements is standing in for the JWST segmented primary. This setup successfully produces images extremely similar to NIRCam images from cryotesting in terms of the PSF morphology and sampling relative to the diffraction limit.The testbed is used for staff training of the wavefront sensing and control (WFS&C) team and for independent analysis of WFS&C scenarios of the JWST. Algorithms like geometric phase retrieval (GPR) that may be used in flight and potential upgrades to JWST WFS&C will be explored. We report on the current status of the testbed after alignment, implementation of the segmented mirror, and testing of phase retrieval techniques.This optical bench complements other work at the Makidon laboratory at the Space Telescope Science Institute, including the investigation of coronagraphy for segmented aperture telescopes. Beyond JWST we intend to use JOST for WFS&C studies for future large segmented space telescopes such as LUVOIR.

  19. Autonomous power expert fault diagnostic system for Space Station Freedom electrical power system testbed

    Science.gov (United States)

    Truong, Long V.; Walters, Jerry L.; Roth, Mary Ellen; Quinn, Todd M.; Krawczonek, Walter M.

    1990-01-01

    The goal of the Autonomous Power System (APS) program is to develop and apply intelligent problem solving and control to the Space Station Freedom Electrical Power System (SSF/EPS) testbed being developed and demonstrated at NASA Lewis Research Center. The objectives of the program are to establish artificial intelligence technology paths, to craft knowledge-based tools with advanced human-operator interfaces for power systems, and to interface and integrate knowledge-based systems with conventional controllers. The Autonomous Power EXpert (APEX) portion of the APS program will integrate a knowledge-based fault diagnostic system and a power resource planner-scheduler. Then APEX will interface on-line with the SSF/EPS testbed and its Power Management Controller (PMC). The key tasks include establishing knowledge bases for system diagnostics, fault detection and isolation analysis, on-line information accessing through PMC, enhanced data management, and multiple-level, object-oriented operator displays. The first prototype of the diagnostic expert system for fault detection and isolation has been developed. The knowledge bases and the rule-based model that were developed for the Power Distribution Control Unit subsystem of the SSF/EPS testbed are described. A corresponding troubleshooting technique is also described.

  20. Real-Time Simulation and Hardware-in-the-Loop Testbed for Distribution Synchrophasor Applications

    Directory of Open Access Journals (Sweden)

    Matthias Stifter

    2018-04-01

    Full Text Available With the advent of Distribution Phasor Measurement Units (D-PMUs and Micro-Synchrophasors (Micro-PMUs, the situational awareness in power distribution systems is going to the next level using time-synchronization. However, designing, analyzing, and testing of such accurate measurement devices are still challenging. Due to the lack of available knowledge and sufficient history for synchrophasors’ applications at the power distribution level, the realistic simulation, and validation environments are essential for D-PMU development and deployment. This paper presents a vendor agnostic PMU real-time simulation and hardware-in-the-Loop (PMU-RTS-HIL testbed, which helps in multiple PMUs validation and studies. The network of real and virtual PMUs was built in a full time-synchronized environment for PMU applications’ validation. The proposed testbed also includes an emulated communication network (CNS layer to replicate bandwidth, packet loss and collisions conditions inherent to the PMUs data streams’ issues. Experimental results demonstrate the flexibility and scalability of the developed PMU-RTS-HIL testbed by producing large amounts of measurements under typical normal and abnormal distribution grid operation conditions.

  1. High contrast vacuum nuller testbed (VNT) contrast, performance, and null control

    Science.gov (United States)

    Lyon, Richard G.; Clampin, Mark; Petrone, Peter; Mallik, Udayan; Madison, Timothy; Bolcar, Matthew R.

    2012-09-01

    Herein we report on our Visible Nulling Coronagraph high-contrast result of 109 contrast averaged over a focal plane region extending from 1 - 4 λ/D with the Vacuum Nuller Testbed (VNT) in a vibration isolated vacuum chamber. The VNC is a hybrid interferometric/coronagraphic approach for exoplanet science. It operates with high Lyot stop efficiency for filled, segmented and sparse or diluted-aperture telescopes, thereby spanning the range of potential future NASA flight telescopes. NASA/Goddard Space Flight Center (GSFC) has a well-established effort to develop the VNC and its technologies, and has developed an incremental sequence of VNC testbeds to advance this approach and its enabling technologies. These testbeds have enabled advancement of high-contrast, visible light, nulling interferometry to unprecedented levels. The VNC is based on a modified Mach-Zehnder nulling interferometer, with a “W” configuration to accommodate a hex-packed MEMS based deformable mirror, a coherent fiber bundle and achromatic phase shifters. We give an overview of the VNT and discuss the high-contrast laboratory results, the optical configuration, critical technologies and null sensing and control.

  2. A Multi-Vehicles, Wireless Testbed for Networked Control, Communications and Computing

    Science.gov (United States)

    Murray, Richard; Doyle, John; Effros, Michelle; Hickey, Jason; Low, Steven

    2002-03-01

    We have constructed a testbed consisting of 4 mobile vehicles (with 4 additional vehicles being completed), each with embedded computing and communications capability for use in testing new approaches for command and control across dynamic networks. The system is being used or is planned to be used for testing of a variety of communications-related technologies, including distributed command and control algorithms, dynamically reconfigurable network topologies, source coding for real-time transmission of data in lossy environments, and multi-network communications. A unique feature of the testbed is the use of vehicles that have second order dynamics. Requiring real-time feedback algorithms to stabilize the system while performing cooperative tasks. The testbed was constructed in the Caltech Vehicles Laboratory and consists of individual vehicles with PC-based computation and controls, and multiple communications devices (802.11 wireless Ethernet, Bluetooth, and infrared). The vehicles are freely moving, wheeled platforms propelled by high performance dotted fairs. The room contains an access points for an 802.11 network, overhead visual sensing (to allow emulation of CI'S signal processing), a centralized computer for emulating certain distributed computations, and network gateways to control and manipulate communications traffic.

  3. User interface design principles for the SSM/PMAD automated power system

    International Nuclear Information System (INIS)

    Jakstas, L.M.; Myers, C.J.

    1991-01-01

    Computer-human interfaces are an integral part of developing software for spacecraft power systems. A well designed and efficient user interface enables an engineer to effectively operate the system, while it concurrently prevents the user from entering data which is beyond boundary conditions or performing operations which are out of context. A user interface should also be designed to ensure that the engineer easily obtains all useful and critical data for operating the system and is aware of all faults and states in the system. Martin Marietta, under contract to NASA George C. Marshall Space Flight Center, has developed a user interface for the Space Station Module Power Management and Distribution (SSM/PMAD) automated power system testbed which provides human access to the functionality of the power system, as well as exemplifying current techniques in user interface design. The testbed user interface was designed to enable an engineer to operate the system easily without having significant knowledge of computer systems, as well as provide an environment in which the engineer can monitor and interact with the SSM/PMAD system hardware. The design of the interface supports a global view of the most important data form the various hardware and software components, as well as enabling the user to obtain additional or more detailed data when needed. The components and representations of the SSM/PMAD testbed user interface are examined in this paper. An engineer's interactions with the system are also described

  4. Automated Predictive Big Data Analytics Using Ontology Based Semantics.

    Science.gov (United States)

    Nural, Mustafa V; Cotterell, Michael E; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A

    2015-10-01

    Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology.

  5. Automation in Clinical Microbiology

    Science.gov (United States)

    Ledeboer, Nathan A.

    2013-01-01

    Historically, the trend toward automation in clinical pathology laboratories has largely bypassed the clinical microbiology laboratory. In this article, we review the historical impediments to automation in the microbiology laboratory and offer insight into the reasons why we believe that we are on the cusp of a dramatic change that will sweep a wave of automation into clinical microbiology laboratories. We review the currently available specimen-processing instruments as well as the total laboratory automation solutions. Lastly, we outline the types of studies that will need to be performed to fully assess the benefits of automation in microbiology laboratories. PMID:23515547

  6. Demonstration of automated proximity and docking technologies

    Science.gov (United States)

    Anderson, Robert L.; Tsugawa, Roy K.; Bryan, Thomas C.

    An autodock was demonstrated using straightforward techniques and real sensor hardware. A simulation testbed was established and validated. The sensor design was refined with improved optical performance and image processing noise mitigation techniques, and the sensor is ready for production from off-the-shelf components. The autonomous spacecraft architecture is defined. The areas of sensors, docking hardware, propulsion, and avionics are included in the design. The Guidance Navigation and Control architecture and requirements are developed. Modular structures suitable for automated control are used. The spacecraft system manager functions including configuration, resource, and redundancy management are defined. The requirements for autonomous spacecraft executive are defined. High level decisionmaking, mission planning, and mission contingency recovery are a part of this. The next step is to do flight demonstrations. After the presentation the following question was asked. How do you define validation? There are two components to validation definition: software simulation with formal and vigorous validation, and hardware and facility performance validated with respect to software already validated against analytical profile.

  7. Social media analytics and research testbed (SMART: Exploring spatiotemporal patterns of human dynamics with geo-targeted social media messages

    Directory of Open Access Journals (Sweden)

    Jiue-An Yang

    2016-06-01

    Full Text Available The multilevel model of meme diffusion conceptualizes how mediated messages diffuse over time and space. As a pilot application of implementing the meme diffusion, we developed the social media analytics and research testbed to monitor Twitter messages and track the diffusion of information in and across different cities and geographic regions. Social media analytics and research testbed is an online geo-targeted search and analytics tool, including an automatic data processing procedure at the backend and an interactive frontend user interface. Social media analytics and research testbed is initially designed to facilitate (1 searching and geo-locating tweet topics and terms in different cities and geographic regions; (2 filtering noise from raw data (such as removing redundant retweets and using machine learning methods to improve precision; (3 analyzing social media data from a spatiotemporal perspective; and (4 visualizing social media data in diagnostic ways (such as weekly and monthly trends, trend maps, top media, top retweets, top mentions, or top hashtags. Social media analytics and research testbed provides researchers and domain experts with a tool that can efficiently facilitate the refinement, formalization, and testing of research hypotheses or questions. Three case studies (flu outbreaks, Ebola epidemic, and marijuana legalization are introduced to illustrate how the predictions of meme diffusion can be examined and to demonstrate the potentials and key functions of social media analytics and research testbed.

  8. The OGC Innovation Program Testbeds - Advancing Architectures for Earth and Systems

    Science.gov (United States)

    Bermudez, L. E.; Percivall, G.; Simonis, I.; Serich, S.

    2017-12-01

    The OGC Innovation Program provides a collaborative agile process for solving challenging science problems and advancing new technologies. Since 1999, 100 initiatives have taken place, from multi-million dollar testbeds to small interoperability experiments. During these initiatives, sponsors and technology implementers (including academia and private sector) come together to solve problems, produce prototypes, develop demonstrations, provide best practices, and advance the future of standards. This presentation will provide the latest system architectures that can be used for Earth and space systems as a result of the OGC Testbed 13, including the following components: Elastic cloud autoscaler for Earth Observations (EO) using a WPS in an ESGF hybrid climate data research platform. Accessibility of climate data for the scientist and non-scientist users via on demand models wrapped in WPS. Standards descriptions for containerize applications to discover processes on the cloud, including using linked data, a WPS extension for hybrid clouds and linking to hybrid big data stores. OpenID and OAuth to secure OGC Services with built-in Attribute Based Access Control (ABAC) infrastructures leveraging GeoDRM patterns. Publishing and access of vector tiles, including use of compression and attribute options reusing patterns from WMS, WMTS and WFS. Servers providing 3D Tiles and streaming of data, including Indexed 3d Scene Layer (I3S), CityGML and Common DataBase (CDB). Asynchronous Services with advanced pushed notifications strategies, with a filter language instead of simple topic subscriptions, that can be use across OGC services. Testbed 14 will continue advancing topics like Big Data, security, and streaming, as well as making easier to use OGC services (e.g. RESTful APIs). The Call for Participation will be issued in December and responses are due on mid January 2018.

  9. Automation systems for radioimmunoassay

    International Nuclear Information System (INIS)

    Yamasaki, Paul

    1974-01-01

    The application of automation systems for radioimmunoassay (RIA) was discussed. Automated systems could be useful in the second step, of the four basic processes in the course of RIA, i.e., preparation of sample for reaction. There were two types of instrumentation, a semi-automatic pipete, and a fully automated pipete station, both providing for fast and accurate dispensing of the reagent or for the diluting of sample with reagent. Illustrations of the instruments were shown. (Mukohata, S.)

  10. Design of a nickel-hydrogen battery simulator for the NASA EOS testbed

    Science.gov (United States)

    Gur, Zvi; Mang, Xuesi; Patil, Ashok R.; Sable, Dan M.; Cho, Bo H.; Lee, Fred C.

    1992-01-01

    The hardware and software design of a nickel-hydrogen (Ni-H2) battery simulator (BS) with application to the NASA Earth Observation System (EOS) satellite is presented. The battery simulator is developed as a part of a complete testbed for the EOS satellite power system. The battery simulator involves both hardware and software components. The hardware component includes the capability of sourcing and sinking current at a constant programmable voltage. The software component includes the capability of monitoring the battery's ampere-hours (Ah) and programming the battery voltage according to an empirical model of the nickel-hydrogen battery stored in a computer.

  11. An ODMG-compatible testbed architecture for scalable management and analysis of physics data

    International Nuclear Information System (INIS)

    Malon, D.M.; May, E.N.

    1997-01-01

    This paper describes a testbed architecture for the investigation and development of scalable approaches to the management and analysis of massive amounts of high energy physics data. The architecture has two components: an interface layer that is compliant with a substantial subset of the ODMG-93 Version 1.2 specification, and a lightweight object persistence manager that provides flexible storage and retrieval services on a variety of single- and multi-level storage architectures, and on a range of parallel and distributed computing platforms

  12. Deployment of a Testbed in a Brazilian Research Network using IPv6 and Optical Access Technologies

    Science.gov (United States)

    Martins, Luciano; Ferramola Pozzuto, João; Olimpio Tognolli, João; Chaves, Niudomar Siqueira De A.; Reggiani, Atilio Eduardo; Hortêncio, Claudio Antonio

    2012-04-01

    This article presents the implementation of a testbed and the experimental results obtained with it on the Brazilian Experimental Network of the government-sponsored "GIGA Project." The use of IPv6 integrated to current and emerging optical architectures and technologies, such as dense wavelength division multiplexing and 10-gigabit Ethernet on the core and gigabit capable passive optical network and optical distribution network on access, were tested. These protocols, architectures, and optical technologies are promising and part of a brand new worldwide technological scenario that has being fairly adopted in the networks of enterprises and providers of the world.

  13. Test-bed Assessment of Communication Technologies for a Power-Balancing Controller

    DEFF Research Database (Denmark)

    Findrik, Mislav; Pedersen, Rasmus; Hasenleithner, Eduard

    2016-01-01

    and control. In this paper, we present a Smart Grid test-bed that integrates various communication technologies and deploys a power balancing controller for LV grids. Control performance of the introduced power balancing controller is subsequently investigated and its robustness to communication network cross......Due to growing need for sustainable energy, increasing number of different renewable energy resources are being connected into distribution grids. In order to efficiently manage a decentralized power generation units, the smart grid will rely on communication networks for information exchange...

  14. Laboratory Automation and Middleware.

    Science.gov (United States)

    Riben, Michael

    2015-06-01

    The practice of surgical pathology is under constant pressure to deliver the highest quality of service, reduce errors, increase throughput, and decrease turnaround time while at the same time dealing with an aging workforce, increasing financial constraints, and economic uncertainty. Although not able to implement total laboratory automation, great progress continues to be made in workstation automation in all areas of the pathology laboratory. This report highlights the benefits and challenges of pathology automation, reviews middleware and its use to facilitate automation, and reviews the progress so far in the anatomic pathology laboratory. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. Automated cloning methods.; TOPICAL

    International Nuclear Information System (INIS)

    Collart, F.

    2001-01-01

    Argonne has developed a series of automated protocols to generate bacterial expression clones by using a robotic system designed to be used in procedures associated with molecular biology. The system provides plate storage, temperature control from 4 to 37 C at various locations, and Biomek and Multimek pipetting stations. The automated system consists of a robot that transports sources from the active station on the automation system. Protocols for the automated generation of bacterial expression clones can be grouped into three categories (Figure 1). Fragment generation protocols are initiated on day one of the expression cloning procedure and encompass those protocols involved in generating purified coding region (PCR)

  16. Complacency and Automation Bias in the Use of Imperfect Automation.

    Science.gov (United States)

    Wickens, Christopher D; Clegg, Benjamin A; Vieane, Alex Z; Sebok, Angelia L

    2015-08-01

    We examine the effects of two different kinds of decision-aiding automation errors on human-automation interaction (HAI), occurring at the first failure following repeated exposure to correctly functioning automation. The two errors are incorrect advice, triggering the automation bias, and missing advice, reflecting complacency. Contrasts between analogous automation errors in alerting systems, rather than decision aiding, have revealed that alerting false alarms are more problematic to HAI than alerting misses are. Prior research in decision aiding, although contrasting the two aiding errors (incorrect vs. missing), has confounded error expectancy. Participants performed an environmental process control simulation with and without decision aiding. For those with the aid, automation dependence was created through several trials of perfect aiding performance, and an unexpected automation error was then imposed in which automation was either gone (one group) or wrong (a second group). A control group received no automation support. The correct aid supported faster and more accurate diagnosis and lower workload. The aid failure degraded all three variables, but "automation wrong" had a much greater effect on accuracy, reflecting the automation bias, than did "automation gone," reflecting the impact of complacency. Some complacency was manifested for automation gone, by a longer latency and more modest reduction in accuracy. Automation wrong, creating the automation bias, appears to be a more problematic form of automation error than automation gone, reflecting complacency. Decision-aiding automation should indicate its lower degree of confidence in uncertain environments to avoid the automation bias. © 2015, Human Factors and Ergonomics Society.

  17. A methodology for automation and robotics evaluation applied to the space station telerobotic servicer

    Science.gov (United States)

    Smith, Jeffrey H.; Gyanfi, Max; Volkmer, Kent; Zimmerman, Wayne

    1988-01-01

    The efforts of a recent study aimed at identifying key issues and trade-offs associated with using a Flight Telerobotic Servicer (FTS) to aid in Space Station assembly-phase tasks is described. The use of automation and robotic (A and R) technologies for large space systems would involve a substitution of automation capabilities for human extravehicular or intravehicular activities (EVA, IVA). A methodology is presented that incorporates assessment of candidate assembly-phase tasks, telerobotic performance capabilities, development costs, and effect of operational constraints (space transportation system (STS), attached payload, and proximity operations). Changes in the region of cost-effectiveness are examined under a variety of systems design assumptions. A discussion of issues is presented with focus on three roles the FTS might serve: (1) as a research-oriented testbed to learn more about space usage of telerobotics; (2) as a research based testbed having an experimental demonstration orientation with limited assembly and servicing applications; or (3) as an operational system to augment EVA and to aid the construction of the Space Station and to reduce the programmatic (schedule) risk by increasing the flexibility of mission operations.

  18. On-wire lithography-generated molecule-based transport junctions: a new testbed for molecular electronics.

    Science.gov (United States)

    Chen, Xiaodong; Jeon, You-Moon; Jang, Jae-Won; Qin, Lidong; Huo, Fengwei; Wei, Wei; Mirkin, Chad A

    2008-07-02

    On-wire lithography (OWL) fabricated nanogaps are used as a new testbed to construct molecular transport junctions (MTJs) through the assembly of thiolated molecular wires across a nanogap formed between two Au electrodes. In addition, we show that one can use OWL to rapidly characterize a MTJ and optimize gap size for two molecular wires of different dimensions. Finally, we have used this new testbed to identify unusual temperature-dependent transport mechanisms for alpha,omega-dithiol terminated oligo(phenylene ethynylene).

  19. Design and construction of a 76m long-travel laser enclosure for a space occulter testbed

    Science.gov (United States)

    Galvin, Michael; Kim, Yunjong; Kasdin, N. Jeremy; Sirbu, Dan; Vanderbei, Robert; Echeverri, Dan; Sagolla, Giuseppe; Rousing, Andreas; Balasubramanian, Kunjithapatham; Ryan, Daniel; Shaklan, Stuart; Lisman, Doug

    2016-07-01

    Princeton University is upgrading our space occulter testbed. In particular, we are lengthening it to 76m to achieve flightlike Fresnel numbers. This much longer testbed required an all-new enclosure design. In this design, we prioritized modularity and the use of commercial off-the-shelf (COTS) and semi-COTS components. Several of the technical challenges encountered included an unexpected slow beam drift and black paint selection. Herein we describe the design and construction of this long-travel laser enclosure.

  20. Development of Research Reactor Simulator and Its Application to Dynamic Test-bed

    International Nuclear Information System (INIS)

    Kwon, Kee Choon; Park, Jae Chang; Lee, Seung Wook; Bang, Dane; Bae, Sung Won

    2014-01-01

    We developed HANARO and the Jordan Research and Training Reactor (JRTR) real-time simulator for operating staff training. The main purpose of this simulator is operator training, but we modified this simulator as a dynamic test-bed to test the reactor regulating system in HANARO or JRTR before installation. The simulator configuration is divided into hardware and software. The simulator hardware consists of a host computer, 6 operator stations, a network switch, and a large display panel. The simulator software is divided into three major parts: a mathematical modeling module, which executes the plant dynamic modeling program in real-time, an instructor station module that manages user instructions, and a human machine interface (HMI) module. The developed research reactors are installed in the Korea Atomic Energy Research Institute nuclear training center for reactor operator training. To use the simulator as a dynamic test-bed, the reactor regulating system modeling software of the simulator was replaced by a hardware controller and the simulator and target controller were interfaced with a hard-wired and network-based interface

  1. Development of research reactor simulator and its application to dynamic test-bed

    International Nuclear Information System (INIS)

    Kwon, Kee-Choon; Baang, Dane; Park, Jae-Chang; Lee, Seung-Wook; Bae, Sung Won

    2014-01-01

    We developed a real-time simulator for 'High-flux Advanced Neutron Application ReactOr (HANARO), and the Jordan Research and Training Reactor (JRTR). The main purpose of this simulator is operator training, but we modified this simulator into a dynamic test-bed (DTB) to test the functions and dynamic control performance of reactor regulating system (RRS) in HANARO or JRTR before installation. The simulator hardware consists of a host computer, 6 operator stations, a network switch, and a large display panel. The software includes a mathematical model that implements plant dynamics in real-time, an instructor station module that manages user instructions, and a human machine interface module. The developed research reactor simulators are installed in the Korea Atomic Energy Research Institute nuclear training center for reactor operator training. To use the simulator as a dynamic test-bed, the reactor regulating system modeling software of the simulator was replaced by actual RRS cabinet, and was interfaced using a hard-wired and network-based interface. RRS cabinet generates control signals for reactor power control based on the various feedback signals from DTB, and the DTB runs plant dynamics based on the RRS control signals. Thus the Hardware-In-the-Loop Simulation between RRS and the emulated plant (DTB) has been implemented and tested in this configuration. The test result shows that the developed DTB and actual RRS cabinet works together simultaneously resulting in quite good dynamic control performances. (author)

  2. Development of a hardware-in-the-loop testbed to demonstrate multiple spacecraft operations in proximity

    Science.gov (United States)

    Eun, Youngho; Park, Sang-Young; Kim, Geuk-Nam

    2018-06-01

    This paper presents a new state-of-the-art ground-based hardware-in-the-loop test facility, which was developed to verify and demonstrate autonomous guidance, navigation, and control algorithms for space proximity operations and formation flying maneuvers. The test facility consists of two complete spaceflight simulators, an aluminum-based operational arena, and a set of infrared motion tracking cameras; thus, the testbed is capable of representing space activities under circumstances prevailing on the ground. The spaceflight simulators have a maximum of five-degree-of-freedom in a quasi-momentum-free environment, which is produced by a set of linear/hemispherical air-bearings and a horizontally leveled operational arena. The tracking system measures the real-time three-dimensional position and attitude to provide state variables to the agents. The design of the testbed is illustrated in detail for every element throughout the paper. The practical hardware characteristics of the active/passive measurement units and internal actuators are identified in detail from various perspectives. These experimental results support the successful development of the entire facility and enable us to implement and verify the spacecraft proximity operation strategy in the near future.

  3. Design and Development of a 200-kW Turbo-Electric Distributed Propulsion Testbed

    Science.gov (United States)

    Papathakis, Kurt V.; Kloesel, Kurt J.; Lin, Yohan; Clarke, Sean; Ediger, Jacob J.; Ginn, Starr

    2016-01-01

    The National Aeronautics and Space Administration (NASA) Armstrong Flight Research Center (AFRC) (Edwards, California) is developing a Hybrid-Electric Integrated Systems Testbed (HEIST) Testbed as part of the HEIST Project, to study power management and transition complexities, modular architectures, and flight control laws for turbo-electric distributed propulsion technologies using representative hardware and piloted simulations. Capabilities are being developed to assess the flight readiness of hybrid electric and distributed electric vehicle architectures. Additionally, NASA will leverage experience gained and assets developed from HEIST to assist in flight-test proposal development, flight-test vehicle design, and evaluation of hybrid electric and distributed electric concept vehicles for flight safety. The HEIST test equipment will include three trailers supporting a distributed electric propulsion wing, a battery system and turbogenerator, dynamometers, and supporting power and communication infrastructure, all connected to the AFRC Core simulation. Plans call for 18 high performance electric motors that will be powered by batteries and the turbogenerator, and commanded by a piloted simulation. Flight control algorithms will be developed on the turbo-electric distributed propulsion system.

  4. Test-bed for the remote health monitoring system for bridge structures using FBG sensors

    Science.gov (United States)

    Lee, Chin-Hyung; Park, Ki-Tae; Joo, Bong-Chul; Hwang, Yoon-Koog

    2009-05-01

    This paper reports on test-bed for the long-term health monitoring system for bridge structures employing fiber Bragg grating (FBG) sensors, which is remotely accessible via the web, to provide real-time quantitative information on a bridge's response to live loading and environmental changes, and fast prediction of the structure's integrity. The sensors are attached on several locations of the structure and connected to a data acquisition system permanently installed onsite. The system can be accessed through remote communication using an optical cable network, through which the evaluation of the bridge behavior under live loading can be allowed at place far away from the field. Live structural data are transmitted continuously to the server computer at the central office. The server computer is connected securely to the internet, where data can be retrieved, processed and stored for the remote web-based health monitoring. Test-bed revealed that the remote health monitoring technology will enable practical, cost-effective, and reliable condition assessment and maintenance of bridge structures.

  5. Adaptive Coding and Modulation Experiment With NASA's Space Communication and Navigation Testbed

    Science.gov (United States)

    Downey, Joseph; Mortensen, Dale; Evans, Michael; Briones, Janette; Tollis, Nicholas

    2016-01-01

    National Aeronautics and Space Administration (NASA)'s Space Communication and Navigation Testbed is an advanced integrated communication payload on the International Space Station. This paper presents results from an adaptive coding and modulation (ACM) experiment over S-band using a direct-to-earth link between the SCaN Testbed and the Glenn Research Center. The testing leverages the established Digital Video Broadcasting Second Generation (DVB-S2) standard to provide various modulation and coding options, and uses the Space Data Link Protocol (Consultative Committee for Space Data Systems (CCSDS) standard) for the uplink and downlink data framing. The experiment was conducted in a challenging environment due to the multipath and shadowing caused by the International Space Station structure. Several approaches for improving the ACM system are presented, including predictive and learning techniques to accommodate signal fades. Performance of the system is evaluated as a function of end-to-end system latency (round-trip delay), and compared to the capacity of the link. Finally, improvements over standard NASA waveforms are presented.

  6. FloorNet: Deployment and Evaluation of a Multihop Wireless 802.11 Testbed

    Directory of Open Access Journals (Sweden)

    Zink Michael

    2010-01-01

    Full Text Available A lot of attention has been given to multihop wireless networks lately, but further research—in particular, through experimentation—is needed. This attention has motivated an increase in the number of 802.11-based deployments, both indoor and outdoor. These testbeds, which require a significant amount of resources during both deployment and maintenance, are used to run measurements in order to analyze and understand the limitation and differences between analytical or simulation-based figures and the results from real-life experimentation. This paper makes two major contributions: (i first, we describe a novel wireless multihop testbed, which we name FloorNet, that is deployed and operated under the false floor of a lab in our Computer Science building. This false floor provides a strong physical protection that prevents disconnections or misplacements, as well as radio shielding (to some extent thanks to the false floor panels—this later feature is assessed through experimentation; (ii second, by running exhaustive and controlled experiments we are able to analyze the performance limits of commercial off-the-shelf hardware, as well as to derive practical design criteria for the deployment and configuration of mesh networks. These results both provide valuable insights of wireless multihop performance and prove that FloorNet constitutes a valuable asset to research on wireless mesh networks.

  7. BEATBOX v1.0: Background Error Analysis Testbed with Box Models

    Science.gov (United States)

    Knote, Christoph; Barré, Jérôme; Eckl, Max

    2018-02-01

    The Background Error Analysis Testbed (BEATBOX) is a new data assimilation framework for box models. Based on the BOX Model eXtension (BOXMOX) to the Kinetic Pre-Processor (KPP), this framework allows users to conduct performance evaluations of data assimilation experiments, sensitivity analyses, and detailed chemical scheme diagnostics from an observation simulation system experiment (OSSE) point of view. The BEATBOX framework incorporates an observation simulator and a data assimilation system with the possibility of choosing ensemble, adjoint, or combined sensitivities. A user-friendly, Python-based interface allows for the tuning of many parameters for atmospheric chemistry and data assimilation research as well as for educational purposes, for example observation error, model covariances, ensemble size, perturbation distribution in the initial conditions, and so on. In this work, the testbed is described and two case studies are presented to illustrate the design of a typical OSSE experiment, data assimilation experiments, a sensitivity analysis, and a method for diagnosing model errors. BEATBOX is released as an open source tool for the atmospheric chemistry and data assimilation communities.

  8. BEATBOX v1.0: Background Error Analysis Testbed with Box Models

    Directory of Open Access Journals (Sweden)

    C. Knote

    2018-02-01

    Full Text Available The Background Error Analysis Testbed (BEATBOX is a new data assimilation framework for box models. Based on the BOX Model eXtension (BOXMOX to the Kinetic Pre-Processor (KPP, this framework allows users to conduct performance evaluations of data assimilation experiments, sensitivity analyses, and detailed chemical scheme diagnostics from an observation simulation system experiment (OSSE point of view. The BEATBOX framework incorporates an observation simulator and a data assimilation system with the possibility of choosing ensemble, adjoint, or combined sensitivities. A user-friendly, Python-based interface allows for the tuning of many parameters for atmospheric chemistry and data assimilation research as well as for educational purposes, for example observation error, model covariances, ensemble size, perturbation distribution in the initial conditions, and so on. In this work, the testbed is described and two case studies are presented to illustrate the design of a typical OSSE experiment, data assimilation experiments, a sensitivity analysis, and a method for diagnosing model errors. BEATBOX is released as an open source tool for the atmospheric chemistry and data assimilation communities.

  9. Open-Source Based Testbed for Multioperator 4G/5G Infrastructure Sharing in Virtual Environments

    Directory of Open Access Journals (Sweden)

    Ricardo Marco Alaez

    2017-01-01

    Full Text Available Fourth-Generation (4G mobile networks are based on Long-Term Evolution (LTE technologies and are being deployed worldwide, while research on further evolution towards the Fifth Generation (5G has been recently initiated. 5G will be featured with advanced network infrastructure sharing capabilities among different operators. Therefore, an open-source implementation of 4G/5G networks with this capability is crucial to enable early research in this area. The main contribution of this paper is the design and implementation of such a 4G/5G open-source testbed to investigate multioperator infrastructure sharing capabilities executed in virtual architectures. The proposed design and implementation enable the virtualization and sharing of some of the components of the LTE architecture. A testbed has been implemented and validated with intensive empirical experiments conducted to validate the suitability of virtualizing LTE components in virtual infrastructures (i.e., infrastructures with multitenancy sharing capabilities. The impact of the proposed technologies can lead to significant saving of both capital and operational costs for mobile telecommunication operators.

  10. A low-cost test-bed for real-time landmark tracking

    Science.gov (United States)

    Csaszar, Ambrus; Hanan, Jay C.; Moreels, Pierre; Assad, Christopher

    2007-04-01

    A low-cost vehicle test-bed system was developed to iteratively test, refine and demonstrate navigation algorithms before attempting to transfer the algorithms to more advanced rover prototypes. The platform used here was a modified radio controlled (RC) car. A microcontroller board and onboard laptop computer allow for either autonomous or remote operation via a computer workstation. The sensors onboard the vehicle represent the types currently used on NASA-JPL rover prototypes. For dead-reckoning navigation, optical wheel encoders, a single axis gyroscope, and 2-axis accelerometer were used. An ultrasound ranger is available to calculate distance as a substitute for the stereo vision systems presently used on rovers. The prototype also carries a small laptop computer with a USB camera and wireless transmitter to send real time video to an off-board computer. A real-time user interface was implemented that combines an automatic image feature selector, tracking parameter controls, streaming video viewer, and user generated or autonomous driving commands. Using the test-bed, real-time landmark tracking was demonstrated by autonomously driving the vehicle through the JPL Mars yard. The algorithms tracked rocks as waypoints. This generated coordinates calculating relative motion and visually servoing to science targets. A limitation for the current system is serial computing-each additional landmark is tracked in order-but since each landmark is tracked independently, if transferred to appropriate parallel hardware, adding targets would not significantly diminish system speed.

  11. A Functional Neuroimaging Analysis of the Trail Making Test-B: Implications for Clinical Application

    Directory of Open Access Journals (Sweden)

    Mark D. Allen

    2011-01-01

    Full Text Available Recent progress has been made using fMRI as a clinical assessment tool, often employing analogues of traditional “paper and pencil” tests. The Trail Making Test (TMT, popular for years as a neuropsychological exam, has been largely ignored in the realm of neuroimaging, most likely because its physical format and administration does not lend itself to straightforward adaptation as an fMRI paradigm. Likewise, there is relatively more ambiguity about the neural systems associated with this test than many other tests of comparable clinical use. In this study, we describe an fMRI version of Trail Making Test-B (TMTB that maintains the core functionality of the TMT while optimizing its use for both research and clinical settings. Subjects (N = 32 were administered the Functional Trail Making Test-B (f-TMTB. Brain region activations elicited by the f-TMTB were consistent with expectations given by prior TMT neurophysiological studies, including significant activations in the ventral and dorsal visual pathways and the medial pre-supplementary motor area. The f-TMTB was further evaluated for concurrent validity with the traditional TMTB using an additional sample of control subjects (N = 100. Together, these results support the f-TMTB as a viable neuroimaging adaptation of the TMT that is optimized to evoke maximally robust fMRI activation with minimal time and equipment requirements.

  12. PlanetLab Europe as Geographically-Distributed Testbed for Software Development and Evaluation

    Directory of Open Access Journals (Sweden)

    Dan Komosny

    2015-01-01

    Full Text Available In this paper, we analyse the use of PlanetLab Europe for development and evaluation of geographically-oriented Internet services. PlanetLab is a global research network with the main purpose to support development of new Internet services and protocols. PlanetLab is divided into several branches; one of them is PlanetLab Europe. PlanetLab Europe consists of about 350 nodes at 150 geographically different sites. The nodes are accessible by remote login, and the users can run their software on the nodes. In the paper, we study the PlanetLab's properties that are significant for its use as a geographically distributed testbed. This includes node position accuracy, services availability and stability. We find a considerable number of location inaccuracies and a number of services that cannot be considered as reliable. Based on the results we propose a simple approach to nodes selection in testbeds for geographically-oriented Internet services development and evaluation.

  13. Establishment of a sensor testbed at NIST for plant productivity monitoring

    Science.gov (United States)

    Allen, D. W.; Hutyra, L.; Reinmann, A.; Trlica, A.; Marrs, J.; Jones, T.; Whetstone, J. R.; Logan, B.; Reblin, J.

    2017-12-01

    Accurate assessments of biogenic carbon fluxes is challenging. Correlating optical signatures to plant activity allows for monitoring large regions. New methods, including solar-induced fluorescence (SIF), promise to provide more timely and accurate estimate of plant activity, but we are still developing a full understanding of the mechanistic leakage between plant assimilation of carbon and SIF. We have initiated a testbed to facilitate the evaluation of sensors and methods for remote monitoring of plant activity at the NIST headquarters. The test bed utilizes a forested area of mature trees in a mixed urban environment. A 1 hectare plot within the 26 hectare forest has been instrumented for ecophysiological measurements with an edge (100 m long) that is persistently monitored with multimodal optical sensors (SIF spectrometers, hyperspectral imagers, thermal infrared imaging, and lidar). This biological testbed has the advantage of direct access to the national scales maintained by NIST of measurements related to both the physical and optical measurements of interest. We offer a description of the test site, the sensors, and preliminary results from the first season of observations for ecological, physiological, and remote sensing based estimates of ecosystem productivity.

  14. The Objectives of NASA's Living with a Star Space Environment Testbed

    Science.gov (United States)

    Barth, Janet L.; LaBel, Kenneth A.; Brewer, Dana; Kauffman, Billy; Howard, Regan; Griffin, Geoff; Day, John H. (Technical Monitor)

    2001-01-01

    NASA is planning to fly a series of Space Environment Testbeds (SET) as part of the Living With A Star (LWS) Program. The goal of the testbeds is to improve and develop capabilities to mitigate and/or accommodate the affects of solar variability in spacecraft and avionics design and operation. This will be accomplished by performing technology validation in space to enable routine operations, characterize technology performance in space, and improve and develop models, guidelines and databases. The anticipated result of the LWS/SET program is improved spacecraft performance, design, and operation for survival of the radiation, spacecraft charging, meteoroid, orbital debris and thermosphere/ionosphere environments. The program calls for a series of NASA Research Announcements (NRAs) to be issued to solicit flight validation experiments, improvement in environment effects models and guidelines, and collateral environment measurements. The selected flight experiments may fly on the SET experiment carriers and flights of opportunity on other commercial and technology missions. This paper presents the status of the project so far, including a description of the types of experiments that are intended to fly on SET-1 and a description of the SET-1 carrier parameters.

  15. Carrier Plus: A sensor payload for Living With a Star Space Environment Testbed (LWS/SET)

    Science.gov (United States)

    Marshall, Cheryl J.; Moss, Steven; Howard, Regan; LaBel, Kenneth A.; Grycewicz, Tom; Barth, Janet L.; Brewer, Dana

    2003-01-01

    The Defense Threat Reduction Agency (DTR4) and National Aeronautics and Space Administration (NASA) Goddard Space Flight Center are collaborating to develop the Carrier Plus sensor experiment platform as a capability of the Space Environments Testbed (SET). The Space Environment Testbed (SET) provides flight opportunities for technology experiments as part of NASA's Living With a Star (LWS) program. The Carrier Plus will provide new capability to characterize sensor technologies such as state-of-the-art visible focal plane arrays (FPAs) in a natural space radiation environment. The technical objectives include on-orbit validation of recently developed FPA technologies and performance prediction methodologies, as well as characterization of the FPA radiation response to total ionizing dose damage, displacement damage and transients. It is expected that the sensor experiment will carry 4-6 FPAs and associated radiation correlative environment monitors (CEMs) for a 2006-2007 launch. Sensor technology candidates may include n- and p-charge coupled devices (CCDs), active pixel sensors (APS), and hybrid CMOS arrays. The presentation will describe the Carrier Plus goals and objectives, as well as provide details about the architecture and design. More information on the LWS program can be found at http://lws.gsfc.nasa.gov/. Business announcements for LWS/SET and program briefings are posted at http://lws-set.gsfc.nasa.gov

  16. Development of Research Reactor Simulator and Its Application to Dynamic Test-bed

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Kee Choon; Park, Jae Chang; Lee, Seung Wook; Bang, Dane; Bae, Sung Won [KAERI, Daejeon (Korea, Republic of)

    2014-08-15

    We developed HANARO and the Jordan Research and Training Reactor (JRTR) real-time simulator for operating staff training. The main purpose of this simulator is operator training, but we modified this simulator as a dynamic test-bed to test the reactor regulating system in HANARO or JRTR before installation. The simulator configuration is divided into hardware and software. The simulator hardware consists of a host computer, 6 operator stations, a network switch, and a large display panel. The simulator software is divided into three major parts: a mathematical modeling module, which executes the plant dynamic modeling program in real-time, an instructor station module that manages user instructions, and a human machine interface (HMI) module. The developed research reactors are installed in the Korea Atomic Energy Research Institute nuclear training center for reactor operator training. To use the simulator as a dynamic test-bed, the reactor regulating system modeling software of the simulator was replaced by a hardware controller and the simulator and target controller were interfaced with a hard-wired and network-based interface.

  17. The Orlando TDWR testbed and airborne wind shear date comparison results

    Science.gov (United States)

    Campbell, Steven; Berke, Anthony; Matthews, Michael

    1992-01-01

    The focus of this talk is on comparing terminal Doppler Weather Radar (TDWR) and airborne wind shear data in computing a microburst hazard index called the F factor. The TDWR is a ground-based system for detecting wind shear hazards to aviation in the terminal area. The Federal Aviation Administration will begin deploying TDWR units near 45 airports in late 1992. As part of this development effort, M.I.T. Lincoln Laboratory operates under F.A.A. support a TDWR testbed radar in Orlando, FL. During the past two years, a series of flight tests has been conducted with instrumented aircraft penetrating microburst events while under testbed radar surveillance. These tests were carried out with a Cessna Citation 2 aircraft operated by the University of North Dakota (UND) Center for Aerospace Sciences in 1990, and a Boeing 737 operated by NASA Langley Research Center in 1991. A large data base of approximately 60 instrumented microburst penetrations has been obtained from these flights.

  18. Automated System Marketplace 1994.

    Science.gov (United States)

    Griffiths, Jose-Marie; Kertis, Kimberly

    1994-01-01

    Reports results of the 1994 Automated System Marketplace survey based on responses from 60 vendors. Highlights include changes in the library automation marketplace; estimated library systems revenues; minicomputer and microcomputer-based systems; marketplace trends; global markets and mergers; research needs; new purchase processes; and profiles…

  19. Automation in Warehouse Development

    NARCIS (Netherlands)

    Hamberg, R.; Verriet, J.

    2012-01-01

    The warehouses of the future will come in a variety of forms, but with a few common ingredients. Firstly, human operational handling of items in warehouses is increasingly being replaced by automated item handling. Extended warehouse automation counteracts the scarcity of human operators and

  20. Order Division Automated System.

    Science.gov (United States)

    Kniemeyer, Justin M.; And Others

    This publication was prepared by the Order Division Automation Project staff to fulfill the Library of Congress' requirement to document all automation efforts. The report was originally intended for internal use only and not for distribution outside the Library. It is now felt that the library community at-large may have an interest in the…

  1. Automate functional testing

    Directory of Open Access Journals (Sweden)

    Ramesh Kalindri

    2014-06-01

    Full Text Available Currently, software engineers are increasingly turning to the option of automating functional tests, but not always have successful in this endeavor. Reasons range from low planning until over cost in the process. Some principles that can guide teams in automating these tests are described in this article.

  2. Automation and robotics

    Science.gov (United States)

    Montemerlo, Melvin

    1988-01-01

    The Autonomous Systems focus on the automation of control systems for the Space Station and mission operations. Telerobotics focuses on automation for in-space servicing, assembly, and repair. The Autonomous Systems and Telerobotics each have a planned sequence of integrated demonstrations showing the evolutionary advance of the state-of-the-art. Progress is briefly described for each area of concern.

  3. Automating the Small Library.

    Science.gov (United States)

    Skapura, Robert

    1987-01-01

    Discusses the use of microcomputers for automating school libraries, both for entire systems and for specific library tasks. Highlights include available library management software, newsletters that evaluate software, constructing an evaluation matrix, steps to consider in library automation, and a brief discussion of computerized card catalogs.…

  4. Automated model building

    CERN Document Server

    Caferra, Ricardo; Peltier, Nicholas

    2004-01-01

    This is the first book on automated model building, a discipline of automated deduction that is of growing importance Although models and their construction are important per se, automated model building has appeared as a natural enrichment of automated deduction, especially in the attempt to capture the human way of reasoning The book provides an historical overview of the field of automated deduction, and presents the foundations of different existing approaches to model construction, in particular those developed by the authors Finite and infinite model building techniques are presented The main emphasis is on calculi-based methods, and relevant practical results are provided The book is of interest to researchers and graduate students in computer science, computational logic and artificial intelligence It can also be used as a textbook in advanced undergraduate courses

  5. Automation in Immunohematology

    Directory of Open Access Journals (Sweden)

    Meenu Bajpai

    2012-01-01

    Full Text Available There have been rapid technological advances in blood banking in South Asian region over the past decade with an increasing emphasis on quality and safety of blood products. The conventional test tube technique has given way to newer techniques such as column agglutination technique, solid phase red cell adherence assay, and erythrocyte-magnetized technique. These new technologies are adaptable to automation and major manufacturers in this field have come up with semi and fully automated equipments for immunohematology tests in the blood bank. Automation improves the objectivity and reproducibility of tests. It reduces human errors in patient identification and transcription errors. Documentation and traceability of tests, reagents and processes and archiving of results is another major advantage of automation. Shifting from manual methods to automation is a major undertaking for any transfusion service to provide quality patient care with lesser turnaround time for their ever increasing workload. This article discusses the various issues involved in the process.

  6. Automation in Warehouse Development

    CERN Document Server

    Verriet, Jacques

    2012-01-01

    The warehouses of the future will come in a variety of forms, but with a few common ingredients. Firstly, human operational handling of items in warehouses is increasingly being replaced by automated item handling. Extended warehouse automation counteracts the scarcity of human operators and supports the quality of picking processes. Secondly, the development of models to simulate and analyse warehouse designs and their components facilitates the challenging task of developing warehouses that take into account each customer’s individual requirements and logistic processes. Automation in Warehouse Development addresses both types of automation from the innovative perspective of applied science. In particular, it describes the outcomes of the Falcon project, a joint endeavour by a consortium of industrial and academic partners. The results include a model-based approach to automate warehouse control design, analysis models for warehouse design, concepts for robotic item handling and computer vision, and auton...

  7. Analysis, modeling, and simulation (AMS) testbed development and evaluation to support dynamic mobility applications (DMA) and active transportation and demand management (ATDM) programs — evaluation report for ATDM program. [supporting datasets - Pasadena Testbed

    Science.gov (United States)

    2017-07-26

    This zip file contains POSTDATA.ATT (.ATT); Print to File (.PRN); Portable Document Format (.PDF); and document (.DOCX) files of data to support FHWA-JPO-16-385, Analysis, modeling, and simulation (AMS) testbed development and evaluation to support d...

  8. Systematic review automation technologies

    Science.gov (United States)

    2014-01-01

    Systematic reviews, a cornerstone of evidence-based medicine, are not produced quickly enough to support clinical practice. The cost of production, availability of the requisite expertise and timeliness are often quoted as major contributors for the delay. This detailed survey of the state of the art of information systems designed to support or automate individual tasks in the systematic review, and in particular systematic reviews of randomized controlled clinical trials, reveals trends that see the convergence of several parallel research projects. We surveyed literature describing informatics systems that support or automate the processes of systematic review or each of the tasks of the systematic review. Several projects focus on automating, simplifying and/or streamlining specific tasks of the systematic review. Some tasks are already fully automated while others are still largely manual. In this review, we describe each task and the effect that its automation would have on the entire systematic review process, summarize the existing information system support for each task, and highlight where further research is needed for realizing automation for the task. Integration of the systems that automate systematic review tasks may lead to a revised systematic review workflow. We envisage the optimized workflow will lead to system in which each systematic review is described as a computer program that automatically retrieves relevant trials, appraises them, extracts and synthesizes data, evaluates the risk of bias, performs meta-analysis calculations, and produces a report in real time. PMID:25005128

  9. A Real-Time GPP Software-Defined Radio Testbed for the Physical Layer of Wireless Standards

    NARCIS (Netherlands)

    Schiphorst, Roelof; Hoeksema, F.W.; Slump, Cornelis H.

    2005-01-01

    We present our contribution to the general-purpose-processor-(GPP)-based radio. We describe a baseband software-defined radio testbed for the physical layer of wireless LAN standards. All physical layer functions have been successfully mapped on a Pentium 4 processor that performs these functions in

  10. Interactive aircraft cabin testbed for stress-free air travel system experiment: an innovative concurrent design approach

    NARCIS (Netherlands)

    Tan, C.F.; Chen, W.; Rauterberg, G.W.M.

    2009-01-01

    In this paper, a study of the concurrent engineering design for the environmental friendly low cost aircraft cabin simulator is presented. The study describes the used of concurrent design technique in the design activity. The simulator is a testbed that was designed and built for research on

  11. Photovoltaic Engineering Testbed: A Facility for Space Calibration and Measurement of Solar Cells on the International Space Station

    Science.gov (United States)

    Landis, Geoffrey A.; Bailey, Sheila G.; Jenkins, Phillip; Sexton, J. Andrew; Scheiman, David; Christie, Robert; Charpie, James; Gerber, Scott S.; Johnson, D. Bruce

    2001-01-01

    The Photovoltaic Engineering Testbed ("PET") is a facility to be flown on the International Space Station to perform calibration, measurement, and qualification of solar cells in the space environment and then returning the cells to Earth for laboratory use. PET will allow rapid turnaround testing of new photovoltaic technology under AM0 conditions.

  12. Operational proof of automation

    International Nuclear Information System (INIS)

    Jaerschky, R.; Reifenhaeuser, R.; Schlicht, K.

    1976-01-01

    Automation of the power plant process may imply quite a number of problems. The automation of dynamic operations requires complicated programmes often interfering in several branched areas. This reduces clarity for the operating and maintenance staff, whilst increasing the possibilities of errors. The synthesis and the organization of standardized equipment have proved very successful. The possibilities offered by this kind of automation for improving the operation of power plants will only sufficiently and correctly be turned to profit, however, if the application of these technics of equipment is further improved and if its volume is tallied with a definite etc. (orig.) [de

  13. Automation of radioimmunoassay

    International Nuclear Information System (INIS)

    Yamaguchi, Chisato; Yamada, Hideo; Iio, Masahiro

    1974-01-01

    Automation systems for measuring Australian antigen by radioimmunoassay under development were discussed. Samples were processed as follows: blood serum being dispensed by automated sampler to the test tube, and then incubated under controlled time and temperature; first counting being omitted; labelled antibody being dispensed to the serum after washing; samples being incubated and then centrifuged; radioactivities in the precipitate being counted by auto-well counter; measurements being tabulated by automated typewriter. Not only well-type counter but also position counter was studied. (Kanao, N.)

  14. Automated electron microprobe

    International Nuclear Information System (INIS)

    Thompson, K.A.; Walker, L.R.

    1986-01-01

    The Plant Laboratory at the Oak Ridge Y-12 Plant has recently obtained a Cameca MBX electron microprobe with a Tracor Northern TN5500 automation system. This allows full stage and spectrometer automation and digital beam control. The capabilities of the system include qualitative and quantitative elemental microanalysis for all elements above and including boron in atomic number, high- and low-magnification imaging and processing, elemental mapping and enhancement, and particle size, shape, and composition analyses. Very low magnification, quantitative elemental mapping using stage control (which is of particular interest) has been accomplished along with automated size, shape, and composition analysis over a large relative area

  15. Chef infrastructure automation cookbook

    CERN Document Server

    Marschall, Matthias

    2013-01-01

    Chef Infrastructure Automation Cookbook contains practical recipes on everything you will need to automate your infrastructure using Chef. The book is packed with illustrated code examples to automate your server and cloud infrastructure.The book first shows you the simplest way to achieve a certain task. Then it explains every step in detail, so that you can build your knowledge about how things work. Eventually, the book shows you additional things to consider for each approach. That way, you can learn step-by-step and build profound knowledge on how to go about your configuration management

  16. Managing laboratory automation.

    Science.gov (United States)

    Saboe, T J

    1995-01-01

    This paper discusses the process of managing automated systems through their life cycles within the quality-control (QC) laboratory environment. The focus is on the process of directing and managing the evolving automation of a laboratory; system examples are given. The author shows how both task and data systems have evolved, and how they interrelate. A BIG picture, or continuum view, is presented and some of the reasons for success or failure of the various examples cited are explored. Finally, some comments on future automation need are discussed.

  17. Automated PCB Inspection System

    Directory of Open Access Journals (Sweden)

    Syed Usama BUKHARI

    2017-05-01

    Full Text Available Development of an automated PCB inspection system as per the need of industry is a challenging task. In this paper a case study is presented, to exhibit, a proposed system for an immigration process of a manual PCB inspection system to an automated PCB inspection system, with a minimal intervention on the existing production flow, for a leading automotive manufacturing company. A detailed design of the system, based on computer vision followed by testing and analysis was proposed, in order to aid the manufacturer in the process of automation.

  18. Operational proof of automation

    International Nuclear Information System (INIS)

    Jaerschky, R.; Schlicht, K.

    1977-01-01

    Automation of the power plant process may imply quite a number of problems. The automation of dynamic operations requires complicated programmes often interfering in several branched areas. This reduces clarity for the operating and maintenance staff, whilst increasing the possibilities of errors. The synthesis and the organization of standardized equipment have proved very successful. The possibilities offered by this kind of automation for improving the operation of power plants will only sufficiently and correctly be turned to profit, however, if the application of these equipment techniques is further improved and if it stands in a certain ratio with a definite efficiency. (orig.) [de

  19. First light of an external occulter testbed at flight Fresnel numbers

    Science.gov (United States)

    Kim, Yunjong; Sirbu, Dan; Hu, Mia; Kasdin, Jeremy; Vanderbei, Robert J.; Harness, Anthony; Shaklan, Stuart

    2017-01-01

    Many approaches have been suggested over the last couple of decades for imaging Earth-like planets. One of the main candidates for creating high-contrast for future Earth-like planets detection is an external occulter. The external occulter is a spacecraft flown along the line-of-sight of a space telescope to suppress starlight and enable high-contrast direct imaging of exoplanets. The occulter is typically tens of meters in diameter and the separation from the telescope is of the order of tens of thousands of kilometers. Optical testing of a full-scale external occulter on the ground is impossible because of the long separations. Therefore, laboratory verification of occulter designs is necessary to validate the optical models used to design and predict occulter performance. At Princeton, we have designed and built a testbed that allows verification of scaled occulter designs whose suppressed shadow is mathematically identical to that of space occulters. The goal of this experiment is to demonstrate a pupil plane suppression of better than 1e-9 with a corresponding image plane contrast of better than 1e-11. The occulter testbed uses a 77.2 m optical propagation distance to realize the flight Fresnel number of 14.5. The scaled mask is placed at 27.2 m from the artificial source and the camera is located 50.0 m from the scaled mask. We will use an etched silicon mask, manufactured by the Microdevices Lab(MDL) of the Jet Propulsion Laboratory(JPL), as the occulter. Based on conversations with MDL, we expect that 0.5 μm feature size is an achievable resolution in the mask manufacturing process and is therefore likely the indicator of the best possible performance. The occulter is illuminated by a diverging laser beam to reduce the aberrations from the optics before the occulter. Here, we present first light result of a sample design operating at a flight Fresnel number and the experimental setup of the testbed. We compare the experimental results with simulations

  20. Recent Successes and Future Plans for NASA's Space Communications and Navigation Testbed on the International Space Station

    Science.gov (United States)

    Reinhart, Richard C.; Sankovic, John M.; Johnson, Sandra K.; Lux, James P.; Chelmins, David T.

    2014-01-01

    Flexible and extensible space communications architectures and technology are essential to enable future space exploration and science activities. NASA has championed the development of the Space Telecommunications Radio System (STRS) software defined radio (SDR) standard and the application of SDR technology to reduce the costs and risks of using SDRs for space missions, and has developed an on-orbit testbed to validate these capabilities. The Space Communications and Navigation (SCaN) Testbed (previously known as the Communications, Navigation, and Networking reConfigurable Testbed (CoNNeCT)) is advancing SDR, on-board networking, and navigation technologies by conducting space experiments aboard the International Space Station. During its first year(s) on-orbit, the SCaN Testbed has achieved considerable accomplishments to better understand SDRs and their applications. The SDR platforms and software waveforms on each SDR have over 1500 hours of operation and are performing as designed. The Ka-band SDR on the SCaN Testbed is NASAs first space Ka-band transceiver and is NASA's first Ka-band mission using the Space Network. This has provided exciting opportunities to operate at Ka-band and assist with on-orbit tests of NASA newest Tracking and Data Relay Satellites (TDRS). During its first year, SCaN Testbed completed its first on-orbit SDR reconfigurations. SDR reconfigurations occur when implementing new waveforms on an SDR. SDR reconfigurations allow a radio to change minor parameters, such as data rate, or complete functionality. New waveforms which provide new capability and are reusable across different missions provide long term value for reconfigurable platforms such as SDRs. The STRS Standard provides guidelines for new waveform development by third parties. Waveform development by organizations other than the platform provider offers NASA the ability to develop waveforms itself and reduce its dependence and costs on the platform developer. Each of these

  1. Design, Development, and Testing of a UAV Hardware-in-the-Loop Testbed for Aviation and Airspace Prognostics Research

    Science.gov (United States)

    Kulkarni, Chetan; Teubert, Chris; Gorospe, George; Burgett, Drew; Quach, Cuong C.; Hogge, Edward

    2016-01-01

    The airspace is becoming more and more complicated, and will continue to do so in the future with the integration of Unmanned Aerial Vehicles (UAVs), autonomy, spacecraft, other forms of aviation technology into the airspace. The new technology and complexity increases the importance and difficulty of safety assurance. Additionally, testing new technologies on complex aviation systems & systems of systems can be very difficult, expensive, and sometimes unsafe in real life scenarios. Prognostic methodology provides an estimate of the health and risks of a component, vehicle, or airspace and knowledge of how that will change over time. That measure is especially useful in safety determination, mission planning, and maintenance scheduling. The developed testbed will be used to validate prediction algorithms for the real-time safety monitoring of the National Airspace System (NAS) and the prediction of unsafe events. The framework injects flight related anomalies related to ground systems, routing, airport congestion, etc. to test and verify algorithms for NAS safety. In our research work, we develop a live, distributed, hardware-in-the-loop testbed for aviation and airspace prognostics along with exploring further research possibilities to verify and validate future algorithms for NAS safety. The testbed integrates virtual aircraft using the X-Plane simulator and X-PlaneConnect toolbox, UAVs using onboard sensors and cellular communications, and hardware in the loop components. In addition, the testbed includes an additional research framework to support and simplify future research activities. It enables safe, accurate, and inexpensive experimentation and research into airspace and vehicle prognosis that would not have been possible otherwise. This paper describes the design, development, and testing of this system. Software reliability, safety and latency are some of the critical design considerations in development of the testbed. Integration of HITL elements in

  2. Sensing across large-scale cognitive radio networks: Data processing, algorithms, and testbed for wireless tomography and moving target tracking

    Science.gov (United States)

    Bonior, Jason David

    As the use of wireless devices has become more widespread so has the potential for utilizing wireless networks for remote sensing applications. Regular wireless communication devices are not typically designed for remote sensing. Remote sensing techniques must be carefully tailored to the capabilities of these networks before they can be applied. Experimental verification of these techniques and algorithms requires robust yet flexible testbeds. In this dissertation, two experimental testbeds for the advancement of research into sensing across large-scale cognitive radio networks are presented. System architectures, implementations, capabilities, experimental verification, and performance are discussed. One testbed is designed for the collection of scattering data to be used in RF and wireless tomography research. This system is used to collect full complex scattering data using a vector network analyzer (VNA) and amplitude-only data using non-synchronous software-defined radios (SDRs). Collected data is used to experimentally validate a technique for phase reconstruction using semidefinite relaxation and demonstrate the feasibility of wireless tomography. The second testbed is a SDR network for the collection of experimental data. The development of tools for network maintenance and data collection is presented and discussed. A novel recursive weighted centroid algorithm for device-free target localization using the variance of received signal strength for wireless links is proposed. The signal variance resulting from a moving target is modeled as having contours related to Cassini ovals. This model is used to formulate recursive weights which reduce the influence of wireless links that are farther from the target location estimate. The algorithm and its implementation on this testbed are presented and experimental results discussed.

  3. Automated Vehicles Symposium 2015

    CERN Document Server

    Beiker, Sven

    2016-01-01

    This edited book comprises papers about the impacts, benefits and challenges of connected and automated cars. It is the third volume of the LNMOB series dealing with Road Vehicle Automation. The book comprises contributions from researchers, industry practitioners and policy makers, covering perspectives from the U.S., Europe and Japan. It is based on the Automated Vehicles Symposium 2015 which was jointly organized by the Association of Unmanned Vehicle Systems International (AUVSI) and the Transportation Research Board (TRB) in Ann Arbor, Michigan, in July 2015. The topical spectrum includes, but is not limited to, public sector activities, human factors, ethical and business aspects, energy and technological perspectives, vehicle systems and transportation infrastructure. This book is an indispensable source of information for academic researchers, industrial engineers and policy makers interested in the topic of road vehicle automation.

  4. Hydrometeorological Automated Data System

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Office of Hydrologic Development of the National Weather Service operates HADS, the Hydrometeorological Automated Data System. This data set contains the last 48...

  5. Automated External Defibrillator

    Science.gov (United States)

    ... leads to a 10 percent reduction in survival. Training To Use an Automated External Defibrillator Learning how to use an AED and taking a CPR (cardiopulmonary resuscitation) course are helpful. However, if trained ...

  6. Planning for Office Automation.

    Science.gov (United States)

    Mick, Colin K.

    1983-01-01

    Outlines a practical approach to planning for office automation termed the "Focused Process Approach" (the "what" phase, "how" phase, "doing" phase) which is a synthesis of the problem-solving and participatory planning approaches. Thirteen references are provided. (EJS)

  7. Fixed automated spray technology.

    Science.gov (United States)

    2011-04-19

    This research project evaluated the construction and performance of Boschungs Fixed Automated : Spray Technology (FAST) system. The FAST system automatically sprays de-icing material on : the bridge when icing conditions are about to occur. The FA...

  8. Automated Vehicles Symposium 2014

    CERN Document Server

    Beiker, Sven; Road Vehicle Automation 2

    2015-01-01

    This paper collection is the second volume of the LNMOB series on Road Vehicle Automation. The book contains a comprehensive review of current technical, socio-economic, and legal perspectives written by experts coming from public authorities, companies and universities in the U.S., Europe and Japan. It originates from the Automated Vehicle Symposium 2014, which was jointly organized by the Association for Unmanned Vehicle Systems International (AUVSI) and the Transportation Research Board (TRB) in Burlingame, CA, in July 2014. The contributions discuss the challenges arising from the integration of highly automated and self-driving vehicles into the transportation system, with a focus on human factors and different deployment scenarios. This book is an indispensable source of information for academic researchers, industrial engineers, and policy makers interested in the topic of road vehicle automation.

  9. Automation Interface Design Development

    Data.gov (United States)

    National Aeronautics and Space Administration — Our research makes its contributions at two levels. At one level, we addressed the problems of interaction between humans and computers/automation in a particular...

  10. I-94 Automation FAQs

    Data.gov (United States)

    Department of Homeland Security — In order to increase efficiency, reduce operating costs and streamline the admissions process, U.S. Customs and Border Protection has automated Form I-94 at air and...

  11. Automation synthesis modules review

    International Nuclear Information System (INIS)

    Boschi, S.; Lodi, F.; Malizia, C.; Cicoria, G.; Marengo, M.

    2013-01-01

    The introduction of 68 Ga labelled tracers has changed the diagnostic approach to neuroendocrine tumours and the availability of a reliable, long-lived 68 Ge/ 68 Ga generator has been at the bases of the development of 68 Ga radiopharmacy. The huge increase in clinical demand, the impact of regulatory issues and a careful radioprotection of the operators have boosted for extensive automation of the production process. The development of automated systems for 68 Ga radiochemistry, different engineering and software strategies and post-processing of the eluate were discussed along with impact of automation with regulations. - Highlights: ► Generators availability and robust chemistry boosted for the huge diffusion of 68Ga radiopharmaceuticals. ► Different technological approaches for 68Ga radiopharmaceuticals will be discussed. ► Generator eluate post processing and evolution to cassette based systems were the major issues in automation. ► Impact of regulations on the technological development will be also considered

  12. The development of the human exploration demonstration project (HEDP), a planetary systems testbed

    Science.gov (United States)

    Chevers, Edward S.; Korsmeyer, David J.

    1993-01-01

    The Human Exploration Demonstration Project (HEDP) is an ongoing task at the National Aeronautics and Space Administration's Ames Research Center to address the advanced technology requirements necessary to implement an integrated working and living environment for a planetary surface habitat. The integrated environment will consist of life support systems, physiological monitoring of project crew, a virtual environment workstation, and centralized data acquisition and habitat systems health monitoring. There will be several robotic systems on a simulated planetary landscape external to the habitat environment to provide representative work loads for the crew. This paper describes the status of the HEDP after one year, the major facilities composing the HEDP, the project's role as an Ames Research Center testbed, and the types of demonstration scenarios that will be run to showcase the technologies.

  13. High-Resolution Adaptive Optics Test-Bed for Vision Science

    International Nuclear Information System (INIS)

    Wilks, S.C.; Thomspon, C.A.; Olivier, S.S.; Bauman, B.J.; Barnes, T.; Werner, J.S.

    2001-01-01

    We discuss the design and implementation of a low-cost, high-resolution adaptive optics test-bed for vision research. It is well known that high-order aberrations in the human eye reduce optical resolution and limit visual acuity. However, the effects of aberration-free eyesight on vision are only now beginning to be studied using adaptive optics to sense and correct the aberrations in the eye. We are developing a high-resolution adaptive optics system for this purpose using a Hamamatsu Parallel Aligned Nematic Liquid Crystal Spatial Light Modulator. Phase-wrapping is used to extend the effective stroke of the device, and the wavefront sensing and wavefront correction are done at different wavelengths. Issues associated with these techniques will be discussed

  14. Thermal and Fluid Modeling of the CRYogenic Orbital TEstbed (CRYOTE) Ground Test Article (GTA)

    Science.gov (United States)

    Piryk, David; Schallhorn, Paul; Walls, Laurie; Stopnitzky, Benny; Rhys, Noah; Wollen, Mark

    2012-01-01

    The purpose of this study was to anchor thermal and fluid system models to data acquired from a ground test article (GTA) for the CRYogenic Orbital TEstbed - CRYOTE. To accomplish this analysis, it was broken into four primary tasks. These included model development, pre-test predictions, testing support at Marshall Space Flight Center (MSFC} and post-test correlations. Information from MSFC facilitated the task of refining and correlating the initial models. The primary goal of the modeling/testing/correlating efforts was to characterize heat loads throughout the ground test article. Significant factors impacting the heat loads included radiative environments, multi-layer insulation (MLI) performance, tank fill levels, tank pressures, and even contact conductance coefficients. This paper demonstrates how analytical thermal/fluid networks were established, and it includes supporting rationale for specific thermal responses seen during testing.

  15. Living with a Star (LWS) Space Environment Testbeds (SET), Mission Carrier Overview and Capabilities

    Science.gov (United States)

    Patschke, Robert; Barth, Janet; Label, Ken; Mariano, Carolyn; Pham, Karen; Brewer, Dana; Cuviello, Michael; Kobe, David; Wu, Carl; Jarosz, Donald

    2004-01-01

    NASA has initiated the Living With a Star (LWS) Program to develop the scientific understanding to address the aspects of the Connected Sun-Earth system that affect life and society. A goal of the program is to bridge the gap between science, engineering, and user application communities. This will enable future science, operational, and commercial objectives in space and atmospheric environments by improving engineering approaches to the accommodation and/or mitigation of the effects of solar variability on technological systems. The three program elements of the LWS Program are Science Missions; Targeted Research and Technology; and Space Environment Testbeds (SETS). SET is an ideal platform for small experiments performing research on space environment effects on technologies and on the mitigation of space weather effects. A short description of the LWS Program will be given, and the SET will be described in detail, giving the mission objectives, available carrier services, and upcoming flight opportunities.

  16. OPNET/Simulink Based Testbed for Disturbance Detection in the Smart Grid

    Energy Technology Data Exchange (ETDEWEB)

    Sadi, Mohammad A. H. [University of Memphis; Dasgupta, Dipankar [ORNL; Ali, Mohammad Hassan [University of Memphis; Abercrombie, Robert K [ORNL

    2015-01-01

    The important backbone of the smart grid is the cyber/information infrastructure, which is primarily used to communicate with different grid components. A smart grid is a complex cyber physical system containing a numerous and variety number of sources, devices, controllers and loads. Therefore, the smart grid is vulnerable to grid related disturbances. For such dynamic system, disturbance and intrusion detection is a paramount issue. This paper presents a Simulink and Opnet based co-simulated platform to carry out a cyber-intrusion in cyber network for modern power systems and the smart grid. The IEEE 30 bus power system model is used to demonstrate the effectiveness of the simulated testbed. The experiments were performed by disturbing the circuit breakers reclosing time through a cyber-attack. Different disturbance situations in the considered test system are considered and the results indicate the effectiveness of the proposed co-simulated scheme.

  17. MODELING CIRCUMSTELLAR DISKS OF B-TYPE STARS WITH OBSERVATIONS FROM THE PALOMAR TESTBED INTERFEROMETER

    International Nuclear Information System (INIS)

    Grzenia, B. J.; Tycner, C.; Jones, C. E.; Sigut, T. A. A.; Rinehart, S. A.; Van Belle, G. T.

    2013-01-01

    Geometrical (uniform disk) and numerical models were calculated for a set of B-emission (Be) stars observed with the Palomar Testbed Interferometer (PTI). Physical extents have been estimated for the disks of a total of 15 stars via uniform disk models. Our numerical non-LTE models used parameters for the B0, B2, B5, and B8 spectral classes and following the framework laid by previous studies, we have compared them to infrared K-band interferometric observations taken at PTI. This is the first time such an extensive set of Be stars observed with long-baseline interferometry has been analyzed with self-consistent non-LTE numerical disk models.

  18. Disassembly automation automated systems with cognitive abilities

    CERN Document Server

    Vongbunyong, Supachai

    2015-01-01

    This book presents a number of aspects to be considered in the development of disassembly automation, including the mechanical system, vision system and intelligent planner. The implementation of cognitive robotics increases the flexibility and degree of autonomy of the disassembly system. Disassembly, as a step in the treatment of end-of-life products, can allow the recovery of embodied value left within disposed products, as well as the appropriate separation of potentially-hazardous components. In the end-of-life treatment industry, disassembly has largely been limited to manual labor, which is expensive in developed countries. Automation is one possible solution for economic feasibility. The target audience primarily comprises researchers and experts in the field, but the book may also be beneficial for graduate students.

  19. Highway Electrification And Automation

    OpenAIRE

    Shladover, Steven E.

    1992-01-01

    This report addresses how the California Department of Transportation and the California PATH Program have made efforts to evaluate the feasibility and applicability of highway electrification and automation technologies. In addition to describing how the work was conducted, the report also describes the findings on highway electrification and highway automation, with experimental results, design study results, and a region-wide application impacts study for Los Angeles.

  20. Automated lattice data generation

    Directory of Open Access Journals (Sweden)

    Ayyar Venkitesh

    2018-01-01

    Full Text Available The process of generating ensembles of gauge configurations (and measuring various observables over them can be tedious and error-prone when done “by hand”. In practice, most of this procedure can be automated with the use of a workflow manager. We discuss how this automation can be accomplished using Taxi, a minimal Python-based workflow manager built for generating lattice data. We present a case study demonstrating this technology.

  1. Automated lattice data generation

    Science.gov (United States)

    Ayyar, Venkitesh; Hackett, Daniel C.; Jay, William I.; Neil, Ethan T.

    2018-03-01

    The process of generating ensembles of gauge configurations (and measuring various observables over them) can be tedious and error-prone when done "by hand". In practice, most of this procedure can be automated with the use of a workflow manager. We discuss how this automation can be accomplished using Taxi, a minimal Python-based workflow manager built for generating lattice data. We present a case study demonstrating this technology.

  2. Automated security management

    CERN Document Server

    Al-Shaer, Ehab; Xie, Geoffrey

    2013-01-01

    In this contributed volume, leading international researchers explore configuration modeling and checking, vulnerability and risk assessment, configuration analysis, and diagnostics and discovery. The authors equip readers to understand automated security management systems and techniques that increase overall network assurability and usability. These constantly changing networks defend against cyber attacks by integrating hundreds of security devices such as firewalls, IPSec gateways, IDS/IPS, authentication servers, authorization/RBAC servers, and crypto systems. Automated Security Managemen

  3. Marketing automation supporting sales

    OpenAIRE

    Sandell, Niko

    2016-01-01

    The past couple of decades has been a time of major changes in marketing. Digitalization has become a permanent part of marketing and at the same time enabled efficient collection of data. Personalization and customization of content are playing a crucial role in marketing when new customers are acquired. This has also created a need for automation to facilitate the distribution of targeted content. As a result of successful marketing automation more information of the customers is gathered ...

  4. Instant Sikuli test automation

    CERN Document Server

    Lau, Ben

    2013-01-01

    Get to grips with a new technology, understand what it is and what it can do for you, and then get to work with the most important features and tasks. A concise guide written in an easy-to follow style using the Starter guide approach.This book is aimed at automation and testing professionals who want to use Sikuli to automate GUI. Some Python programming experience is assumed.

  5. Managing laboratory automation

    OpenAIRE

    Saboe, Thomas J.

    1995-01-01

    This paper discusses the process of managing automated systems through their life cycles within the quality-control (QC) laboratory environment. The focus is on the process of directing and managing the evolving automation of a laboratory; system examples are given. The author shows how both task and data systems have evolved, and how they interrelate. A BIG picture, or continuum view, is presented and some of the reasons for success or failure of the various examples cited are explored. Fina...

  6. Shielded cells transfer automation

    International Nuclear Information System (INIS)

    Fisher, J.J.

    1984-01-01

    Nuclear waste from shielded cells is removed, packaged, and transferred manually in many nuclear facilities. Radiation exposure is absorbed by operators during these operations and limited only through procedural controls. Technological advances in automation using robotics have allowed a production waste removal operation to be automated to reduce radiation exposure. The robotic system bags waste containers out of glove box and transfers them to a shielded container. Operators control the system outside the system work area via television cameras. 9 figures

  7. Automated Status Notification System

    Science.gov (United States)

    2005-01-01

    NASA Lewis Research Center's Automated Status Notification System (ASNS) was born out of need. To prevent "hacker attacks," Lewis' telephone system needed to monitor communications activities 24 hr a day, 7 days a week. With decreasing staff resources, this continuous monitoring had to be automated. By utilizing existing communications hardware, a UNIX workstation, and NAWK (a pattern scanning and processing language), we implemented a continuous monitoring system.

  8. Real-time remote diagnostic monitoring test-bed in JET

    International Nuclear Information System (INIS)

    Castro, R.; Kneupner, K.; Vega, J.; De Arcas, G.; Lopez, J.M.; Purahoo, K.; Murari, A.; Fonseca, A.; Pereira, A.; Portas, A.

    2010-01-01

    Based on the remote experimentation concept oriented to long pulse shots, a test-bed system has been implemented in JET. Its main functionality is the real-time monitoring, on remote, of a reflectometer diagnostic, to visualize different data outputs and status information. The architecture of the system is formed by: the data generator components, the data distribution system, an access control service, and the client applications. In the test-bed there is one data generator, which is the acquisition equipment associated with the reflectometer diagnostic that generates data and status information. The data distribution system has been implemented using a publishing-subscribing technology that receives data from data generators and redistributes them to client applications. And finally, for monitoring, a client application based on JAVA Web Start technology has been used. There are three interesting results from this project. The first one is the analysis of different aspects (data formats, data frame rate, data resolution, etc) related with remote real-time diagnostic monitoring oriented to long pulse experiments. The second one is the definition and implementation of an architecture, flexible enough to be applied to different types of data generated from other diagnostics, and that fits with remote access requirements. Finally, the third result is a secure system, taking into account internal networks and firewalls aspects of JET, and securing the access from remote users. For this last issue, PAPI technology has been used, enabling access control based on user attributes, enabling mobile users to monitor diagnostics in real-time, and enabling the integration of this service into the EFDA Federation (Castro et al., 2008 ).

  9. Real-time remote diagnostic monitoring test-bed in JET

    Energy Technology Data Exchange (ETDEWEB)

    Castro, R., E-mail: rodrigo.castro@ciemat.e [Asociacion EURATOM/CIEMAT para Fusion, Madrid (Spain); Kneupner, K. [EURATOM/UKAEA Fusion Association, Culham Science Centre, Abingdon, OX14 3DB (United Kingdom); Vega, J. [Asociacion EURATOM/CIEMAT para Fusion, Madrid (Spain); De Arcas, G.; Lopez, J.M. [Universidad Politecnica de Madrid, Grupo I2A2, Madrid (Spain); Purahoo, K. [EURATOM/UKAEA Fusion Association, Culham Science Centre, Abingdon, OX14 3DB (United Kingdom); Murari, A. [Associazione EURATOM-ENEA per la Fusione, Consorzio RFX, 4-35127 Padova (Italy); Fonseca, A. [Associacao EURATOM/IST, Lisbon (Portugal); Pereira, A.; Portas, A. [Asociacion EURATOM/CIEMAT para Fusion, Madrid (Spain)

    2010-07-15

    Based on the remote experimentation concept oriented to long pulse shots, a test-bed system has been implemented in JET. Its main functionality is the real-time monitoring, on remote, of a reflectometer diagnostic, to visualize different data outputs and status information. The architecture of the system is formed by: the data generator components, the data distribution system, an access control service, and the client applications. In the test-bed there is one data generator, which is the acquisition equipment associated with the reflectometer diagnostic that generates data and status information. The data distribution system has been implemented using a publishing-subscribing technology that receives data from data generators and redistributes them to client applications. And finally, for monitoring, a client application based on JAVA Web Start technology has been used. There are three interesting results from this project. The first one is the analysis of different aspects (data formats, data frame rate, data resolution, etc) related with remote real-time diagnostic monitoring oriented to long pulse experiments. The second one is the definition and implementation of an architecture, flexible enough to be applied to different types of data generated from other diagnostics, and that fits with remote access requirements. Finally, the third result is a secure system, taking into account internal networks and firewalls aspects of JET, and securing the access from remote users. For this last issue, PAPI technology has been used, enabling access control based on user attributes, enabling mobile users to monitor diagnostics in real-time, and enabling the integration of this service into the EFDA Federation (Castro et al., 2008 ).

  10. Automated Groundwater Screening

    International Nuclear Information System (INIS)

    Taylor, Glenn A.; Collard, Leonard B.

    2005-01-01

    The Automated Intruder Analysis has been extended to include an Automated Ground Water Screening option. This option screens 825 radionuclides while rigorously applying the National Council on Radiation Protection (NCRP) methodology. An extension to that methodology is presented to give a more realistic screening factor for those radionuclides which have significant daughters. The extension has the promise of reducing the number of radionuclides which must be tracked by the customer. By combining the Automated Intruder Analysis with the Automated Groundwater Screening a consistent set of assumptions and databases is used. A method is proposed to eliminate trigger values by performing rigorous calculation of the screening factor thereby reducing the number of radionuclides sent to further analysis. Using the same problem definitions as in previous groundwater screenings, the automated groundwater screening found one additional nuclide, Ge-68, which failed the screening. It also found that 18 of the 57 radionuclides contained in NCRP Table 3.1 failed the screening. This report describes the automated groundwater screening computer application

  11. ACES-Based Testbed and Bayesian Game-Theoretic Framework for Dynamic Airspace Configuration, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The key innovation in this effort is the development of algorithms and a framework for automated Dynamic Airspace Configuration (DAC) using a cooperative Bayesian...

  12. The Soil Moisture Active Passive Mission (SMAP) Science Data Products: Results of Testing with Field Experiment and Algorithm Testbed Simulation Environment Data

    Science.gov (United States)

    Entekhabi, Dara; Njoku, Eni E.; O'Neill, Peggy E.; Kellogg, Kent H.; Entin, Jared K.

    2010-01-01

    Talk outline 1. Derivation of SMAP basic and applied science requirements from the NRC Earth Science Decadal Survey applications 2. Data products and latencies 3. Algorithm highlights 4. SMAP Algorithm Testbed 5. SMAP Working Groups and community engagement

  13. Automation of Taxiing

    Directory of Open Access Journals (Sweden)

    Jaroslav Bursík

    2017-01-01

    Full Text Available The article focuses on the possibility of automation of taxiing, which is the part of a flight, which, under adverse weather conditions, greatly reduces the operational usability of an airport, and is the only part of a flight that has not been affected by automation, yet. Taxiing is currently handled manually by the pilot, who controls the airplane based on information from visual perception. The article primarily deals with possible ways of obtaining navigational information, and its automatic transfer to the controls. Analyzed wand assessed were currently available technologies such as computer vision, Light Detection and Ranging and Global Navigation Satellite System, which are useful for navigation and their general implementation into an airplane was designed. Obstacles to the implementation were identified, too. The result is a proposed combination of systems along with their installation into airplane’s systems so that it is possible to use the automated taxiing.

  14. Oceanic Platform of the Canary Islands: an ocean testbed for ocean energy converters

    Science.gov (United States)

    González, Javier; Hernández-Brito, Joaquín.; Llinás, Octavio

    2010-05-01

    The Oceanic Platform of the Canary Islands (PLOCAN) is a Governmental Consortium aimed to build and operate an off-shore infrastructure to facilitate the deep sea research and speed up the technology associated. This Consortium is overseen by the Spanish Ministry of Science and Innovation and the Canarian Agency for Research and Innovation. The infrastructure consists of an oceanic platform located in an area with depths between 50-100 meters, close to the continental slope and four kilometers off the coast of Gran Canaria, in the archipelago of the Canary Islands. The process of construction will start during the first months of 2010 and is expected to be finished in mid-year 2011. PLOCAN serves five strategic lines: an integral observatory able to explore from the deep ocean to the atmosphere, an ocean technology testbed, a base for underwater vehicles, an innovation platform and a highly specialized training centre. Ocean energy is a suitable source to contribute the limited mix-energy conformed in the archipelago of the Canary Islands with a total population around 2 million people unequally distributed in seven islands. Islands of Gran Canaria and Tenerife support the 80% of the total population with 800.000 people each. PLOCAN will contribute to develop the ocean energy sector establishing a marine testbed allowing prototypes testing at sea under a meticulous monitoring network provided by the integral observatory, generating valuable information to developers. Reducing costs throughout an integral project management is an essential objective to be reach, providing services such as transportation, customs and administrative permits. Ocean surface for testing activities is around 8 km2 with a depth going from 50 to 100 meters, 4km off the coast. Selected areas for testing have off-shore wind power conditions around 500-600 W/m2 and wave power conditions around 6 kW/m in the East coast and 10 kW/m in the North coast. Marine currents in the Canary Islands are

  15. Implementation of Motion Simulation Software and Visual-Auditory Electronics for Use in a Low Gravity Robotic Testbed

    Science.gov (United States)

    Martin, William Campbell

    2011-01-01

    The Jet Propulsion Laboratory (JPL) is developing the All-Terrain Hex-Limbed Extra-Terrestrial Explorer (ATHLETE) to assist in manned space missions. One of the proposed targets for this robotic vehicle is a near-Earth asteroid (NEA), which typically exhibit a surface gravity of only a few micro-g. In order to properly test ATHLETE in such an environment, the development team has constructed an inverted Stewart platform testbed that acts as a robotic motion simulator. This project focused on creating physical simulation software that is able to predict how ATHLETE will function on and around a NEA. The corresponding platform configurations are calculated and then passed to the testbed to control ATHLETE's motion. In addition, imitation attitude, imitation attitude control thrusters were designed and fabricated for use on ATHLETE. These utilize a combination of high power LEDs and audio amplifiers to provide visual and auditory cues that correspond to the physics simulation.

  16. Evaluasi Kinerja Layanan IPTV pada Jaringan Testbed WiMAX Berbasis Standar IEEE 802.16-2004

    Directory of Open Access Journals (Sweden)

    Prasetiyono Hari Mukti

    2015-09-01

    Full Text Available In this paper, a performance evaluation for IPTV Services over WiMAX testbed based on IEEE Standard 802.16-2004 will be described. The performance of the proposed system is evaluated in terms of delay, jitter, throughput and packet loss. Service performance evaluations are conducted on network topology of point-to-point in the variation of background traffic with different scheduling types. Background traffic is injected into the system to give the sense that the proposed system has variation traffic load. Scheduling type which are used in this paper are Best Effort (BE, Non-Real-Time Polling Service (nrtPS, Real-Time Polling Service (rtPS and Unsolicited Grant Service (UGS. The expemerintal results of IPTV service performance over the testbed network show that the maximum average of delay, jitter, packet loss and jitter are 16.581 ms, 58.515 ms, 0.67 Mbps dan 10.96%, respectively.

  17. The Fourier-Kelvin Stellar Interferometer (FKSI) Nulling Testbed II: Closed-loop Path Length Metrology And Control Subsystem

    Science.gov (United States)

    Frey, B. J.; Barry, R. K.; Danchi, W. C.; Hyde, T. T.; Lee, K. Y.; Martino, A. J.; Zuray, M. S.

    2006-01-01

    The Fourier-Kelvin Stellar Interferometer (FKSI) is a mission concept for an imaging and nulling interferometer in the near to mid-infrared spectral region (3-8 microns), and will be a scientific and technological pathfinder for upcoming missions including TPF-I/DARWIN, SPECS, and SPIRIT. At NASA's Goddard Space Flight Center, we have constructed a symmetric Mach-Zehnder nulling testbed to demonstrate techniques and algorithms that can be used to establish and maintain the 10(exp 4) null depth that will be required for such a mission. Among the challenges inherent in such a system is the ability to acquire and track the null fringe to the desired depth for timescales on the order of hours in a laboratory environment. In addition, it is desirable to achieve this stability without using conventional dithering techniques. We describe recent testbed metrology and control system developments necessary to achieve these goals and present our preliminary results.

  18. Optimizing Electric Vehicle Coordination Over a Heterogeneous Mesh Network in a Scaled-Down Smart Grid Testbed

    DEFF Research Database (Denmark)

    Bhattarai, Bishnu Prasad; Lévesque, Martin; Maier, Martin

    2015-01-01

    High penetration of renewable energy sources and electric vehicles (EVs) create power imbalance and congestion in the existing power network, and hence causes significant problems in the control and operation. Despite investing huge efforts from the electric utilities, governments, and researchers......, smart grid (SG) is still at the developmental stage to address those issues. In this regard, a smart grid testbed (SGT) is desirable to develop, analyze, and demonstrate various novel SG solutions, namely demand response, real-time pricing, and congestion management. In this paper, a novel SGT...... is developed in a laboratory by scaling a 250 kVA, 0.4 kV real low-voltage distribution feeder down to 1 kVA, 0.22 kV. Information and communication technology is integrated in the scaled-down network to establish real-time monitoring and control. The novelty of the developed testbed is demonstrated...

  19. A Monocular Vision Measurement System of Three-Degree-of-Freedom Air-Bearing Test-Bed Based on FCCSP

    Science.gov (United States)

    Gao, Zhanyu; Gu, Yingying; Lv, Yaoyu; Xu, Zhenbang; Wu, Qingwen

    2018-06-01

    A monocular vision-based pose measurement system is provided for real-time measurement of a three-degree-of-freedom (3-DOF) air-bearing test-bed. Firstly, a circular plane cooperative target is designed. An image of a target fixed on the test-bed is then acquired. Blob analysis-based image processing is used to detect the object circles on the target. A fast algorithm (FCCSP) based on pixel statistics is proposed to extract the centers of object circles. Finally, pose measurements can be obtained when combined with the centers and the coordinate transformation relation. Experiments show that the proposed method is fast, accurate, and robust enough to satisfy the requirement of the pose measurement.

  20. Control and automation systems

    International Nuclear Information System (INIS)

    Schmidt, R.; Zillich, H.

    1986-01-01

    A survey is given of the development of control and automation systems for energy uses. General remarks about control and automation schemes are followed by a description of modern process control systems along with process control processes as such. After discussing the particular process control requirements of nuclear power plants the paper deals with the reliability and availability of process control systems and refers to computerized simulation processes. The subsequent paragraphs are dedicated to descriptions of the operating floor, ergonomic conditions, existing systems, flue gas desulfurization systems, the electromagnetic influences on digital circuits as well as of light wave uses. (HAG) [de

  1. Automated nuclear materials accounting

    International Nuclear Information System (INIS)

    Pacak, P.; Moravec, J.

    1982-01-01

    An automated state system of accounting for nuclear materials data was established in Czechoslovakia in 1979. A file was compiled of 12 programs in the PL/1 language. The file is divided into four groups according to logical associations, namely programs for data input and checking, programs for handling the basic data file, programs for report outputs in the form of worksheets and magnetic tape records, and programs for book inventory listing, document inventory handling and materials balance listing. A similar automated system of nuclear fuel inventory for a light water reactor was introduced for internal purposes in the Institute of Nuclear Research (UJV). (H.S.)

  2. Automating the CMS DAQ

    International Nuclear Information System (INIS)

    Bauer, G; Darlea, G-L; Gomez-Ceballos, G; Bawej, T; Chaze, O; Coarasa, J A; Deldicque, C; Dobson, M; Dupont, A; Gigi, D; Glege, F; Gomez-Reino, R; Hartl, C; Hegeman, J; Masetti, L; Behrens, U; Branson, J; Cittolin, S; Holzner, A; Erhan, S

    2014-01-01

    We present the automation mechanisms that have been added to the Data Acquisition and Run Control systems of the Compact Muon Solenoid (CMS) experiment during Run 1 of the LHC, ranging from the automation of routine tasks to automatic error recovery and context-sensitive guidance to the operator. These mechanisms helped CMS to maintain a data taking efficiency above 90% and to even improve it to 95% towards the end of Run 1, despite an increase in the occurrence of single-event upsets in sub-detector electronics at high LHC luminosity.

  3. Altering user' acceptance of automation through prior automation exposure.

    Science.gov (United States)

    Bekier, Marek; Molesworth, Brett R C

    2017-06-01

    Air navigation service providers worldwide see increased use of automation as one solution to overcome the capacity constraints imbedded in the present air traffic management (ATM) system. However, increased use of automation within any system is dependent on user acceptance. The present research sought to determine if the point at which an individual is no longer willing to accept or cooperate with automation can be manipulated. Forty participants underwent training on a computer-based air traffic control programme, followed by two ATM exercises (order counterbalanced), one with and one without the aid of automation. Results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation ('tipping point') decreased; suggesting it is indeed possible to alter automation acceptance. Practitioner Summary: This paper investigates whether the point at which a user of automation rejects automation (i.e. 'tipping point') is constant or can be manipulated. The results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation decreased; suggesting it is possible to alter automation acceptance.

  4. A Method to Derive Monitoring Variables for a Cyber Security Test-bed of I and C System

    International Nuclear Information System (INIS)

    Han, Kyung Soo; Song, Jae Gu; Lee, Joung Woon; Lee, Cheol Kwon

    2013-01-01

    In the IT field, monitoring techniques have been developed to protect the systems connected by networks from cyber attacks and incidents. For the development of monitoring systems for I and C cyber security, it is necessary to review the monitoring systems in the IT field and derive cyber security-related monitoring variables among the proprietary operating information about the I and C systems. Tests for the development and application of these monitoring systems may cause adverse effects on the I and C systems. To analyze influences on the system and safely intended variables, the construction of an I and C system Test-bed should be preceded. This article proposes a method of deriving variables that should be monitored through a monitoring system for cyber security as a part of I and C Test-bed. The surveillance features and the monitored variables of NMS(Network Management System), a monitoring technique in the IT field, were reviewed in section 2. In Section 3, the monitoring variables for an I and C cyber security were derived by the of NMS and the investigation for information used for hacking techniques that can be practiced against I and C systems. The monitoring variables of NMS in the IT field and the information about the malicious behaviors used for hacking were derived as expected variables to be monitored for an I and C cyber security research. The derived monitoring variables were classified into the five functions of NMS for efficient management. For the cyber security of I and C systems, the vulnerabilities should be understood through a penetration test etc. and an assessment of influences on the actual system should be carried out. Thus, constructing a test-bed of I and C systems is necessary for the safety system in operation. In the future, it will be necessary to develop a logging and monitoring system for studies on the vulnerabilities of I and C systems with test-beds

  5. Experimental aerodynamic and acoustic model testing of the Variable Cycle Engine (VCE) testbed coannular exhaust nozzle system: Comprehensive data report

    Science.gov (United States)

    Nelson, D. P.; Morris, P. M.

    1980-01-01

    The component detail design drawings of the one sixth scale model of the variable cycle engine testbed demonstrator exhaust syatem tested are presented. Also provided are the basic acoustic and aerodynamic data acquired during the experimental model tests. The model drawings, an index to the acoustic data, an index to the aerodynamic data, tabulated and graphical acoustic data, and the tabulated aerodynamic data and graphs are discussed.

  6. A Method to Derive Monitoring Variables for a Cyber Security Test-bed of I and C System

    Energy Technology Data Exchange (ETDEWEB)

    Han, Kyung Soo; Song, Jae Gu; Lee, Joung Woon; Lee, Cheol Kwon [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2013-10-15

    In the IT field, monitoring techniques have been developed to protect the systems connected by networks from cyber attacks and incidents. For the development of monitoring systems for I and C cyber security, it is necessary to review the monitoring systems in the IT field and derive cyber security-related monitoring variables among the proprietary operating information about the I and C systems. Tests for the development and application of these monitoring systems may cause adverse effects on the I and C systems. To analyze influences on the system and safely intended variables, the construction of an I and C system Test-bed should be preceded. This article proposes a method of deriving variables that should be monitored through a monitoring system for cyber security as a part of I and C Test-bed. The surveillance features and the monitored variables of NMS(Network Management System), a monitoring technique in the IT field, were reviewed in section 2. In Section 3, the monitoring variables for an I and C cyber security were derived by the of NMS and the investigation for information used for hacking techniques that can be practiced against I and C systems. The monitoring variables of NMS in the IT field and the information about the malicious behaviors used for hacking were derived as expected variables to be monitored for an I and C cyber security research. The derived monitoring variables were classified into the five functions of NMS for efficient management. For the cyber security of I and C systems, the vulnerabilities should be understood through a penetration test etc. and an assessment of influences on the actual system should be carried out. Thus, constructing a test-bed of I and C systems is necessary for the safety system in operation. In the future, it will be necessary to develop a logging and monitoring system for studies on the vulnerabilities of I and C systems with test-beds.

  7. Dynamic Testing of the NASA Hypersonic Project Combined Cycle Engine Testbed for Mode Transition Experiments

    Science.gov (United States)

    2011-01-01

    NASA is interested in developing technology that leads to more routine, safe, and affordable access to space. Access to space using airbreathing propulsion systems has potential to meet these objectives based on Airbreathing Access to Space (AAS) system studies. To this end, the NASA Fundamental Aeronautics Program (FAP) Hypersonic Project is conducting fundamental research on a Turbine Based Combined Cycle (TBCC) propulsion system. The TBCC being studied considers a dual flow-path inlet system. One flow-path includes variable geometry to regulate airflow to a turbine engine cycle. The turbine cycle provides propulsion from take-off to supersonic flight. The second flow-path supports a dual-mode scramjet (DMSJ) cycle which would be initiated at supersonic speed to further accelerate the vehicle to hypersonic speed. For a TBCC propulsion system to accelerate a vehicle from supersonic to hypersonic speed, a critical enabling technology is the ability to safely and effectively transition from the turbine to the DMSJ-referred to as mode transition. To experimentally test methods of mode transition, a Combined Cycle Engine (CCE) Large-scale Inlet testbed was designed with two flow paths-a low speed flow-path sized for a turbine cycle and a high speed flow-path designed for a DMSJ. This testbed system is identified as the CCE Large-Scale Inlet for Mode Transition studies (CCE-LIMX). The test plan for the CCE-LIMX in the NASA Glenn Research Center (GRC) 10- by 10-ft Supersonic Wind Tunnel (10x10 SWT) is segmented into multiple phases. The first phase is a matrix of inlet characterization (IC) tests to evaluate the inlet performance and establish the mode transition schedule. The second phase is a matrix of dynamic system identification (SysID) experiments designed to support closed-loop control development at mode transition schedule operating points for the CCE-LIMX. The third phase includes a direct demonstration of controlled mode transition using a closed loop control

  8. Real-Time Remote Diagnostic Monitoring Test-bed in JET

    Energy Technology Data Exchange (ETDEWEB)

    Castro, R. [Asociation Euratom/CIEMAT para Fusion, Madrid (Spain); Kneupner, K.; Purahoo, K. [EURATOM/UKAEA Fusion Association, Abingdon (United Kingdom); Vega, J.; Pereira, A.; Portas, A. [Association EuratomCIEMAT para Fusion, Madrid (Spain); De Arcas, G.; Lopez, J.M. [Universidad Politecnica de Madrid (Spain); Murari, A. [Consorzio RFX, Padova (Italy); Fonseca, A. [Associacao URATOM/IST, Lisboa (Portugal); Contributors, J.E. [JET-EFDA, Abingdon (United Kingdom)

    2009-07-01

    Based on the remote experimentation concept oriented to long pulse shots, a test-bed system has been implemented in JET. It integrates 2 functionalities. The first one is the real-time monitoring, on remote, of a reflectometer diagnostic, to visualize different data outputs and status information. The second one is the integration of dotJET (Diagnostic Overview Tool for JET), which internally provides at JET an overview about the current diagnostic systems state, in order to monitor, on remote, JET diagnostics status. The architecture of the system is formed by: the data generator components, the data distribution system, an access control service, and the client applications. In the test-bed there are two data generators: the acquisition equipment associated with the reflectometer diagnostic that generates data and status information, and dotJET server that centralize the access to the status information of JET diagnostics. The data distribution system has been implemented using a publishing-subscribing technology that receives data from data generators and redistributes them to client applications. And finally, for monitoring, a client application based on Java Web Start technology, and a dotJET client application have been used. There are 3 interesting results from this project. The first one is the analysis of different aspects (data formats, data frame rate, data resolution, etc) related with remote real-time diagnostic monitoring oriented to long pulse experiments. The second one is the definition and implementation of a flexible enough architecture, to be applied to different types of data generated from other diagnostics, and that fits with remote access requirements; and the third one is to have achieved a secure system, taking into account internal networks and firewalls aspects in JET, and securing the access from remote users. For this last issue, PAPI technology has been used, enabling access control based on user attributes, enabling mobile users to

  9. LIBRARY AUTOMATION IN NIGERAN UNIVERSITIES

    African Journals Online (AJOL)

    facilitate services and access to information in libraries is widely acceptable. ... Moreover, Ugah (2001) reports that the automation process at the. Abubakar ... blueprint in 1987 and a turn-key system of automation was suggested for the library.

  10. A Testbed to Evaluate the FIWARE-Based IoT Platform in the Domain of Precision Agriculture

    Science.gov (United States)

    Martínez, Ramón; Pastor, Juan Ángel; Álvarez, Bárbara; Iborra, Andrés

    2016-01-01

    Wireless sensor networks (WSNs) represent one of the most promising technologies for precision farming. Over the next few years, a significant increase in the use of such systems on commercial farms is expected. WSNs present a number of problems, regarding scalability, interoperability, communications, connectivity with databases and data processing. Different Internet of Things middleware is appearing to overcome these challenges. This paper checks whether one of these middleware, FIWARE, is suitable for the development of agricultural applications. To the authors’ knowledge, there are no works that show how to use FIWARE in precision agriculture and study its appropriateness, its scalability and its efficiency for this kind of applications. To do this, a testbed has been designed and implemented to simulate different deployments and load conditions. The testbed is a typical FIWARE application, complete, yet simple and comprehensible enough to show the main features and components of FIWARE, as well as the complexity of using this technology. Although the testbed has been deployed in a laboratory environment, its design is based on the analysis of an Internet of Things use case scenario in the domain of precision agriculture. PMID:27886091

  11. Real-Time Emulation of Heterogeneous Wireless Networks with End-to-Edge Quality of Service Guarantees: The AROMA Testbed

    Directory of Open Access Journals (Sweden)

    Anna Umbert

    2010-01-01

    Full Text Available This work presents and describes the real-time testbed for all-IP Beyond 3G (B3G heterogeneous wireless networks that has been developed in the framework of the European IST AROMA project. The main objective of the AROMA testbed is to provide a highly accurate and realistic framework where the performance of algorithms, policies, protocols, services, and applications for a complete heterogeneous wireless network can be fully assessed and evaluated before bringing them to a real system. The complexity of the interaction between all-IP B3G systems and user applications, while dealing with the Quality of Service (QoS concept, motivates the development of this kind of emulation platform where different solutions can be tested in realistic conditions that could not be achieved by means of simple offline simulations. This work provides an in-depth description of the AROMA testbed, emphasizing many interesting implementation details and lessons learned during the development of the tool that may result helpful to other researchers and system engineers in the development of similar emulation platforms. Several case studies are also presented in order to illustrate the full potential and capabilities of the presented emulation platform.

  12. A testbed to explore the optimal electrical stimulation parameters for suppressing inter-ictal spikes in human hippocampal slices.

    Science.gov (United States)

    Min-Chi Hsiao; Pen-Ning Yu; Dong Song; Liu, Charles Y; Heck, Christi N; Millett, David; Berger, Theodore W

    2014-01-01

    New interventions using neuromodulatory devices such as vagus nerve stimulation, deep brain stimulation and responsive neurostimulation are available or under study for the treatment of refractory epilepsy. Since the actual mechanisms of the onset and termination of the seizure are still unclear, most researchers or clinicians determine the optimal stimulation parameters through trial-and-error procedures. It is necessary to further explore what types of electrical stimulation parameters (these may include stimulation frequency, amplitude, duration, interval pattern, and location) constitute a set of optimal stimulation paradigms to suppress seizures. In a previous study, we developed an in vitro epilepsy model using hippocampal slices from patients suffering from mesial temporal lobe epilepsy. Using a planar multi-electrode array system, inter-ictal activity from human hippocampal slices was consistently recorded. In this study, we have further transferred this in vitro seizure model to a testbed for exploring the possible neurostimulation paradigms to inhibit inter-ictal spikes. The methodology used to collect the electrophysiological data, the approach to apply different electrical stimulation parameters to the slices are provided in this paper. The results show that this experimental testbed will provide a platform for testing the optimal stimulation parameters of seizure cessation. We expect this testbed will expedite the process for identifying the most effective parameters, and may ultimately be used to guide programming of new stimulating paradigms for neuromodulatory devices.

  13. Developing the science product algorithm testbed for Chinese next-generation geostationary meteorological satellites: Fengyun-4 series

    Science.gov (United States)

    Min, Min; Wu, Chunqiang; Li, Chuan; Liu, Hui; Xu, Na; Wu, Xiao; Chen, Lin; Wang, Fu; Sun, Fenglin; Qin, Danyu; Wang, Xi; Li, Bo; Zheng, Zhaojun; Cao, Guangzhen; Dong, Lixin

    2017-08-01

    Fengyun-4A (FY-4A), the first of the Chinese next-generation geostationary meteorological satellites, launched in 2016, offers several advances over the FY-2: more spectral bands, faster imaging, and infrared hyperspectral measurements. To support the major objective of developing the prototypes of FY-4 science algorithms, two science product algorithm testbeds for imagers and sounders have been developed by the scientists in the FY-4 Algorithm Working Group (AWG). Both testbeds, written in FORTRAN and C programming languages for Linux or UNIX systems, have been tested successfully by using Intel/g compilers. Some important FY-4 science products, including cloud mask, cloud properties, and temperature profiles, have been retrieved successfully through using a proxy imager, Himawari-8/Advanced Himawari Imager (AHI), and sounder data, obtained from the Atmospheric InfraRed Sounder, thus demonstrating their robustness. In addition, in early 2016, the FY-4 AWG was developed based on the imager testbed—a near real-time processing system for Himawari-8/AHI data for use by Chinese weather forecasters. Consequently, robust and flexible science product algorithm testbeds have provided essential and productive tools for popularizing FY-4 data and developing substantial improvements in FY-4 products.

  14. Future Trends in Process Automation

    OpenAIRE

    Jämsä-Jounela, Sirkka-Liisa

    2007-01-01

    The importance of automation in the process industries has increased dramatically in recent years. In the highly industrialized countries, process automation serves to enhance product quality, master the whole range of products, improve process safety and plant availability, efficiently utilize resources and lower emissions. In the rapidly developing countries, mass production is the main motivation for applying process automation. The greatest demand for process automation is in the chemical...

  15. Adaptive Automation Design and Implementation

    Science.gov (United States)

    2015-09-17

    with an automated system to a real-world adaptive au- tomation system implementation. There have been plenty of adaptive automation 17 Adaptive...of systems without increasing manpower requirements by allocating routine tasks to automated aids, improving safety through the use of au- tomated ...between intermediate levels of au- tomation , explicitly defining which human task a given level automates. Each model aids the creation and classification

  16. Automated HAZOP revisited

    DEFF Research Database (Denmark)

    Taylor, J. R.

    2017-01-01

    Hazard and operability analysis (HAZOP) has developed from a tentative approach to hazard identification for process plants in the early 1970s to an almost universally accepted approach today, and a central technique of safety engineering. Techniques for automated HAZOP analysis were developed...

  17. Automated Student Model Improvement

    Science.gov (United States)

    Koedinger, Kenneth R.; McLaughlin, Elizabeth A.; Stamper, John C.

    2012-01-01

    Student modeling plays a critical role in developing and improving instruction and instructional technologies. We present a technique for automated improvement of student models that leverages the DataShop repository, crowd sourcing, and a version of the Learning Factors Analysis algorithm. We demonstrate this method on eleven educational…

  18. Automated Vehicle Monitoring System

    OpenAIRE

    Wibowo, Agustinus Deddy Arief; Heriansyah, Rudi

    2014-01-01

    An automated vehicle monitoring system is proposed in this paper. The surveillance system is based on image processing techniques such as background subtraction, colour balancing, chain code based shape detection, and blob. The proposed system will detect any human's head as appeared at the side mirrors. The detected head will be tracked and recorded for further action.

  19. Mechatronic Design Automation

    DEFF Research Database (Denmark)

    Fan, Zhun

    successfully design analogue filters, vibration absorbers, micro-electro-mechanical systems, and vehicle suspension systems, all in an automatic or semi-automatic way. It also investigates the very important issue of co-designing plant-structures and dynamic controllers in automated design of Mechatronic...

  20. Automated Accounting. Instructor Guide.

    Science.gov (United States)

    Moses, Duane R.

    This curriculum guide was developed to assist business instructors using Dac Easy Accounting College Edition Version 2.0 software in their accounting programs. The module consists of four units containing assignment sheets and job sheets designed to enable students to master competencies identified in the area of automated accounting. The first…

  1. Automated conflict resolution issues

    Science.gov (United States)

    Wike, Jeffrey S.

    1991-01-01

    A discussion is presented of how conflicts for Space Network resources should be resolved in the ATDRSS era. The following topics are presented: a description of how resource conflicts are currently resolved; a description of issues associated with automated conflict resolution; present conflict resolution strategies; and topics for further discussion.

  2. Automated gamma counters

    International Nuclear Information System (INIS)

    Regener, M.

    1977-01-01

    This is a report on the most recent developments in the full automation of gamma counting in RIA, in particular by Messrs. Kontron. The development targets were flexibility in sample capacity and shape of test tubes, the possibility of using different radioisotopes for labelling due to an optimisation of the detector system and the use of microprocessers to substitute software for hardware. (ORU) [de

  3. Myths in test automation

    Directory of Open Access Journals (Sweden)

    Jazmine Francis

    2014-12-01

    Full Text Available Myths in automation of software testing is an issue of discussion that echoes about the areas of service in validation of software industry. Probably, the first though that appears in knowledgeable reader would be Why this old topic again? What's New to discuss the matter? But, for the first time everyone agrees that undoubtedly automation testing today is not today what it used to be ten or fifteen years ago, because it has evolved in scope and magnitude. What began as a simple linear scripts for web applications today has a complex architecture and a hybrid framework to facilitate the implementation of testing applications developed with various platforms and technologies. Undoubtedly automation has advanced, but so did the myths associated with it. The change in perspective and knowledge of people on automation has altered the terrain. This article reflects the points of views and experience of the author in what has to do with the transformation of the original myths in new versions, and how they are derived; also provides his thoughts on the new generation of myths.

  4. Myths in test automation

    Directory of Open Access Journals (Sweden)

    Jazmine Francis

    2015-01-01

    Full Text Available Myths in automation of software testing is an issue of discussion that echoes about the areas of service in validation of software industry. Probably, the first though that appears in knowledgeable reader would be Why this old topic again? What's New to discuss the matter? But, for the first time everyone agrees that undoubtedly automation testing today is not today what it used to be ten or fifteen years ago, because it has evolved in scope and magnitude. What began as a simple linear scripts for web applications today has a complex architecture and a hybrid framework to facilitate the implementation of testing applications developed with various platforms and technologies. Undoubtedly automation has advanced, but so did the myths associated with it. The change in perspective and knowledge of people on automation has altered the terrain. This article reflects the points of views and experience of the author in what has to do with the transformation of the original myths in new versions, and how they are derived; also provides his thoughts on the new generation of myths.

  5. Building Automation Systems.

    Science.gov (United States)

    Honeywell, Inc., Minneapolis, Minn.

    A number of different automation systems for use in monitoring and controlling building equipment are described in this brochure. The system functions include--(1) collection of information, (2) processing and display of data at a central panel, and (3) taking corrective action by sounding alarms, making adjustments, or automatically starting and…

  6. Automation of activation analysis

    International Nuclear Information System (INIS)

    Ivanov, I.N.; Ivanets, V.N.; Filippov, V.V.

    1985-01-01

    The basic data on the methods and equipment of activation analysis are presented. Recommendations on the selection of activation analysis techniques, and especially the technique envisaging the use of short-lived isotopes, are given. The equipment possibilities to increase dataway carrying capacity, using modern computers for the automation of the analysis and data processing procedure, are shown

  7. Protokoller til Home Automation

    DEFF Research Database (Denmark)

    Kjær, Kristian Ellebæk

    2008-01-01

    computer, der kan skifte mellem foruddefinerede indstillinger. Nogle gange kan computeren fjernstyres over internettet, så man kan se hjemmets status fra en computer eller måske endda fra en mobiltelefon. Mens nævnte anvendelser er klassiske indenfor home automation, er yderligere funktionalitet dukket op...

  8. Automation of radioimmunoassays

    International Nuclear Information System (INIS)

    Goldie, D.J.; West, P.M.; Ismail, A.A.A.

    1979-01-01

    A short account is given of recent developments in automation of the RIA technique. Difficulties encountered in the incubation, separation and quantitation steps are summarized. Published references are given to a number of systems, both discrete and continuous flow, and details are given of a system developed by the present authors. (U.K.)

  9. Microcontroller for automation application

    Science.gov (United States)

    Cooper, H. W.

    1975-01-01

    The description of a microcontroller currently being developed for automation application was given. It is basically an 8-bit microcomputer with a 40K byte random access memory/read only memory, and can control a maximum of 12 devices through standard 15-line interface ports.

  10. Driver Psychology during Automated Platooning

    NARCIS (Netherlands)

    Heikoop, D.D.

    2017-01-01

    With the rapid increase in vehicle automation technology, the call for understanding how humans behave while driving in an automated vehicle becomes more urgent. Vehicles that have automated systems such as Lane Keeping Assist (LKA) or Adaptive Cruise Control (ACC) not only support drivers in their

  11. Automated Inadvertent Intruder Application

    International Nuclear Information System (INIS)

    Koffman, Larry D.; Lee, Patricia L.; Cook, James R.; Wilhite, Elmer L.

    2008-01-01

    The Environmental Analysis and Performance Modeling group of Savannah River National Laboratory (SRNL) conducts performance assessments of the Savannah River Site (SRS) low-level waste facilities to meet the requirements of DOE Order 435.1. These performance assessments, which result in limits on the amounts of radiological substances that can be placed in the waste disposal facilities, consider numerous potential exposure pathways that could occur in the future. One set of exposure scenarios, known as inadvertent intruder analysis, considers the impact on hypothetical individuals who are assumed to inadvertently intrude onto the waste disposal site. Inadvertent intruder analysis considers three distinct scenarios for exposure referred to as the agriculture scenario, the resident scenario, and the post-drilling scenario. Each of these scenarios has specific exposure pathways that contribute to the overall dose for the scenario. For the inadvertent intruder analysis, the calculation of dose for the exposure pathways is a relatively straightforward algebraic calculation that utilizes dose conversion factors. Prior to 2004, these calculations were performed using an Excel spreadsheet. However, design checks of the spreadsheet calculations revealed that errors could be introduced inadvertently when copying spreadsheet formulas cell by cell and finding these errors was tedious and time consuming. This weakness led to the specification of functional requirements to create a software application that would automate the calculations for inadvertent intruder analysis using a controlled source of input parameters. This software application, named the Automated Inadvertent Intruder Application, has undergone rigorous testing of the internal calculations and meets software QA requirements. The Automated Inadvertent Intruder Application was intended to replace the previous spreadsheet analyses with an automated application that was verified to produce the same calculations and

  12. Automating spectral measurements

    Science.gov (United States)

    Goldstein, Fred T.

    2008-09-01

    This paper discusses the architecture of software utilized in spectroscopic measurements. As optical coatings become more sophisticated, there is mounting need to automate data acquisition (DAQ) from spectrophotometers. Such need is exacerbated when 100% inspection is required, ancillary devices are utilized, cost reduction is crucial, or security is vital. While instrument manufacturers normally provide point-and-click DAQ software, an application programming interface (API) may be missing. In such cases automation is impossible or expensive. An API is typically provided in libraries (*.dll, *.ocx) which may be embedded in user-developed applications. Users can thereby implement DAQ automation in several Windows languages. Another possibility, developed by FTG as an alternative to instrument manufacturers' software, is the ActiveX application (*.exe). ActiveX, a component of many Windows applications, provides means for programming and interoperability. This architecture permits a point-and-click program to act as automation client and server. Excel, for example, can control and be controlled by DAQ applications. Most importantly, ActiveX permits ancillary devices such as barcode readers and XY-stages to be easily and economically integrated into scanning procedures. Since an ActiveX application has its own user-interface, it can be independently tested. The ActiveX application then runs (visibly or invisibly) under DAQ software control. Automation capabilities are accessed via a built-in spectro-BASIC language with industry-standard (VBA-compatible) syntax. Supplementing ActiveX, spectro-BASIC also includes auxiliary serial port commands for interfacing programmable logic controllers (PLC). A typical application is automatic filter handling.

  13. A Rural Next Generation Network (R-NGN and Its Testbed

    Directory of Open Access Journals (Sweden)

    Armein Z. R. Langi

    2007-05-01

    Full Text Available Rural Next Generation Networks (R-NGN technology allows Internet protocol (IP based systems to be used in rural areas. This paper reports a testbed of R-NGN that uses low cost Ethernet radio links, combined with media gateways and a softswitch. The network consists of point-to-point IP Ethernet 2.4 GHz wireless link, IP switches and gateways in each community, standard copper wires and telephone sets for users. It uses low power consumption, and suitable for low density users. This combination allows low cost systems as well as multiservices (voice, data, and multimedia for rural communications. An infrastructure has been deployed in two communities in Cipicung Girang, a village 10 km outside Bandung city, Indonesia. Two towers link the communities with a network of Institut Teknologi Bandung (ITB campus. In addition, local wirelines connect community houses to the network. Currently there are four houses connected to each community node (for a total of eight house, upon which we can perform various tests and measurements.

  14. A Rural Next Generation Network (R-NGN and Its Testbed

    Directory of Open Access Journals (Sweden)

    Armein Z. R. Langi

    2013-09-01

    Full Text Available Rural Next Generation Networks (R-NGN technology allows Internet protocol (IP based systems to be used in rural areas. This paper reports a testbed of R-NGN that uses low cost Ethernet radio links, combined with media gateways and a softswitch. The network consists of point-to-point IP Ethernet 2.4 GHz wireless link, IP switches and gateways in each community, standard copper wires and telephone sets for users. It uses low power consumption, and suitable for low density users. This combination allows low cost systems as well as multiservices (voice, data, and multimedia for rural communications. An infrastructure has been deployed in two communities in Cipicung Girang, a village 10 km outside Bandung city, Indonesia. Two towers link the communities with a network of Institut Teknologi Bandung (ITB campus. In addition, local wirelines connect community houses to the network. Currently there are four houses connected to each community node (for a total of eight house, upon which we can perform various tests and measurements.

  15. The CCPP-ARM Parameterization Testbed (CAPT): Where Climate Simulation Meets Weather Prediction

    Energy Technology Data Exchange (ETDEWEB)

    Phillips, T J; Potter, G L; Williamson, D L; Cederwall, R T; Boyle, J S; Fiorino, M; Hnilo, J J; Olson, J G; Xie, S; Yio, J J

    2003-11-21

    To significantly improve the simulation of climate by general circulation models (GCMs), systematic errors in representations of relevant processes must first be identified, and then reduced. This endeavor demands, in particular, that the GCM parameterizations of unresolved processes should be tested over a wide range of time scales, not just in climate simulations. Thus, a numerical weather prediction (NWP) methodology for evaluating model parameterizations and gaining insights into their behavior may prove useful, provied that suitable adaptations are made for implementation in climate GCMs. This method entails the generation of short-range weather forecasts by realistically initialized climate GCM, and the application of six-hourly NWP analyses and observations of parameterized variables to evaluate these forecasts. The behavior of the parameterizations in such a weather-forecasting framework can provide insights on how these schemes might be improved, and modified parameterizations then can be similarly tested. In order to further this method for evaluating and analyzing parameterizations in climate GCMs, the USDOE is funding a joint venture of its Climate Change Prediction Program (CCPP) and Atmospheric Radiation Measurement (ARM) Program: the CCPP-ARM Parameterization Testbed (CAPT). This article elaborates the scientific rationale for CAPT, discusses technical aspects of its methodology, and presents examples of its implementation in a representative climate GCM. Numerical weather prediction methods show promise for improving parameterizations in climate GCMs.

  16. GEO light imaging national testbed (GLINT) heliostat design and testing status

    Science.gov (United States)

    Thornton, Marcia A.; Oldenettel, Jerry R.; Hult, Dane W.; Koski, Katrina; Depue, Tracy; Cuellar, Edward L.; Balfour, Jim; Roof, Morey; Yarger, Fred W.; Newlin, Greg; Ramzel, Lee; Buchanan, Peter; Mariam, Fesseha G.; Scotese, Lee

    2002-01-01

    The GEO Light Imaging National Testbed (GLINT) will use three laser beams producing simultaneous interference fringes to illuminate satellites in geosynchronous earth orbit (GEO). The reflected returns will be recorded using a large 4,000 m2 'light bucket' receiver. This imaging methodology is termed Fourier Telescopy. A major component of the 'light bucket' will be an array of 40 - 80 heliostats. Each heliostat will have a mirrored surface area of 100 m2 mounted on a rigid truss structure which is supported by an A-frame. The truss structure attaches to the torque tube elevation drive and the A-frame structure rests on an azimuth ring that could provide nearly full coverage of the sky. The heliostat is designed to operate in 15 mph winds with jitter of less than 500 microradians peak-to- peak. One objective of the design was to minimize receiver cost to the maximum extent possible while maintaining GLINT system performance specifications. The mechanical structure weights approximately seven tons and is a simple fabricated steel framework. A prototype heliostat has been assembled at Stallion Range Center, White Sands Missile Range, New Mexico and is being tested under a variety of weather and operational conditions. The preliminary results of that testing will be presented as well as some finite element model analyses that were performed to predict the performance of the structure.

  17. PAPI based federation as a test-bed for a common security infrastructure in EFDA sites

    International Nuclear Information System (INIS)

    Castro, R.; Vega, J.; Portas, A.; Lopez, D.R.; Balme, S.; Theis, J.M.; Lebourg, P.; Fernandes, H.; Neto, A.; Duarte, A.; Oliveira, F.; Reis, F.; Purahoo, K.; Thomsen, K.; Schiller, W.; Kadlecsik, J.

    2008-01-01

    Federated authentication and authorization systems provide several advantages to collaborative environments, for example, easy authentication integration, simpler user management, easier security policy implementation and quicker implementation of access control elements for new type of resources. A federation integrates different aspects that have to be coordinated by all the organizations involved. The most relevant are: definition of common schemas and attributes, definition of common policies and procedures, management of keys and certificates, management of common repositories and implementation of a home location service. A federation enabling collaboration of European sites has been put into operation. Four laboratories have been integrated and two more organizations (EFDA and KFKI/HAS) are finishing their integration. The federation infrastructure is based on Point of Access to Providers of Information (PAPI), a distributed authentication and authorization system. PAPI technology gives some important features, such as, single sign on for accessing to different resources, mobility for users, and compatibility with open and standard technologies: Java, JNLP protocol, XML-RPC and web technologies among others. In this article, the test-bed of EFDA federation is presented. Some examples of resources, securely shared inside the federation, are shown. Specific issues and experience gained in deploying federated collaboration systems will be addressed as well

  18. A boundary-layer cloud study using Southern Great Plains Cloud and radiation testbed (CART) data

    Energy Technology Data Exchange (ETDEWEB)

    Albrecht, B.; Mace, G.; Dong, X.; Syrett, W. [Pennsylvania State Univ., University Park, PA (United States)] [and others

    1996-04-01

    Boundary layer clouds-stratus and fairweather cumulus - are closely coupled involves the radiative impact of the clouds on the surface energy budget and the strong dependence of cloud formation and maintenance on the turbulent fluxes of heat and moisture in the boundary layer. The continuous data collection at the Southern Great Plains (SGP) Cloud and Radiation Testbed (CART) site provides a unique opportunity to study components of the coupling processes associated with boundary layer clouds and to provide descriptions of cloud and boundary layer structure that can be used to test parameterizations used in climate models. But before the CART data can be used for process studies and parameterization testing, it is necessary to evaluate and validate data and to develop techniques for effectively combining the data to provide meaningful descriptions of cloud and boundary layer characteristics. In this study we use measurements made during an intensive observing period we consider a case where low-level stratus were observed at the site for about 18 hours. This case is being used to examine the temporal evolution of cloud base, cloud top, cloud liquid water content, surface radiative fluxes, and boundary layer structure. A method for inferring cloud microphysics from these parameters is currently being evaluated.

  19. Adaptive Signal Processing Testbed: VME-based DSP board market survey

    Science.gov (United States)

    Ingram, Rick E.

    1992-04-01

    The Adaptive Signal Processing Testbed (ASPT) is a real-time multiprocessor system utilizing digital signal processor technology on VMEbus based printed circuit boards installed on a Sun workstation. The ASPT has specific requirements, particularly as regards to the signal excision application, with respect to interfacing with current and planned data generation equipment, processing of the data, storage to disk of final and intermediate results, and the development tools for applications development and integration into the overall EW/COM computing environment. A prototype ASPT was implemented using three VME-C-30 boards from Applied Silicon. Experience gained during the prototype development led to the conclusions that interprocessor communications capability is the most significant contributor to overall ASPT performance. In addition, the host involvement should be minimized. Boards using different processors were evaluated with respect to the ASPT system requirements, pricing, and availability. Specific recommendations based on various priorities are made as well as recommendations concerning the integration and interaction of various tools developed during the prototype implementation.

  20. The GridEcon Platform: A Business Scenario Testbed for Commercial Cloud Services

    Science.gov (United States)

    Risch, Marcel; Altmann, Jörn; Guo, Li; Fleming, Alan; Courcoubetis, Costas

    Within this paper, we present the GridEcon Platform, a testbed for designing and evaluating economics-aware services in a commercial Cloud computing setting. The Platform is based on the idea that the exact working of such services is difficult to predict in the context of a market and, therefore, an environment for evaluating its behavior in an emulated market is needed. To identify the components of the GridEcon Platform, a number of economics-aware services and their interactions have been envisioned. The two most important components of the platform are the Marketplace and the Workflow Engine. The Workflow Engine allows the simple composition of a market environment by describing the service interactions between economics-aware services. The Marketplace allows trading goods using different market mechanisms. The capabilities of these components of the GridEcon Platform in conjunction with the economics-aware services are described in this paper in detail. The validation of an implemented market mechanism and a capacity planning service using the GridEcon Platform also demonstrated the usefulness of the GridEcon Platform.

  1. Phase Retrieval Using a Genetic Algorithm on the Systematic Image-Based Optical Alignment Testbed

    Science.gov (United States)

    Taylor, Jaime R.

    2003-01-01

    NASA s Marshall Space Flight Center s Systematic Image-Based Optical Alignment (SIBOA) Testbed was developed to test phase retrieval algorithms and hardware techniques. Individuals working with the facility developed the idea of implementing phase retrieval by breaking the determination of the tip/tilt of each mirror apart from the piston motion (or translation) of each mirror. Presented in this report is an algorithm that determines the optimal phase correction associated only with the piston motion of the mirrors. A description of the Phase Retrieval problem is first presented. The Systematic Image-Based Optical Alignment (SIBOA) Testbeb is then described. A Discrete Fourier Transform (DFT) is necessary to transfer the incoming wavefront (or estimate of phase error) into the spatial frequency domain to compare it with the image. A method for reducing the DFT to seven scalar/matrix multiplications is presented. A genetic algorithm is then used to search for the phase error. The results of this new algorithm on a test problem are presented.

  2. Overview of the Transport Rotorcraft Airframe Crash Testbed (TRACT) Full Scale Crash Tests

    Science.gov (United States)

    Annett, Martin; Littell, Justin

    2015-01-01

    The Transport Rotorcraft Airframe Crash Testbed (TRACT) full-scale tests were performed at NASA Langley Research Center's Landing and Impact Research Facility in 2013 and 2014. Two CH-46E airframes were impacted at 33-ft/s forward and 25-ft/s vertical combined velocities onto soft soil, which represents a severe, but potentially survivable impact scenario. TRACT 1 provided a baseline set of responses, while TRACT 2 included retrofits with composite subfloors and other crash system improvements based on TRACT 1. For TRACT 2, a total of 18 unique experiments were conducted to evaluate Anthropomorphic Test Devices (ATD) responses, seat and restraint performance, cargo restraint effectiveness, patient litter behavior, and activation of emergency locator transmitters and crash sensors. Combinations of Hybrid II, Hybrid III, and ES-2 ATDs were placed in forward and side facing seats and occupant results were compared against injury criteria. The structural response of the airframe was assessed based on accelerometers located throughout the airframe and using three-dimensional photogrammetric techniques. Analysis of the photogrammetric data indicated regions of maximum deflection and permanent deformation. The response of TRACT 2 was noticeably different in the horizontal direction due to changes in the cabin configuration and soil surface, with higher acceleration and damage occurring in the cabin. Loads from ATDs in energy absorbing seats and restraints were within injury limits. Severe injury was likely for ATDs in forward facing passenger seats.

  3. Design of a Loose Part Monitoring System Test-bed using CompactRIO

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Min-seok; Lee, Kwang-Dae; Lee, Eui-Jong [KHNP CRI, Daejeon (Korea, Republic of)

    2016-10-15

    A loose part monitoring system (LPMS) is included in the NSSS integrity monitoring system (NIMS), which serves to detect loose parts in reactor coolant systems (RCS). LPMSs at Nuclear Power Plants (NPPs) in Korea follow the ASME OM standard and acquire data from 18 sensors simultaneously. Data acquisition requires a sampling rate of more than 50KHz along with a 12bit A/D converter. Existing LPMS equipment is composed of several different platforms, such as a digital signal processor (DSP), a field-programmable gate array (FPGA), a micro control unit (MCU), and electric circuit cards. These systems have vulnerabilities, such as discontinuance due to aging and incompatibility issues between different pieces of equipment. This paper suggests CompactRIO as a new platform. We devised a Test-bed using CompactRIO and demonstrate that the proposed method meets the criteria required by the standard. The LPMS provides an alert when an impact event occurs and provides information with which to analyze the location, energy, and mass of the loose parts. LPMSs in NPPs in Korea operate on a variety of platforms. Thus, these systems are vulnerable to discontinuances due to aging and incompatibilities arising from the use of different type of equipment. In order to solve these problems, this paper suggests CompactRIO as a new platform. It is a rugged, reconfigurable, high-performance industrial embedded system. The results of performance tests meet the criteria set by the current standard.

  4. Time Distribution Using SpaceWire in the SCaN Testbed on ISS

    Science.gov (United States)

    Lux, James P.

    2012-01-01

    A paper describes an approach for timekeeping and time transfer among the devices on the CoNNeCT project s SCaN Testbed. It also describes how the clocks may be synchronized with an external time reference; e.g., time tags from the International Space Station (ISS) or RF signals received by a radio (TDRSS time service or GPS). All the units have some sort of counter that is fed by an oscillator at some convenient frequency. The basic problem in timekeeping is relating the counter value to some external time standard such as UTC. With SpaceWire, there are two approaches possible: one is to just use SpaceWire to send a message, and use an external wire for the sync signal. This is much the same as with the RS- 232 messages and l pps line from a GPS receiver. However, SpaceWire has an additional capability that was added to make it easier - it can insert and receive a special "timecode" word in the data stream.

  5. Project Morpheus: Lean Development of a Terrestrial Flight Testbed for Maturing NASA Lander Technologies

    Science.gov (United States)

    Devolites, Jennifer L.; Olansen, Jon B.

    2015-01-01

    NASA's Morpheus Project has developed and tested a prototype planetary lander capable of vertical takeoff and landing that is designed to serve as a testbed for advanced spacecraft technologies. The lander vehicle, propelled by a Liquid Oxygen (LOX)/Methane engine and sized to carry a 500kg payload to the lunar surface, provides a platform for bringing technologies from the laboratory into an integrated flight system at relatively low cost. In 2012, Morpheus began integrating the Autonomous Landing and Hazard Avoidance Technology (ALHAT) sensors and software onto the vehicle in order to demonstrate safe, autonomous landing and hazard avoidance. From the beginning, one of goals for the Morpheus Project was to streamline agency processes and practices. The Morpheus project accepted a challenge to tailor the traditional NASA systems engineering approach in a way that would be appropriate for a lower cost, rapid prototype engineering effort, but retain the essence of the guiding principles. This paper describes the tailored project life cycle and systems engineering approach for the Morpheus project, including the processes, tools, and amount of rigor employed over the project's multiple lifecycles since the project began in fiscal year (FY) 2011.

  6. Cargo container inspection test program at ARPA's Nonintrusive Inspection Technology Testbed

    Science.gov (United States)

    Volberding, Roy W.; Khan, Siraj M.

    1994-10-01

    An x-ray-based cargo inspection system test program is being conducted at the Advanced Research Project Agency (ARPA)-sponsored Nonintrusive Inspection Technology Testbed (NITT) located in the Port of Tacoma, Washington. The test program seeks to determine the performance that can be expected from a dual, high-energy x-ray cargo inspection system when inspecting ISO cargo containers. This paper describes an intensive, three-month, system test involving two independent test groups, one representing the criminal smuggling element and the other representing the law enforcement community. The first group, the `Red Team', prepares ISO containers for inspection at an off-site facility. An algorithm randomly selects and indicates the positions and preparation of cargoes within a container. The prepared container is dispatched to the NITT for inspection by the `Blue Team'. After in-gate processing, it is queued for examination. The Blue Team inspects the container and decides whether or not to pass the container. The shipment undergoes out-gate processing and returns to the Red Team. The results of the inspection are recorded for subsequent analysis. The test process, including its governing protocol, the cargoes, container preparation, the examination and results available at the time of submission are presented.

  7. PAPI based federation as a test-bed for a common security infrastructure in EFDA sites

    Energy Technology Data Exchange (ETDEWEB)

    Castro, R. [Asociacion EURATOM/CIEMAT para Fusion, Madrid (Spain)], E-mail: rodrigo.castro@ciemat.es; Vega, J.; Portas, A. [Asociacion EURATOM/CIEMAT para Fusion, Madrid (Spain); Lopez, D.R. [Departamento RedIRIS, Entidad publica empresarial Red.es, Madrid (Spain); Balme, S.; Theis, J.M.; Lebourg, P. [Association EURATOM-CEA, CEA/DSM/Departement de Recherches sur la Fusion Controlee DRFC, CEA-Cadarache (France); Fernandes, H.; Neto, A.; Duarte, A.; Oliveira, F.; Reis, F. [Centro de Fusao Nuclear, Associacao EURATOM/IST, Lisboa (Portugal); Purahoo, K. [EURATOM/UKAEA Fusion Association, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Thomsen, K.; Schiller, W. [EFDA Close Support Unit Garching, Max Planck Institut fuer Plasmaphysik, Boltzmannstr. 2, D-85748 Garching (Germany); Kadlecsik, J. [KFKI R.I. for Particle and Nuclear Physics of the Hungarian Academy of Sciences, and the Association EURATOM/HAS, Budapest (Hungary)

    2008-04-15

    Federated authentication and authorization systems provide several advantages to collaborative environments, for example, easy authentication integration, simpler user management, easier security policy implementation and quicker implementation of access control elements for new type of resources. A federation integrates different aspects that have to be coordinated by all the organizations involved. The most relevant are: definition of common schemas and attributes, definition of common policies and procedures, management of keys and certificates, management of common repositories and implementation of a home location service. A federation enabling collaboration of European sites has been put into operation. Four laboratories have been integrated and two more organizations (EFDA and KFKI/HAS) are finishing their integration. The federation infrastructure is based on Point of Access to Providers of Information (PAPI), a distributed authentication and authorization system. PAPI technology gives some important features, such as, single sign on for accessing to different resources, mobility for users, and compatibility with open and standard technologies: Java, JNLP protocol, XML-RPC and web technologies among others. In this article, the test-bed of EFDA federation is presented. Some examples of resources, securely shared inside the federation, are shown. Specific issues and experience gained in deploying federated collaboration systems will be addressed as well.

  8. New Security Development and Trends to Secure the SCADA Sensors Automated Transmission during Critical Sessions

    Directory of Open Access Journals (Sweden)

    Aamir Shahzad

    2015-10-01

    Full Text Available Modern technology enhancements have been used worldwide to fulfill the requirements of the industrial sector, especially in supervisory control and data acquisition (SCADA systems as a part of industrial control systems (ICS. SCADA systems have gained popularity in industrial automations due to technology enhancements and connectivity with modern computer networks and/or protocols. The procurement of new technologies has made SCADA systems important and helpful to processing in oil lines, water treatment plants, and electricity generation and control stations. On the other hand, these systems have vulnerabilities like other traditional computer networks (or systems, especially when interconnected with open platforms. Many international organizations and researchers have proposed and deployed solutions for SCADA security enhancement, but most of these have been based on node-to-node security, without emphasizing critical sessions that are linked directly with industrial processing and automation. This study concerns SCADA security measures related to critical processing with specified sessions of automated polling, analyzing cryptography mechanisms and deploying the appropriate explicit inclusive security solution in a distributed network protocol version 3 (DNP3 stack, as part of a SCADA system. The bytes flow through the DNP3 stack with security computational bytes within specified critical intervals defined for polling. We took critical processing knowledge into account when designing a SCADA/DNP3 testbed and deploying a cryptography solution that did not affect communications.

  9. A holistic approach to ZigBee performance enhancement for home automation networks.

    Science.gov (United States)

    Betzler, August; Gomez, Carles; Demirkol, Ilker; Paradells, Josep

    2014-08-14

    Wireless home automation networks are gaining importance for smart homes. In this ambit, ZigBee networks play an important role. The ZigBee specification defines a default set of protocol stack parameters and mechanisms that is further refined by the ZigBee Home Automation application profile. In a holistic approach, we analyze how the network performance is affected with the tuning of parameters and mechanisms across multiple layers of the ZigBee protocol stack and investigate possible performance gains by implementing and testing alternative settings. The evaluations are carried out in a testbed of 57 TelosB motes. The results show that considerable performance improvements can be achieved by using alternative protocol stack configurations. From these results, we derive two improved protocol stack configurations for ZigBee wireless home automation networks that are validated in various network scenarios. In our experiments, these improved configurations yield a relative packet delivery ratio increase of up to 33.6%, a delay decrease of up to 66.6% and an improvement of the energy efficiency for battery powered devices of up to 48.7%, obtainable without incurring any overhead to the network.

  10. A Holistic Approach to ZigBee Performance Enhancement for Home Automation Networks

    Science.gov (United States)

    Betzler, August; Gomez, Carles; Demirkol, Ilker; Paradells, Josep

    2014-01-01

    Wireless home automation networks are gaining importance for smart homes. In this ambit, ZigBee networks play an important role. The ZigBee specification defines a default set of protocol stack parameters and mechanisms that is further refined by the ZigBee Home Automation application profile. In a holistic approach, we analyze how the network performance is affected with the tuning of parameters and mechanisms across multiple layers of the ZigBee protocol stack and investigate possible performance gains by implementing and testing alternative settings. The evaluations are carried out in a testbed of 57 TelosB motes. The results show that considerable performance improvements can be achieved by using alternative protocol stack configurations. From these results, we derive two improved protocol stack configurations for ZigBee wireless home automation networks that are validated in various network scenarios. In our experiments, these improved configurations yield a relative packet delivery ratio increase of up to 33.6%, a delay decrease of up to 66.6% and an improvement of the energy efficiency for battery powered devices of up to 48.7%, obtainable without incurring any overhead to the network. PMID:25196004

  11. A Holistic Approach to ZigBee Performance Enhancement for Home Automation Networks

    Directory of Open Access Journals (Sweden)

    August Betzler

    2014-08-01

    Full Text Available Wireless home automation networks are gaining importance for smart homes. In this ambit, ZigBee networks play an important role. The ZigBee specification defines a default set of protocol stack parameters and mechanisms that is further refined by the ZigBee Home Automation application profile. In a holistic approach, we analyze how the network performance is affected with the tuning of parameters and mechanisms across multiple layers of the ZigBee protocol stack and investigate possible performance gains by implementing and testing alternative settings. The evaluations are carried out in a testbed of 57 TelosB motes. The results show that considerable performance improvements can be achieved by using alternative protocol stack configurations. From these results, we derive two improved protocol stack configurations for ZigBee wireless home automation networks that are validated in various network scenarios. In our experiments, these improved configurations yield a relative packet delivery ratio increase of up to 33.6%, a delay decrease of up to 66.6% and an improvement of the energy efficiency for battery powered devices of up to 48.7%, obtainable without incurring any overhead to the network.

  12. Rapid automated nuclear chemistry

    International Nuclear Information System (INIS)

    Meyer, R.A.

    1979-01-01

    Rapid Automated Nuclear Chemistry (RANC) can be thought of as the Z-separation of Neutron-rich Isotopes by Automated Methods. The range of RANC studies of fission and its products is large. In a sense, the studies can be categorized into various energy ranges from the highest where the fission process and particle emission are considered, to low energies where nuclear dynamics are being explored. This paper presents a table which gives examples of current research using RANC on fission and fission products. The remainder of this text is divided into three parts. The first contains a discussion of the chemical methods available for the fission product elements, the second describes the major techniques, and in the last section, examples of recent results are discussed as illustrations of the use of RANC

  13. Automated optical assembly

    Science.gov (United States)

    Bala, John L.

    1995-08-01

    Automation and polymer science represent fundamental new technologies which can be directed toward realizing the goal of establishing a domestic, world-class, commercial optics business. Use of innovative optical designs using precision polymer optics will enable the US to play a vital role in the next generation of commercial optical products. The increased cost savings inherent in the utilization of optical-grade polymers outweighs almost every advantage of using glass for high volume situations. Optical designers must gain experience with combined refractive/diffractive designs and broaden their knowledge base regarding polymer technology beyond a cursory intellectual exercise. Implementation of a fully automated assembly system, combined with utilization of polymer optics, constitutes the type of integrated manufacturing process which will enable the US to successfully compete with the low-cost labor employed in the Far East, as well as to produce an equivalent product.

  14. Automated breeder fuel fabrication

    International Nuclear Information System (INIS)

    Goldmann, L.H.; Frederickson, J.R.

    1983-01-01

    The objective of the Secure Automated Fabrication (SAF) Project is to develop remotely operated equipment for the processing and manufacturing of breeder reactor fuel pins. The SAF line will be installed in the Fuels and Materials Examination Facility (FMEF). The FMEF is presently under construction at the Department of Energy's (DOE) Hanford site near Richland, Washington, and is operated by the Westinghouse Hanford Company (WHC). The fabrication and support systems of the SAF line are designed for computer-controlled operation from a centralized control room. Remote and automated fuel fabriction operations will result in: reduced radiation exposure to workers; enhanced safeguards; improved product quality; near real-time accountability, and increased productivity. The present schedule calls for installation of SAF line equipment in the FMEF beginning in 1984, with qualifying runs starting in 1986 and production commencing in 1987. 5 figures

  15. Automated multiple failure FMEA

    International Nuclear Information System (INIS)

    Price, C.J.; Taylor, N.S.

    2002-01-01

    Failure mode and effects analysis (FMEA) is typically performed by a team of engineers working together. In general, they will only consider single point failures in a system. Consideration of all possible combinations of failures is impractical for all but the simplest example systems. Even if the task of producing the FMEA report for the full multiple failure scenario were automated, it would still be impractical for the engineers to read, understand and act on all of the results. This paper shows how approximate failure rates for components can be used to select the most likely combinations of failures for automated investigation using simulation. The important information can be automatically identified from the resulting report, making it practical for engineers to study and act on the results. The strategy described in the paper has been applied to a range of electrical subsystems, and the results have confirmed that the strategy described here works well for realistically complex systems

  16. Automated drawing generation system

    International Nuclear Information System (INIS)

    Yoshinaga, Toshiaki; Kawahata, Junichi; Yoshida, Naoto; Ono, Satoru

    1991-01-01

    Since automated CAD drawing generation systems still require human intervention, improvements were focussed on an interactive processing section (data input and correcting operation) which necessitates a vast amount of work. As a result, human intervention was eliminated, the original objective of a computerized system. This is the first step taken towards complete automation. The effects of development and commercialization of the system are as described below. (1) The interactive processing time required for generating drawings was improved. It was determined that introduction of the CAD system has reduced the time required for generating drawings. (2) The difference in skills between workers preparing drawings has been eliminated and the quality of drawings has been made uniform. (3) The extent of knowledge and experience demanded of workers has been reduced. (author)

  17. ATLAS Distributed Computing Automation

    CERN Document Server

    Schovancova, J; The ATLAS collaboration; Borrego, C; Campana, S; Di Girolamo, A; Elmsheuser, J; Hejbal, J; Kouba, T; Legger, F; Magradze, E; Medrano Llamas, R; Negri, G; Rinaldi, L; Sciacca, G; Serfon, C; Van Der Ster, D C

    2012-01-01

    The ATLAS Experiment benefits from computing resources distributed worldwide at more than 100 WLCG sites. The ATLAS Grid sites provide over 100k CPU job slots, over 100 PB of storage space on disk or tape. Monitoring of status of such a complex infrastructure is essential. The ATLAS Grid infrastructure is monitored 24/7 by two teams of shifters distributed world-wide, by the ATLAS Distributed Computing experts, and by site administrators. In this paper we summarize automation efforts performed within the ATLAS Distributed Computing team in order to reduce manpower costs and improve the reliability of the system. Different aspects of the automation process are described: from the ATLAS Grid site topology provided by the ATLAS Grid Information System, via automatic site testing by the HammerCloud, to automatic exclusion from production or analysis activities.

  18. Automated Analysis of Accountability

    DEFF Research Database (Denmark)

    Bruni, Alessandro; Giustolisi, Rosario; Schürmann, Carsten

    2017-01-01

    that the system can detect the misbehaving parties who caused that failure. Accountability is an intuitively stronger property than verifiability as the latter only rests on the possibility of detecting the failure of a goal. A plethora of accountability and verifiability definitions have been proposed...... in the literature. Those definitions are either very specific to the protocols in question, hence not applicable in other scenarios, or too general and widely applicable but requiring complicated and hard to follow manual proofs. In this paper, we advance formal definitions of verifiability and accountability...... that are amenable to automated verification. Our definitions are general enough to be applied to different classes of protocols and different automated security verification tools. Furthermore, we point out formally the relation between verifiability and accountability. We validate our definitions...

  19. Automated assessment of cognitive health using smart home technologies.

    Science.gov (United States)

    Dawadi, Prafulla N; Cook, Diane J; Schmitter-Edgecombe, Maureen; Parsey, Carolyn

    2013-01-01

    The goal of this work is to develop intelligent systems to monitor the wellbeing of individuals in their home environments. This paper introduces a machine learning-based method to automatically predict activity quality in smart homes and automatically assess cognitive health based on activity quality. This paper describes an automated framework to extract set of features from smart home sensors data that reflects the activity performance or ability of an individual to complete an activity which can be input to machine learning algorithms. Output from learning algorithms including principal component analysis, support vector machine, and logistic regression algorithms are used to quantify activity quality for a complex set of smart home activities and predict cognitive health of participants. Smart home activity data was gathered from volunteer participants (n=263) who performed a complex set of activities in our smart home testbed. We compare our automated activity quality prediction and cognitive health prediction with direct observation scores and health assessment obtained from neuropsychologists. With all samples included, we obtained statistically significant correlation (r=0.54) between direct observation scores and predicted activity quality. Similarly, using a support vector machine classifier, we obtained reasonable classification accuracy (area under the ROC curve=0.80, g-mean=0.73) in classifying participants into two different cognitive classes, dementia and cognitive healthy. The results suggest that it is possible to automatically quantify the task quality of smart home activities and perform limited assessment of the cognitive health of individual if smart home activities are properly chosen and learning algorithms are appropriately trained.

  20. Automation and Mankind

    Science.gov (United States)

    1960-08-07

    limited by the cap- abilities of the human organism in the matter of control of its processes. In our time, the speeds of technological processes are...in many cases limited by conditions of control. The speed of human reaction is limited and therefore, at pre- sent, only processes of a relatively...forwiard, It can e foreseer thast automIation will comp~letely free Mans -Pn work unler conlitions’ of high texpemratures pressures,, anid nollutA-: or

  1. Automated Cooperative Trajectories

    Science.gov (United States)

    Hanson, Curt; Pahle, Joseph; Brown, Nelson

    2015-01-01

    This presentation is an overview of the Automated Cooperative Trajectories project. An introduction to the phenomena of wake vortices is given, along with a summary of past research into the possibility of extracting energy from the wake by flying close parallel trajectories. Challenges and barriers to adoption of civilian automatic wake surfing technology are identified. A hardware-in-the-loop simulation is described that will support future research. Finally, a roadmap for future research and technology transition is proposed.

  2. Automating ASW fusion

    OpenAIRE

    Pabelico, James C.

    2011-01-01

    Approved for public release; distribution is unlimited. This thesis examines ASW eFusion, an anti-submarine warfare (ASW) tactical decision aid (TDA) that utilizes Kalman filtering to improve battlespace awareness by simplifying and automating the track management process involved in anti-submarine warfare (ASW) watchstanding operations. While this program can currently help the ASW commander manage uncertainty and make better tactical decisions, the program has several limitations. Comman...

  3. Autonomy, Automation, and Systems

    Science.gov (United States)

    Turner, Philip R.

    1987-02-01

    Aerospace industry interest in autonomy and automation, given fresh impetus by the national goal of establishing a Space Station, is becoming a major item of research and technology development. The promise of new technology arising from research in Artificial Intelligence (AI) has focused much attention on its potential in autonomy and automation. These technologies can improve performance in autonomous control functions that involve planning, scheduling, and fault diagnosis of complex systems. There are, however, many aspects of system and subsystem design in an autonomous system that impact AI applications, but do not directly involve AI technology. Development of a system control architecture, establishment of an operating system within the design, providing command and sensory data collection features appropriate to automated operation, and the use of design analysis tools to support system engineering are specific examples of major design issues. Aspects such as these must also receive attention and technology development support if we are to implement complex autonomous systems within the realistic limitations of mass, power, cost, and available flight-qualified technology that are all-important to a flight project.

  4. Longwall automation 2

    Energy Technology Data Exchange (ETDEWEB)

    David Hainsworth; David Reid; Con Caris; J.C. Ralston; C.O. Hargrave; Ron McPhee; I.N. Hutchinson; A. Strange; C. Wesner [CSIRO (Australia)

    2008-05-15

    This report covers a nominal two-year extension to the Major Longwall Automation Project (C10100). Production standard implementation of Longwall Automation Steering Committee (LASC) automation systems has been achieved at Beltana and Broadmeadow mines. The systems are now used on a 24/7 basis and have provided production benefits to the mines. The LASC Information System (LIS) has been updated and has been implemented successfully in the IT environment of major coal mining houses. This enables 3D visualisation of the longwall environment and equipment to be accessed on line. A simulator has been specified and a prototype system is now ready for implementation. The Shearer Position Measurement System (SPMS) has been upgraded to a modular commercial production standard hardware solution.A compact hardware solution for visual face monitoring has been developed, an approved enclosure for a thermal infrared camera has been produced and software for providing horizon control through faulted conditions has been delivered. The incorporation of the LASC Cut Model information into OEM horizon control algorithms has been bench and underground tested. A prototype system for shield convergence monitoring has been produced and studies to identify techniques for coal flow optimisation and void monitoring have been carried out. Liaison with equipment manufacturers has been maintained and technology delivery mechanisms for LASC hardware and software have been established.

  5. Automation in biological crystallization.

    Science.gov (United States)

    Stewart, Patrick Shaw; Mueller-Dieckmann, Jochen

    2014-06-01

    Crystallization remains the bottleneck in the crystallographic process leading from a gene to a three-dimensional model of the encoded protein or RNA. Automation of the individual steps of a crystallization experiment, from the preparation of crystallization cocktails for initial or optimization screens to the imaging of the experiments, has been the response to address this issue. Today, large high-throughput crystallization facilities, many of them open to the general user community, are capable of setting up thousands of crystallization trials per day. It is thus possible to test multiple constructs of each target for their ability to form crystals on a production-line basis. This has improved success rates and made crystallization much more convenient. High-throughput crystallization, however, cannot relieve users of the task of producing samples of high quality. Moreover, the time gained from eliminating manual preparations must now be invested in the careful evaluation of the increased number of experiments. The latter requires a sophisticated data and laboratory information-management system. A review of the current state of automation at the individual steps of crystallization with specific attention to the automation of optimization is given.

  6. Test-beds for molecular electronics: metal-molecules-metal junctions based on Hg electrodes.

    Science.gov (United States)

    Simeone, Felice Carlo; Rampi, Maria Anita

    2010-01-01

    produced results, are convenient test-beds for molecular electronics and represent a useful complement to physics-based experimental methods.

  7. Photovoltaic Shading Testbed for Module-Level Power Electronics: 2016 Performance Data Update

    Energy Technology Data Exchange (ETDEWEB)

    Deline, Chris [National Renewable Energy Lab. (NREL), Golden, CO (United States); Meydbray, Jenya [PV Evolution Labs (PVEL), Davis, CA (United States); Donovan, Matt [PV Evolution Labs (PVEL), Davis, CA (United States)

    2016-09-01

    The 2012 NREL report 'Photovoltaic Shading Testbed for Module-Level Power Electronics' provides a standard methodology for estimating the performance benefit of distributed power electronics under partial shading conditions. Since the release of the report, experiments have been conducted for a number of products and for different system configurations. Drawing from these experiences, updates to the test and analysis methods are recommended. Proposed changes in data processing have the benefit of reducing the sensitivity to measurement errors and weather variability, as well as bringing the updated performance score in line with measured and simulated values of the shade recovery benefit of distributed PV power electronics. Also, due to the emergence of new technologies including sub-module embedded power electronics, the shading method has been extended to include power electronics that operate at a finer granularity than the module level. An update to the method is proposed to account for these emerging technologies that respond to shading differently than module-level devices. The partial shading test remains a repeatable test procedure that attempts to simulate shading situations as would be experienced by typical residential or commercial rooftop photovoltaic (PV) systems. Performance data for multiple products tested using this method are discussed, based on equipment from Enphase, Solar Edge, Maxim Integrated and SMA. In general, the annual recovery of shading losses from the module-level electronics evaluated is 25-35%, with the major difference between different trials being related to the number of parallel strings in the test installation rather than differences between the equipment tested. Appendix D data has been added in this update.

  8. Empowering Geoscience with Improved Data Assimilation Using the Data Assimilation Research Testbed "Manhattan" Release.

    Science.gov (United States)

    Raeder, K.; Hoar, T. J.; Anderson, J. L.; Collins, N.; Hendricks, J.; Kershaw, H.; Ha, S.; Snyder, C.; Skamarock, W. C.; Mizzi, A. P.; Liu, H.; Liu, J.; Pedatella, N. M.; Karspeck, A. R.; Karol, S. I.; Bitz, C. M.; Zhang, Y.

    2017-12-01

    The capabilities of the Data Assimilation Research Testbed (DART) at NCAR have been significantly expanded with the recent "Manhattan" release. DART is an ensemble Kalman filter based suite of tools, which enables researchers to use data assimilation (DA) without first becoming DA experts. Highlights: significant improvement in efficient ensemble DA for very large models on thousands of processors, direct read and write of model state files in parallel, more control of the DA output for finer-grained analysis, new model interfaces which are useful to a variety of geophysical researchers, new observation forward operators and the ability to use precomputed forward operators from the forecast model. The new model interfaces and example applications include the following: MPAS-A; Model for Prediction Across Scales - Atmosphere is a global, nonhydrostatic, variable-resolution mesh atmospheric model, which facilitates multi-scale analysis and forecasting. The absence of distinct subdomains eliminates problems associated with subdomain boundaries. It demonstrates the ability to consistently produce higher-quality analyses than coarse, uniform meshes do. WRF-Chem; Weather Research and Forecasting + (MOZART) Chemistry model assimilates observations from FRAPPÉ (Front Range Air Pollution and Photochemistry Experiment). WACCM-X; Whole Atmosphere Community Climate Model with thermosphere and ionosphere eXtension assimilates observations of electron density to investigate sudden stratospheric warming. CESM (weakly) coupled assimilation; NCAR's Community Earth System Model is used for assimilation of atmospheric and oceanic observations into their respective components using coupled atmosphere+land+ocean+sea+ice forecasts. CESM2.0; Assimilation in the atmospheric component (CAM, WACCM) of the newly released version is supported. This version contains new and extensively updated components and software environment. CICE; Los Alamos sea ice model (in CESM) is used to assimilate

  9. High energy nuclear database: a test-bed for nuclear data information technology

    International Nuclear Information System (INIS)

    Brown, D.A.; Vogt, R.; Beck, B.; Pruet, J.; Vogt, R.

    2008-01-01

    We describe the development of an on-line high-energy heavy-ion experimental database. When completed, the database will be searchable and cross-indexed with relevant publications, including published detector descriptions. While this effort is relatively new, it will eventually contain all published data from older heavy-ion programs as well as published data from current and future facilities. These data include all measured observables in proton-proton, proton-nucleus and nucleus-nucleus collisions. Once in general use, this database will have tremendous scientific payoff as it makes systematic studies easier and allows simpler benchmarking of theoretical models for a broad range of experiments. Furthermore, there is a growing need for compilations of high-energy nuclear data for applications including stockpile stewardship, technology development for inertial confinement fusion, target and source development for upcoming facilities such as the International Linear Collider and homeland security. This database is part of a larger proposal that includes the production of periodic data evaluations and topical reviews. These reviews would provide an alternative and impartial mechanism to resolve discrepancies between published data from rival experiments and between theory and experiment. Since this database will be a community resource, it requires the high-energy nuclear physics community's financial and manpower support. This project serves as a test-bed for the further development of an object-oriented nuclear data format and database system. By using 'off-the-shelf' software tools and techniques, the system is simple, robust, and extensible. Eventually we envision a 'Grand Unified Nuclear Format' encapsulating data types used in the ENSDF, Endf/B, EXFOR, NSR and other formats, including processed data formats. (authors)

  10. High Energy Nuclear Database: A Testbed for Nuclear Data Information Technology

    International Nuclear Information System (INIS)

    Brown, D A; Vogt, R; Beck, B; Pruet, J

    2007-01-01

    We describe the development of an on-line high-energy heavy-ion experimental database. When completed, the database will be searchable and cross-indexed with relevant publications, including published detector descriptions. While this effort is relatively new, it will eventually contain all published data from older heavy-ion programs as well as published data from current and future facilities. These data include all measured observables in proton-proton, proton-nucleus and nucleus-nucleus collisions. Once in general use, this database will have tremendous scientific payoff as it makes systematic studies easier and allows simpler benchmarking of theoretical models for a broad range of experiments. Furthermore, there is a growing need for compilations of high-energy nuclear data for applications including stockpile stewardship, technology development for inertial confinement fusion, target and source development for upcoming facilities such as the International Linear Collider and homeland security. This database is part of a larger proposal that includes the production of periodic data evaluations and topical reviews. These reviews would provide an alternative and impartial mechanism to resolve discrepancies between published data from rival experiments and between theory and experiment. Since this database will be a community resource, it requires the high-energy nuclear physics community's financial and manpower support. This project serves as a testbed for the further development of an object-oriented nuclear data format and database system. By using ''off-the-shelf'' software tools and techniques, the system is simple, robust, and extensible. Eventually we envision a ''Grand Unified Nuclear Format'' encapsulating data types used in the ENSDF, ENDF/B, EXFOR, NSR and other formats, including processed data formats

  11. CubeSub - A CubeSat Based Submersible Testbed for Space Technology

    Science.gov (United States)

    Slettebo, Christian

    2016-01-01

    This report is a Master's Thesis in Aerospace Engineering, performed at the NASA Ames Research Center. It describes the development of the CubeSub, a submersible testbed compatible with the CubeSat form factor. The CubeSub will be used to mature technology and operational procedures to be used in space exploration, and possibly also as a tool for exploration of Earthly environments. CubeSats are carried as payloads, either containing technology to be tested or experiments and sensors for scientific use. The CubeSub is designed to be built up by modules, which can be assembled in different configurations to fulfill different needs. Each module is powered individually and intermodular communication is wireless, reducing the need for wiring. The inside of the hull is flooded with ambient water to simplify the interaction between payloads and surrounding environment. The overall shape is similar to that of a conventional AUV, slender and smooth. This is to make for a low drag, reduce the risk of snagging on surrounding objects and make it possible to deploy through an ice sheet via a narrow borehole. Rapid prototyping is utilized to a large extent, with full-scale prototypes being constructed through 3D-printing and with COTS (Commercial Off-The-Shelf) components. Arduino boards are used for control and internal communication. Modules required for basic operation have been designed, manufactured and tested. Each module is described with regards to its function, design and manufacturability. By performing tests in a pool it was found that the basic concept is sound and that future improvements include better controllability, course stability and waterproofing of electrical components. Further development is needed to make the CubeSub usable for its intended purposes. The largest gains are expected to be found by developing the software and improving controllability.

  12. “Modular Biospheres” New testbed platforms for public environmental education and research

    Science.gov (United States)

    Nelson, M.; Dempster, W. F.; Allen, J. P.

    This paper will review the potential of a relatively new type of testbed platform for environmental education and research because of the unique advantages resulting from their material closure and separation from the outside environment. These facilities which we term "modular biospheres", have emerged from research centered on space life support research but offer a wider range of application. Examples of this type of facility include the Bios-3 facility in Russia, the Japanese CEEF (Closed Ecological Experiment Facility), the NASA Kennedy Space Center Breadboard facility, the Biosphere 2 Test Module and the Laboratory Biosphere. Modular biosphere facilities offer unique research and public real-time science education opportunities. Ecosystem behavior can be studied since initial state conditions can be precisely specified and tracked over different ranges of time. With material closure (apart from very small air exchange rate which can be determined), biogeochemical cycles between soil and soil microorganisms, water, plants, and atmosphere can be studied in detail. Such studies offer a major advance from studies conducted with phytotrons which because of their small size, limit the number of organisms to a very small number, and which crucially do not have a high degree of atmospheric, water and overall material closure. Modular biospheres take advantage of the unique properties of closure, as representing a distinct system "metabolism" and therefore are essentially a "mini-world". Though relatively large in comparison with most phytotrons and ecological microcosms, which are now standard research and educational tools, modular biospheres are small enough that they can be economically reconfigured to reflect a changing research agenda. Some design elements include lighting via electric lights and/or sunlight, hydroponic or soil substrate for plants, opaque or glazed structures, and variable volume chambers or other methods to handle atmospheric pressure

  13. AUTOMATED INADVERTENT INTRUDER APPLICATION

    International Nuclear Information System (INIS)

    Koffman, L; Patricia Lee, P; Jim Cook, J; Elmer Wilhite, E

    2007-01-01

    The Environmental Analysis and Performance Modeling group of Savannah River National Laboratory (SRNL) conducts performance assessments of the Savannah River Site (SRS) low-level waste facilities to meet the requirements of DOE Order 435.1. These performance assessments, which result in limits on the amounts of radiological substances that can be placed in the waste disposal facilities, consider numerous potential exposure pathways that could occur in the future. One set of exposure scenarios, known as inadvertent intruder analysis, considers the impact on hypothetical individuals who are assumed to inadvertently intrude onto the waste disposal site. Inadvertent intruder analysis considers three distinct scenarios for exposure referred to as the agriculture scenario, the resident scenario, and the post-drilling scenario. Each of these scenarios has specific exposure pathways that contribute to the overall dose for the scenario. For the inadvertent intruder analysis, the calculation of dose for the exposure pathways is a relatively straightforward algebraic calculation that utilizes dose conversion factors. Prior to 2004, these calculations were performed using an Excel spreadsheet. However, design checks of the spreadsheet calculations revealed that errors could be introduced inadvertently when copying spreadsheet formulas cell by cell and finding these errors was tedious and time consuming. This weakness led to the specification of functional requirements to create a software application that would automate the calculations for inadvertent intruder analysis using a controlled source of input parameters. This software application, named the Automated Inadvertent Intruder Application, has undergone rigorous testing of the internal calculations and meets software QA requirements. The Automated Inadvertent Intruder Application was intended to replace the previous spreadsheet analyses with an automated application that was verified to produce the same calculations and

  14. M1 mirror print-through investigation and performance on the thermo-opto-mechanical testbed for the Space Interferometry Mission

    Science.gov (United States)

    Feria, V. Alfonso; Lam, Jonathan; Van Buren, Dave

    2006-06-01

    SIM PlanetQuest (SIM) is a large (9-meter baseline) space-borne optical interferometer that will determine the position and distance of stars to high accuracy. With microarcsecond measurements SIM will probe nearby stars for Earth-sized planets. To achieve this precision, SIM requires very tight manufacturing tolerances and high stability of optical components. To reduce technical risks, the SIM project developed an integrated thermal, mechanical and optical testbed (TOM3) to allow predictions of the system performance at the required high precision. The TOM3 testbed used full-scale brassboard optical components and picometer-class metrology to reach the SIM target performance levels. During the testbed integration and after one of the testbed mirrors, M1, was bonded into its mount, some surface distortion dimples that exceeded the optical specification were discovered. A detailed finite element model was used to analyze different load cases to try to determine the source of the M1 surface deformations. The same model was also used to compare with actual deformations due to varied thermal conditions on the TOM3 testbed. This paper presents the studies carried out to determine the source of the surface distortions on the M1 mirror as well as comparison and model validation during testing. This research was carried out at the Jet Propulsion Laboratory, California Institute of Technology, under a contract with the National Aeronautics and Space Administration.

  15. Automation, Performance and International Competition

    DEFF Research Database (Denmark)

    Kromann, Lene; Sørensen, Anders

    This paper presents new evidence on trade‐induced automation in manufacturing firms using unique data combining a retrospective survey that we have assembled with register data for 2005‐2010. In particular, we establish a causal effect where firms that have specialized in product types for which...... the Chinese exports to the world market has risen sharply invest more in automated capital compared to firms that have specialized in other product types. We also study the relationship between automation and firm performance and find that firms with high increases in scale and scope of automation have faster...... productivity growth than other firms. Moreover, automation improves the efficiency of all stages of the production process by reducing setup time, run time, and inspection time and increasing uptime and quantity produced per worker. The efficiency improvement varies by type of automation....

  16. Automation in organizations: Eternal conflict

    Science.gov (United States)

    Dieterly, D. L.

    1981-01-01

    Some ideas on and insights into the problems associated with automation in organizations are presented with emphasis on the concept of automation, its relationship to the individual, and its impact on system performance. An analogy is drawn, based on an American folk hero, to emphasize the extent of the problems encountered when dealing with automation within an organization. A model is proposed to focus attention on a set of appropriate dimensions. The function allocation process becomes a prominent aspect of the model. The current state of automation research is mentioned in relation to the ideas introduced. Proposed directions for an improved understanding of automation's effect on the individual's efficiency are discussed. The importance of understanding the individual's perception of the system in terms of the degree of automation is highlighted.

  17. Full-Scaled Advanced Systems Testbed: Ensuring Success of Adaptive Control Research Through Project Lifecycle Risk Mitigation

    Science.gov (United States)

    Pavlock, Kate M.

    2011-01-01

    The National Aeronautics and Space Administration's Dryden Flight Research Center completed flight testing of adaptive controls research on the Full-Scale Advance Systems Testbed (FAST) in January of 2011. The research addressed technical challenges involved with reducing risk in an increasingly complex and dynamic national airspace. Specific challenges lie with the development of validated, multidisciplinary, integrated aircraft control design tools and techniques to enable safe flight in the presence of adverse conditions such as structural damage, control surface failures, or aerodynamic upsets. The testbed is an F-18 aircraft serving as a full-scale vehicle to test and validate adaptive flight control research and lends a significant confidence to the development, maturation, and acceptance process of incorporating adaptive control laws into follow-on research and the operational environment. The experimental systems integrated into FAST were designed to allow for flexible yet safe flight test evaluation and validation of modern adaptive control technologies and revolve around two major hardware upgrades: the modification of Production Support Flight Control Computers (PSFCC) and integration of two, fourth-generation Airborne Research Test Systems (ARTS). Post-hardware integration verification and validation provided the foundation for safe flight test of Nonlinear Dynamic Inversion and Model Reference Aircraft Control adaptive control law experiments. To ensure success of flight in terms of cost, schedule, and test results, emphasis on risk management was incorporated into early stages of design and flight test planning and continued through the execution of each flight test mission. Specific consideration was made to incorporate safety features within the hardware and software to alleviate user demands as well as into test processes and training to reduce human factor impacts to safe and successful flight test. This paper describes the research configuration

  18. Development of Open Test-bed for Autonomous Operation in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Kim, Seungmin; Heo, Gyunyoung

    2017-01-01

    Nuclear power plants also recognize the need for automation. However, it is dangerous technology to have a significant impact on human society. In addition, due to the uncertain legal responsibility for autonomous operation, the application and development speed of nuclear energy related automation technology will be significantly decrease compared to other industries. It is argued that the application of AI and automation technology to power plants should not be prematurely applied or not based on the principle of applying proven technology since nuclear power plants are the highest level security operated facilities. As described above, the overall algorithm of the Test Bed is an autonomous operation algorithm (rulebased algorithm, learning-based algorithm, semiautonomous operation algorithm) to judge the entry condition of the procedure through condition monitoring and to enter the appropriate operating procedure. In order to make a test bed, the investigation for the heuristic part of the existing procedures and the heuristic part from the circumstance which is not specified in the procedure is needed. In the learning based and semi-autonomous operation algorithms, using MARS to extract its operating data and operational logs and try out various diagnostic algorithms as described above. Through the completion of these future tasks, the test bed which can compared with actual operators will be constructed and that it will be able to check its effectiveness by improving competitively with other research teams through the characteristics of shared platform.

  19. Automation System Products and Research

    OpenAIRE

    Rintala, Mikko; Sormunen, Jussi; Kuisma, Petri; Rahkala, Matti

    2014-01-01

    Automation systems are used in most buildings nowadays. In the past they were mainly used in industry to control and monitor critical systems. During the past few decades the automation systems have become more common and are used today from big industrial solutions to homes of private customers. With the growing need for ecologic and cost-efficient management systems, home and building automation systems are becoming a standard way of controlling lighting, ventilation, heating etc. Auto...

  20. Guidelines for Automation Project Execution

    OpenAIRE

    Takkinen, Heidi

    2011-01-01

    The purpose of this Master’s thesis was to create instructions for executing an automation project. Sarlin Oy Ab needed directions on how to execute an automation project. Sarlin is starting up a new business area offering total project solutions for customers. Sarlin focuses on small and minor automation projects on domestic markets. The thesis represents issues related to project execution starting from the theory of the project to its kick-off and termination. Site work is one importan...

  1. 78 FR 53466 - Modification of Two National Customs Automation Program (NCAP) Tests Concerning Automated...

    Science.gov (United States)

    2013-08-29

    ... Customs Automation Program (NCAP) Tests Concerning Automated Commercial Environment (ACE) Document Image... National Customs Automation Program (NCAP) tests concerning document imaging, known as the Document Image... the National Customs Automation Program (NCAP) tests concerning document imaging, known as the...

  2. NASA Langley's AirSTAR Testbed: A Subscale Flight Test Capability for Flight Dynamics and Control System Experiments

    Science.gov (United States)

    Jordan, Thomas L.; Bailey, Roger M.

    2008-01-01

    As part of the Airborne Subscale Transport Aircraft Research (AirSTAR) project, NASA Langley Research Center (LaRC) has developed a subscaled flying testbed in order to conduct research experiments in support of the goals of NASA s Aviation Safety Program. This research capability consists of three distinct components. The first of these is the research aircraft, of which there are several in the AirSTAR stable. These aircraft range from a dynamically-scaled, twin turbine vehicle to a propeller driven, off-the-shelf airframe. Each of these airframes carves out its own niche in the research test program. All of the airplanes have sophisticated on-board data acquisition and actuation systems, recording, telemetering, processing, and/or receiving data from research control systems. The second piece of the testbed is the ground facilities, which encompass the hardware and software infrastructure necessary to provide comprehensive support services for conducting flight research using the subscale aircraft, including: subsystem development, integrated testing, remote piloting of the subscale aircraft, telemetry processing, experimental flight control law implementation and evaluation, flight simulation, data recording/archiving, and communications. The ground facilities are comprised of two major components: (1) The Base Research Station (BRS), a LaRC laboratory facility for system development, testing and data analysis, and (2) The Mobile Operations Station (MOS), a self-contained, motorized vehicle serving as a mobile research command/operations center, functionally equivalent to the BRS, capable of deployment to remote sites for supporting flight tests. The third piece of the testbed is the test facility itself. Research flights carried out by the AirSTAR team are conducted at NASA Wallops Flight Facility (WFF) on the Eastern Shore of Virginia. The UAV Island runway is a 50 x 1500 paved runway that lies within restricted airspace at Wallops Flight Facility. The

  3. Use of Bioregenerative Technologies for Advanced Life Support: Some Considerations for BIO-Plex and Related Testbeds

    Science.gov (United States)

    Wheeler, Raymond M.; Strayer, Richard F.

    1997-01-01

    A review of bioregenerative life support concepts is provided as a guide for developing ground-based testbeds for NASA's Advanced Life Support Program. Key among these concepts are the use of controlled environment plant culture for the production of food, oxygen, and clean water, and the use of bacterial bioreactors for degrading wastes and recycling nutrients. Candidate crops and specific bioreactor approaches are discussed based on experiences from the. Kennedy Space Center Advanced Life Support Breadboard Project, and a review of related literature is provided.

  4. World-wide distribution automation systems

    International Nuclear Information System (INIS)

    Devaney, T.M.

    1994-01-01

    A worldwide power distribution automation system is outlined. Distribution automation is defined and the status of utility automation is discussed. Other topics discussed include a distribution management system, substation feeder, and customer functions, potential benefits, automation costs, planning and engineering considerations, automation trends, databases, system operation, computer modeling of system, and distribution management systems

  5. Contaminant analysis automation, an overview

    International Nuclear Information System (INIS)

    Hollen, R.; Ramos, O. Jr.

    1996-01-01

    To meet the environmental restoration and waste minimization goals of government and industry, several government laboratories, universities, and private companies have formed the Contaminant Analysis Automation (CAA) team. The goal of this consortium is to design and fabricate robotics systems that standardize and automate the hardware and software of the most common environmental chemical methods. In essence, the CAA team takes conventional, regulatory- approved (EPA Methods) chemical analysis processes and automates them. The automation consists of standard laboratory modules (SLMs) that perform the work in a much more efficient, accurate, and cost- effective manner

  6. Development and application of an actively controlled hybrid proton exchange membrane fuel cell - Lithium-ion battery laboratory test-bed based on off-the-shelf components

    Energy Technology Data Exchange (ETDEWEB)

    Yufit, V.; Brandon, N.P. [Dept. Earth Science and Engineering, Imperial College, London SW7 2AZ (United Kingdom)

    2011-01-15

    The use of commercially available components enables rapid prototyping and assembling of laboratory scale hybrid test-bed systems, which can be used to evaluate new hybrid configurations. The development of such a test-bed using an off-the-shelf PEM fuel cell, lithium-ion battery and DC/DC converter is presented here, and its application to a hybrid configuration appropriate for an unmanned underwater vehicle is explored. A control algorithm was implemented to regulate the power share between the fuel cell and the battery with a graphical interface to control, record and analyze the electrochemical and thermal parameters of the system. The results demonstrate the applicability of the test-bed and control algorithm for this application, and provide data on the dynamic electrical and thermal behaviour of the hybrid system. (author)

  7. Automating CPM-GOMS

    Science.gov (United States)

    John, Bonnie; Vera, Alonso; Matessa, Michael; Freed, Michael; Remington, Roger

    2002-01-01

    CPM-GOMS is a modeling method that combines the task decomposition of a GOMS analysis with a model of human resource usage at the level of cognitive, perceptual, and motor operations. CPM-GOMS models have made accurate predictions about skilled user behavior in routine tasks, but developing such models is tedious and error-prone. We describe a process for automatically generating CPM-GOMS models from a hierarchical task decomposition expressed in a cognitive modeling tool called Apex. Resource scheduling in Apex automates the difficult task of interleaving the cognitive, perceptual, and motor resources underlying common task operators (e.g. mouse move-and-click). Apex's UI automatically generates PERT charts, which allow modelers to visualize a model's complex parallel behavior. Because interleaving and visualization is now automated, it is feasible to construct arbitrarily long sequences of behavior. To demonstrate the process, we present a model of automated teller interactions in Apex and discuss implications for user modeling. available to model human users, the Goals, Operators, Methods, and Selection (GOMS) method [6, 21] has been the most widely used, providing accurate, often zero-parameter, predictions of the routine performance of skilled users in a wide range of procedural tasks [6, 13, 15, 27, 28]. GOMS is meant to model routine behavior. The user is assumed to have methods that apply sequences of operators and to achieve a goal. Selection rules are applied when there is more than one method to achieve a goal. Many routine tasks lend themselves well to such decomposition. Decomposition produces a representation of the task as a set of nested goal states that include an initial state and a final state. The iterative decomposition into goals and nested subgoals can terminate in primitives of any desired granularity, the choice of level of detail dependent on the predictions required. Although GOMS has proven useful in HCI, tools to support the

  8. Automating dipole subtraction

    International Nuclear Information System (INIS)

    Hasegawa, K.; Moch, S.; Uwer, P.

    2008-07-01

    We report on automating the Catani-Seymour dipole subtraction which is a general procedure to treat infrared divergences in real emission processes at next-to-leading order in QCD. The automatization rests on three essential steps: the creation of the dipole terms, the calculation of the color linked squared Born matrix elements, and the evaluation of different helicity amplitudes. The routines have been tested for a number of complex processes, such as the real emission process gg→t anti tggg. (orig.)

  9. Automating dipole subtraction

    Energy Technology Data Exchange (ETDEWEB)

    Hasegawa, K.; Moch, S. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Uwer, P. [Karlsruhe Univ. (T.H.) (Germany). Inst. fuer Theoretische Teilchenphysik

    2008-07-15

    We report on automating the Catani-Seymour dipole subtraction which is a general procedure to treat infrared divergences in real emission processes at next-to-leading order in QCD. The automatization rests on three essential steps: the creation of the dipole terms, the calculation of the color linked squared Born matrix elements, and the evaluation of different helicity amplitudes. The routines have been tested for a number of complex processes, such as the real emission process gg{yields}t anti tggg. (orig.)

  10. Fossil power plant automation

    International Nuclear Information System (INIS)

    Divakaruni, S.M.; Touchton, G.

    1991-01-01

    This paper elaborates on issues facing the utilities industry and seeks to address how new computer-based control and automation technologies resulting from recent microprocessor evolution, can improve fossil plant operations and maintenance. This in turn can assist utilities to emerge stronger from the challenges ahead. Many presentations at the first ISA/EPRI co-sponsored conference are targeted towards improving the use of computer and control systems in the fossil and nuclear power plants and we believe this to be the right forum to share our ideas

  11. Passive Thermal Design Approach for the Space Communications and Navigation (SCaN) Testbed Experiment on the International Space Station (ISS)

    Science.gov (United States)

    Siamidis, John; Yuko, Jim

    2014-01-01

    The Space Communications and Navigation (SCaN) Program Office at NASA Headquarters oversees all of NASAs space communications activities. SCaN manages and directs the ground-based facilities and services provided by the Deep Space Network (DSN), Near Earth Network (NEN), and the Space Network (SN). Through the SCaN Program Office, NASA GRC developed a Software Defined Radio (SDR) testbed experiment (SCaN testbed experiment) for use on the International Space Station (ISS). It is comprised of three different SDR radios, the Jet Propulsion Laboratory (JPL) radio, Harris Corporation radio, and the General Dynamics Corporation radio. The SCaN testbed experiment provides an on-orbit, adaptable, SDR Space Telecommunications Radio System (STRS) - based facility to conduct a suite of experiments to advance the Software Defined Radio, Space Telecommunications Radio Systems (STRS) standards, reduce risk (Technology Readiness Level (TRL) advancement) for candidate Constellation future space flight hardware software, and demonstrate space communication links critical to future NASA exploration missions. The SCaN testbed project provides NASA, industry, other Government agencies, and academic partners the opportunity to develop and field communications, navigation, and networking technologies in the laboratory and space environment based on reconfigurable, software defined radio platforms and the STRS Architecture.The SCaN testbed is resident on the P3 Express Logistics Carrier (ELC) on the exterior truss of the International Space Station (ISS). The SCaN testbed payload launched on the Japanese Aerospace Exploration Agency (JAXA) H-II Transfer Vehicle (HTV) and was installed on the ISS P3 ELC located on the inboard RAM P3 site. The daily operations and testing are managed out of NASA GRC in the Telescience Support Center (TSC).

  12. The Climate-G testbed: towards a large scale data sharing environment for climate change

    Science.gov (United States)

    Aloisio, G.; Fiore, S.; Denvil, S.; Petitdidier, M.; Fox, P.; Schwichtenberg, H.; Blower, J.; Barbera, R.

    2009-04-01

    The Climate-G testbed provides an experimental large scale data environment for climate change addressing challenging data and metadata management issues. The main scope of Climate-G is to allow scientists to carry out geographical and cross-institutional climate data discovery, access, visualization and sharing. Climate-G is a multidisciplinary collaboration involving both climate and computer scientists and it currently involves several partners such as: Centro Euro-Mediterraneo per i Cambiamenti Climatici (CMCC), Institut Pierre-Simon Laplace (IPSL), Fraunhofer Institut für Algorithmen und Wissenschaftliches Rechnen (SCAI), National Center for Atmospheric Research (NCAR), University of Reading, University of Catania and University of Salento. To perform distributed metadata search and discovery, we adopted a CMCC metadata solution (which provides a high level of scalability, transparency, fault tolerance and autonomy) leveraging both on P2P and grid technologies (GRelC Data Access and Integration Service). Moreover, data are available through OPeNDAP/THREDDS services, Live Access Server as well as the OGC compliant Web Map Service and they can be downloaded, visualized, accessed into the proposed environment through the Climate-G Data Distribution Centre (DDC), the web gateway to the Climate-G digital library. The DDC is a data-grid portal allowing users to easily, securely and transparently perform search/discovery, metadata management, data access, data visualization, etc. Godiva2 (integrated into the DDC) displays 2D maps (and animations) and also exports maps for display on the Google Earth virtual globe. Presently, Climate-G publishes (through the DDC) about 2TB of data related to the ENSEMBLES project (also including distributed replicas of data) as well as to the IPCC AR4. The main results of the proposed work are: wide data access/sharing environment for climate change; P2P/grid metadata approach; production-level Climate-G DDC; high quality tools for

  13. First results of the Test-Bed Telescopes (TBT) project: Cebreros telescope commissioning

    Science.gov (United States)

    Ocaña, Francisco; Ibarra, Aitor; Racero, Elena; Montero, Ángel; Doubek, Jirí; Ruiz, Vicente

    2016-07-01

    The TBT project is being developed under ESA's General Studies and Technology Programme (GSTP), and shall implement a test-bed for the validation of an autonomous optical observing system in a realistic scenario within the Space Situational Awareness (SSA) programme of the European Space Agency (ESA). The goal of the project is to provide two fully robotic telescopes, which will serve as prototypes for development of a future network. The system consists of two telescopes, one in Spain and the second one in the Southern Hemisphere. The telescope is a fast astrograph with a large Field of View (FoV) of 2.5 x 2.5 square-degrees and a plate scale of 2.2 arcsec/pixel. The tube is mounted on a fast direct-drive mount moving with speed up to 20 degrees per second. The focal plane hosts a 2-port 4K x 4K back-illuminated CCD with readout speeds up to 1MHz per port. All these characteristics ensure good survey performance for transients and fast moving objects. Detection software and hardware are optimised for the detection of NEOs and objects in high Earth orbits (objects moving from 0.1-40 arcsec/second). Nominal exposures are in the range from 2 to 30 seconds, depending on the observational strategy. Part of the validation scenario involves the scheduling concept integrated in the robotic operations for both sensors. Every night it takes all the input needed and prepares a schedule following predefined rules allocating tasks for the telescopes. Telescopes are managed by RTS2 control software, that performs the real-time scheduling of the observation and manages all the devices at the observatory.1 At the end of the night the observing systems report astrometric positions and photometry of the objects detected. The first telescope was installed in Cebreros Satellite Tracking Station in mid-2015. It is currently in the commissioning phase and we present here the first results of the telescope. We evaluate the site characteristics and the performance of the TBT Cebreros

  14. A Future Accelerated Cognitive Distributed Hybrid Testbed for Big Data Science Analytics

    Science.gov (United States)

    Halem, M.; Prathapan, S.; Golpayegani, N.; Huang, Y.; Blattner, T.; Dorband, J. E.

    2016-12-01

    As increased sensor spectral data volumes from current and future Earth Observing satellites are assimilated into high-resolution climate models, intensive cognitive machine learning technologies are needed to data mine, extract and intercompare model outputs. It is clear today that the next generation of computers and storage, beyond petascale cluster architectures, will be data centric. They will manage data movement and process data in place. Future cluster nodes have been announced that integrate multiple CPUs with high-speed links to GPUs and MICS on their backplanes with massive non-volatile RAM and access to active flash RAM disk storage. Active Ethernet connected key value store disk storage drives with 10Ge or higher are now available through the Kinetic Open Storage Alliance. At the UMBC Center for Hybrid Multicore Productivity Research, a future state-of-the-art Accelerated Cognitive Computer System (ACCS) for Big Data science is being integrated into the current IBM iDataplex computational system `bluewave'. Based on the next gen IBM 200 PF Sierra processor, an interim two node IBM Power S822 testbed is being integrated with dual Power 8 processors with 10 cores, 1TB Ram, a PCIe to a K80 GPU and an FPGA Coherent Accelerated Processor Interface card to 20TB Flash Ram. This system is to be updated to the Power 8+, an NVlink 1.0 with the Pascal GPU late in 2016. Moreover, the Seagate 96TB Kinetic Disk system with 24 Ethernet connected active disks is integrated into the ACCS storage system. A Lightweight Virtual File System developed at the NASA GSFC is installed on bluewave. Since remote access to publicly available quantum annealing computers is available at several govt labs, the ACCS will offer an in-line Restricted Boltzmann Machine optimization capability to the D-Wave 2X quantum annealing processor over the campus high speed 100 Gb network to Internet 2 for large files. As an evaluation test of the cognitive functionality of the architecture, the

  15. Future space-based direct imaging platforms: high fidelity simulations and instrument testbed development

    Science.gov (United States)

    Hicks, Brian A.; Eberhardt, Andrew; SAINT, VNC, LUVOIR

    2017-06-01

    The direct detection and characterization of habitable zone (HZ) Earth-like exoplanets is predicated on light gathering power of a large telescope operating with tens of millicarcsecond angular resolution, and at contrast scales on the order of 0.1 ppb. Accessing a statistically significant sample of planets to search for habitable worlds will likely build on the knowledge and insfrastructure gained through JWST, later advancing to assembly in space or formation flying approaches that may eventually be used to achieve even greater photometric sensitivity or resolution. in order to address contrast, a means of starlight suppression is needed that contends with complex aperture diffraction. The Visible Nulling Coronagraph (VNC) is one such approach that destructively interferes starlight to enable detection and characterization of extrasolar objects.The VNC is being incorporated into an end-to-end telescope-coronagraph system demonstrator called the Segmented Aperture Interferometric Nulling Testbed (SAINT). Development of the VNC has a rich legacy, and successfully demonstrating its capability with SAINT will mark milestones towards meeting the high-contrast direct imaging needs of future large space telescopes. SAINT merges the VNC with an actively-controlled segmented aperture telescope via a fine pointing system and aims to demonstrate 1e-8 contrast nulling of a segmented aperture at an inner working angle of four diffraction radii over a 20 nm visible bandpass. The system comprises four detectors for wavefront sensing, one of which is the high-contrast focal plane. The detectors provide feedback to control the segmented telescope primary mirror, a fast steering mirror, a segmented deformable mirror, and a delay stage. All of these components must work in concert with passive optical elements that are designed, fabricated, and aligned pairwise to achieve the requisite wavefront symmetry needed to push the state of the art in broadband destructive interferometric

  16. Automated Test Case Generation

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    I would like to present the concept of automated test case generation. I work on it as part of my PhD and I think it would be interesting also for other people. It is also the topic of a workshop paper that I am introducing in Paris. (abstract below) Please note that the talk itself would be more general and not about the specifics of my PhD, but about the broad field of Automated Test Case Generation. I would introduce the main approaches (combinatorial testing, symbolic execution, adaptive random testing) and their advantages and problems. (oracle problem, combinatorial explosion, ...) Abstract of the paper: Over the last decade code-based test case generation techniques such as combinatorial testing or dynamic symbolic execution have seen growing research popularity. Most algorithms and tool implementations are based on finding assignments for input parameter values in order to maximise the execution branch coverage. Only few of them consider dependencies from outside the Code Under Test’s scope such...

  17. Automating quantum experiment control

    Science.gov (United States)

    Stevens, Kelly E.; Amini, Jason M.; Doret, S. Charles; Mohler, Greg; Volin, Curtis; Harter, Alexa W.

    2017-03-01

    The field of quantum information processing is rapidly advancing. As the control of quantum systems approaches the level needed for useful computation, the physical hardware underlying the quantum systems is becoming increasingly complex. It is already becoming impractical to manually code control for the larger hardware implementations. In this chapter, we will employ an approach to the problem of system control that parallels compiler design for a classical computer. We will start with a candidate quantum computing technology, the surface electrode ion trap, and build a system instruction language which can be generated from a simple machine-independent programming language via compilation. We incorporate compile time generation of ion routing that separates the algorithm description from the physical geometry of the hardware. Extending this approach to automatic routing at run time allows for automated initialization of qubit number and placement and additionally allows for automated recovery after catastrophic events such as qubit loss. To show that these systems can handle real hardware, we present a simple demonstration system that routes two ions around a multi-zone ion trap and handles ion loss and ion placement. While we will mainly use examples from transport-based ion trap quantum computing, many of the issues and solutions are applicable to other architectures.

  18. AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT ...

    Science.gov (United States)

    The Automated Geospatial Watershed Assessment tool (AGWA) is a GIS interface jointly developed by the USDA Agricultural Research Service, the U.S. Environmental Protection Agency, the University of Arizona, and the University of Wyoming to automate the parameterization and execution of the Soil Water Assessment Tool (SWAT) and KINEmatic Runoff and EROSion (KINEROS2) hydrologic models. The application of these two models allows AGWA to conduct hydrologic modeling and watershed assessments at multiple temporal and spatial scales. AGWA’s current outputs are runoff (volumes and peaks) and sediment yield, plus nitrogen and phosphorus with the SWAT model. AGWA uses commonly available GIS data layers to fully parameterize, execute, and visualize results from both models. Through an intuitive interface the user selects an outlet from which AGWA delineates and discretizes the watershed using a Digital Elevation Model (DEM) based on the individual model requirements. The watershed model elements are then intersected with soils and land cover data layers to derive the requisite model input parameters. The chosen model is then executed, and the results are imported back into AGWA for visualization. This allows managers to identify potential problem areas where additional monitoring can be undertaken or mitigation activities can be focused. AGWA also has tools to apply an array of best management practices. There are currently two versions of AGWA available; AGWA 1.5 for

  19. Maneuver Automation Software

    Science.gov (United States)

    Uffelman, Hal; Goodson, Troy; Pellegrin, Michael; Stavert, Lynn; Burk, Thomas; Beach, David; Signorelli, Joel; Jones, Jeremy; Hahn, Yungsun; Attiyah, Ahlam; hide

    2009-01-01

    The Maneuver Automation Software (MAS) automates the process of generating commands for maneuvers to keep the spacecraft of the Cassini-Huygens mission on a predetermined prime mission trajectory. Before MAS became available, a team of approximately 10 members had to work about two weeks to design, test, and implement each maneuver in a process that involved running many maneuver-related application programs and then serially handing off data products to other parts of the team. MAS enables a three-member team to design, test, and implement a maneuver in about one-half hour after Navigation has process-tracking data. MAS accepts more than 60 parameters and 22 files as input directly from users. MAS consists of Practical Extraction and Reporting Language (PERL) scripts that link, sequence, and execute the maneuver- related application programs: "Pushing a single button" on a graphical user interface causes MAS to run navigation programs that design a maneuver; programs that create sequences of commands to execute the maneuver on the spacecraft; and a program that generates predictions about maneuver performance and generates reports and other files that enable users to quickly review and verify the maneuver design. MAS can also generate presentation materials, initiate electronic command request forms, and archive all data products for future reference.

  20. Automated screening for retinopathy

    Directory of Open Access Journals (Sweden)

    A. S. Rodin

    2014-07-01

    Full Text Available Retinal pathology is a common cause of an irreversible decrease of central vision commonly found amongst senior population. Detection of the earliest signs of retinal diseases can be facilitated by viewing retinal images available from the telemedicine networks. To facilitate the process of retinal images, screening software applications based on image recognition technology are currently on the various stages of development.Purpose: To develop and implement computerized image recognition software that can be used as a decision support technologyfor retinal image screening for various types of retinopathies.Methods: The software application for the retina image recognition has been developed using C++ language. It was tested on dataset of 70 images with various types of pathological features (age related macular degeneration, chorioretinitis, central serous chorioretinopathy and diabetic retinopathy.Results: It was shown that the system can achieve a sensitivity of 73 % and specificity of 72 %.Conclusion: Automated detection of macular lesions using proposed software can significantly reduce manual grading workflow. In addition, automated detection of retinal lesions can be implemented as a clinical decision support system for telemedicine screening. It is anticipated that further development of this technology can become a part of diagnostic image analysis system for the electronic health records.

  1. Automation from pictures

    International Nuclear Information System (INIS)

    Kozubal, A.J.

    1992-01-01

    The state transition diagram (STD) model has been helpful in the design of real time software, especially with the emergence of graphical computer aided software engineering (CASE) tools. Nevertheless, the translation of the STD to real time code has in the past been primarily a manual task. At Los Alamos we have automated this process. The designer constructs the STD using a CASE tool (Cadre Teamwork) using a special notation for events and actions. A translator converts the STD into an intermediate state notation language (SNL), and this SNL is compiled directly into C code (a state program). Execution of the state program is driven by external events, allowing multiple state programs to effectively share the resources of the host processor. Since the design and the code are tightly integrated through the CASE tool, the design and code never diverge, and we avoid design obsolescence. Furthermore, the CASE tool automates the production of formal technical documents from the graphic description encapsulated by the CASE tool. (author)

  2. Automated digital magnetofluidics

    Energy Technology Data Exchange (ETDEWEB)

    Schneider, J; Garcia, A A; Marquez, M [Harrington Department of Bioengineering Arizona State University, Tempe AZ 85287-9709 (United States)], E-mail: tony.garcia@asu.edu

    2008-08-15

    Drops can be moved in complex patterns on superhydrophobic surfaces using a reconfigured computer-controlled x-y metrology stage with a high degree of accuracy, flexibility, and reconfigurability. The stage employs a DMC-4030 controller which has a RISC-based, clock multiplying processor with DSP functions, accepting encoder inputs up to 22 MHz, provides servo update rates as high as 32 kHz, and processes commands at rates as fast as 40 milliseconds. A 6.35 mm diameter cylindrical NdFeB magnet is translated by the stage causing water drops to move by the action of induced magnetization of coated iron microspheres that remain in the drop and are attracted to the rare earth magnet through digital magnetofluidics. Water drops are easily moved in complex patterns in automated digital magnetofluidics at an average speed of 2.8 cm/s over a superhydrophobic polyethylene surface created by solvent casting. With additional components, some potential uses for this automated microfluidic system include characterization of superhydrophobic surfaces, water quality analysis, and medical diagnostics.

  3. Automated digital magnetofluidics

    Science.gov (United States)

    Schneider, J.; Garcia, A. A.; Marquez, M.

    2008-08-01

    Drops can be moved in complex patterns on superhydrophobic surfaces using a reconfigured computer-controlled x-y metrology stage with a high degree of accuracy, flexibility, and reconfigurability. The stage employs a DMC-4030 controller which has a RISC-based, clock multiplying processor with DSP functions, accepting encoder inputs up to 22 MHz, provides servo update rates as high as 32 kHz, and processes commands at rates as fast as 40 milliseconds. A 6.35 mm diameter cylindrical NdFeB magnet is translated by the stage causing water drops to move by the action of induced magnetization of coated iron microspheres that remain in the drop and are attracted to the rare earth magnet through digital magnetofluidics. Water drops are easily moved in complex patterns in automated digital magnetofluidics at an average speed of 2.8 cm/s over a superhydrophobic polyethylene surface created by solvent casting. With additional components, some potential uses for this automated microfluidic system include characterization of superhydrophobic surfaces, water quality analysis, and medical diagnostics.

  4. Work Planing Automation at Mechanical Subdivision

    OpenAIRE

    Dzindzelėta, Vytautas

    2005-01-01

    Work planing automation, installation possibilities and future outlook at mechanical subdivision. To study how the work planing has changed before and after automation process and to analyse automation process methodology.

  5. Link Adaptation for Mitigating Earth-To-Space Propagation Effects on the NASA SCaN Testbed

    Science.gov (United States)

    Kilcoyne, Deirdre K.; Headley, William C.; Leffke, Zach J.; Rowe, Sonya A.; Mortensen, Dale J.; Reinhart, Richard C.; McGwier, Robert W.

    2016-01-01

    In Earth-to-Space communications, well-known propagation effects such as path loss and atmospheric loss can lead to fluctuations in the strength of the communications link between a satellite and its ground station. Additionally, the typically unconsidered effect of shadowing due to the geometry of the satellite and its solar panels can also lead to link degradation. As a result of these anticipated channel impairments, NASA's communication links have been traditionally designed to handle the worst-case impact of these effects through high link margins and static, lower rate, modulation formats. The work presented in this paper aims to relax these constraints by providing an improved trade-off between data rate and link margin through utilizing link adaptation. More specifically, this work provides a simulation study on the propagation effects impacting NASA's SCaN Testbed flight software-defined radio (SDR) as well as proposes a link adaptation algorithm that varies the modulation format of a communications link as its signal-to-noise ratio fluctuates. Ultimately, the models developed in this work will be utilized to conduct real-time flight experiments on-board the NASA SCaN Testbed.

  6. An SDR-Based Real-Time Testbed for GNSS Adaptive Array Anti-Jamming Algorithms Accelerated by GPU

    Directory of Open Access Journals (Sweden)

    Hailong Xu

    2016-03-01

    Full Text Available Nowadays, software-defined radio (SDR has become a common approach to evaluate new algorithms. However, in the field of Global Navigation Satellite System (GNSS adaptive array anti-jamming, previous work has been limited due to the high computational power demanded by adaptive algorithms, and often lack flexibility and configurability. In this paper, the design and implementation of an SDR-based real-time testbed for GNSS adaptive array anti-jamming accelerated by a Graphics Processing Unit (GPU are documented. This testbed highlights itself as a feature-rich and extendible platform with great flexibility and configurability, as well as high computational performance. Both Space-Time Adaptive Processing (STAP and Space-Frequency Adaptive Processing (SFAP are implemented with a wide range of parameters. Raw data from as many as eight antenna elements can be processed in real-time in either an adaptive nulling or beamforming mode. To fully take advantage of the parallelism resource provided by the GPU, a batched method in programming is proposed. Tests and experiments are conducted to evaluate both the computational and anti-jamming performance. This platform can be used for research and prototyping, as well as a real product in certain applications.

  7. Using the ISS as a testbed to prepare for the next generation of space-based telescopes

    Science.gov (United States)

    Postman, Marc; Sparks, William B.; Liu, Fengchuan; Ess, Kim; Green, Joseph; Carpenter, Kenneth G.; Thronson, Harley; Goullioud, Renaud

    2012-09-01

    The infrastructure available on the ISS provides a unique opportunity to develop the technologies necessary to assemble large space telescopes. Assembling telescopes in space is a game-changing approach to space astronomy. Using the ISS as a testbed enables a concentration of resources on reducing the technical risks associated with integrating the technologies, such as laser metrology and wavefront sensing and control (WFS&C), with the robotic assembly of major components including very light-weight primary and secondary mirrors and the alignment of the optical elements to a diffraction-limited optical system in space. The capability to assemble the optical system and remove and replace components via the existing ISS robotic systems such as the Special Purpose Dexterous Manipulator (SPDM), or by the ISS Flight Crew, allows for future experimentation as well as repair if necessary. In 2015, first light will be obtained by the Optical Testbed and Integration on ISS eXperiment (OpTIIX), a small 1.5-meter optical telescope assembled on the ISS. The primary objectives of OpTIIX include demonstrating telescope assembly technologies and end-to-end optical system technologies that will advance future large optical telescopes.

  8. Report of the Interagency Optical Network Testbeds Workshop 2, NASA Ames Research Center, September 12-14, 2005

    Science.gov (United States)

    2005-01-01

    The Optical Network Testbeds Workshop 2 (ONT2), held on September 12-14, 2005, was cosponsored by the Department of Energy Office of Science (DOE/SC) and the National Aeronautics and Space Administration (NASA), in cooperation with the Joint Engineering Team (JET) of the Federal Networking and Information Technology Research and Development (NITRD) Program's Large Scale Networking (LSN) Coordinating Group. The ONT2 workshop was a follow-on to an August 2004 Workshop on Optical Network Testbeds (ONT1). ONT1 recommended actions by the Federal agencies to assure timely development and implementation of optical networking technologies and infrastructure. Hosted by the NASA Ames Research Center in Mountain View, California, the ONT2 workshop brought together representatives of the U.S. advanced research and education (R&E) networks, regional optical networks (RONs), service providers, international networking organizations, and senior engineering and R&D managers from Federal agencies and national research laboratories. Its purpose was to develop a common vision of the optical network technologies, services, infrastructure, and organizations needed to enable widespread use of optical networks; recommend activities for transitioning the optical networking research community and its current infrastructure to leading-edge optical networks over the next three to five years; and present information enabling commercial network infrastructure providers to plan for and use leading-edge optical network services in that time frame.

  9. Conceptual Design for a Dual-Bell Rocket Nozzle System Using a NASA F-15 Airplane as the Flight Testbed

    Science.gov (United States)

    Jones, Daniel S.; Ruf, Joseph H.; Bui, Trong T.; Martinez, Martel; St. John, Clinton W.

    2014-01-01

    The dual-bell rocket nozzle was first proposed in 1949, offering a potential improvement in rocket nozzle performance over the conventional-bell nozzle. Despite the performance advantages that have been predicted, both analytically and through static test data, the dual-bell nozzle has still not been adequately tested in a relevant flight environment. In 2013 a proposal was constructed that offered a National Aeronautics and Space Administration (NASA) F-15 airplane as the flight testbed, with the plan to operate a dual-bell rocket nozzle during captive-carried flight. If implemented, this capability will permit nozzle operation into an external flow field similar to that of a launch vehicle, and facilitate an improved understanding of dual-bell nozzle plume sensitivity to external flow-field effects. More importantly, this flight testbed can be utilized to help quantify the performance benefit with the dual-bell nozzle, as well as to advance its technology readiness level. Toward this ultimate goal, this report provides plans for future flights to quantify the external flow field of the airplane near the nozzle experiment, as well as details on the conceptual design for the dual-bell nozzle cold-flow propellant feed system integration within the NASA F-15 Propulsion Flight Test Fixture. The current study shows that this concept of flight research is feasible, and could result in valuable flight data for the dual-bell nozzle.

  10. Automation for mineral resource development

    Energy Technology Data Exchange (ETDEWEB)

    Norrie, A.W.; Turner, D.R. (eds.)

    1986-01-01

    A total of 55 papers were presented at the symposium under the following headings: automation and the future of mining; modelling and control of mining processes; transportation for mining; automation and the future of metallurgical processes; modelling and control of metallurgical processes; and general aspects. Fifteen papers have been abstracted separately.

  11. Opening up Library Automation Software

    Science.gov (United States)

    Breeding, Marshall

    2009-01-01

    Throughout the history of library automation, the author has seen a steady advancement toward more open systems. In the early days of library automation, when proprietary systems dominated, the need for standards was paramount since other means of inter-operability and data exchange weren't possible. Today's focus on Application Programming…

  12. Resins production: batch plant automation

    International Nuclear Information System (INIS)

    Banti, M.; Mauri, G.

    1996-01-01

    Companies that look for automation in their plants without external resources, have at their disposal flexible, custom and easy to use DCS, open towards PLC. In this article it is explained why Hoechts has followed this way of new plants for resins production automation

  13. Automated Methods of Corrosion Measurements

    DEFF Research Database (Denmark)

    Andersen, Jens Enevold Thaulov

    1997-01-01

    . Mechanical control, recording, and data processing must therefore be automated to a high level of precision and reliability. These general techniques and the apparatus involved have been described extensively. The automated methods of such high-resolution microscopy coordinated with computerized...

  14. Migration monitoring with automated technology

    Science.gov (United States)

    Rhonda L. Millikin

    2005-01-01

    Automated technology can supplement ground-based methods of migration monitoring by providing: (1) unbiased and automated sampling; (2) independent validation of current methods; (3) a larger sample area for landscape-level analysis of habitat selection for stopover, and (4) an opportunity to study flight behavior. In particular, radar-acoustic sensor fusion can...

  15. Automated methods of corrosion measurement

    DEFF Research Database (Denmark)

    Andersen, Jens Enevold Thaulov; Bech-Nielsen, Gregers; Reeve, John Ch

    1997-01-01

    to revise assumptions regarding the basis of the method, which sometimes leads to the discovery of as-yet unnoticed phenomena. The present selection of automated methods for corrosion measurements is not motivated simply by the fact that a certain measurement can be performed automatically. Automation...... is applied to nearly all types of measurements today....

  16. Classification of Automated Search Traffic

    Science.gov (United States)

    Buehrer, Greg; Stokes, Jack W.; Chellapilla, Kumar; Platt, John C.

    As web search providers seek to improve both relevance and response times, they are challenged by the ever-increasing tax of automated search query traffic. Third party systems interact with search engines for a variety of reasons, such as monitoring a web site’s rank, augmenting online games, or possibly to maliciously alter click-through rates. In this paper, we investigate automated traffic (sometimes referred to as bot traffic) in the query stream of a large search engine provider. We define automated traffic as any search query not generated by a human in real time. We first provide examples of different categories of query logs generated by automated means. We then develop many different features that distinguish between queries generated by people searching for information, and those generated by automated processes. We categorize these features into two classes, either an interpretation of the physical model of human interactions, or as behavioral patterns of automated interactions. Using the these detection features, we next classify the query stream using multiple binary classifiers. In addition, a multiclass classifier is then developed to identify subclasses of both normal and automated traffic. An active learning algorithm is used to suggest which user sessions to label to improve the accuracy of the multiclass classifier, while also seeking to discover new classes of automated traffic. Performance analysis are then provided. Finally, the multiclass classifier is used to predict the subclass distribution for the search query stream.

  17. Automated Test-Form Generation

    Science.gov (United States)

    van der Linden, Wim J.; Diao, Qi

    2011-01-01

    In automated test assembly (ATA), the methodology of mixed-integer programming is used to select test items from an item bank to meet the specifications for a desired test form and optimize its measurement accuracy. The same methodology can be used to automate the formatting of the set of selected items into the actual test form. Three different…

  18. Automated evaluation of ultrasonic indications

    International Nuclear Information System (INIS)

    Hansch, M.K.T.; Stegemann, D.

    1994-01-01

    Future requirements of reliability and reproducibility in quality assurance demand computer evaluation of defect indications. The ultrasonic method with its large field of applications and a high potential for automation provides all preconditions for fully automated inspection. The survey proposes several desirable hardware improvements, data acquisition requirements and software configurations. (orig.) [de

  19. Automated Methods Of Corrosion Measurements

    DEFF Research Database (Denmark)

    Bech-Nielsen, Gregers; Andersen, Jens Enevold Thaulov; Reeve, John Ch

    1997-01-01

    The chapter describes the following automated measurements: Corrosion Measurements by Titration, Imaging Corrosion by Scanning Probe Microscopy, Critical Pitting Temperature and Application of the Electrochemical Hydrogen Permeation Cell.......The chapter describes the following automated measurements: Corrosion Measurements by Titration, Imaging Corrosion by Scanning Probe Microscopy, Critical Pitting Temperature and Application of the Electrochemical Hydrogen Permeation Cell....

  20. Robotics/Automated Systems Technicians.

    Science.gov (United States)

    Doty, Charles R.

    Major resources exist that can be used to develop or upgrade programs in community colleges and technical institutes that educate robotics/automated systems technicians. The first category of resources is Economic, Social, and Education Issues. The Office of Technology Assessment (OTA) report, "Automation and the Workplace," presents analyses of…

  1. Translation: Aids, Robots, and Automation.

    Science.gov (United States)

    Andreyewsky, Alexander

    1981-01-01

    Examines electronic aids to translation both as ways to automate it and as an approach to solve problems resulting from shortage of qualified translators. Describes the limitations of robotic MT (Machine Translation) systems, viewing MAT (Machine-Aided Translation) as the only practical solution and the best vehicle for further automation. (MES)

  2. Report of the Interagency Optical Network Testbeds Workshop 2 September 12-14, 2006 NASA Ames Research Center

    Energy Technology Data Exchange (ETDEWEB)

    Joe Mambretti Richard desJardins

    2006-05-01

    A new generation of optical networking services and technologies is rapidly changing the world of communications. National and international networks are implementing optical services to supplement traditional packet routed services. On September 12-14, 2005, the Optical Network Testbeds Workshop 2 (ONT2), an invitation-only forum hosted by the NASA Research and Engineering Network (NREN) and co-sponsored by the Department of Energy (DOE), was held at NASA Ames Research Center in Mountain View, California. The aim of ONT2 was to help the Federal Large Scale Networking Coordination Group (LSN) and its Joint Engineering Team (JET) to coordinate testbed and network roadmaps describing agency and partner organization views and activities for moving toward next generation communication services based on leading edge optical networks in the 3-5 year time frame. ONT2 was conceived and organized as a sequel to the first Optical Network Testbeds Workshop (ONT1, August 2004, www.nren.nasa.gov/workshop7). ONT1 resulted in a series of recommendations to LSN. ONT2 was designed to move beyond recommendations to agree on a series of “actionable objectives” that would proactively help federal and partner optical network testbeds and advanced research and education (R&E) networks to begin incorporating technologies and services representing the next generation of advanced optical networks in the next 1-3 years. Participants in ONT2 included representatives from innovative prototype networks (Panel A), basic optical network research testbeds (Panel B), and production R&D networks (Panels C and D), including “JETnets,” selected regional optical networks (RONs), international R&D networks, commercial network technology and service providers (Panel F), and senior engineering and R&D managers from LSN agencies and partner organizations. The overall goal of ONT2 was to identify and coordinate short and medium term activities and milestones for researching, developing, identifying

  3. Radiation beamline testbeds for the simulation of planetary and spacecraft environments for human and robotic mission risk assessment

    Science.gov (United States)

    Wilkins, Richard

    The Center for Radiation Engineering and Science for Space Exploration (CRESSE) at Prairie View A&M University, Prairie View, Texas, USA, is establishing an integrated, multi-disciplinary research program on the scientific and engineering challenges faced by NASA and the inter-national space community caused by space radiation. CRESSE focuses on space radiation research directly applicable to astronaut health and safety during future long term, deep space missions, including Martian, lunar, and other planetary body missions beyond low earth orbit. The research approach will consist of experimental and theoretical radiation modeling studies utilizing particle accelerator facilities including: 1. NASA Space Radiation Laboratory (NSRL) at Brookhaven National Laboratory; 2. Proton Synchrotron at Loma Linda University Med-ical Center; and 3. Los Alamos Neutron Science Center (LANSCE) at Los Alamos National Laboratory. Specifically, CRESSE investigators are designing, developing, and building experimental test beds that simulate the lunar and Martian radiation environments for experiments focused on risk assessment for astronauts and instrumentation. The testbeds have been designated the Bioastronautics Experimental Research Testbeds for Environmental Radiation Nostrum Investigations and Education (BERT and ERNIE). The designs of BERT and ERNIE will allow for a high degree of flexibility and adaptability to modify experimental configurations to simulate planetary surface environments, planetary habitats, and spacecraft interiors. In the nominal configuration, BERT and ERIE will consist of a set of experimental zones that will simulate the planetary atmosphere (Solid CO2 in the case of the Martian surface.), the planetary surface, and sub-surface regions. These experimental zones can be used for dosimetry, shielding, biological, and electronic effects radiation studies in support of space exploration missions. BERT and ERNIE are designed to be compatible with the

  4. Robust automated knowledge capture.

    Energy Technology Data Exchange (ETDEWEB)

    Stevens-Adams, Susan Marie; Abbott, Robert G.; Forsythe, James Chris; Trumbo, Michael Christopher Stefan; Haass, Michael Joseph; Hendrickson, Stacey M. Langfitt

    2011-10-01

    This report summarizes research conducted through the Sandia National Laboratories Robust Automated Knowledge Capture Laboratory Directed Research and Development project. The objective of this project was to advance scientific understanding of the influence of individual cognitive attributes on decision making. The project has developed a quantitative model known as RumRunner that has proven effective in predicting the propensity of an individual to shift strategies on the basis of task and experience related parameters. Three separate studies are described which have validated the basic RumRunner model. This work provides a basis for better understanding human decision making in high consequent national security applications, and in particular, the individual characteristics that underlie adaptive thinking.

  5. Printing quality control automation

    Science.gov (United States)

    Trapeznikova, O. V.

    2018-04-01

    One of the most important problems in the concept of standardizing the process of offset printing is the control the quality rating of printing and its automation. To solve the problem, a software has been developed taking into account the specifics of printing system components and the behavior in printing process. In order to characterize the distribution of ink layer on the printed substrate the so-called deviation of the ink layer thickness on the sheet from nominal surface is suggested. The geometric data construction the surface projections of the color gamut bodies allows to visualize the color reproduction gamut of printing systems in brightness ranges and specific color sectors, that provides a qualitative comparison of the system by the reproduction of individual colors in a varying ranges of brightness.

  6. Automated electronic filter design

    CERN Document Server

    Banerjee, Amal

    2017-01-01

    This book describes a novel, efficient and powerful scheme for designing and evaluating the performance characteristics of any electronic filter designed with predefined specifications. The author explains techniques that enable readers to eliminate complicated manual, and thus error-prone and time-consuming, steps of traditional design techniques. The presentation includes demonstration of efficient automation, using an ANSI C language program, which accepts any filter design specification (e.g. Chebyschev low-pass filter, cut-off frequency, pass-band ripple etc.) as input and generates as output a SPICE(Simulation Program with Integrated Circuit Emphasis) format netlist. Readers then can use this netlist to run simulations with any version of the popular SPICE simulator, increasing accuracy of the final results, without violating any of the key principles of the traditional design scheme.

  7. Berkeley automated supernova search

    Energy Technology Data Exchange (ETDEWEB)

    Kare, J.T.; Pennypacker, C.R.; Muller, R.A.; Mast, T.S.; Crawford, F.S.; Burns, M.S.

    1981-01-01

    The Berkeley automated supernova search employs a computer controlled 36-inch telescope and charge coupled device (CCD) detector to image 2500 galaxies per night. A dedicated minicomputer compares each galaxy image with stored reference data to identify supernovae in real time. The threshold for detection is m/sub v/ = 18.8. We plan to monitor roughly 500 galaxies in Virgo and closer every night, and an additional 6000 galaxies out to 70 Mpc on a three night cycle. This should yield very early detection of several supernovae per year for detailed study, and reliable premaximum detection of roughly 100 supernovae per year for statistical studies. The search should be operational in mid-1982.

  8. Automated asteroseismic peak detections

    DEFF Research Database (Denmark)

    de Montellano, Andres Garcia Saravia Ortiz; Hekker, S.; Themessl, N.

    2018-01-01

    Space observatories such as Kepler have provided data that can potentially revolutionize our understanding of stars. Through detailed asteroseismic analyses we are capable of determining fundamental stellar parameters and reveal the stellar internal structure with unprecedented accuracy. However......, such detailed analyses, known as peak bagging, have so far been obtained for only a small percentage of the observed stars while most of the scientific potential of the available data remains unexplored. One of the major challenges in peak bagging is identifying how many solar-like oscillation modes are visible...... of detected oscillation modes. The algorithm presented here opens the possibility for detailed and automated peak bagging of the thousands of solar-like oscillators observed by Kepler....

  9. Automated Motivic Analysis

    DEFF Research Database (Denmark)

    Lartillot, Olivier

    2016-01-01

    Motivic analysis provides very detailed understanding of musical composi- tions, but is also particularly difficult to formalize and systematize. A computational automation of the discovery of motivic patterns cannot be reduced to a mere extraction of all possible sequences of descriptions...... for lossless compression. The structural complexity resulting from successive repetitions of patterns can be controlled through a simple modelling of cycles. Generally, motivic patterns cannot always be defined solely as sequences of descriptions in a fixed set of dimensions: throughout the descriptions...... of the successive notes and intervals, various sets of musical parameters may be invoked. In this chapter, a method is presented that allows for these heterogeneous patterns to be discovered. Motivic repetition with local ornamentation is detected by reconstructing, on top of “surface-level” monodic voices, longer...

  10. Berkeley automated supernova search

    International Nuclear Information System (INIS)

    Kare, J.T.; Pennypacker, C.R.; Muller, R.A.; Mast, T.S.

    1981-01-01

    The Berkeley automated supernova search employs a computer controlled 36-inch telescope and charge coupled device (CCD) detector to image 2500 galaxies per night. A dedicated minicomputer compares each galaxy image with stored reference data to identify supernovae in real time. The threshold for detection is m/sub v/ = 18.8. We plan to monitor roughly 500 galaxies in Virgo and closer every night, and an additional 6000 galaxies out to 70 Mpc on a three night cycle. This should yield very early detection of several supernovae per year for detailed study, and reliable premaximum detection of roughly 100 supernovae per year for statistical studies. The search should be operational in mid-1982

  11. (No) Security in Automation!?

    CERN Document Server

    Lüders, S

    2008-01-01

    Modern Information Technologies like Ethernet, TCP/IP, web server or FTP are nowadays increas-ingly used in distributed control and automation systems. Thus, information from the factory floor is now directly available at the management level (From Shop-Floor to Top-Floor) and can be ma-nipulated from there. Despite the benefits coming with this (r)evolution, new vulnerabilities are in-herited, too: worms and viruses spread within seconds via Ethernet and attackers are becoming interested in control systems. Unfortunately, control systems lack the standard security features that usual office PCs have. This contribution will elaborate on these problems, discuss the vulnerabilities of modern control systems and present international initiatives for mitigation.

  12. [Automated anesthesia record systems].

    Science.gov (United States)

    Heinrichs, W; Mönk, S; Eberle, B

    1997-07-01

    The introduction of electronic anaesthesia documentation systems was attempted as early as in 1979, although their efficient application has become reality only in the past few years. The advantages of the electronic protocol are apparent: Continuous high quality documentation, comparability of data due to the availability of a data bank, reduction in the workload of the anaesthetist and availability of additional data. Disadvantages of the electronic protocol have also been discussed in the literature. By going through the process of entering data on the course of the anaesthetic procedure on the protocol sheet, the information is mentally absorbed and evaluated by the anaesthetist. This information may, however, be lost when the data are recorded fully automatically-without active involvement on the part of the anaesthetist. Recent publications state that by using intelligent alarms and/or integrated displays manual record keeping is no longer necessary for anaesthesia vigilance. The technical design of automated anaesthesia records depends on an integration of network technology into the hospital. It will be appropriate to connect the systems to the internet, but safety requirements have to be followed strictly. Concerning the database, client server architecture as well as language standards like SQL should be used. Object oriented databases will be available in the near future. Another future goal of automated anaesthesia record systems will be using knowledge based technologies within these systems. Drug interactions, disease related anaesthetic techniques and other information sources can be integrated. At this time, almost none of the commercially available systems has matured to a point where their purchase can be recommended without reservation. There is still a lack of standards for the subsequent exchange of data and a solution to a number of ergonomic problems still remains to be found. Nevertheless, electronic anaesthesia protocols will be required in

  13. Automating the radiographic NDT process

    International Nuclear Information System (INIS)

    Aman, J.K.

    1986-01-01

    Automation, the removal of the human element in inspection, has not been generally applied to film radiographic NDT. The justication for automating is not only productivity but also reliability of results. Film remains in the automated system of the future because of its extremely high image content, approximately 8 x 10 9 bits per 14 x 17. The equivalent to 2200 computer floppy discs. Parts handling systems and robotics applied for manufacturing and some NDT modalities, should now be applied to film radiographic NDT systems. Automatic film handling can be achieved with the daylight NDT film handling system. Automatic film processing is becoming the standard in industry and can be coupled to the daylight system. Robots offer the opportunity to automate fully the exposure step. Finally, computer aided interpretation appears on the horizon. A unit which laser scans a 14 x 17 (inch) film in 6 - 8 seconds can digitize film information for further manipulation and possible automatic interrogations (computer aided interpretation). The system called FDRS (for Film Digital Radiography System) is moving toward 50 micron (*approx* 16 lines/mm) resolution. This is believed to meet the need of the majority of image content needs. We expect the automated system to appear first in parts (modules) as certain operations are automated. The future will see it all come together in an automated film radiographic NDT system (author) [pt

  14. Development of small-bore, high-current-density railgun as testbed for study of plasma-materials interaction. Progress report for October 16, 2000 - May 13, 2003

    International Nuclear Information System (INIS)

    Kyekyoon, Kim-Kevin

    2003-01-01

    The present document is a final technical report summarizing the progress made during 10/16/2000 - 05/13/2003 toward the development of a small-bore railgun with transaugmentation as a testbed for investigating plasma-materials interaction

  15. Full Scale Advanced Systems Testbed (FAST): Capabilities and Recent Flight Research

    Science.gov (United States)

    Miller, Christopher

    2014-01-01

    At the NASA Armstrong Flight Research Center research is being conducted into flight control technologies that will enable the next generation of air and space vehicles. The Full Scale Advanced Systems Testbed (FAST) aircraft provides a laboratory for flight exploration of these technologies. In recent years novel but simple adaptive architectures for aircraft and rockets have been researched along with control technologies for improving aircraft fuel efficiency and control structural interaction. This presentation outlines the FAST capabilities and provides a snapshot of the research accomplishments to date. Flight experimentation allows a researcher to substantiate or invalidate their assumptions and intuition about a new technology or innovative approach Data early in a development cycle is invaluable for determining which technology barriers are real and which ones are imagined Data for a technology at a low TRL can be used to steer and focus the exploration and fuel rapid advances based on real world lessons learned It is important to identify technologies that are mature enough to benefit from flight research data and not be tempted to wait until we have solved all the potential issues prior to getting some data Sometimes a stagnated technology just needs a little real world data to get it going One trick to getting data for low TRL technologies is finding an environment where it is okay to take risks, where occasional failure is an expected outcome Learning how things fail is often as valuable as showing that they work FAST has been architected to facilitate this type of testing for control system technologies, specifically novel algorithms and sensors Rapid prototyping with a quick turnaround in a fly-fix-fly paradigm Sometimes it's easier and cheaper to just go fly it than to analyze the problem to death The goal is to find and test control technologies that would benefit from flight data and find solutions to the real barriers to innovation. The FAST

  16. A numerical testbed for remote sensing of aerosols, and its demonstration for evaluating retrieval synergy from a geostationary satellite constellation of GEO-CAPE and GOES-R

    International Nuclear Information System (INIS)

    Wang, Jun; Xu, Xiaoguang; Ding, Shouguo; Zeng, Jing; Spurr, Robert; Liu, Xiong; Chance, Kelly; Mishchenko, Michael

    2014-01-01

    We present a numerical testbed for remote sensing of aerosols, together with a demonstration for evaluating retrieval synergy from a geostationary satellite constellation. The testbed combines inverse (optimal-estimation) software with a forward model containing linearized code for computing particle scattering (for both spherical and non-spherical particles), a kernel-based (land and ocean) surface bi-directional reflectance facility, and a linearized radiative transfer model for polarized radiance. Calculation of gas absorption spectra uses the HITRAN (HIgh-resolution TRANsmission molecular absorption) database of spectroscopic line parameters and other trace species cross-sections. The outputs of the testbed include not only the Stokes 4-vector elements and their sensitivities (Jacobians) with respect to the aerosol single scattering and physical parameters (such as size and shape parameters, refractive index, and plume height), but also DFS (Degree of Freedom for Signal) values for retrieval of these parameters. This testbed can be used as a tool to provide an objective assessment of aerosol information content that can be retrieved for any constellation of (planned or real) satellite sensors and for any combination of algorithm design factors (in terms of wavelengths, viewing angles, radiance and/or polarization to be measured or used). We summarize the components of the testbed, including the derivation and validation of analytical formulae for Jacobian calculations. Benchmark calculations from the forward model are documented. In the context of NASA's Decadal Survey Mission GEO-CAPE (GEOstationary Coastal and Air Pollution Events), we demonstrate the use of the testbed to conduct a feasibility study of using polarization measurements in and around the O 2 A band for the retrieval of aerosol height information from space, as well as an to assess potential improvement in the retrieval of aerosol fine and coarse mode aerosol optical depth (AOD) through the

  17. [Automated analyzer of enzyme immunoassay].

    Science.gov (United States)

    Osawa, S

    1995-09-01

    Automated analyzers for enzyme immunoassay can be classified by several points of view: the kind of labeled antibodies or enzymes, detection methods, the number of tests per unit time, analytical time and speed per run. In practice, it is important for us consider the several points such as detection limits, the number of tests per unit time, analytical range, and precision. Most of the automated analyzers on the market can randomly access and measure samples. I will describe the recent advance of automated analyzers reviewing their labeling antibodies and enzymes, the detection methods, the number of test per unit time and analytical time and speed per test.

  18. Programmable Automated Welding System (PAWS)

    Science.gov (United States)

    Kline, Martin D.

    1994-01-01

    An ambitious project to develop an advanced, automated welding system is being funded as part of the Navy Joining Center with Babcock & Wilcox as the prime integrator. This program, the Programmable Automated Welding System (PAWS), involves the integration of both planning and real-time control activities. Planning functions include the development of a graphical decision support system within a standard, portable environment. Real-time control functions include the development of a modular, intelligent, real-time control system and the integration of a number of welding process sensors. This paper presents each of these components of the PAWS and discusses how they can be utilized to automate the welding operation.

  19. An Automation Survival Guide for Media Centers.

    Science.gov (United States)

    Whaley, Roger E.

    1989-01-01

    Reviews factors that should affect the decision to automate a school media center and offers suggestions for the automation process. Topics discussed include getting the library collection ready for automation, deciding what automated functions are needed, evaluating software vendors, selecting software, and budgeting. (CLB)

  20. Environmental assessment for the Atmospheric Radiation Measurement (ARM) Program: Southern Great Plains Cloud and Radiation Testbed (CART) site

    International Nuclear Information System (INIS)

    Policastro, A.J.; Pfingston, J.M.; Maloney, D.M.; Wasmer, F.; Pentecost, E.D.

    1992-03-01

    The Atmospheric Radiation Measurement (ARM) Program is aimed at supplying improved predictive capability of climate change, particularly the prediction of cloud-climate feedback. The objective will be achieved by measuring the atmospheric radiation and physical and meteorological quantities that control solar radiation in the earth's atmosphere and using this information to test global climate and related models. The proposed action is to construct and operate a Cloud and Radiation Testbed (CART) research site in the southern Great Plains as part of the Department of Energy's Atmospheric Radiation Measurement Program whose objective is to develop an improved predictive capability of global climate change. The purpose of this CART research site in southern Kansas and northern Oklahoma would be to collect meteorological and other scientific information to better characterize the processes controlling radiation transfer on a global scale. Impacts which could result from this facility are described

  1. Results from a multi aperture Fizeau interferometer ground testbed: demonstrator for a future space-based interferometer

    Science.gov (United States)

    Baccichet, Nicola; Caillat, Amandine; Rakotonimbahy, Eddy; Dohlen, Kjetil; Savini, Giorgio; Marcos, Michel

    2016-08-01

    In the framework of the European FP7-FISICA (Far Infrared Space Interferometer Critical Assessment) program, we developed a miniaturized version of the hyper-telescope to demonstrate multi-aperture interferometry on ground. This setup would be ultimately integrated into a CubeSat platform, therefore providing the first real demonstrator of a multi aperture Fizeau interferometer in space. In this paper, we describe the optical design of the ground testbed and the data processing pipeline implemented to reconstruct the object image from interferometric data. As a scientific application, we measured the Sun diameter by fitting a limb-darkening model to our data. Finally, we present the design of a CubeSat platform carrying this miniature Fizeau interferometer, which could be used to monitor the Sun diameter over a long in-orbit period.

  2. The CELSS Antarctic Analog Project: An Advanced Life Support Testbed at the Amundsen-Scott South Pole Station, Antarctica

    Science.gov (United States)

    Straight, Christian L.; Bubenheim, David L.; Bates, Maynard E.; Flynn, Michael T.

    1994-01-01

    CELSS Antarctic Analog Project (CAAP) represents a logical solution to the multiple objectives of both the NASA and the National Science Foundation (NSF). CAAP will result in direct transfer of proven technologies and systems, proven under the most rigorous of conditions, to the NSF and to society at large. This project goes beyond, as it must, the generally accepted scope of CELSS and life support systems including the issues of power generation, human dynamics, community systems, and training. CAAP provides a vivid and starkly realistic testbed of Controlled Ecological Life Support System (CELSS) and life support systems and methods. CAAP will also be critical in the development and validation of performance parameters for future advanced life support systems.

  3. Environmental assessment for the Atmospheric Radiation Measurement (ARM) Program: Southern Great Plains Cloud and Radiation Testbed (CART) site

    Energy Technology Data Exchange (ETDEWEB)

    Policastro, A.J.; Pfingston, J.M.; Maloney, D.M.; Wasmer, F.; Pentecost, E.D.

    1992-03-01

    The Atmospheric Radiation Measurement (ARM) Program is aimed at supplying improved predictive capability of climate change, particularly the prediction of cloud-climate feedback. The objective will be achieved by measuring the atmospheric radiation and physical and meteorological quantities that control solar radiation in the earth`s atmosphere and using this information to test global climate and related models. The proposed action is to construct and operate a Cloud and Radiation Testbed (CART) research site in the southern Great Plains as part of the Department of Energy`s Atmospheric Radiation Measurement Program whose objective is to develop an improved predictive capability of global climate change. The purpose of this CART research site in southern Kansas and northern Oklahoma would be to collect meteorological and other scientific information to better characterize the processes controlling radiation transfer on a global scale. Impacts which could result from this facility are described.

  4. Optimal reliability design for over-actuated systems based on the MIT rule: Application to an octocopter helicopter testbed

    International Nuclear Information System (INIS)

    Chamseddine, Abbas; Theilliol, Didier; Sadeghzadeh, Iman; Zhang, Youmin; Weber, Philippe

    2014-01-01

    This paper addresses the problem of optimal reliability in over-actuated systems. Overloading an actuator decreases its overall lifetime and reduces its average performance over a long time. Therefore, performance and reliability are two conflicting requirements. While appropriate reliability is related to average loads, good performance is related to fast response and sufficient loads generated by actuators. Actuator redundancy allows us to address both performance and reliability at the same time by properly allocating desired loads among redundant actuators. The main contribution of this paper is the on-line optimization of the overall plant reliability according to performance objective using an MIT (Massachusetts Institute of Technology) rule-based method. The effectiveness of the proposed method is illustrated through an experimental application to an octocopter helicopter testbed

  5. Home automation with Intel Galileo

    CERN Document Server

    Dundar, Onur

    2015-01-01

    This book is for anyone who wants to learn Intel Galileo for home automation and cross-platform software development. No knowledge of programming with Intel Galileo is assumed, but knowledge of the C programming language is essential.

  6. Strategic Transit Automation Research Plan

    Science.gov (United States)

    2018-01-01

    Transit bus automation could deliver many potential benefits, but transit agencies need additional research and policy guidance to make informed deployment decisions. Although funding and policy constraints may play a role, there is also a reasonable...

  7. The Evaluation of Automated Systems

    National Research Council Canada - National Science Library

    McDougall, Jeffrey

    2004-01-01

    .... The Army has recognized this change and is adapting to operate in this new environment. It has developed a number of automated tools to assist leaders in the command and control of their organizations...

  8. National Automated Conformity Inspection Process -

    Data.gov (United States)

    Department of Transportation — The National Automated Conformity Inspection Process (NACIP) Application is intended to expedite the workflow process as it pertains to the FAA Form 81 0-10 Request...

  9. Automation of the testing procedure

    International Nuclear Information System (INIS)

    Haas, H.; Fleischer, M.; Bachner, E.

    1979-01-01

    For the judgement of technologies applied and the testing of specific components of the HTR primary circuit, complex test procedures and data evaluations are required. Extensive automation of these test procedures is indispensable. (orig.) [de

  10. Automation of coal mining equipment

    Energy Technology Data Exchange (ETDEWEB)

    Yamada, Ryuji

    1986-12-25

    Major machines used in the working face include the shearer and the self-advancing frame. The shearer has been changed from the radio-controlled model to the microcomputer operated machine, while automating various functions. In addition, a system for comprehensively examining operating conditions and natural conditions in the working face for further automation. The selfadvancing frame has been modified from the sequence controlled model to the microcomputer aided electrohydraulic control system. In order to proceed further with automation and introduce robotics, detectors, control units and valves must be made smaller in higher reliability. The system will be controlled above the ground in the future, provided that the machines in the working face are remote controlled at the gate while transmitting relevant data above the ground from this system. Thus, automated working face will be realized. (2 figs, 1 photo)

  11. Synthesis of Automated Vehicle Legislation

    Science.gov (United States)

    2017-10-01

    This report provides a synthesis of issues addressed by state legislation regarding automated vehicles (AV); AV technologies are rapidly evolving and many states have developed legislation to govern AV testing and deployment and to assure safety on p...

  12. Fully automated parallel oligonucleotide synthesizer

    Czech Academy of Sciences Publication Activity Database

    Lebl, M.; Burger, Ch.; Ellman, B.; Heiner, D.; Ibrahim, G.; Jones, A.; Nibbe, M.; Thompson, J.; Mudra, Petr; Pokorný, Vít; Poncar, Pavel; Ženíšek, Karel

    2001-01-01

    Roč. 66, č. 8 (2001), s. 1299-1314 ISSN 0010-0765 Institutional research plan: CEZ:AV0Z4055905 Keywords : automated oligonucleotide synthesizer Subject RIV: CC - Organic Chemistry Impact factor: 0.778, year: 2001

  13. Automation and Human Resource Management.

    Science.gov (United States)

    Taft, Michael

    1988-01-01

    Discussion of the automation of personnel administration in libraries covers (1) new developments in human resource management systems; (2) system requirements; (3) software evaluation; (4) vendor evaluation; (5) selection of a system; (6) training and support; and (7) benefits. (MES)

  14. Tower-Based Greenhouse Gas Measurement Network Design---The National Institute of Standards and Technology North East Corridor Testbed.

    Science.gov (United States)

    Lopez-Coto, Israel; Ghosh, Subhomoy; Prasad, Kuldeep; Whetstone, James

    2017-09-01

    The North-East Corridor (NEC) Testbed project is the 3rd of three NIST (National Institute of Standards and Technology) greenhouse gas emissions testbeds designed to advance greenhouse gas measurements capabilities. A design approach for a dense observing network combined with atmospheric inversion methodologies is described. The Advanced Research Weather Research and Forecasting Model with the Stochastic Time-Inverted Lagrangian Transport model were used to derive the sensitivity of hypothetical observations to surface greenhouse gas emissions (footprints). Unlike other network design algorithms, an iterative selection algorithm, based on a k -means clustering method, was applied to minimize the similarities between the temporal response of each site and maximize sensitivity to the urban emissions contribution. Once a network was selected, a synthetic inversion Bayesian Kalman filter was used to evaluate observing system performance. We present the performances of various measurement network configurations consisting of differing numbers of towers and tower locations. Results show that an overly spatially compact network has decreased spatial coverage, as the spatial information added per site is then suboptimal as to cover the largest possible area, whilst networks dispersed too broadly lose capabilities of constraining flux uncertainties. In addition, we explore the possibility of using a very high density network of lower cost and performance sensors characterized by larger uncertainties and temporal drift. Analysis convergence is faster with a large number of observing locations, reducing the response time of the filter. Larger uncertainties in the observations implies lower values of uncertainty reduction. On the other hand, the drift is a bias in nature, which is added to the observations and, therefore, biasing the retrieved fluxes.

  15. Anesthesiology, automation, and artificial intelligence.

    Science.gov (United States)

    Alexander, John C; Joshi, Girish P

    2018-01-01

    There have been many attempts to incorporate automation into the practice of anesthesiology, though none have been successful. Fundamentally, these failures are due to the underlying complexity of anesthesia practice and the inability of rule-based feedback loops to fully master it. Recent innovations in artificial intelligence, especially machine learning, may usher in a new era of automation across many industries, including anesthesiology. It would be wise to consider the implications of such potential changes before they have been fully realized.

  16. Virtual Machine in Automation Projects

    OpenAIRE

    Xing, Xiaoyuan

    2010-01-01

    Virtual machine, as an engineering tool, has recently been introduced into automation projects in Tetra Pak Processing System AB. The goal of this paper is to examine how to better utilize virtual machine for the automation projects. This paper designs different project scenarios using virtual machine. It analyzes installability, performance and stability of virtual machine from the test results. Technical solutions concerning virtual machine are discussed such as the conversion with physical...

  17. Evolution of Home Automation Technology

    OpenAIRE

    Mohd. Rihan; M. Salim Beg

    2009-01-01

    In modern society home and office automation has becomeincreasingly important, providing ways to interconnectvarious home appliances. This interconnection results infaster transfer of information within home/offices leading tobetter home management and improved user experience.Home Automation, in essence, is a technology thatintegrates various electrical systems of a home to provideenhanced comfort and security. Users are grantedconvenient and complete control over all the electrical homeappl...

  18. Automated measuring systems. Automatisierte Messsysteme

    Energy Technology Data Exchange (ETDEWEB)

    1985-01-01

    Microprocessors have become a regular component of automated measuring systems. Experts offer their experience and basic information in 24 lectures and 10 poster presentations. The focus is on the following: Automated measuring, computer and microprocessor use, sensor technique, actuator technique, communication, interfaces, man-system interaction, distrubance tolerance and availability as well as uses. A discussion meeting is dedicated to the theme complex sensor digital signal, sensor interface and sensor bus.

  19. Aprendizaje automático

    OpenAIRE

    Moreno, Antonio

    1994-01-01

    En este libro se introducen los conceptos básicos en una de las ramas más estudiadas actualmente dentro de la inteligencia artificial: el aprendizaje automático. Se estudian temas como el aprendizaje inductivo, el razonamiento analógico, el aprendizaje basado en explicaciones, las redes neuronales, los algoritmos genéticos, el razonamiento basado en casos o las aproximaciones teóricas al aprendizaje automático.

  20. Safeguards through secure automated fabrication

    International Nuclear Information System (INIS)

    DeMerschman, A.W.; Carlson, R.L.

    1982-01-01

    Westinghouse Hanford Company, a prime contractor for the U.S. Department of Energy, is constructing the Secure Automated Fabrication (SAF) line for fabrication of mixed oxide breeder fuel pins. Fuel processing by automation, which provides a separation of personnel from fuel handling, will provide a means whereby advanced safeguards concepts will be introduced. Remote operations and the inter-tie between the process computer and the safeguards computer are discussed

  1. Automated sample analysis and remediation

    International Nuclear Information System (INIS)

    Hollen, R.; Settle, F.

    1995-01-01

    The Contaminant Analysis Automation Project is developing an automated chemical analysis system to address the current needs of the US Department of Energy (DOE). These needs focus on the remediation of large amounts of radioactive and chemically hazardous wastes stored, buried and still being processed at numerous DOE sites. This paper outlines the advantages of the system under development, and details the hardware and software design. A prototype system for characterizing polychlorinated biphenyls in soils is also described

  2. Manned spacecraft automation and robotics

    Science.gov (United States)

    Erickson, Jon D.

    1987-01-01

    The Space Station holds promise of being a showcase user and driver of advanced automation and robotics technology. The author addresses the advances in automation and robotics from the Space Shuttle - with its high-reliability redundancy management and fault tolerance design and its remote manipulator system - to the projected knowledge-based systems for monitoring, control, fault diagnosis, planning, and scheduling, and the telerobotic systems of the future Space Station.

  3. Home Automation and Security System

    OpenAIRE

    Surinder Kaur,; Rashmi Singh; Neha Khairwal; Pratyk Jain

    2016-01-01

    Easy Home or Home automation plays a very important role in modern era because of its flexibility in using it at different places with high precision which will save money and time by decreasing human hard work. Prime focus of this technology is to control the household equipment’s like light, fan, door, AC etc. automatically. This research paper has detailed information on Home Automation and Security System using Arduino, GSM and how we can control home appliances using Android application....

  4. 2015 Chinese Intelligent Automation Conference

    CERN Document Server

    Li, Hongbo

    2015-01-01

    Proceedings of the 2015 Chinese Intelligent Automation Conference presents selected research papers from the CIAC’15, held in Fuzhou, China. The topics include adaptive control, fuzzy control, neural network based control, knowledge based control, hybrid intelligent control, learning control, evolutionary mechanism based control, multi-sensor integration, failure diagnosis, reconfigurable control, etc. Engineers and researchers from academia, industry and the government can gain valuable insights into interdisciplinary solutions in the field of intelligent automation.

  5. BARD: Better Automated Redistricting

    Directory of Open Access Journals (Sweden)

    Micah Altman

    2011-08-01

    Full Text Available BARD is the first (and at time of writing, only open source software package for general redistricting and redistricting analysis. BARD provides methods to create, display, compare, edit, automatically refine, evaluate, and profile political districting plans. BARD aims to provide a framework for scientific analysis of redistricting plans and to facilitate wider public participation in the creation of new plans.BARD facilitates map creation and refinement through command-line, graphical user interface, and automatic methods. Since redistricting is a computationally complex partitioning problem not amenable to an exact optimization solution, BARD implements a variety of selectable metaheuristics that can be used to refine existing or randomly-generated redistricting plans based on user-determined criteria.Furthermore, BARD supports automated generation of redistricting plans and profiling of plans by assigning different weights to various criteria, such as district compactness or equality of population. This functionality permits exploration of trade-offs among criteria. The intent of a redistricting authority may be explored by examining these trade-offs and inferring which reasonably observable plans were not adopted.Redistricting is a computationally-intensive problem for even modest-sized states. Performance is thus an important consideration in BARD's design and implementation. The program implements performance enhancements such as evaluation caching, explicit memory management, and distributed computing across snow clusters.

  6. Automated uranium titration system

    International Nuclear Information System (INIS)

    Takahashi, M.; Kato, Y.

    1983-01-01

    An automated titration system based on the Davies-Gray method has been developed for accurate determination of uranium. The system consists of a potentiometric titrator with precise burettes, a sample changer, an electronic balance and a desk-top computer with a printer. Fifty-five titration vessels are loaded in the sample changer. The first three contain the standard solution for standardizing potassium dichromate titrant, and the next two and the last two contain the control samples for data quality assurance. The other forty-eight measurements are carried out for sixteen unknown samples. Sample solution containing about 100 mg uranium is taken in a titration vessel. At the pretreatment position, uranium (VI) is reduced to uranium (IV) by iron (II). After the valency adjustment, the vessel is transferred to the titration position. The rate of titrant addition is automatically controlled to be slower near the end-point. The last figure (0.01 mL) of the equivalent titrant volume for uranium is calculated from the potential change. The results obtained with this system on 100 mg uranium gave a precision of 0.2% (RSD,n=3) and an accuracy of better than 0.1%. Fifty-five titrations are accomplished in 10 hours. (author)

  7. Automated asteroseismic peak detections

    Science.gov (United States)

    García Saravia Ortiz de Montellano, Andrés; Hekker, S.; Themeßl, N.

    2018-05-01

    Space observatories such as Kepler have provided data that can potentially revolutionize our understanding of stars. Through detailed asteroseismic analyses we are capable of determining fundamental stellar parameters and reveal the stellar internal structure with unprecedented accuracy. However, such detailed analyses, known as peak bagging, have so far been obtained for only a small percentage of the observed stars while most of the scientific potential of the available data remains unexplored. One of the major challenges in peak bagging is identifying how many solar-like oscillation modes are visible in a power density spectrum. Identification of oscillation modes is usually done by visual inspection that is time-consuming and has a degree of subjectivity. Here, we present a peak-detection algorithm especially suited for the detection of solar-like oscillations. It reliably characterizes the solar-like oscillations in a power density spectrum and estimates their parameters without human intervention. Furthermore, we provide a metric to characterize the false positive and false negative rates to provide further information about the reliability of a detected oscillation mode or the significance of a lack of detected oscillation modes. The algorithm presented here opens the possibility for detailed and automated peak bagging of the thousands of solar-like oscillators observed by Kepler.

  8. Particle Accelerator Focus Automation

    Science.gov (United States)

    Lopes, José; Rocha, Jorge; Redondo, Luís; Cruz, João

    2017-08-01

    The Laboratório de Aceleradores e Tecnologias de Radiação (LATR) at the Campus Tecnológico e Nuclear, of Instituto Superior Técnico (IST) has a horizontal electrostatic particle accelerator based on the Van de Graaff machine which is used for research in the area of material characterization. This machine produces alfa (He+) and proton (H+) beams of some μA currents up to 2 MeV/q energies. Beam focusing is obtained using a cylindrical lens of the Einzel type, assembled near the high voltage terminal. This paper describes the developed system that automatically focuses the ion beam, using a personal computer running the LabVIEW software, a multifunction input/output board and signal conditioning circuits. The focusing procedure consists of a scanning method to find the lens bias voltage which maximizes the beam current measured on a beam stopper target, which is used as feedback for the scanning cycle. This system, as part of a wider start up and shut down automation system built for this particle accelerator, brings great advantages to the operation of the accelerator by turning it faster and easier to operate, requiring less human presence, and adding the possibility of total remote control in safe conditions.

  9. Particle Accelerator Focus Automation

    Directory of Open Access Journals (Sweden)

    Lopes José

    2017-08-01

    Full Text Available The Laboratório de Aceleradores e Tecnologias de Radiação (LATR at the Campus Tecnológico e Nuclear, of Instituto Superior Técnico (IST has a horizontal electrostatic particle accelerator based on the Van de Graaff machine which is used for research in the area of material characterization. This machine produces alfa (He+ and proton (H+ beams of some μA currents up to 2 MeV/q energies. Beam focusing is obtained using a cylindrical lens of the Einzel type, assembled near the high voltage terminal. This paper describes the developed system that automatically focuses the ion beam, using a personal computer running the LabVIEW software, a multifunction input/output board and signal conditioning circuits. The focusing procedure consists of a scanning method to find the lens bias voltage which maximizes the beam current measured on a beam stopper target, which is used as feedback for the scanning cycle. This system, as part of a wider start up and shut down automation system built for this particle accelerator, brings great advantages to the operation of the accelerator by turning it faster and easier to operate, requiring less human presence, and adding the possibility of total remote control in safe conditions.

  10. Automated ISS Flight Utilities

    Science.gov (United States)

    Offermann, Jan Tuzlic

    2016-01-01

    During my internship at NASA Johnson Space Center, I worked in the Space Radiation Analysis Group (SRAG), where I was tasked with a number of projects focused on the automation of tasks and activities related to the operation of the International Space Station (ISS). As I worked on a number of projects, I have written short sections below to give a description for each, followed by more general remarks on the internship experience. My first project is titled "General Exposure Representation EVADOSE", also known as "GEnEVADOSE". This project involved the design and development of a C++/ ROOT framework focused on radiation exposure for extravehicular activity (EVA) planning for the ISS. The utility helps mission managers plan EVAs by displaying information on the cumulative radiation doses that crew will receive during an EVA as a function of the egress time and duration of the activity. SRAG uses a utility called EVADOSE, employing a model of the space radiation environment in low Earth orbit to predict these doses, as while outside the ISS the astronauts will have less shielding from charged particles such as electrons and protons. However, EVADOSE output is cumbersome to work with, and prior to GEnEVADOSE, querying data and producing graphs of ISS trajectories and cumulative doses versus egress time required manual work in Microsoft Excel. GEnEVADOSE automates all this work, reading in EVADOSE output file(s) along with a plaintext file input by the user providing input parameters. GEnEVADOSE will output a text file containing all the necessary dosimetry for each proposed EVA egress time, for each specified EVADOSE file. It also plots cumulative dose versus egress time and the ISS trajectory, and displays all of this information in an auto-generated presentation made in LaTeX. New features have also been added, such as best-case scenarios (egress times corresponding to the least dose), interpolated curves for trajectories, and the ability to query any time in the

  11. Automated Supernova Discovery (Abstract)

    Science.gov (United States)

    Post, R. S.

    2015-12-01

    (Abstract only) We are developing a system of robotic telescopes for automatic recognition of Supernovas as well as other transient events in collaboration with the Puckett Supernova Search Team. At the SAS2014 meeting, the discovery program, SNARE, was first described. Since then, it has been continuously improved to handle searches under a wide variety of atmospheric conditions. Currently, two telescopes are used to build a reference library while searching for PSN with a partial library. Since data is taken every night without clouds, we must deal with varying atmospheric and high background illumination from the moon. Software is configured to identify a PSN, reshoot for verification with options to change the run plan to acquire photometric or spectrographic data. The telescopes are 24-inch CDK24, with Alta U230 cameras, one in CA and one in NM. Images and run plans are sent between sites so the CA telescope can search while photometry is done in NM. Our goal is to find bright PSNs with magnitude 17.5 or less which is the limit of our planned spectroscopy. We present results from our first automated PSN discoveries and plans for PSN data acquisition.

  12. Utilizing the ISS Mission as a Testbed to Develop Cognitive Communications Systems

    Science.gov (United States)

    Jackson, Dan

    2016-01-01

    The ISS provides an excellent opportunity for pioneering artificial intelligence software to meet the challenges of real-time communications (comm) link management. This opportunity empowers the ISS Program to forge a testbed for developing cognitive communications systems for the benefit of the ISS mission, manned Low Earth Orbit (LEO) science programs and future planetary exploration programs. In November, 1998, the Flight Operations Directorate (FOD) started the ISS Antenna Manager (IAM) project to develop a single processor supporting multiple comm satellite tracking for two different antenna systems. Further, the processor was developed to be highly adaptable as it supported the ISS mission through all assembly stages. The ISS mission mandated communications specialists with complete knowledge of when the ISS was about to lose or gain comm link service. The current specialty mandated cognizance of large sun-tracking solar arrays and thermal management panels in addition to the highly-dynamic satellite service schedules and rise/set tables. This mission requirement makes the ISS the ideal communications management analogue for future LEO space station and long-duration planetary exploration missions. Future missions, with their precision-pointed, dynamic, laser-based comm links, require complete autonomy for managing high-data rate communications systems. Development of cognitive communications management systems that permit any crew member or payload science specialist, regardless of experience level, to control communications is one of the greater benefits the ISS can offer new space exploration programs. The IAM project met a new mission requirement never previously levied against US space-born communications systems management: process and display the orientation of large solar arrays and thermal control panels based on real-time joint angle telemetry. However, IAM leaves the actual communications availability assessment to human judgement, which introduces

  13. 77 FR 48527 - National Customs Automation Program (NCAP) Test Concerning Automated Commercial Environment (ACE...

    Science.gov (United States)

    2012-08-14

    ... National Customs Automation Program (NCAP) test concerning the simplified entry functionality in the... DEPARTMENT OF HOMELAND SECURITY U.S. Customs and Border Protection National Customs Automation Program (NCAP) Test Concerning Automated Commercial Environment (ACE) Simplified Entry: Modification of...

  14. SAIL: automating interlibrary loan.

    Science.gov (United States)

    Lacroix, E M

    1994-01-01

    The National Library of Medicine (NLM) initiated the System for Automated Interlibrary Loan (SAIL) pilot project to study the feasibility of using imaging technology linked to the DOCLINE system to deliver copies of journal articles. During the project, NLM converted a small number of print journal issues to electronic form, linking the captured articles to the MEDLINE citation unique identifier. DOCLINE requests for these journals that could not be filled by network libraries were routed to SAIL. Nearly 23,000 articles from sixty-four journals recently selected for indexing in Index Medicus were scanned to convert them to electronic images. During fiscal year 1992, 4,586 scanned articles were used to fill 10,444 interlibrary loan (ILL) requests, and more than half of these were used only once. Eighty percent of all the articles were not requested at all. The total cost per article delivered was $10.76, substantially more than it costs to process a photocopy request. Because conversion costs were the major component of the total SAIL cost, and most of the articles captured for the project were not requested, this model was not cost-effective. Data on SAIL journal article use was compared with all ILL requests filled by NLM for the same period. Eighty-eight percent of all articles requested from NLM were requested only once. The results of the SAIL project demonstrated that converting journal articles to electronic images and storing them in anticipation of repeated requests would not meet NLM's objective to improve interlibrary loan. PMID:8004020

  15. AUTOMATED ANALYSIS OF BREAKERS

    Directory of Open Access Journals (Sweden)

    E. M. Farhadzade

    2014-01-01

    Full Text Available Breakers relate to Electric Power Systems’ equipment, the reliability of which influence, to a great extend, on reliability of Power Plants. In particular, the breakers determine structural reliability of switchgear circuit of Power Stations and network substations. Failure in short-circuit switching off by breaker with further failure of reservation unit or system of long-distance protection lead quite often to system emergency.The problem of breakers’ reliability improvement and the reduction of maintenance expenses is becoming ever more urgent in conditions of systematic increasing of maintenance cost and repair expenses of oil circuit and air-break circuit breakers. The main direction of this problem solution is the improvement of diagnostic control methods and organization of on-condition maintenance. But this demands to use a great amount of statistic information about nameplate data of breakers and their operating conditions, about their failures, testing and repairing, advanced developments (software of computer technologies and specific automated information system (AIS.The new AIS with AISV logo was developed at the department: “Reliability of power equipment” of AzRDSI of Energy. The main features of AISV are:· to provide the security and data base accuracy;· to carry out systematic control of breakers conformity with operating conditions;· to make the estimation of individual  reliability’s value and characteristics of its changing for given combination of characteristics variety;· to provide personnel, who is responsible for technical maintenance of breakers, not only with information but also with methodological support, including recommendations for the given problem solving  and advanced methods for its realization.

  16. Automation of solar plants

    Energy Technology Data Exchange (ETDEWEB)

    Yebra, L.J.; Romero, M.; Martinez, D.; Valverde, A. [CIEMAT - Plataforma Solar de Almeria, Tabernas (Spain); Berenguel, M. [Almeria Univ. (Spain). Departamento de Lenguajes y Computacion

    2004-07-01

    This work overviews some of the main activities and research lines that are being carried out within the scope of the specific collaboration agreement between the Plataforma Solar de Almeria-CIEMAT (PSA-CIEMAT) and the Automatic Control, Electronics and Robotics research group of the Universidad de Almeria (TEP197) titled ''Development of control systems and tools for thermosolar plants'' and the projects financed by the MCYT DPI2001-2380-C02-02 and DPI2002-04375-C03. The research is directed by the need of improving the efficiency of the process through which the energy provided by the sun is totally or partially used as energy source, as far as diminishing the costs associated to the operation and maintenance of the installations that use this energy source. The final objective is to develop different automatic control systems and techniques aimed at improving the competitiveness of solar plants. The paper summarizes different objectives and automatic control approaches that are being implemented in different facilities at the PSA-CIEMAT: central receiver systems and solar furnace. For each one of these facilities, a systematic procedure is being followed, composed of several steps: (i) development of dynamic models using the newest modeling technologies (both for simulation and control purposes), (ii) development of fully automated data acquisition and control systems including software tools facilitating the analysis of data and the application of knowledge to the controlled plants and (iii) synthesis of advanced controllers using techniques successfully used in the process industry and development of new and optimized control algorithms for solar plants. These aspects are summarized in this work. (orig.)

  17. Development of an automated data acquisition and processing pipeline using multiple telescopes for observing transient phenomena

    Science.gov (United States)

    Savant, Vaibhav; Smith, Niall

    2016-07-01

    We report on the current status in the development of a pilot automated data acquisition and reduction pipeline based around the operation of two nodes of remotely operated robotic telescopes based in California, USA and Cork, Ireland. The observatories are primarily used as a testbed for automation and instrumentation and as a tool to facilitate STEM (Science Technology Engineering Mathematics) promotion. The Ireland node is situated at Blackrock Castle Observatory (operated by Cork Institute of Technology) and consists of two optical telescopes - 6" and 16" OTAs housed in two separate domes while the node in California is its 6" replica. Together they form a pilot Telescope ARrAy known as TARA. QuickPhot is an automated data reduction pipeline designed primarily to throw more light on the microvariability of blazars employing precision optical photometry and using data from the TARA telescopes as they constantly monitor predefined targets whenever observing conditions are favourable. After carrying out aperture photometry, if any variability above a given threshold is observed, the reporting telescope will communicate the source concerned and the other nodes will follow up with multi-band observations, taking advantage that they are located in strategically separated time-zones. Ultimately we wish to investigate the applicability of Shock-in-Jet and Geometric models. These try to explain the processes at work in AGNs which result in the formation of jets, by looking for temporal and spectral variability in TARA multi-band observations. We are also experimenting with using a Twochannel Optical PHotometric Imaging CAMera (TOΦCAM) that we have developed and which has been optimised for simultaneous two-band photometry on our 16" OTA.

  18. A Ground Testbed to Advance US Capability in Autonomous Rendezvous and Docking Project

    Science.gov (United States)

    D'Souza, Chris

    2014-01-01

    This project will advance the Autonomous Rendezvous and Docking (AR&D) GNC system by testing it on hardware, particularly in a flight processor, with a goal of testing it in IPAS with the Waypoint L2 AR&D scenario. The entire Agency supports development of a Commodity for Autonomous Rendezvous and Docking (CARD) as outlined in the Agency-wide Community of Practice whitepaper entitled: "A Strategy for the U.S. to Develop and Maintain a Mainstream Capability for Automated/Autonomous Rendezvous and Docking in Low Earth Orbit and Beyond". The whitepaper establishes that 1) the US is in a continual state of AR&D point-designs and therefore there is no US "off-the-shelf" AR&D capability in existence today, 2) the US has fallen behind our foreign counterparts particularly in the autonomy of AR&D systems, 3) development of an AR&D commodity is a national need that would benefit NASA, our commercial partners, and DoD, and 4) an initial estimate indicates that the development of a standardized AR&D capability could save the US approximately $60M for each AR&D project and cut each project's AR&D flight system implementation time in half.

  19. A Numerical Testbed for Remote Sensing of Aerosols, and its Demonstration for Evaluating Retrieval Synergy from a Geostationary Satellite Constellation of GEO-CAPE and GOES-R

    Science.gov (United States)

    Wang, Jun; Xu, Xiaoguang; Ding, Shouguo; Zeng, Jing; Spurr, Robert; Liu, Xiong; Chance, Kelly; Mishchenko, Michael I.

    2014-01-01

    We present a numerical testbed for remote sensing of aerosols, together with a demonstration for evaluating retrieval synergy from a geostationary satellite constellation. The testbed combines inverse (optimal-estimation) software with a forward model containing linearized code for computing particle scattering (for both spherical and non-spherical particles), a kernel-based (land and ocean) surface bi-directional reflectance facility, and a linearized radiative transfer model for polarized radiance. Calculation of gas absorption spectra uses the HITRAN (HIgh-resolution TRANsmission molecular absorption) database of spectroscopic line parameters and other trace species cross-sections. The outputs of the testbed include not only the Stokes 4-vector elements and their sensitivities (Jacobians) with respect to the aerosol single scattering and physical parameters (such as size and shape parameters, refractive index, and plume height), but also DFS (Degree of Freedom for Signal) values for retrieval of these parameters. This testbed can be used as a tool to provide an objective assessment of aerosol information content that can be retrieved for any constellation of (planned or real) satellite sensors and for any combination of algorithm design factors (in terms of wavelengths, viewing angles, radiance and/or polarization to be measured or used). We summarize the components of the testbed, including the derivation and validation of analytical formulae for Jacobian calculations. Benchmark calculations from the forward model are documented. In the context of NASA's Decadal Survey Mission GEOCAPE (GEOstationary Coastal and Air Pollution Events), we demonstrate the use of the testbed to conduct a feasibility study of using polarization measurements in and around the O2 A band for the retrieval of aerosol height information from space, as well as an to assess potential improvement in the retrieval of aerosol fine and coarse mode aerosol optical depth (AOD) through the

  20. Layered distributed architecture for plant automation

    International Nuclear Information System (INIS)

    Aravamuthan, G.; Verma, Yachika; Ranjan, Jyoti; Chachondia, Alka S.; Ganesh, G.

    2005-01-01

    The development of plant automation system and associated software remains one of the greatest challenges to the widespread implementation of highly adaptive re-configurable automation technology. This paper presents a layered distributed architecture for a plant automation system designed to support rapid reconfiguration and redeployment of automation components. The paper first presents evolution of automation architecture and their associated environment in the past few decades and then presents the concept of layered system architecture and the use of automation components to support the construction of a wide variety of automation system. It also highlights the role of standards and technology, which can be used in the development of automation components. We have attempted to adhere to open standards and technology for the development of automation component at a various layers. It also highlights the application of this concept in the development of an Operator Information System (OIS) for Advanced Heavy Water Reactor (AHWR). (author)