WorldWideScience

Sample records for ground-based automatic generation

  1. System Identification and Automatic Mass Balancing of Ground-Based Three-Axis Spacecraft Simulator

    Science.gov (United States)

    2006-08-01

    System Identification and Automatic Mass Balancing of Ground-Based Three-Axis Spacecraft Simulator Jae-Jun Kim∗ and Brij N. Agrawal † Department of...TITLE AND SUBTITLE System Identification and Automatic Mass Balancing of Ground-Based Three-Axis Spacecraft Simulator 5a. CONTRACT NUMBER 5b...and Dynamics, Vol. 20, No. 4, July-August 1997, pp. 625-632. 6Schwartz, J. L. and Hall, C. D., “ System Identification of a Spherical Air-Bearing

  2. Automatic Dance Lesson Generation

    Science.gov (United States)

    Yang, Yang; Leung, H.; Yue, Lihua; Deng, LiQun

    2012-01-01

    In this paper, an automatic lesson generation system is presented which is suitable in a learning-by-mimicking scenario where the learning objects can be represented as multiattribute time series data. The dance is used as an example in this paper to illustrate the idea. Given a dance motion sequence as the input, the proposed lesson generation…

  3. Ground-based Measurements of Next Generation Spectroradiometric Standard Stars

    Science.gov (United States)

    McGraw, John T.

    2013-01-01

    Accurate, radiometric standards are essential to the future of ground- and space-based astronomy and astrophysics. While astronomers tend to think of “standard stars” as available calibration sources, progress at NIST to accurately calibrate inexpensive, easy to use photodiode detectors as spectroradiometric standards from 200 nm to 1800 nm allows referencing astronomical measurements to these devices. Direction-, time-, and wavelength-dependent transmission of Earth’s atmosphere is the single largest source of error for ground-based radiometric measurement of astronomical objects. Measurements and impacts of atmospheric extinction - scattering and absorption - on imaging radiometric and spectroradiometric measurements are described. The conclusion is that accurate real-time measurement of extinction in the column of atmosphere through which standard star observations are made, over the spectral region being observed and over the field of view of the telescope are required. New techniques to directly and simultaneously measure extinction in the column of atmosphere through which observations are made are required. Our direct extinction measurement solution employs three small facility-class instruments working in parallel: a lidar to measure rapidly time variable transmission at three wavelengths with uncertainty of 0.25% per airmass, a spectrophotometer to measure rapidly wavelength variable extinction with sub-1% precision per nanometer resolution element from 350 to 1050nm, and a wide-field camera to measure angularly variable extinction over the field of view. These instruments and their operation will be described. We assert that application of atmospheric metadata provided by this instrument suite corrects for a significant fraction of systematic errors currently limiting radiometric precision, and provides a major step towards measurements that are provably dominated by random noise.

  4. First-generation Science Cases for Ground-based Terahertz Telescopes

    CERN Document Server

    Hirashita, Hiroyuki; Matsushita, Satoki; Takakuwa, Shigehisa; Nakamura, Masanori; Asada, Keiichi; Liu, Hauyu Baobab; Urata, Yuji; Wang, Ming-Jye; Wang, Wei-Hao; Takahashi, Satoko; Tang, Ya-Wen; Chang, Hsian-Hong; Huang, Kuiyun; Morata, Oscar; Otsuka, Masaaki; Lin, Kai-Yang; Tsai, An-Li; Lin, Yen-Ting; Srinivasan, Sundar; Martin-Cocher, Pierre; Pu, Hung-Yi; Kemper, Francisca; Patel, Nimesh; Grimes, Paul; Huang, Yau-De; Han, Chih-Chiang; Huang, Yen-Ru; Nishioka, Hiroaki; Lin, Lupin Chun-Che; Zhang, Qizhou; Keto, Eric; Burgos, Roberto; Chen, Ming-Tang; Inoue, Makoto; Ho, Paul T P

    2015-01-01

    Ground-based observations at terahertz (THz) frequencies are a newly explorable area of astronomy for the next ten years. We discuss science cases for a first-generation 10-m class THz telescope, focusing on the Greenland Telescope as an example of such a facility. We propose science cases and provide quantitative estimates for each case. The largest advantage of ground-based THz telescopes is their higher angular resolution (~ 4 arcsec for a 10-m dish), as compared to space or airborne THz telescopes. Thus, high-resolution mapping is an important scientific argument. In particular, we can isolate zones of interest for Galactic and extragalactic star-forming regions. The THz windows are suitable for observations of high-excitation CO lines and [N II] 205 um lines, which are scientifically relevant tracers of star formation and stellar feedback. Those lines are the brightest lines in the THz windows, so that they are suitable for the initiation of ground-based THz observations. THz polarization of star-forming...

  5. Ground-based acoustic parametric generator impact on the atmosphere and ionosphere in an active experiment

    Science.gov (United States)

    Rapoport, Yuriy G.; Cheremnykh, Oleg K.; Koshovy, Volodymyr V.; Melnik, Mykola O.; Ivantyshyn, Oleh L.; Nogach, Roman T.; Selivanov, Yuriy A.; Grimalsky, Vladimir V.; Mezentsev, Valentyn P.; Karataeva, Larysa M.; Ivchenko, Vasyl. M.; Milinevsky, Gennadi P.; Fedun, Viktor N.; Tkachenko, Eugen N.

    2017-01-01

    We develop theoretical basics of active experiments with two beams of acoustic waves, radiated by a ground-based sound generator. These beams are transformed into atmospheric acoustic gravity waves (AGWs), which have parameters that enable them to penetrate to the altitudes of the ionospheric E and F regions where they influence the electron concentration of the ionosphere. Acoustic waves are generated by the ground-based parametric sound generator (PSG) at the two close frequencies. The main idea of the experiment is to design the output parameters of the PSG to build a cascade scheme of nonlinear wave frequency downshift transformations to provide the necessary conditions for their vertical propagation and to enable penetration to ionospheric altitudes. The PSG generates sound waves (SWs) with frequencies f1 = 600 and f2 = 625 Hz and large amplitudes (100-420 m s-1). Each of these waves is modulated with the frequency of 0.016 Hz. The novelty of the proposed analytical-numerical model is due to simultaneous accounting for nonlinearity, diffraction, losses, and dispersion and inclusion of the two-stage transformation (1) of the initial acoustic waves to the acoustic wave with the difference frequency Δf = f2 - f1 in the altitude ranges 0-0.1 km, in the strongly nonlinear regime, and (2) of the acoustic wave with the difference frequency to atmospheric acoustic gravity waves with the modulational frequency in the altitude ranges 0.1-20 km, which then reach the altitudes of the ionospheric E and F regions, in a practically linear regime. AGWs, nonlinearly transformed from the sound waves, launched by the two-frequency ground-based sound generator can increase the transparency of the ionosphere for the electromagnetic waves in HF (MHz) and VLF (kHz) ranges. The developed theoretical model can be used for interpreting an active experiment that includes the PSG impact on the atmosphere-ionosphere system, measurements of electromagnetic and acoustic fields, study of

  6. The DKIST Data Center: Meeting the Data Challenges for Next-Generation, Ground-Based Solar Physics

    Science.gov (United States)

    Davey, A. R.; Reardon, K.; Berukoff, S. J.; Hays, T.; Spiess, D.; Watson, F. T.; Wiant, S.

    2016-12-01

    The Daniel K. Inouye Solar Telescope (DKIST) is under construction on the summit of Haleakalā in Maui, and scheduled to start science operations in 2020. The DKIST design includes a four-meter primary mirror coupled to an adaptive optics system, and a flexible instrumentation suite capable of delivering high-resolution optical and infrared observations of the solar chromosphere, photosphere, and corona. Through investigator-driven science proposals, the facility will generate an average of 8 TB of data daily, comprised of millions of images and hundreds of millions of metadata elements. The DKIST Data Center is responsible for the long-term curation and calibration of data received from the DKIST, and for distributing it to the user community for scientific use. Two key elements necessary to meet the inherent big data challenge are the development of flexible public/private cloud computing and coupled relational and non-relational data storage mechanisms. We discuss how this infrastructure is being designed to meet the significant expectation of automatic and manual calibration of ground-based solar physics data, and the maximization the data's utility through efficient, long-term data management practices implemented with prudent process definition and technology exploitation.

  7. Automatic generation of tourist brochures

    KAUST Repository

    Birsak, Michael

    2014-05-01

    We present a novel framework for the automatic generation of tourist brochures that include routing instructions and additional information presented in the form of so-called detail lenses. The first contribution of this paper is the automatic creation of layouts for the brochures. Our approach is based on the minimization of an energy function that combines multiple goals: positioning of the lenses as close as possible to the corresponding region shown in an overview map, keeping the number of lenses low, and an efficient numbering of the lenses. The second contribution is a route-aware simplification of the graph of streets used for traveling between the points of interest (POIs). This is done by reducing the graph consisting of all shortest paths through the minimization of an energy function. The output is a subset of street segments that enable traveling between all the POIs without considerable detours, while at the same time guaranteeing a clutter-free visualization. © 2014 The Author(s) Computer Graphics Forum © 2014 The Eurographics Association and John Wiley & Sons Ltd. Published by John Wiley & Sons Ltd.

  8. NASA's Next Generation Sunphotometer for Ground-based Remote Sensing Applications

    Science.gov (United States)

    Shu, Peter K.; Miko, Laddawan; Bly, Vince T.; Chiao, Meng P.; Jones, Hollis H.; Kahle, Duncan M.

    2005-01-01

    Atmospheric aerosol concentrations and their optical properties, in terms of which cause differential warming/cooling effects in the atmosphere and on the surface, constitute one of the largest sources of uncertainty in current assessments and predictions of global climatic change. This is especially true over the regions of bright-reflecting surface, such as desert and urban areas. Under the name of AERONET since the 90's, Cimel's sunphotometers have been deployed worldwide as the standard instrument for aerosol monitoring network, developed to support NASA, CNES, and NASDA's Earth satellite systems. PREDE's skyradiometers, as deployed in SKYNET, serve the similar role. One of the key ingredients for achieving accurate aerosol retrievals from satellite observations is comprehensive understanding of surface spectral BRF's (Bidirectional Reflectance Factors), defined as a ratio of radiance measurements reflected from a targeted surface and from a spectral-angular featureless referencing plate. Although the weather-resistant, automatic, sun/sky-scanning spectroradiometers enable Frequent measurements of atmospheric aerosol optical properties and precipitable water at remote sites, they are too slow for surface BRF measurements (20-25 seconds per 360 degree scan, in addition to filter wheel rotation time). We have designed a next generation sun photometer whose sensor head has no moving parts. A dedicated detector for each channel enables 12 simultaneous measurements ranging from the UV (380 nm) to shortwave-IR (2.13 micron) regions. The scan platform will be capable of traveling 360 degrees in about 6 seconds. This is sufficient to finish a BRDF scan every 30 degrees in azimuth and 15 degrees in elevation in less than 4 minutes. More details about this instrument will be presented, together with its applications to the aerosol and trace gas studies. The current plan for this instrument is to deploy during the EAST-AIRE (East Asian Study of Tropospheric Aerosols: an

  9. Automatic quiz generation for elderly people

    OpenAIRE

    Samuelsen, Jeanette

    2016-01-01

    Studies have indicated that games can be beneficial for the elderly, in areas such as cognitive functioning and well-being. Taking part in social activities, such as playing a game with others, could also be beneficial. One type of game is a computer-based quiz. One can create quiz questions manually; however, this can be time-consuming. Another approach is to generate quiz questions automatically. This project has examined how quizzes for Norwegian elderly can be automatically generated usin...

  10. Automatic Chinese Factual Question Generation

    Science.gov (United States)

    Liu, Ming; Rus, Vasile; Liu, Li

    2017-01-01

    Question generation is an emerging research area of artificial intelligence in education. Question authoring tools are important in educational technologies, e.g., intelligent tutoring systems, as well as in dialogue systems. Approaches to generate factual questions, i.e., questions that have concrete answers, mainly make use of the syntactical…

  11. Automatic code generation in practice

    DEFF Research Database (Denmark)

    Adam, Marian Sorin; Kuhrmann, Marco; Schultz, Ulrik Pagh

    2016-01-01

    -specific language to specify those requirements and to allow for generating a safety-enforcing layer of code, which is deployed to the robot. The paper at hand reports experiences in practically applying code generation to mobile robots. For two cases, we discuss how we addressed challenges, e.g., regarding weaving...... code generation into proprietary development environments and testing of manually written code. We find that a DSL based on the same conceptual model can be used across different kinds of hardware modules, but a significant adaptation effort is required in practical scenarios involving different kinds...

  12. Using Locally Generated Magnetic Indices to Characterize the Ionosphere From Magnetic Data Acquisition System (Magdas Ground Based Observatories in Nigeria.

    Directory of Open Access Journals (Sweden)

    U.C. Rabiu

    2013-06-01

    Full Text Available This work presents an attempt to establish a baseline for geomagnetic indices inNigeria. This is particularly very crucial since these indices give indications of theseverity of magnetic fluctuations, and hence the level of disturbances in theionosphere. K (an index which measures the magnetic perturbations of theplanetary field and A (a linear measure of the Earth's field that provides a dailyaverage level for geomagnetic activity geomagnetic indices were generated locallyfrom geomagnetic data obtained using ground based MAGDAS magnetometerslocated at Abuja (9 ̊ 40’N, 7 ̊ 29’E, Ilorin (8 ̊30’N, 4 ̊33’E and Lagos (6 ̊27’N,3 ̊23’E in Nigeria using Computer-based derivation. The indices generated wereused to characterize the ionosphere over the Magdas magnetometer Nigerianetwork stations. Results obtained showed average K values of 3.5 (ABU, 4.60(LAG and 4.13 (ILR, the ionosphere over the three stations was found to berelatively active (4.08 thus setting the baseline for characterizing the ionosphereover Nigeria from ground based magnetometers.

  13. Automatic generation of multilingual sports summaries

    OpenAIRE

    Hasan, Fahim Muhammad

    2011-01-01

    Natural Language Generation is a subfield of Natural Language Processing, which is concerned with automatically creating human readable text from non-linguistic forms of information. A template-based approach to Natural Language Generation utilizes base formats for different types of sentences, which are subsequently transformed to create the final readable forms of the output. In this thesis, we investigate the suitability of a template-based approach to multilingual Natural Language Generat...

  14. Automatic Thesaurus Generation for Chinese Documents.

    Science.gov (United States)

    Tseng, Yuen-Hsien

    2002-01-01

    Reports an approach to automatic thesaurus construction for Chinese documents. Presents an effective Chinese keyword extraction algorithm. Compared to previous studies, this method speeds up the thesaurus generation process drastically. It also achieves a similar percentage level of term relatedness. Includes three tables and four figures.…

  15. Ground-based gravitational wave interferometric detectors of the first and second generation: an overview

    Science.gov (United States)

    Losurdo, Giovanni

    2012-06-01

    The era of first-generation gravitational wave interferometric detectors is ending. No signals have been detected so far. However, remarkable results have been achieved: the design sensitivity has been approached (and in some cases even exceeded) together with the achievement of robustness and reliability; a world-wide network of detectors has been established; the data collected so far has allowed upper limits to be put on several types of sources; some second-generation technologies have been tested on these detectors. The scenario for the next few years is very exciting. The projects to upgrade LIGO and Virgo to second-generation interferometers, capable of increasing the detection rate by a factor of ˜1000, have been funded. The construction of Advanced LIGO and Advanced Virgo has started. GEO600 has started the upgrade to GEO HF, introducing light squeezing for the first time on a large detector. LCGT has been partly funded and the construction of the vacuum system is underway. There is a possibility that the third Advanced LIGO interferometer will be constructed in India. So, a powerful worldwide network could be in operation by the end of the decade. In this paper, we review the results achieved so far and the perspectives for the advanced detectors.

  16. Automatic approach for generating ETL operators

    OpenAIRE

    Bakari, Wided; Ali, Mouez; Ben-Abdallah, Hanene

    2012-01-01

    This article addresses the generation of the ETL operators(Extract-Transform-Load) for supplying a Data Warehouse from a relational data source. As a first step, we add new rules to those proposed by the authors of [1], these rules deal with the combination of ETL operators. In a second step, we propose an automatic approach based on model transformations to generate the ETL operations needed for loading a data warehouse. This approach offers the possibility to set some designer requirements ...

  17. Hail prevention by ground-based silver iodide generators: Results of historical and modern field projects

    Science.gov (United States)

    Dessens, J.; Sánchez, J. L.; Berthet, C.; Hermida, L.; Merino, A.

    2016-03-01

    The science of hail suppression by silver iodide (AgI) cloud seeding was developed during the second half of the 20th century in laboratory and tested in several research or operational projects using three delivery methods for the ice forming particles: ground generators, aircraft, and rockets. The randomization process for the seeding was often considered as the imperative method for a better evaluation but failed to give firm results, mostly because the projects did not last long enough considering the hazardous occurrence of severe hailfalls, and also probably due to the use of improper hail parameters. At the same time and until now, a continuous long-term research and operational field project (1952-2015) using ground generator networks has been conducted in France under the leadership of the Association Nationale d'Etude et de Lutte contre les Fléaux Atmosphériques (ANELFA), with a control initially based on annual insurance loss-to-risk ratios, then on hailpad data. More recently (2000-2009), a companion ground seeding project was developed in the north of Spain, with control mostly based on microphysical and hailpad data. The present paper, which focuses on hail suppression by ground seeding, reviews the production of the AgI nuclei, their dispersion and measurement in the atmosphere, as well as their observed or simulated effects in clouds. The paper summarizes the results of the main historical projects in Switzerland, Argentina, and North America, and finally concentrates on the current French and Spanish projects, with a review of already published results, complemented by new ones recently collected in Spain. The conclusion, at least for France and Spain, is that if ground seeding is performed starting 3 hours before the hail falls at the ground with a 10-km mesh AgI generator network located in the developing hailstorm areas, each generator burning about 9 g of AgI per hour, the hailfall energy of the most severe hail days is decreased by about 50%.

  18. Automatic Test Pattern Generation for Digital Circuits

    Directory of Open Access Journals (Sweden)

    S. Hemalatha

    2014-04-01

    Full Text Available Digital circuits complexity and density are increasing and at the same time it should have more quality and reliability. It leads with high test costs and makes the validation more complex. The main aim is to develop a complete behavioral fault simulation and automatic test pattern generation (ATPG system for digital circuits modeled in verilog and VHDL. An integrated Automatic Test Generation (ATG and Automatic Test Executing/Equipment (ATE system for complex boards is developed here. An approach to use memristors (resistors with memory in programmable analog circuits. The Main idea consists in a circuit design in which low voltages are applied to memristors during their operation as analog circuit elements and high voltages are used to program the memristor’s states. This way, as it was demonstrated in recent experiments, the state of memristors does not essentially change during analog mode operation. As an example of our approach, we have built several programmable analog circuits demonstrating memristor -based programming of threshold, gain and frequency. In these circuits the role of memristor is played by a memristor emulator developed by us. A multiplexer is developed to generate a class of minimum transition sequences. The entire hardware is realized as digital logical circuits and the test results are simulated in Model sim software. The results of this research show that behavioral fault simulation will remain as a highly attractive alternative for the future generation of VLSI and system-on-chips (SoC.

  19. Automatic generation of tree level helicity amplitudes

    CERN Document Server

    Stelzer, T

    1994-01-01

    The program MadGraph is presented which automatically generates postscript Feynman diagrams and Fortran code to calculate arbitrary tree level helicity amplitudes by calling HELAS[1] subroutines. The program is written in Fortran and is available in Unix and VMS versions. MadGraph currently includes standard model interactions of QCD and QFD, but is easily modified to include additional models such as supersymmetry.

  20. Automatic Caption Generation for Electronics Textbooks

    Directory of Open Access Journals (Sweden)

    Veena Thakur

    2014-12-01

    Full Text Available Automatic or semi-automatic approaches for developing Technology Supported Learning Systems (TSLS are required to lighten their development cost. The main objective of this paper is to automate the generation of a caption module; it aims at reproducing the way teachers prepare their lessons and the learning material they will use throughout the course. Teachers tend to choose one or more textbooks that cover the contents of their subjects, determine the topics to be addressed, and identify the parts of the textbooks which may be helpful for the students it describes the entities, attributes, role and their relationship plus the constraints that govern the problem domain. The caption model is created in order to represent the vocabulary and key concepts of the problem domain. The caption model also identifies the relationships among all the entities within the scope of the problem domain, and commonly identifies their attributes. It defines a vocabulary and is helpful as a communication tool. DOM-Sortze, a framework that enables the semi-automatic generation of the Caption Module for technology supported learning system (TSLS from electronic textbooks. The semiautomatic generation of the Caption Module entails the identification and elicitation of knowledge from the documents to which end Natural Language Processing (NLP techniques are combined with ontologies and heuristic reasoning.

  1. Automatic generation of combinatorial test data

    CERN Document Server

    Zhang, Jian; Ma, Feifei

    2014-01-01

    This book reviews the state-of-the-art in combinatorial testing, with particular emphasis on the automatic generation of test data. It describes the most commonly used approaches in this area - including algebraic construction, greedy methods, evolutionary computation, constraint solving and optimization - and explains major algorithms with examples. In addition, the book lists a number of test generation tools, as well as benchmarks and applications. Addressing a multidisciplinary topic, it will be of particular interest to researchers and professionals in the areas of software testing, combi

  2. Automatic Metadata Generation using Associative Networks

    CERN Document Server

    Rodriguez, Marko A; Van de Sompel, Herbert

    2008-01-01

    In spite of its tremendous value, metadata is generally sparse and incomplete, thereby hampering the effectiveness of digital information services. Many of the existing mechanisms for the automated creation of metadata rely primarily on content analysis which can be costly and inefficient. The automatic metadata generation system proposed in this article leverages resource relationships generated from existing metadata as a medium for propagation from metadata-rich to metadata-poor resources. Because of its independence from content analysis, it can be applied to a wide variety of resource media types and is shown to be computationally inexpensive. The proposed method operates through two distinct phases. Occurrence and co-occurrence algorithms first generate an associative network of repository resources leveraging existing repository metadata. Second, using the associative network as a substrate, metadata associated with metadata-rich resources is propagated to metadata-poor resources by means of a discrete...

  3. Towards automatic planning for manufacturing generative processes

    Energy Technology Data Exchange (ETDEWEB)

    CALTON,TERRI L.

    2000-05-24

    Generative process planning describes methods process engineers use to modify manufacturing/process plans after designs are complete. A completed design may be the result from the introduction of a new product based on an old design, an assembly upgrade, or modified product designs used for a family of similar products. An engineer designs an assembly and then creates plans capturing manufacturing processes, including assembly sequences, component joining methods, part costs, labor costs, etc. When new products originate as a result of an upgrade, component geometry may change, and/or additional components and subassemblies may be added to or are omitted from the original design. As a result process engineers are forced to create new plans. This is further complicated by the fact that the process engineer is forced to manually generate these plans for each product upgrade. To generate new assembly plans for product upgrades, engineers must manually re-specify the manufacturing plan selection criteria and re-run the planners. To remedy this problem, special-purpose assembly planning algorithms have been developed to automatically recognize design modifications and automatically apply previously defined manufacturing plan selection criteria and constraints.

  4. Multiblock grid generation with automatic zoning

    Science.gov (United States)

    Eiseman, Peter R.

    1995-01-01

    An overview will be given for multiblock grid generation with automatic zoning. We shall explore the many advantages and benefits of this exciting technology and will also see how to apply it to a number of interesting cases. The technology is available in the form of a commercial code, GridPro(registered trademark)/az3000. This code takes surface geometry definitions and patterns of points as its primary input and produces high quality grids as its output. Before we embark upon our exploration, we shall first give a brief background of the environment in which this technology fits.

  5. Linguistics Computation, Automatic Model Generation, and Intensions

    CERN Document Server

    Nourani, C F

    1994-01-01

    Techniques are presented for defining models of computational linguistics theories. The methods of generalized diagrams that were developed by this author for modeling artificial intelligence planning and reasoning are shown to be applicable to models of computation of linguistics theories. It is shown that for extensional and intensional interpretations, models can be generated automatically which assign meaning to computations of linguistics theories for natural languages. Keywords: Computational Linguistics, Reasoning Models, G-diagrams For Models, Dynamic Model Implementation, Linguistics and Logics For Artificial Intelligence

  6. Automatic Testcase Generation for Flight Software

    Science.gov (United States)

    Bushnell, David Henry; Pasareanu, Corina; Mackey, Ryan M.

    2008-01-01

    The TacSat3 project is applying Integrated Systems Health Management (ISHM) technologies to an Air Force spacecraft for operational evaluation in space. The experiment will demonstrate the effectiveness and cost of ISHM and vehicle systems management (VSM) technologies through onboard operation for extended periods. We present two approaches to automatic testcase generation for ISHM: 1) A blackbox approach that views the system as a blackbox, and uses a grammar-based specification of the system's inputs to automatically generate *all* inputs that satisfy the specifications (up to prespecified limits); these inputs are then used to exercise the system. 2) A whitebox approach that performs analysis and testcase generation directly on a representation of the internal behaviour of the system under test. The enabling technologies for both these approaches are model checking and symbolic execution, as implemented in the Ames' Java PathFinder (JPF) tool suite. Model checking is an automated technique for software verification. Unlike simulation and testing which check only some of the system executions and therefore may miss errors, model checking exhaustively explores all possible executions. Symbolic execution evaluates programs with symbolic rather than concrete values and represents variable values as symbolic expressions. We are applying the blackbox approach to generating input scripts for the Spacecraft Command Language (SCL) from Interface and Control Systems. SCL is an embedded interpreter for controlling spacecraft systems. TacSat3 will be using SCL as the controller for its ISHM systems. We translated the SCL grammar into a program that outputs scripts conforming to the grammars. Running JPF on this program generates all legal input scripts up to a prespecified size. Script generation can also be targeted to specific parts of the grammar of interest to the developers. These scripts are then fed to the SCL Executive. ICS's in-house coverage tools will be run to

  7. The Role of Item Models in Automatic Item Generation

    Science.gov (United States)

    Gierl, Mark J.; Lai, Hollis

    2012-01-01

    Automatic item generation represents a relatively new but rapidly evolving research area where cognitive and psychometric theories are used to produce tests that include items generated using computer technology. Automatic item generation requires two steps. First, test development specialists create item models, which are comparable to templates…

  8. Automatic Generation of Minimal Cut Sets

    Directory of Open Access Journals (Sweden)

    Sentot Kromodimoeljo

    2015-06-01

    Full Text Available A cut set is a collection of component failure modes that could lead to a system failure. Cut Set Analysis (CSA is applied to critical systems to identify and rank system vulnerabilities at design time. Model checking tools have been used to automate the generation of minimal cut sets but are generally based on checking reachability of system failure states. This paper describes a new approach to CSA using a Linear Temporal Logic (LTL model checker called BT Analyser that supports the generation of multiple counterexamples. The approach enables a broader class of system failures to be analysed, by generalising from failure state formulae to failure behaviours expressed in LTL. The traditional approach to CSA using model checking requires the model or system failure to be modified, usually by hand, to eliminate already-discovered cut sets, and the model checker to be rerun, at each step. By contrast, the new approach works incrementally and fully automatically, thereby removing the tedious and error-prone manual process and resulting in significantly reduced computation time. This in turn enables larger models to be checked. Two different strategies for using BT Analyser for CSA are presented. There is generally no single best strategy for model checking: their relative efficiency depends on the model and property being analysed. Comparative results are given for the A320 hydraulics case study in the Behavior Tree modelling language.

  9. Automatic Generation of Validated Specific Epitope Sets

    Directory of Open Access Journals (Sweden)

    Sebastian Carrasco Pro

    2015-01-01

    Full Text Available Accurate measurement of B and T cell responses is a valuable tool to study autoimmunity, allergies, immunity to pathogens, and host-pathogen interactions and assist in the design and evaluation of T cell vaccines and immunotherapies. In this context, it is desirable to elucidate a method to select validated reference sets of epitopes to allow detection of T and B cells. However, the ever-growing information contained in the Immune Epitope Database (IEDB and the differences in quality and subjects studied between epitope assays make this task complicated. In this study, we develop a novel method to automatically select reference epitope sets according to a categorization system employed by the IEDB. From the sets generated, three epitope sets (EBV, mycobacteria and dengue were experimentally validated by detection of T cell reactivity ex vivo from human donors. Furthermore, a web application that will potentially be implemented in the IEDB was created to allow users the capacity to generate customized epitope sets.

  10. System Supporting Automatic Generation of Finite Element Using Image Information

    Institute of Scientific and Technical Information of China (English)

    J; Fukuda

    2002-01-01

    A mesh generating system has been developed in orde r to prepare large amounts of input data which are needed for easy implementation of a finite element analysis. This system consists of a Pre-Mesh Generator, an Automatic Mesh Generator and a Mesh Modifier. Pre-Mesh Generator produces the shape and sub-block information as input data of Automatic Mesh Generator by c arrying out various image processing with respect to the image information of th e drawing input using scanner. Automatic Mesh Generato...

  11. Generating IDS Attack Pattern Automatically Based on Attack Tree

    Institute of Scientific and Technical Information of China (English)

    向尕; 曹元大

    2003-01-01

    Generating attack pattern automatically based on attack tree is studied. The extending definition of attack tree is proposed. And the algorithm of generating attack tree is presented. The method of generating attack pattern automatically based on attack tree is shown, which is tested by concrete attack instances. The results show that the algorithm is effective and efficient. In doing so, the efficiency of generating attack pattern is improved and the attack trees can be reused.

  12. System for Automatic Generation of Examination Papers in Discrete Mathematics

    Science.gov (United States)

    Fridenfalk, Mikael

    2013-01-01

    A system was developed for automatic generation of problems and solutions for examinations in a university distance course in discrete mathematics and tested in a pilot experiment involving 200 students. Considering the success of such systems in the past, particularly including automatic assessment, it should not take long before such systems are…

  13. Automatic program generation: future of software engineering

    Energy Technology Data Exchange (ETDEWEB)

    Robinson, J.H.

    1979-01-01

    At this moment software development is still more of an art than an engineering discipline. Each piece of software is lovingly engineered, nurtured, and presented to the world as a tribute to the writer's skill. When will this change. When will the craftsmanship be removed and the programs be turned out like so many automobiles from an assembly line. Sooner or later it will happen: economic necessities will demand it. With the advent of cheap microcomputers and ever more powerful supercomputers doubling capacity, much more software must be produced. The choices are to double the number of programers, double the efficiency of each programer, or find a way to produce the needed software automatically. Producing software automatically is the only logical choice. How will automatic programing come about. Some of the preliminary actions which need to be done and are being done are to encourage programer plagiarism of existing software through public library mechanisms, produce well understood packages such as compiler automatically, develop languages capable of producing software as output, and learn enough about the whole process of programing to be able to automate it. Clearly, the emphasis must not be on efficiency or size, since ever larger and faster hardware is coming.

  14. Automatic Structure-Based Code Generation from Coloured Petri Nets

    DEFF Research Database (Denmark)

    Kristensen, Lars Michael; Westergaard, Michael

    2010-01-01

    Automatic code generation based on Coloured Petri Net (CPN) models is challenging because CPNs allow for the construction of abstract models that intermix control flow and data processing, making translation into conventional programming constructs difficult. We introduce Process-Partitioned CPNs....... The viability of our approach is demonstrated by applying it to automatically generate an Erlang implementation of the Dynamic MANET On-demand (DYMO) routing protocol specified by the Internet Engineering Task Force (IETF)....

  15. Image analysis techniques associated with automatic data base generation.

    Science.gov (United States)

    Bond, A. D.; Ramapriyan, H. K.; Atkinson, R. J.; Hodges, B. C.; Thomas, D. T.

    1973-01-01

    This paper considers some basic problems relating to automatic data base generation from imagery, the primary emphasis being on fast and efficient automatic extraction of relevant pictorial information. Among the techniques discussed are recursive implementations of some particular types of filters which are much faster than FFT implementations, a 'sequential similarity detection' technique of implementing matched filters, and sequential linear classification of multispectral imagery. Several applications of the above techniques are presented including enhancement of underwater, aerial and radiographic imagery, detection and reconstruction of particular types of features in images, automatic picture registration and classification of multiband aerial photographs to generate thematic land use maps.

  16. A New Approach to Fully Automatic Mesh Generation

    Institute of Scientific and Technical Information of China (English)

    闵卫东; 张征明; 等

    1995-01-01

    Automatic mesh generation is one of the most important parts in CIMS (Computer Integrated Manufacturing System).A method based on mesh grading propagation which automatically produces a triangular mesh in a multiply connected planar region is presented in this paper.The method decomposes the planar region into convex subregions,using algorithms which run in linear time.For every subregion,an algorithm is used to generate shrinking polygons according to boundary gradings and form delaunay triangulation between two adjacent shrinking polygons,both in linear time.It automatically propagates boundary gradings into the interior of the region and produces satisfactory quasi-uniform mesh.

  17. Automatic Grasp Generation and Improvement for Industrial Bin-Picking

    DEFF Research Database (Denmark)

    Kraft, Dirk; Ellekilde, Lars-Peter; Rytz, Jimmy Alison

    2014-01-01

    This paper presents work on automatic grasp generation and grasp learning for reducing the manual setup time and increase grasp success rates within bin-picking applications. We propose an approach that is able to generate good grasps automatically using a dynamic grasp simulator, a newly developed...... and achieve comparable results and that our learning approach can improve system performance significantly. Automatic bin-picking is an important industrial process that can lead to significant savings and potentially keep production in countries with high labour cost rather than outsourcing it. The presented...... work allows to minimize cycle time as well as setup cost, which are essential factors in automatic bin-picking. It therefore leads to a wider applicability of bin-picking in industry....

  18. Automatic Test Case Generation of C Program Using CFG

    Directory of Open Access Journals (Sweden)

    Sangeeta Tanwer

    2010-07-01

    Full Text Available Software quality and assurance in a software company is the only way to gain the customer confidence by removing all possible errors. It can be done by automatic test case generation. Taking popularly C programs as tests object, this paper explores how to create CFG of a C program and generate automatic Test Cases. It explores the feasibility and nonfeasibility of path basis upon no. of iteration. First C is code converted to instrumented code. Then test cases are generated by using Symbolic Testing and random Testing. System is developed by using C#.net in Visual Studio 2008. In addition some future research directions are also explored.

  19. Review of small-angle coronagraphic techniques in the wake of ground-based second-generation adaptive optics systems

    CERN Document Server

    Mawet, Dimitri; Lawson, Peter; Mugnier, Laurent; Traub, Wesley; Boccaletti, Anthony; Trauger, John; Gladysz, Szymon; Serabyn, Eugene; Milli, Julien; Belikov, Ruslan; Kasper, Markus; Baudoz, Pierre; Macintosh, Bruce; Marois, Christian; Oppenheimer, Ben; Barrett, Harrisson; Beuzit, Jean-Luc; Devaney, Nicolas; Girard, Julien; Guyon, Olivier; Krist, John; Mennesson, Bertrand; Mouillet, David; Murakami, Naoshi; Poyneer, Lisa; Savransky, Dmitri; ́erinaud, Christophe V; Wallace, James K

    2012-01-01

    Small-angle coronagraphy is technically and scientifically appealing because it enables the use of smaller telescopes, allows covering wider wavelength ranges, and potentially increases the yield and completeness of circumstellar environment - exoplanets and disks - detection and characterization campaigns. However, opening up this new parameter space is challenging. Here we will review the four posts of high contrast imaging and their intricate interactions at very small angles (within the first 4 resolution elements from the star). The four posts are: choice of coronagraph, optimized wavefront control, observing strategy, and post-processing methods. After detailing each of the four foundations, we will present the lessons learned from the 10+ years of operations of zeroth and first-generation adaptive optics systems. We will then tentatively show how informative the current integration of second-generation adaptive optics system is, and which lessons can already be drawn from this fresh experience. Then, w...

  20. Automatic generation of matter-of-opinion video documentaries

    NARCIS (Netherlands)

    S. Bocconi; F.-M. Nack (Frank); L. Hardman (Lynda)

    2008-01-01

    textabstractIn this paper we describe a model for automatically generating video documentaries. This allows viewers to specify the subject and the point of view of the documentary to be generated. The domain is matter-of-opinion documentaries based on interviews. The model combines rhetorical

  1. Automatic generation of matter-of-opinion video documentaries

    NARCIS (Netherlands)

    Bocconi, S.; Nack, F.-M.; Hardman, L.

    2008-01-01

    In this paper we describe a model for automatically generating video documentaries. This allows viewers to specify the subject and the point of view of the documentary to be generated. The domain is matter-of-opinion documentaries based on interviews. The model combines rhetorical presentation patte

  2. Investigating the Dominant Source for the Generation of Gravity Waves during Indian Summer Monsoon Using Ground-based Measurements

    Institute of Scientific and Technical Information of China (English)

    Debashis NATH; CHEN Wen

    2013-01-01

    Over the tropics,convection,wind shear (i.e.,vertical and horizontal shear of wind and/or geostrophic adjustment comprising spontaneous imbalance in jet streams) and topography are the major sources for the generation of gravity waves.During the summer monsoon season (June-August) over the Indian subcontinent,convection and wind shear coexist.To determine the dominant source of gravity waves during monsoon season,an experiment was conducted using mesosphere-stratosphere-troposphere (MST) radar situated at Gadanki (13.5°N,79.2°E),a tropical observatory in the southern part of the Indian subcontinent.MST radar was operated continuously for 72 h to capture high-frequency gravity waves.During this time,a radiosonde was released every 6 h in addition to the regular launch (once daily to study low-frequency gravity waves) throughout the season.These two data sets were utilized effectively to characterize the jet stream and the associated gravity waves.Data available from collocated instruments along with satellite-based brightness temperature (TBB) data were utilized to characterize the convection in and around Gadanki,Despite the presence of two major sources of gravity wave generation (i.e.,convection and wind shear) during the monsoon season,wind shear (both vertical shear and geostrophic adjustment) contributed the most to the generation of gravity waves on various scales.

  3. Automatic Melody Generation System with Extraction Feature

    Science.gov (United States)

    Ida, Kenichi; Kozuki, Shinichi

    In this paper, we propose the melody generation system with the analysis result of an existing melody. In addition, we introduce the device that takes user's favor in the system. The melody generation is done by pitch's being arranged best on the given rhythm. The best standard is decided by using the feature element extracted from existing music by proposed method. Moreover, user's favor is reflected in the best standard by operating some of the feature element in users. And, GA optimizes the pitch array based on the standard, and achieves the system.

  4. Formal Specification Based Automatic Test Generation for Embedded Network Systems

    Directory of Open Access Journals (Sweden)

    Eun Hye Choi

    2014-01-01

    Full Text Available Embedded systems have become increasingly connected and communicate with each other, forming large-scaled and complicated network systems. To make their design and testing more reliable and robust, this paper proposes a formal specification language called SENS and a SENS-based automatic test generation tool called TGSENS. Our approach is summarized as follows: (1 A user describes requirements of target embedded network systems by logical property-based constraints using SENS. (2 Given SENS specifications, test cases are automatically generated using a SAT-based solver. Filtering mechanisms to select efficient test cases are also available in our tool. (3 In addition, given a testing goal by the user, test sequences are automatically extracted from exhaustive test cases. We’ve implemented our approach and conducted several experiments on practical case studies. Through the experiments, we confirmed the efficiency of our approach in design and test generation of real embedded air-conditioning network systems.

  5. Automatically generated code for relativistic inhomogeneous cosmologies

    Science.gov (United States)

    Bentivegna, Eloisa

    2017-02-01

    The applications of numerical relativity to cosmology are on the rise, contributing insight into such cosmological problems as structure formation, primordial phase transitions, gravitational-wave generation, and inflation. In this paper, I present the infrastructure for the computation of inhomogeneous dust cosmologies which was used recently to measure the effect of nonlinear inhomogeneity on the cosmic expansion rate. I illustrate the code's architecture, provide evidence for its correctness in a number of familiar cosmological settings, and evaluate its parallel performance for grids of up to several billion points. The code, which is available as free software, is based on the Einstein Toolkit infrastructure, and in particular leverages the automated code generation capabilities provided by its component Kranc.

  6. Automatic generation of Web mining environments

    Science.gov (United States)

    Cibelli, Maurizio; Costagliola, Gennaro

    1999-02-01

    The main problem related to the retrieval of information from the world wide web is the enormous number of unstructured documents and resources, i.e., the difficulty of locating and tracking appropriate sources. This paper presents a web mining environment (WME), which is capable of finding, extracting and structuring information related to a particular domain from web documents, using general purpose indices. The WME architecture includes a web engine filter (WEF), to sort and reduce the answer set returned by a web engine, a data source pre-processor (DSP), which processes html layout cues in order to collect and qualify page segments, and a heuristic-based information extraction system (HIES), to finally retrieve the required data. Furthermore, we present a web mining environment generator, WMEG, that allows naive users to generate a WME specific to a given domain by providing a set of specifications.

  7. Automatic Building Information Model Query Generation

    Energy Technology Data Exchange (ETDEWEB)

    Jiang, Yufei; Yu, Nan; Ming, Jiang; Lee, Sanghoon; DeGraw, Jason; Yen, John; Messner, John I.; Wu, Dinghao

    2015-12-01

    Energy efficient building design and construction calls for extensive collaboration between different subfields of the Architecture, Engineering and Construction (AEC) community. Performing building design and construction engineering raises challenges on data integration and software interoperability. Using Building Information Modeling (BIM) data hub to host and integrate building models is a promising solution to address those challenges, which can ease building design information management. However, the partial model query mechanism of current BIM data hub collaboration model has several limitations, which prevents designers and engineers to take advantage of BIM. To address this problem, we propose a general and effective approach to generate query code based on a Model View Definition (MVD). This approach is demonstrated through a software prototype called QueryGenerator. By demonstrating a case study using multi-zone air flow analysis, we show how our approach and tool can help domain experts to use BIM to drive building design with less labour and lower overhead cost.

  8. Algorithm for Automatic Generation of Curved and Compound Twills

    Institute of Scientific and Technical Information of China (English)

    WANG Mei-zhen; WANG Fu-mei; WANG Shan-yuan

    2005-01-01

    A new arithmetic using matrix left-shift functions for the quicker generation of curved and compound twills is introduced in this paper. A matrix model for the generation of regular, curved and compound twill structures is established and its computing simulation realization are elaborated. Examples of the algorithm applying in the simulation and the automatic generation of curved and compound twills in fabric CAD are obtained.

  9. An efficient method for parallel CRC automatic generation

    Institute of Scientific and Technical Information of China (English)

    陈红胜; 张继承; 王勇; 陈抗生

    2003-01-01

    The State Transition Equation (STE) based method to automatically generate the parallel CRC circuits for any generator polynomial or required amount of parallelism is presented. The parallel CRC circuit so generated is partially optimized before being fed to synthesis tools and works properly in our LAN transceiv-er. Compared with the cascading method, the proposed method gives better timing results and significantly re-duces the synthesis time, in particular.

  10. Procedure for the automatic mesh generation of innovative gear teeth

    Directory of Open Access Journals (Sweden)

    Radicella Andrea Chiaramonte

    2016-01-01

    Full Text Available After having described gear wheels with teeth having the two sides constituted by different involutes and their importance in engineering applications, we stress the need for an efficient procedure for the automatic mesh generation of innovative gear teeth. First, we describe the procedure for the subdivision of the tooth profile in the various possible cases, then we show the method for creating the subdivision mesh, defined by two series of curves called meridians and parallels. Finally, we describe how the above procedure for automatic mesh generation is able to solve specific cases that may arise when dealing with teeth having the two sides constituted by different involutes.

  11. Automatic Generation of Network Protocol Gateways

    DEFF Research Database (Denmark)

    Bromberg, Yérom-David; Réveillère, Laurent; Lawall, Julia

    2009-01-01

    , however, requires an intimate knowledge of the relevant protocols and a substantial understanding of low-level network programming, which can be a challenge for many application programmers. This paper presents a generative approach to gateway construction, z2z, based on a domain-specific language...... for describing protocol behaviors, message structures, and the gateway logic.  Z2z includes a compiler that checks essential correctness properties and produces efficient code. We have used z2z to develop a number of gateways, including SIP to RTSP, SLP to UPnP, and SMTP to SMTP via HTTP, involving a range...... of issues common to protocols used in the home.  Our evaluation of these gateways shows that z2z enables communication between incompatible devices without increasing the overall resource usage or response time....

  12. MEMOPS: data modelling and automatic code generation.

    Science.gov (United States)

    Fogh, Rasmus H; Boucher, Wayne; Ionides, John M C; Vranken, Wim F; Stevens, Tim J; Laue, Ernest D

    2010-03-25

    In recent years the amount of biological data has exploded to the point where much useful information can only be extracted by complex computational analyses. Such analyses are greatly facilitated by metadata standards, both in terms of the ability to compare data originating from different sources, and in terms of exchanging data in standard forms, e.g. when running processes on a distributed computing infrastructure. However, standards thrive on stability whereas science tends to constantly move, with new methods being developed and old ones modified. Therefore maintaining both metadata standards, and all the code that is required to make them useful, is a non-trivial problem. Memops is a framework that uses an abstract definition of the metadata (described in UML) to generate internal data structures and subroutine libraries for data access (application programming interfaces--APIs--currently in Python, C and Java) and data storage (in XML files or databases). For the individual project these libraries obviate the need for writing code for input parsing, validity checking or output. Memops also ensures that the code is always internally consistent, massively reducing the need for code reorganisation. Across a scientific domain a Memops-supported data model makes it easier to support complex standards that can capture all the data produced in a scientific area, share them among all programs in a complex software pipeline, and carry them forward to deposition in an archive. The principles behind the Memops generation code will be presented, along with example applications in Nuclear Magnetic Resonance (NMR) spectroscopy and structural biology.

  13. Automatic Generation and Ranking of Questions for Critical Review

    Science.gov (United States)

    Liu, Ming; Calvo, Rafael A.; Rus, Vasile

    2014-01-01

    Critical review skill is one important aspect of academic writing. Generic trigger questions have been widely used to support this activity. When students have a concrete topic in mind, trigger questions are less effective if they are too general. This article presents a learning-to-rank based system which automatically generates specific trigger…

  14. Automatic Generation of Tests from Domain and Multimedia Ontologies

    Science.gov (United States)

    Papasalouros, Andreas; Kotis, Konstantinos; Kanaris, Konstantinos

    2011-01-01

    The aim of this article is to present an approach for generating tests in an automatic way. Although other methods have been already reported in the literature, the proposed approach is based on ontologies, representing both domain and multimedia knowledge. The article also reports on a prototype implementation of this approach, which…

  15. Towards Automatic Personalized Content Generation for Platform Games

    DEFF Research Database (Denmark)

    Shaker, Noor; Yannakakis, Georgios N.; Togelius, Julian

    2010-01-01

    In this paper, we show that personalized levels can be automatically generated for platform games. We build on previous work, where models were derived that predicted player experience based on features of level design and on playing styles. These models are constructed using preference learning...

  16. A quick scan on possibilities for automatic metadata generation

    NARCIS (Netherlands)

    Benneker, Frank

    2006-01-01

    The Quick Scan is a report on research into useable solutions for automatic generation of metadata or parts of metadata. The aim of this study is to explore possibilities for facilitating the process of attaching metadata to learning objects. This document is aimed at developers of digital learning

  17. Automatic generation of gene finders for eukaryotic species

    DEFF Research Database (Denmark)

    Terkelsen, Kasper Munch; Krogh, A.

    2006-01-01

    Background The number of sequenced eukaryotic genomes is rapidly increasing. This means that over time it will be hard to keep supplying customised gene finders for each genome. This calls for procedures to automatically generate species-specific gene finders and to re-train them as the quantity...

  18. Mppsocgen: A framework for automatic generation of mppsoc architecture

    CERN Document Server

    Kallel, Emna; Baklouti, Mouna; Abid, Mohamed

    2012-01-01

    Automatic code generation is a standard method in software engineering since it improves the code consistency and reduces the overall development time. In this context, this paper presents a design flow for automatic VHDL code generation of mppSoC (massively parallel processing System-on-Chip) configuration. Indeed, depending on the application requirements, a framework of Netbeans Platform Software Tool named MppSoCGEN was developed in order to accelerate the design process of complex mppSoC. Starting from an architecture parameters design, VHDL code will be automatically generated using parsing method. Configuration rules are proposed to have a correct and valid VHDL syntax configuration. Finally, an automatic generation of Processor Elements and network topologies models of mppSoC architecture will be done for Stratix II device family. Our framework improves its flexibility on Netbeans 5.5 version and centrino duo Core 2GHz with 22 Kbytes and 3 seconds average runtime. Experimental results for reduction al...

  19. A quick scan on possibilities for automatic metadata generation

    NARCIS (Netherlands)

    Benneker, Frank

    2006-01-01

    The Quick Scan is a report on research into useable solutions for automatic generation of metadata or parts of metadata. The aim of this study is to explore possibilities for facilitating the process of attaching metadata to learning objects. This document is aimed at developers of digital learning

  20. Automatic generation of matter-of-opinion video documentaries

    OpenAIRE

    Bocconi, S.; Nack, Frank; Hardman, Hazel Lynda

    2008-01-01

    In this paper we describe a model for automatically generating video documentaries. This allows viewers to specify the subject and the point of view of the documentary to be generated. The domain is matter-of-opinion documentaries based on interviews. The model combines rhetorical presentation patterns used by documentary makers with a data-driven approach. Rhetorical presentation patterns provide the viewer with an engaging viewing experience, while a data-driven approach can be applied to g...

  1. Automatic control system generation for robot design validation

    Science.gov (United States)

    Bacon, James A. (Inventor); English, James D. (Inventor)

    2012-01-01

    The specification and drawings present a new method, system and software product for and apparatus for generating a robotic validation system for a robot design. The robotic validation system for the robot design of a robotic system is automatically generated by converting a robot design into a generic robotic description using a predetermined format, then generating a control system from the generic robotic description and finally updating robot design parameters of the robotic system with an analysis tool using both the generic robot description and the control system.

  2. Automatically generating procedure code and database maintenance scripts

    Energy Technology Data Exchange (ETDEWEB)

    Hatley, J.W. [Sandia National Labs., Albuquerque, NM (United States). Information Technologies and Methodologies Dept.

    1994-10-01

    Over the past couple of years the Information Technology Department at Sandia Laboratories has developed software to automatically generate database/4gl procedure code and database maintenance scripts based on database table information. With this software developers simply enter table and referential integrity information and the software generates code and scripts as required. The generated procedure code includes simple insert/delete/update procedures, transaction logging procedures as well as referential integrity procedures. The generated database maintenance scripts include scripts to modify structures, update remote databases, create views, and create indexes. Additionally, the software can generate EPSI representations of Binder diagrams for the tables. This paper will discuss the software application and use of it in real world applications. The automated generation of procedure code and maintenance scripts allows the developers to concentrate on the development of user interface code. The technique involves generating database/4 gl procedure code and maintenance scripts automatically from the database table information. The database/4gl procedure code provides standard insert/update/delete interfaces for upper level code as well as enforces the data constraints defined in the information model. The maintenance scripts provide maintenance scripts and migration scripts. This has resulted in fully updated database applications with complete rules enforcement and database maintenance scripts within days of a database modification.

  3. Towards Automatic Personalized Content Generation for Platform Games

    DEFF Research Database (Denmark)

    Shaker, Noor; Yannakakis, Georgios N.; Togelius, Julian

    2010-01-01

    In this paper, we show that personalized levels can be automatically generated for platform games. We build on previous work, where models were derived that predicted player experience based on features of level design and on playing styles. These models are constructed using preference learning,...... mechanism using both algorithmic and human players. The results indicate that the adaptation mechanism effectively optimizes level design parameters for particular players....

  4. Progressive Concept Evaluation Method for Automatically Generated Concept Variants

    Directory of Open Access Journals (Sweden)

    Woldemichael Dereje Engida

    2014-07-01

    Full Text Available Conceptual design is one of the most critical and important phases of design process with least computer support system. Conceptual design support tool (CDST is a conceptual design support system developed to automatically generate concepts for each subfunction in functional structure. The automated concept generation process results in large number of concept variants which require a thorough evaluation process to select the best design. To address this, a progressive concept evaluation technique consisting of absolute comparison, concept screening and weighted decision matrix using analytical hierarchy process (AHP is proposed to eliminate infeasible concepts at each stage. The software implementation of the proposed method is demonstrated.

  5. [Automatic analysis pipeline of next-generation sequencing data].

    Science.gov (United States)

    Wenke, Li; Fengyu, Li; Siyao, Zhang; Bin, Cai; Na, Zheng; Yu, Nie; Dao, Zhou; Qian, Zhao

    2014-06-01

    The development of next-generation sequencing has generated high demand for data processing and analysis. Although there are a lot of software for analyzing next-generation sequencing data, most of them are designed for one specific function (e.g., alignment, variant calling or annotation). Therefore, it is necessary to combine them together for data analysis and to generate interpretable results for biologists. This study designed a pipeline to process Illumina sequencing data based on Perl programming language and SGE system. The pipeline takes original sequence data (fastq format) as input, calls the standard data processing software (e.g., BWA, Samtools, GATK, and Annovar), and finally outputs a list of annotated variants that researchers can further analyze. The pipeline simplifies the manual operation and improves the efficiency by automatization and parallel computation. Users can easily run the pipeline by editing the configuration file or clicking the graphical interface. Our work will facilitate the research projects using the sequencing technology.

  6. Automatic code generation from the OMT-based dynamic model

    Energy Technology Data Exchange (ETDEWEB)

    Ali, J.; Tanaka, J.

    1996-12-31

    The OMT object-oriented software development methodology suggests creating three models of the system, i.e., object model, dynamic model and functional model. We have developed a system that automatically generates implementation code from the dynamic model. The system first represents the dynamic model as a table and then generates executable Java language code from it. We used inheritance for super-substate relationships. We considered that transitions relate to states in a state diagram exactly as operations relate to classes in an object diagram. In the generated code, each state in the state diagram becomes a class and each event on a state becomes an operation on the corresponding class. The system is implemented and can generate executable code for any state diagram. This makes the role of the dynamic model more significant and the job of designers even simpler.

  7. Automatic Item Generation via Frame Semantics: Natural Language Generation of Math Word Problems.

    Science.gov (United States)

    Deane, Paul; Sheehan, Kathleen

    This paper is an exploration of the conceptual issues that have arisen in the course of building a natural language generation (NLG) system for automatic test item generation. While natural language processing techniques are applicable to general verbal items, mathematics word problems are particularly tractable targets for natural language…

  8. Automatic generation of executable communication specifications from parallel applications

    Energy Technology Data Exchange (ETDEWEB)

    Pakin, Scott [Los Alamos National Laboratory; Wu, Xing [NCSU; Mueller, Frank [NCSU

    2011-01-19

    Portable parallel benchmarks are widely used and highly effective for (a) the evaluation, analysis and procurement of high-performance computing (HPC) systems and (b) quantifying the potential benefits of porting applications for new hardware platforms. Yet, past techniques to synthetically parameterized hand-coded HPC benchmarks prove insufficient for today's rapidly-evolving scientific codes particularly when subject to multi-scale science modeling or when utilizing domain-specific libraries. To address these problems, this work contributes novel methods to automatically generate highly portable and customizable communication benchmarks from HPC applications. We utilize ScalaTrace, a lossless, yet scalable, parallel application tracing framework to collect selected aspects of the run-time behavior of HPC applications, including communication operations and execution time, while abstracting away the details of the computation proper. We subsequently generate benchmarks with identical run-time behavior from the collected traces. A unique feature of our approach is that we generate benchmarks in CONCEPTUAL, a domain-specific language that enables the expression of sophisticated communication patterns using a rich and easily understandable grammar yet compiles to ordinary C + MPI. Experimental results demonstrate that the generated benchmarks are able to preserve the run-time behavior - including both the communication pattern and the execution time - of the original applications. Such automated benchmark generation is particularly valuable for proprietary, export-controlled, or classified application codes: when supplied to a third party. Our auto-generated benchmarks ensure performance fidelity but without the risks associated with releasing the original code. This ability to automatically generate performance-accurate benchmarks from parallel applications is novel and without any precedence, to our knowledge.

  9. Visual definition of procedures for automatic virtual scene generation

    CERN Document Server

    Lucanin, Drazen

    2012-01-01

    With more and more digital media, especially in the field of virtual reality where detailed and convincing scenes are much required, procedural scene generation is a big helping tool for artists. A problem is that defining scene descriptions through these procedures usually requires a knowledge in formal language grammars, programming theory and manually editing textual files using a strict syntax, making it less intuitive to use. Luckily, graphical user interfaces has made a lot of tasks on computers easier to perform and out of the belief that creating computer programs can also be one of them, visual programming languages (VPLs) have emerged. The goal in VPLs is to shift more work from the programmer to the integrated development environment (IDE), making programming an user-friendlier task. In this thesis, an approach of using a VPL for defining procedures that automatically generate virtual scenes is presented. The methods required to build a VPL are presented, including a novel method of generating read...

  10. Automatic Generation of 3D Building Models with Multiple Roofs

    Institute of Scientific and Technical Information of China (English)

    Kenichi Sugihara; Yoshitugu Hayashi

    2008-01-01

    Based on building footprints (building polygons) on digital maps, we are proposing the GIS and CG integrated system that automatically generates 3D building models with multiple roofs. Most building polygons' edges meet at right angles (orthogonal polygon). The integrated system partitions orthogonal building polygons into a set of rectangles and places rectangular roofs and box-shaped building bodies on these rectangles. In order to partition an orthogonal polygon, we proposed a useful polygon expression in deciding from which vertex a dividing line is drawn. In this paper, we propose a new scheme for partitioning building polygons and show the process of creating 3D roof models.

  11. Automatic structures and growth functions for finitely generated abelian groups

    CERN Document Server

    Kamei, Satoshi

    2011-01-01

    In this paper, we consider the formal power series whose n-th coefficient is the number of copies of a given finite graph in the ball of radius n centred at the identity element in the Cayley graph of a finitely generated group and call it the growth function. Epstein, Iano-Fletcher and Uri Zwick proved that the growth function is a rational function if the group has a geodesic automatic structure. We compute the growth function in the case where the group is abelian and see that the denominator of the rational function is determined from the rank of the group.

  12. Automatic generation of gene finders for eukaryotic species

    DEFF Research Database (Denmark)

    Terkelsen, Kasper Munch; Krogh, A.

    2006-01-01

    length distributions. The performance of each individual gene predictor on each individual genome is comparable to the best of the manually optimised species-specific gene finders. It is shown that species-specific gene finders are superior to gene finders trained on other species.......Background The number of sequenced eukaryotic genomes is rapidly increasing. This means that over time it will be hard to keep supplying customised gene finders for each genome. This calls for procedures to automatically generate species-specific gene finders and to re-train them as the quantity...... structure blocks using acyclic discrete phase type distributions. The state structure of the each HMM is generated dynamically from an array of sub-models to include only gene features represented in the training set. Conclusion Acyclic discrete phase type distributions are well suited to model sequence...

  13. Automatic Tamil lyric generation based on ontological interpretation for semantics

    Indian Academy of Sciences (India)

    Rajeswari Sridhar; D Jalin Gladis; Kameswaran Ganga; G Dhivya Prabha

    2014-02-01

    This system proposes an -gram based approach to automatic Tamil lyric generation, by the ontological semantic interpretation of the input scene. The approach is based on identifying the semantics conveyed in the scenario, thereby making the system understand the situation and generate lyrics accordingly. The heart of the system includes the ontological interpretation of the scenario, and the selection of the appropriate tri-grams for generating the lyrics. To fulfill this, we have designed a new ontology with weighted edges, where the edges correspond to a set of sentences, which indicate a relationship, and are represented as a tri-gram. Once the appropriate tri-grams are selected, the root words from these tri-grams are sent to the morphological generator, to form words in their packed form. These words are then assembled to form the final lyrics. Parameters of poetry like rhyme, alliteration, simile, vocative words, etc., are also taken care of by the system. Using this approach, we achieved an average accuracy of 77.3% with respect to the exact semantic details being conveyed in the generated lyrics.

  14. Automatic Model Generation Framework for Computational Simulation of Cochlear Implantation

    DEFF Research Database (Denmark)

    Mangado Lopez, Nerea; Ceresa, Mario; Duchateau, Nicolas

    2016-01-01

    Recent developments in computational modeling of cochlear implantation are promising to study in silico the performance of the implant before surgery. However, creating a complete computational model of the patient's anatomy while including an external device geometry remains challenging. To addr......Recent developments in computational modeling of cochlear implantation are promising to study in silico the performance of the implant before surgery. However, creating a complete computational model of the patient's anatomy while including an external device geometry remains challenging....... To address such a challenge, we propose an automatic framework for the generation of patient-specific meshes for finite element modeling of the implanted cochlea. First, a statistical shape model is constructed from high-resolution anatomical μCT images. Then, by fitting the statistical model to a patient......'s CT image, an accurate model of the patient-specific cochlea anatomy is obtained. An algorithm based on the parallel transport frame is employed to perform the virtual insertion of the cochlear implant. Our automatic framework also incorporates the surrounding bone and nerve fibers and assigns...

  15. Automatic Mesh Generation on a Regular Background Grid

    Institute of Scientific and Technical Information of China (English)

    LO S.H; 刘剑飞

    2002-01-01

    This paper presents an automatic mesh generation procedure on a 2D domainbased on a regular background grid. The idea is to devise a robust mesh generation schemewith equal emphasis on quality and efficiency. Instead of using a traditional regular rectangulargrid, a mesh of equilateral triangles is employed to ensure triangular element of the best qualitywill be preserved in the interior of the domain.As for the boundary, it is to be generated by a node/segment insertion process. Nodes areinserted into the background mesh one by one following the sequence of the domain boundary.The local structure of the mesh is modified based on the Delaunay criterion with the introduc-tion of each node. Those boundary segments, which are not produced in the phase of nodeinsertion, will be recovered through a systematic element swap process. Two theorems will bepresented and proved to set up the theoretical basic of the boundary recovery part. Exampleswill be presented to demonstrate the robustness and the quality of the mesh generated by theproposed technique.

  16. Spline-based automatic path generation of welding robot

    Institute of Scientific and Technical Information of China (English)

    Niu Xuejuan; Li Liangyu

    2007-01-01

    This paper presents a flexible method for the representation of welded seam based on spline interpolation. In this method, the tool path of welding robot can be generated automatically from a 3D CAD model. This technique has been implemented and demonstrated in the FANUC Arc Welding Robot Workstation. According to the method, a software system is developed using VBA of SolidWorks 2006. It offers an interface between SolidWorks and ROBOGUIDE, the off-line programming software of FANUC robot. It combines the strong modeling function of the former and the simulating function of the latter. It also has the capability of communication with on-line robot. The result data have shown its high accuracy and strong reliability in experiments. This method will improve the intelligence and the flexibility of the welding robot workstation.

  17. Automatic Generation of OWL Ontology from XML Data Source

    CERN Document Server

    Yahia, Nora; Ahmed, AbdelWahab

    2012-01-01

    The eXtensible Markup Language (XML) can be used as data exchange format in different domains. It allows different parties to exchange data by providing common understanding of the basic concepts in the domain. XML covers the syntactic level, but lacks support for reasoning. Ontology can provide a semantic representation of domain knowledge which supports efficient reasoning and expressive power. One of the most popular ontology languages is the Web Ontology Language (OWL). It can represent domain knowledge using classes, properties, axioms and instances for the use in a distributed environment such as the World Wide Web. This paper presents a new method for automatic generation of OWL ontology from XML data sources.

  18. Spike Detection Based on Normalized Correlation with Automatic Template Generation

    Directory of Open Access Journals (Sweden)

    Wen-Jyi Hwang

    2014-06-01

    Full Text Available A novel feedback-based spike detection algorithm for noisy spike trains is presented in this paper. It uses the information extracted from the results of spike classification for the enhancement of spike detection. The algorithm performs template matching for spike detection by a normalized correlator. The detected spikes are then sorted by the OSortalgorithm. The mean of spikes of each cluster produced by the OSort algorithm is used as the template of the normalized correlator for subsequent detection. The automatic generation and updating of templates enhance the robustness of the spike detection to input trains with various spike waveforms and noise levels. Experimental results show that the proposed algorithm operating in conjunction with OSort is an efficient design for attaining high detection and classification accuracy for spike sorting.

  19. Automatic Generation of Symbolic Model for Parameterized Synchronous Systems

    Institute of Scientific and Technical Information of China (English)

    Wei-Wen Xu

    2004-01-01

    With the purpose of making the verification of parameterized system more general and easier, in this paper, a new and intuitive language PSL (Parameterized-system Specification Language) is proposed to specify a class of parameterized synchronous systems. From a PSL script, an automatic method is proposed to generate a constraint-based symbolic model. The model can concisely symbolically represent the collections of global states by counting the number of processes in a given state. Moreover, a theorem has been proved that there is a simulation relation between the original system and its symbolic model. Since the abstract and symbolic techniques are exploited in the symbolic model, state-explosion problem in traditional verification methods is efficiently avoided. Based on the proposed symbolic model, a reachability analysis procedure is implemented using ANSI C++ on UNIX platform. Thus, a complete tool for verifying the parameterized synchronous systems is obtained and tested for some cases. The experimental results show that the method is satisfactory.

  20. Hybrid Generative/Discriminative Learning for Automatic Image Annotation

    CERN Document Server

    Yang, Shuang Hong; Zha, Hongyuan

    2012-01-01

    Automatic image annotation (AIA) raises tremendous challenges to machine learning as it requires modeling of data that are both ambiguous in input and output, e.g., images containing multiple objects and labeled with multiple semantic tags. Even more challenging is that the number of candidate tags is usually huge (as large as the vocabulary size) yet each image is only related to a few of them. This paper presents a hybrid generative-discriminative classifier to simultaneously address the extreme data-ambiguity and overfitting-vulnerability issues in tasks such as AIA. Particularly: (1) an Exponential-Multinomial Mixture (EMM) model is established to capture both the input and output ambiguity and in the meanwhile to encourage prediction sparsity; and (2) the prediction ability of the EMM model is explicitly maximized through discriminative learning that integrates variational inference of graphical models and the pairwise formulation of ordinal regression. Experiments show that our approach achieves both su...

  1. Automatic generation of matrix element derivatives for tight binding models

    Science.gov (United States)

    Elena, Alin M.; Meister, Matthias

    2005-10-01

    Tight binding (TB) models are one approach to the quantum mechanical many-particle problem. An important role in TB models is played by hopping and overlap matrix elements between the orbitals on two atoms, which of course depend on the relative positions of the atoms involved. This dependence can be expressed with the help of Slater-Koster parameters, which are usually taken from tables. Recently, a way to generate these tables automatically was published. If TB approaches are applied to simulations of the dynamics of a system, also derivatives of matrix elements can appear. In this work we give general expressions for first and second derivatives of such matrix elements. Implemented in a tight binding computer program, like, for instance, DINAMO, they obviate the need to type all the required derivatives of all occurring matrix elements by hand.

  2. Intelligent control schemes applied to Automatic Generation Control

    Directory of Open Access Journals (Sweden)

    Dingguo Chen

    2016-04-01

    Full Text Available Integrating ever increasing amount of renewable generating resources to interconnected power systems has created new challenges to the safety and reliability of today‟s power grids and posed new questions to be answered in the power system modeling, analysis and control. Automatic Generation Control (AGC must be extended to be able to accommodate the control of renewable generating assets. In addition, AGC is mandated to operate in accordance with the NERC‟s Control Performance Standard (CPS criteria, which represent a greater flexibility in relaxing the control of generating resources and yet assuring the stability and reliability of interconnected power systems when each balancing authority operates in full compliance. Enhancements in several aspects to the traditional AGC must be made in order to meet the aforementioned challenges. It is the intention of this paper to provide a systematic, mathematical formulation for AGC as a first attempt in the context of meeting the NERC CPS requirements and integrating renewable generating assets, which has not been seen reported in the literature to the best knowledge of the authors. Furthermore, this paper proposes neural network based predictive control schemes for AGC. The proposed controller is capable of handling complicated nonlinear dynamics in comparison with the conventional Proportional Integral (PI controller which is typically most effective to handle linear dynamics. The neural controller is designed in such a way that it has the capability of controlling the system generation in the relaxed manner so the ACE is controlled to a desired range instead of driving it to zero which would otherwise increase the control effort and cost; and most importantly the resulting system control performance meets the NERC CPS requirements and/or the NERC Balancing Authority’s ACE Limit (BAAL compliance requirements whichever are applicable.

  3. Provenance-Powered Automatic Workflow Generation and Composition

    Science.gov (United States)

    Zhang, J.; Lee, S.; Pan, L.; Lee, T. J.

    2015-12-01

    In recent years, scientists have learned how to codify tools into reusable software modules that can be chained into multi-step executable workflows. Existing scientific workflow tools, created by computer scientists, require domain scientists to meticulously design their multi-step experiments before analyzing data. However, this is oftentimes contradictory to a domain scientist's daily routine of conducting research and exploration. We hope to resolve this dispute. Imagine this: An Earth scientist starts her day applying NASA Jet Propulsion Laboratory (JPL) published climate data processing algorithms over ARGO deep ocean temperature and AMSRE sea surface temperature datasets. Throughout the day, she tunes the algorithm parameters to study various aspects of the data. Suddenly, she notices some interesting results. She then turns to a computer scientist and asks, "can you reproduce my results?" By tracking and reverse engineering her activities, the computer scientist creates a workflow. The Earth scientist can now rerun the workflow to validate her findings, modify the workflow to discover further variations, or publish the workflow to share the knowledge. In this way, we aim to revolutionize computer-supported Earth science. We have developed a prototyping system to realize the aforementioned vision, in the context of service-oriented science. We have studied how Earth scientists conduct service-oriented data analytics research in their daily work, developed a provenance model to record their activities, and developed a technology to automatically generate workflow starting from user behavior and adaptability and reuse of these workflows for replicating/improving scientific studies. A data-centric repository infrastructure is established to catch richer provenance to further facilitate collaboration in the science community. We have also established a Petri nets-based verification instrument for provenance-based automatic workflow generation and recommendation.

  4. Reaction Mechanism Generator: Automatic construction of chemical kinetic mechanisms

    Science.gov (United States)

    Gao, Connie W.; Allen, Joshua W.; Green, William H.; West, Richard H.

    2016-06-01

    Reaction Mechanism Generator (RMG) constructs kinetic models composed of elementary chemical reaction steps using a general understanding of how molecules react. Species thermochemistry is estimated through Benson group additivity and reaction rate coefficients are estimated using a database of known rate rules and reaction templates. At its core, RMG relies on two fundamental data structures: graphs and trees. Graphs are used to represent chemical structures, and trees are used to represent thermodynamic and kinetic data. Models are generated using a rate-based algorithm which excludes species from the model based on reaction fluxes. RMG can generate reaction mechanisms for species involving carbon, hydrogen, oxygen, sulfur, and nitrogen. It also has capabilities for estimating transport and solvation properties, and it automatically computes pressure-dependent rate coefficients and identifies chemically-activated reaction paths. RMG is an object-oriented program written in Python, which provides a stable, robust programming architecture for developing an extensible and modular code base with a large suite of unit tests. Computationally intensive functions are cythonized for speed improvements.

  5. Reinforcement-Based Fuzzy Neural Network ontrol with Automatic Rule Generation

    Institute of Scientific and Technical Information of China (English)

    1999-01-01

    A reinforcemen-based fuzzy neural network control with automatic rule generation RBFNNC) is pro-posed. A set of optimized fuzzy control rules can be automatically generated through reinforcement learning based onthe state variables of object system. RBFNNC was applied to a cart-pole balancing system and simulation resultshows significant improvements on the rule generation.

  6. [Development of a Software for Automatically Generated Contours in Eclipse TPS].

    Science.gov (United States)

    Xie, Zhao; Hu, Jinyou; Zou, Lian; Zhang, Weisha; Zou, Yuxin; Luo, Kelin; Liu, Xiangxiang; Yu, Luxin

    2015-03-01

    The automatic generation of planning targets and auxiliary contours have achieved in Eclipse TPS 11.0. The scripting language autohotkey was used to develop a software for automatically generated contours in Eclipse TPS. This software is named Contour Auto Margin (CAM), which is composed of operational functions of contours, script generated visualization and script file operations. RESULTS Ten cases in different cancers have separately selected, in Eclipse TPS 11.0 scripts generated by the software could not only automatically generate contours but also do contour post-processing. For different cancers, there was no difference between automatically generated contours and manually created contours. The CAM is a user-friendly and powerful software, and can automatically generated contours fast in Eclipse TPS 11.0. With the help of CAM, it greatly save plan preparation time and improve working efficiency of radiation therapy physicists.

  7. An Algorithm to Automatically Generate the Combinatorial Orbit Counting Equations.

    Science.gov (United States)

    Melckenbeeck, Ine; Audenaert, Pieter; Michoel, Tom; Colle, Didier; Pickavet, Mario

    2016-01-01

    Graphlets are small subgraphs, usually containing up to five vertices, that can be found in a larger graph. Identification of the graphlets that a vertex in an explored graph touches can provide useful information about the local structure of the graph around that vertex. Actually finding all graphlets in a large graph can be time-consuming, however. As the graphlets grow in size, more different graphlets emerge and the time needed to find each graphlet also scales up. If it is not needed to find each instance of each graphlet, but knowing the number of graphlets touching each node of the graph suffices, the problem is less hard. Previous research shows a way to simplify counting the graphlets: instead of looking for the graphlets needed, smaller graphlets are searched, as well as the number of common neighbors of vertices. Solving a system of equations then gives the number of times a vertex is part of each graphlet of the desired size. However, until now, equations only exist to count graphlets with 4 or 5 nodes. In this paper, two new techniques are presented. The first allows to generate the equations needed in an automatic way. This eliminates the tedious work needed to do so manually each time an extra node is added to the graphlets. The technique is independent on the number of nodes in the graphlets and can thus be used to count larger graphlets than previously possible. The second technique gives all graphlets a unique ordering which is easily extended to name graphlets of any size. Both techniques were used to generate equations to count graphlets with 4, 5 and 6 vertices, which extends all previous results. Code can be found at https://github.com/IneMelckenbeeck/equation-generator and https://github.com/IneMelckenbeeck/graphlet-naming.

  8. On A Semi-Automatic Method for Generating Composition Tables

    CERN Document Server

    Liu, Weiming

    2011-01-01

    Originating from Allen's Interval Algebra, composition-based reasoning has been widely acknowledged as the most popular reasoning technique in qualitative spatial and temporal reasoning. Given a qualitative calculus (i.e. a relation model), the first thing we should do is to establish its composition table (CT). In the past three decades, such work is usually done manually. This is undesirable and error-prone, given that the calculus may contain tens or hundreds of basic relations. Computing the correct CT has been identified by Tony Cohn as a challenge for computer scientists in 1995. This paper addresses this problem and introduces a semi-automatic method to compute the CT by randomly generating triples of elements. For several important qualitative calculi, our method can establish the correct CT in a reasonable short time. This is illustrated by applications to the Interval Algebra, the Region Connection Calculus RCC-8, the INDU calculus, and the Oriented Point Relation Algebras. Our method can also be us...

  9. Learning Techniques for Automatic Test Pattern Generation using Boolean Satisfiability

    Directory of Open Access Journals (Sweden)

    Liu Xin

    2013-07-01

    Full Text Available Automatic Test Pattern Generation (ATPG is one of the core problems in testing of digital circuits. ATPG algorithms based on Boolean Satisfiability (SAT turned out to be very powerful, due to great advances in the performance of satisfiability solvers for propositional logic in the last two decades. SAT-based ATPG clearly outperforms classical approaches especially for hard-to-detect faults. But its inaccessibility of structural information and don’t care, there exists the over-specification problem of input patterns. In this paper we present techniques to delve into an additional layer to make use of structural properties of the circuit and value justification relations to a generic SAT algorithm. It joins binary decision graphs (BDD and SAT techniques to improve the efficiency of ATPG. It makes a study of inexpensive reconvergent fanout analysis of circuit to gather information on the local signal correlation by using BDD learning, then uses the above learned information to restrict and focus the overall search space of SAT-based ATPG. The learning technique is effective and lightweight. Experimental results show the effectiveness of the approach.

  10. Automatic summary generating technology of vegetable traceability for information sharing

    Science.gov (United States)

    Zhenxuan, Zhang; Minjing, Peng

    2017-06-01

    In order to solve problems of excessive data entries and consequent high costs for data collection in vegetable traceablility for farmers in traceability applications, the automatic summary generating technology of vegetable traceability for information sharing was proposed. The proposed technology is an effective way for farmers to share real-time vegetable planting information in social networking platforms to enhance their brands and obtain more customers. In this research, the influencing factors in the vegetable traceablility for customers were analyzed to establish the sub-indicators and target indicators and propose a computing model based on the collected parameter values of the planted vegetables and standard legal systems on food safety. The proposed standard parameter model involves five steps: accessing database, establishing target indicators, establishing sub-indicators, establishing standard reference model and computing scores of indicators. On the basis of establishing and optimizing the standards of food safety and traceability system, this proposed technology could be accepted by more and more farmers and customers.

  11. Audio watermarking technologies for automatic cue sheet generation systems

    Science.gov (United States)

    Caccia, Giuseppe; Lancini, Rosa C.; Pascarella, Annalisa; Tubaro, Stefano; Vicario, Elena

    2001-08-01

    Usually watermark is used as a way for hiding information on digital media. The watermarked information may be used to allow copyright protection or user and media identification. In this paper we propose a watermarking scheme for digital audio signals that allow automatic identification of musical pieces transmitted in TV broadcasting programs. In our application the watermark must be, obviously, imperceptible to the users, should be robust to standard TV and radio editing and have a very low complexity. This last item is essential to allow a software real-time implementation of the insertion and detection of watermarks using only a minimum amount of the computation power of a modern PC. In the proposed method the input audio sequence is subdivided in frames. For each frame a watermark spread spectrum sequence is added to the original data. A two steps filtering procedure is used to generate the watermark from a Pseudo-Noise (PN) sequence. The filters approximate respectively the threshold and the frequency masking of the Human Auditory System (HAS). In the paper we discuss first the watermark embedding system then the detection approach. The results of a large set of subjective tests are also presented to demonstrate the quality and robustness of the proposed approach.

  12. Development of tools for automatic generation of PLC code

    CERN Document Server

    Koutli, Maria; Rochez, Jacques

    This Master thesis was performed at CERN and more specifically in the EN-ICE-PLC section. The Thesis describes the integration of two PLC platforms, that are based on CODESYS development tool, to the CERN defined industrial framework, UNICOS. CODESYS is a development tool for PLC programming, based on IEC 61131-3 standard, and is adopted by many PLC manufacturers. The two PLC development environments are, the SoMachine from Schneider and the TwinCAT from Beckhoff. The two CODESYS compatible PLCs, should be controlled by the SCADA system of Siemens, WinCC OA. The framework includes a library of Function Blocks (objects) for the PLC programs and a software for automatic generation of the PLC code based on this library, called UAB. The integration aimed to give a solution that is shared by both PLC platforms and was based on the PLCOpen XML scheme. The developed tools were demonstrated by creating a control application for both PLC environments and testing of the behavior of the code of the library.

  13. Automatic User Interface Generation for Visualizing Big Geoscience Data

    Science.gov (United States)

    Yu, H.; Wu, J.; Zhou, Y.; Tang, Z.; Kuo, K. S.

    2016-12-01

    Along with advanced computing and observation technologies, geoscience and its related fields have been generating a large amount of data at an unprecedented growth rate. Visualization becomes an increasingly attractive and feasible means for researchers to effectively and efficiently access and explore data to gain new understandings and discoveries. However, visualization has been challenging due to a lack of effective data models and visual representations to tackle the heterogeneity of geoscience data. We propose a new geoscience data visualization framework by leveraging the interface automata theory to automatically generate user interface (UI). Our study has the following three main contributions. First, geoscience data has its unique hierarchy data structure and complex formats, and therefore it is relatively easy for users to get lost or confused during their exploration of the data. By applying interface automata model to the UI design, users can be clearly guided to find the exact visualization and analysis that they want. In addition, from a development perspective, interface automaton is also easier to understand than conditional statements, which can simplify the development process. Second, it is common that geoscience data has discontinuity in its hierarchy structure. The application of interface automata can prevent users from suffering automation surprises, and enhance user experience. Third, for supporting a variety of different data visualization and analysis, our design with interface automata could also make applications become extendable in that a new visualization function or a new data group could be easily added to an existing application, which reduces the overhead of maintenance significantly. We demonstrate the effectiveness of our framework using real-world applications.

  14. Automatic generation of digital anthropomorphic phantoms from simulated MRI acquisitions

    Science.gov (United States)

    Lindsay, C.; Gennert, M. A.; KÓ§nik, A.; Dasari, P. K.; King, M. A.

    2013-03-01

    In SPECT imaging, motion from patient respiration and body motion can introduce image artifacts that may reduce the diagnostic quality of the images. Simulation studies using numerical phantoms with precisely known motion can help to develop and evaluate motion correction algorithms. Previous methods for evaluating motion correction algorithms used either manual or semi-automated segmentation of MRI studies to produce patient models in the form of XCAT Phantoms, from which one calculates the transformation and deformation between MRI study and patient model. Both manual and semi-automated methods of XCAT Phantom generation require expertise in human anatomy, with the semiautomated method requiring up to 30 minutes and the manual method requiring up to eight hours. Although faster than manual segmentation, the semi-automated method still requires a significant amount of time, is not replicable, and is subject to errors due to the difficulty of aligning and deforming anatomical shapes in 3D. We propose a new method for matching patient models to MRI that extends the previous semi-automated method by eliminating the manual non-rigid transformation. Our method requires no user supervision and therefore does not require expert knowledge of human anatomy to align the NURBs to anatomical structures in the MR image. Our contribution is employing the SIMRI MRI simulator to convert the XCAT NURBs to a voxel-based representation that is amenable to automatic non-rigid registration. Then registration is used to transform and deform the NURBs to match the anatomy in the MR image. We show that our automated method generates XCAT Phantoms more robustly and significantly faster than the previous semi-automated method.

  15. User evaluation of a communication system that automatically generates captions to improve telephone communication

    NARCIS (Netherlands)

    Zekveld, A.A.; Kramer, S.E.; Kessens, J.M.; Vlaming, M.S.M.G.; Houtgast, T.

    2009-01-01

    This study examined the subjective benefit obtained from automatically generated captions during telephone-speech comprehension in the presence of babble noise. Short stories were presented by telephone either with or without captions that were generated offline by an automatic speech recognition

  16. PUS Services Software Building Block Automatic Generation for Space Missions

    Science.gov (United States)

    Candia, S.; Sgaramella, F.; Mele, G.

    2008-08-01

    The Packet Utilization Standard (PUS) has been specified by the European Committee for Space Standardization (ECSS) and issued as ECSS-E-70-41A to define the application-level interface between Ground Segments and Space Segments. The ECSS-E- 70-41A complements the ECSS-E-50 and the Consultative Committee for Space Data Systems (CCSDS) recommendations for packet telemetry and telecommand. The ECSS-E-70-41A characterizes the identified PUS Services from a functional point of view and the ECSS-E-70-31 standard specifies the rules for their mission-specific tailoring. The current on-board software design for a space mission implies the production of several PUS terminals, each providing a specific tailoring of the PUS services. The associated on-board software building blocks are developed independently, leading to very different design choices and implementations even when the mission tailoring requires very similar services (from the Ground operative perspective). In this scenario, the automatic production of the PUS services building blocks for a mission would be a way to optimize the overall mission economy and improve the robusteness and reliability of the on-board software and of the Ground-Space interactions. This paper presents the Space Software Italia (SSI) activities for the development of an integrated environment to support: the PUS services tailoring activity for a specific mission. the mission-specific PUS services configuration. the generation the UML model of the software building block implementing the mission-specific PUS services and the related source code, support documentation (software requirements, software architecture, test plans/procedures, operational manuals), and the TM/TC database. The paper deals with: (a) the project objectives, (b) the tailoring, configuration, and generation process, (c) the description of the environments supporting the process phases, (d) the characterization of the meta-model used for the generation, (e) the

  17. Automatic HDL firmware generation for FPGA-based reconfigurable measurement and control systems with mezzanines in FMC standard

    Science.gov (United States)

    Wojenski, Andrzej; Kasprowicz, Grzegorz; Pozniak, Krzysztof T.; Romaniuk, Ryszard

    2013-10-01

    The paper describes a concept of automatic firmware generation for reconfigurable measurement systems, which uses FPGA devices and measurement cards in FMC standard. Following sections are described in details: automatic HDL code generation for FPGA devices, automatic communication interfaces implementation, HDL drivers for measurement cards, automatic serial connection between multiple measurement backplane boards, automatic build of memory map (address space), automatic generated firmware management. Presented solutions are required in many advanced measurement systems, like Beam Position Monitors or GEM detectors. This work is a part of a wider project for automatic firmware generation and management of reconfigurable systems. Solutions presented in this paper are based on previous publication in SPIE.

  18. System and Component Software Specification, Run-time Verification and Automatic Test Generation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The following background technology is described in Part 5: Run-time Verification (RV), White Box Automatic Test Generation (WBATG). Part 5 also describes how WBATG...

  19. Computer program for automatic generation of BWR control rod patterns

    Energy Technology Data Exchange (ETDEWEB)

    Taner, M.S.; Levine, S.H.; Hsia, M.Y. (Pennsylvania State Univ., University Park (United States))

    1990-01-01

    A computer program named OCTOPUS has been developed to automatically determine a control rod pattern that approximates some desired target power distribution as closely as possible without violating any thermal safety or reactor criticality constraints. The program OCTOPUS performs a semi-optimization task based on the method of approximation programming (MAP) to develop control rod patterns. The SIMULATE-E code is used to determine the nucleonic characteristics of the reactor core state.

  20. Technical Note: Automatic river network generation for a physically-based river catchment model

    OpenAIRE

    2010-01-01

    SHETRAN is a physically-based distributed modelling system that gives detailed simulations in time and space of water flow and sediment and solute transport in river catchments. Standard algorithms for the automatic generation of river channel networks from digital elevation data are impossible to apply in SHETRAN and other similar models because the river channels are assumed to run along the edges of grid cells. In this work a new algorithm for the automatic generation of a river cha...

  1. Technical Note: Automatic river network generation for a physically-based river catchment model

    OpenAIRE

    2010-01-01

    SHETRAN is a physically-based distributed modelling system that gives detailed simulations in time and space of water flow and sediment and solute transport in river catchments. Standard algorithms for the automatic generation of river channel networks from digital elevation data are impossible to apply in SHETRAN and other similar models because the river channels are assumed to run along the edges of grid cells. In this work a new algorithm for the automatic generation of a river channel ne...

  2. Research on Community Competition and Adaptive Genetic Algorithm for Automatic Generation of Tang Poetry

    OpenAIRE

    Wujian Yang; Yining Cheng; Jie He; Wenqiong Hu; Xiaojia Lin

    2016-01-01

    As there are many researches about traditional Tang poetry, among which automatically generated Tang poetry has arouse great concern in recent years. This study presents a community-based competition and adaptive genetic algorithm for automatically generating Tang poetry. The improved algorithm with community-based competition that has been added aims to maintain the diversity of genes during evolution; meanwhile, the adaptation means that the probabilities of crossover and mutation are varie...

  3. A complete discrimination system for polynomials with complex coefficients and its automatic generation

    Institute of Scientific and Technical Information of China (English)

    梁松新; 张景中

    1999-01-01

    By establishing a complete discrimination system for polynomials, the problem of complete root classification for polynomials with complex coefficients is utterly solved, furthermore, the algorithm obtained is made into a general program in Maple, which enables the complete discrimination system and complete root classification of a polynomial to be automatically generated by computer, without any human intervention. Besides, by using the automatic generation of root classification, a method to determine the positive definiteness of a polynomial in one or two indeterminates is automatically presented.

  4. A Model-Based Method for Content Validation of Automatically Generated Test Items

    Science.gov (United States)

    Zhang, Xinxin; Gierl, Mark

    2016-01-01

    The purpose of this study is to describe a methodology to recover the item model used to generate multiple-choice test items with a novel graph theory approach. Beginning with the generated test items and working backward to recover the original item model provides a model-based method for validating the content used to automatically generate test…

  5. Ground based materials science experiments

    Science.gov (United States)

    Meyer, M. B.; Johnston, J. C.; Glasgow, T. K.

    1988-01-01

    The facilities at the Microgravity Materials Science Laboratory (MMSL) at the Lewis Research Center, created to offer immediate and low-cost access to ground-based testing facilities for industrial, academic, and government researchers, are described. The equipment in the MMSL falls into three categories: (1) devices which emulate some aspect of low gravitational forces, (2) specialized capabilities for 1-g development and refinement of microgravity experiments, and (3) functional duplicates of flight hardware. Equipment diagrams are included.

  6. Ground based materials science experiments

    Science.gov (United States)

    Meyer, M. B.; Johnston, J. C.; Glasgow, T. K.

    1988-01-01

    The facilities at the Microgravity Materials Science Laboratory (MMSL) at the Lewis Research Center, created to offer immediate and low-cost access to ground-based testing facilities for industrial, academic, and government researchers, are described. The equipment in the MMSL falls into three categories: (1) devices which emulate some aspect of low gravitational forces, (2) specialized capabilities for 1-g development and refinement of microgravity experiments, and (3) functional duplicates of flight hardware. Equipment diagrams are included.

  7. Automatic Generation Control Strategy Based on Balance of Daily Electric Energy

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    An automatic generation control strategy based on balance of daily total electric energy is put forward. It makes the balance between actual total generated energy controlled by automatic generation system and planned total energy on base of area control error, and makes the actual 24-hour active power load curve to approach the planned load curve. The generated energy is corrected by velocity weighting factor so that it conducts dynamic regulation and reaches the speed of response. Homologous strategy is used according to the real-time data in the operation of automatic generation control. Results of simulation are perfect and power energy compensation control with ideal effect can be achieved in the particular duration.

  8. Improving Statistical Language Model Performance with Automatically Generated Word Hierarchies

    CERN Document Server

    McMahon, J; Mahon, John Mc

    1995-01-01

    An automatic word classification system has been designed which processes word unigram and bigram frequency statistics extracted from a corpus of natural language utterances. The system implements a binary top-down form of word clustering which employs an average class mutual information metric. Resulting classifications are hierarchical, allowing variable class granularity. Words are represented as structural tags --- unique $n$-bit numbers the most significant bit-patterns of which incorporate class information. Access to a structural tag immediately provides access to all classification levels for the corresponding word. The classification system has successfully revealed some of the structure of English, from the phonemic to the semantic level. The system has been compared --- directly and indirectly --- with other recent word classification systems. Class based interpolated language models have been constructed to exploit the extra information supplied by the classifications and some experiments have sho...

  9. AROMA: Automatic Generation of Radio Maps for Localization Systems

    CERN Document Server

    Eleryan, Ahmed; Youssef, Moustafa

    2010-01-01

    WLAN localization has become an active research field recently. Due to the wide WLAN deployment, WLAN localization provides ubiquitous coverage and adds to the value of the wireless network by providing the location of its users without using any additional hardware. However, WLAN localization systems usually require constructing a radio map, which is a major barrier of WLAN localization systems' deployment. The radio map stores information about the signal strength from different signal strength streams at selected locations in the site of interest. Typical construction of a radio map involves measurements and calibrations making it a tedious and time-consuming operation. In this paper, we present the AROMA system that automatically constructs accurate active and passive radio maps for both device-based and device-free WLAN localization systems. AROMA has three main goals: high accuracy, low computational requirements, and minimum user overhead. To achieve high accuracy, AROMA uses 3D ray tracing enhanced wi...

  10. Why discourse structures in medical reports matter for the validity of automatically generated text knowledge bases.

    Science.gov (United States)

    Hahn, U; Romacker, M; Schulz, S

    1998-01-01

    The automatic analysis of medical full-texts currently suffers from neglecting text coherence phenomena such as reference relations between discourse units. This has unwarranted effects on the description adequacy of medical knowledge bases automatically generated from texts. The resulting representation bias can be characterized in terms of artificially fragmented, incomplete and invalid knowledge structures. We discuss three types of textual phenomena (pronominal and nominal anaphora, as well as textual ellipsis) and outline basic methodologies how to deal with them.

  11. An approach of optimal sensitivity applied in the tertiary loop of the automatic generation control

    Energy Technology Data Exchange (ETDEWEB)

    Belati, Edmarcio A. [CIMATEC - SENAI, Salvador, BA (Brazil); Alves, Dilson A. [Electrical Engineering Department, FEIS, UNESP - Sao Paulo State University (Brazil); da Costa, Geraldo R.M. [Electrical Engineering Department, EESC, USP - Sao Paulo University (Brazil)

    2008-09-15

    This paper proposes an approach of optimal sensitivity applied in the tertiary loop of the automatic generation control. The approach is based on the theorem of non-linear perturbation. From an optimal operation point obtained by an optimal power flow a new optimal operation point is directly determined after a perturbation, i.e., without the necessity of an iterative process. This new optimal operation point satisfies the constraints of the problem for small perturbation in the loads. The participation factors and the voltage set point of the automatic voltage regulators (AVR) of the generators are determined by the technique of optimal sensitivity, considering the effects of the active power losses minimization and the network constraints. The participation factors and voltage set point of the generators are supplied directly to a computational program of dynamic simulation of the automatic generation control, named by power sensitivity mode. Test results are presented to show the good performance of this approach. (author)

  12. Validating EHR documents: automatic schematron generation using archetypes.

    Science.gov (United States)

    Pfeiffer, Klaus; Duftschmid, Georg; Rinner, Christoph

    2014-01-01

    The goal of this study was to examine whether Schematron schemas can be generated from archetypes. The openEHR Java reference API was used to transform an archetype into an object model, which was then extended with context elements. The model was processed and the constraints were transformed into corresponding Schematron assertions. A prototype of the generator for the reference model HL7 v3 CDA R2 was developed and successfully tested. Preconditions for its reusability with other reference models were set. Our results indicate that an automated generation of Schematron schemas is possible with some limitations.

  13. Automatic Model-Based Generation of Parameterized Test Cases Using Data Abstraction

    NARCIS (Netherlands)

    Calamé, Jens R.; Ioustinova, Natalia; Pol, van de Jaco; Romijn, J.M.T.; Smith, G.; Pol, van de J.C.

    2007-01-01

    Developing test suites is a costly and error-prone process. Model-based test generation tools facilitate this process by automatically generating test cases from system models. The applicability of these tools, however, depends on the size of the target systems. Here, we propose an approach to gener

  14. Unidirectional high fiber content composites: Automatic 3D FE model generation and damage simulation

    DEFF Research Database (Denmark)

    Qing, Hai; Mishnaevsky, Leon

    2009-01-01

    A new method and a software code for the automatic generation of 3D micromechanical FE models of unidirectional long-fiber-reinforced composite (LFRC) with high fiber volume fraction with random fiber arrangement are presented. The fiber arrangement in the cross-section is generated through random...

  15. AUTOMATIC MESH GENERATION OF 3-D GEOMETRIC MODELS

    Institute of Scientific and Technical Information of China (English)

    刘剑飞

    2003-01-01

    In this paper the presentation of the ball-packing method is reviewed,and a scheme to generate mesh for complex 3-D geometric models is given,which consists of 4 steps:(1)create nodes in 3-D models by ball-packing method,(2)connect nodes to generate mesh by 3-D Delaunay triangulation,(3)retrieve the boundary of the model after Delaunay triangulation,(4)improve the mesh.

  16. Automatic Security Assessment for Next Generation Wireless Mobile Networks

    Directory of Open Access Journals (Sweden)

    Francesco Palmieri

    2011-01-01

    Full Text Available Wireless networks are more and more popular in our life, but their increasing pervasiveness and widespread coverage raises serious security concerns. Mobile client devices potentially migrate, usually passing through very light access control policies, between numerous and heterogeneous wireless environments, bringing with them software vulnerabilities as well as possibly malicious code. To cope with these new security threats the paper proposes a new active third party authentication, authorization and security assessment strategy in which, once a device enters a new Wi-Fi environment, it is subjected to analysis by the infrastructure, and if it is found to be dangerously insecure, it is immediately taken out from the network and denied further access until its vulnerabilities have been fixed. The security assessment module, that is the fundamental component of the aforementioned strategy, takes advantage from a reliable knowledge base containing semantically-rich information about the mobile node under examination, dynamically provided by network mapping and configuration assessment facilities. It implements a fully automatic security analysis framework, based on AHP, which has been conceived to be flexible and customizable, to provide automated support for real-time execution of complex security/risk evaluation tasks which depends on the results obtained from different kind of analysis tools and methodologies. Encouraging results have been achieved utilizing a proof-of-concept model based on current technology and standard open-source networking tools.

  17. Impact of automatic threshold capture on pulse generator longevity

    Institute of Scientific and Technical Information of China (English)

    CHEN Ruo-han; CHEN Ke-ping; WANG Fang-zheng; HUA Wei; ZHANG Shu

    2006-01-01

    Background The automatic, threshold tracking, pacing algorithm developed by St. Jude Medical, verifies ventricular capture beat by beat by recognizing the evoked response following each pacemaker stimulus. This function was assumed to be not only energy saving but safe. This study estimated the extension in longevity obtained by AutoCapture (AC) compared with pacemakers programmed to manually optimized, nominal output.Methods Thirty-four patients who received the St. Jude Affinity series pacemaker were included in the study.The following measurements were taken: stimulation and sensing threshold, impedance of leads, evoked response and polarization signals by 3501 programmer during followup, battery current and battery impedance under different conditions. For longevity comparison, ventricular output was programmed under three different conditions: (1) AC on; (2) AC off with nominal output, and (3) AC off with pacing output set at twice the pacing threshold with a minimum of 2.0 V. Patients were divided into two groups: chronic threshold is higher or lower than 1 V. The efficacy of AC was evaluated.Results Current drain in the AC on group, AC off with optimized programming or nominal output was (14.33±2.84) mA, (16.74±2.75) mA and (18.4±2.44) mA, respectively (AC on or AC off with optimized programming vs. nominal output, P < 0.01). Estimated longevity was significantly extended by AC on when compared with nominal setting [(103 ± 27) months, (80 ± 24) months, P < 0.01). Furthermore, compared with the optimized programming, AC extends the longevity when the pacing threshold is higher than 1 V.Conclusion AC could significantly prolong pacemaker longevity; especially in the patient with high pacing threshold.

  18. AN ALGORITHM FOR AUTOMATICALLY GENERATING BLACK-BOX TEST CASES

    Institute of Scientific and Technical Information of China (English)

    XuBaowen; NieChanghai; 等

    2003-01-01

    Selection of test cases plays a key role in improving testing efficiency.Black-box testing is an important way of testing,and is validity lies on the secection of test cases in some sense.A reasonable and effective method about the selection and generation of test cascs is urgently needed.This letter first introduces some usual methods on black-box test case generation,then proposes a new glgorithm based on interface parameters and discusses its properties,finally shows the effectiveness of the algorithm.

  19. AN ALGORITHM FOR AUTOMATICALLY GENERATING BLACK-BOX TEST CASES

    Institute of Scientific and Technical Information of China (English)

    Xu Baowen; Nie Changhai; Shi Qunfeng; Lu Hong

    2003-01-01

    Selection of test cases plays a key role in improving testing efficiency. Black-box testing is an important way of testing, and its validity lies on the selection of test cases in some sense. A reasonable and effective method about the selection and generation of test cases is urgently needed. This letter first introduces some usualmethods on black-box test case generation,then proposes a new algorithm based on interface parameters and discusses its properties, finally shows the effectiveness of the algorithm.

  20. An automatic grid generation approach over free-form surface for architectural design

    Institute of Scientific and Technical Information of China (English)

    苏亮; 祝顺来; 肖南; 高博青

    2014-01-01

    An essential step for the realization of free-form surface structures is to create an efficient structural gird that satisfies not only the architectural aesthetics, but also the structural performance. Employing the main stress trajectories as the representation of force flows on a free-form surface, an automatic grid generation approach is proposed for the architectural design. The algorithm automatically plots the main stress trajectories on a 3D free-form surface, and adopts a modified advancing front meshing technique to generate the structural grid. Based on the proposed algorithm, an automatic grid generator named “St-Surmesh” is developed for the practical architectural design of free-form surface structure. The surface geometry of one of the Sun Valleys in Expo Axis for the Expo Shanghai 2010 is selected as a numerical example for validating the proposed approach. Comparative studies are performed to demonstrate how different structural grids affect the design of a free-form surface structure.

  1. Designing a story database for use in automatic story generation

    NARCIS (Netherlands)

    Oinonen, K.M.; Theune, M.; Nijholt, A.; Uijlings, J.R.R.; Harper, R.; Rauterberg, M.; Combetto, M.

    2006-01-01

    In this paper we propose a model for the representation of stories in a story database. The use of such a database will enable computational story generation systems to learn from previous stories and associated user feedback, in order to create believable stories with dramatic plots that invoke an

  2. Template Authoring Environment for the Automatic Generation of Narrative Content

    Science.gov (United States)

    Caropreso, Maria Fernanda; Inkpen, Diana; Keshtkar, Fazel; Khan, Shahzad

    2012-01-01

    Natural Language Generation (NLG) systems can make data accessible in an easily digestible textual form; but using such systems requires sophisticated linguistic and sometimes even programming knowledge. We have designed and implemented an environment for creating and modifying NLG templates that requires no programming knowledge, and can operate…

  3. Automatic generation of computable implementation guides from clinical information models.

    Science.gov (United States)

    Boscá, Diego; Maldonado, José Alberto; Moner, David; Robles, Montserrat

    2015-06-01

    Clinical information models are increasingly used to describe the contents of Electronic Health Records. Implementation guides are a common specification mechanism used to define such models. They contain, among other reference materials, all the constraints and rules that clinical information must obey. However, these implementation guides typically are oriented to human-readability, and thus cannot be processed by computers. As a consequence, they must be reinterpreted and transformed manually into an executable language such as Schematron or Object Constraint Language (OCL). This task can be difficult and error prone due to the big gap between both representations. The challenge is to develop a methodology for the specification of implementation guides in such a way that humans can read and understand easily and at the same time can be processed by computers. In this paper, we propose and describe a novel methodology that uses archetypes as basis for generation of implementation guides. We use archetypes to generate formal rules expressed in Natural Rule Language (NRL) and other reference materials usually included in implementation guides such as sample XML instances. We also generate Schematron rules from NRL rules to be used for the validation of data instances. We have implemented these methods in LinkEHR, an archetype editing platform, and exemplify our approach by generating NRL rules and implementation guides from EN ISO 13606, openEHR, and HL7 CDA archetypes.

  4. Automatic generation of indoor navigation instructions for blind users using a user-centric graph.

    Science.gov (United States)

    Dong, Hao; Ganz, Aura

    2014-01-01

    The complexity and diversity of indoor environments brings significant challenges to automatic generation of navigation instructions for blind and visually impaired users. Unlike generation of navigation instructions for robots, we need to take into account the blind users wayfinding ability. In this paper we introduce a user-centric graph based solution for cane users that takes into account the blind users cognitive ability as well as the user's mobility patterns. We introduce the principles of generating the graph and the algorithm used to automatically generate the navigation instructions using this graph. We successfully tested the efficiency of the instruction generation algorithm, the correctness of the generated paths, and the quality of the navigation instructions. Blindfolded sighted users were successful in navigating through a three-story building.

  5. An automatically generated code for relativistic inhomogeneous cosmologies

    CERN Document Server

    Bentivegna, Eloisa

    2016-01-01

    The applications of numerical relativity to cosmology are on the rise, contributing insight into such cosmological problems as structure formation, primordial phase transitions, gravitational-wave generation, and inflation. In this paper, I present the infrastructure for the computation of inhomogeneous dust cosmologies which was used recently to measure the effect of nonlinear inhomogeneity on the cosmic expansion rate. I illustrate the code's architecture, provide evidence for its correctness in a number of familiar cosmological settings, and evaluate its parallel performance for grids of up to several billion points. The code, which is available as free software, is based on the Einstein Toolkit infrastructure, and in particular leverages the automated-code-generation capabilities provided by its component Kranc.

  6. Facilitate generation connections on Orkney by automatic distribution network management

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2004-07-01

    This report summarises the results of a study assessing the capability and limitations of the Orkney Network under a variety of conditions of demand, generation connections, network configuration, and reactive compensation). A conceptual active management scheme (AMS) suitable for the conditions on Orkney is developed and evaluated. Details are given of a proposed framework for the design and evaluation of future active management schemes, logic control sequences for managed generation units, and a proposed evaluation method for the active management scheme. Implications of introducing the proposed AMS are examined, and the commercial aspects of an AMS and system security are considered. The existing Orkney network is described; and an overview of the SHEPDL (Scottish Hydro Electric Power Distribution Ltd.) SCADA system is presented with a discussion of AMS identification, selection, and development.

  7. Semantic annotation of requirements for automatic UML class diagram generation

    CERN Document Server

    Amdouni, Soumaya; Bouabid, Sondes

    2011-01-01

    The increasing complexity of software engineering requires effective methods and tools to support requirements analysts' activities. While much of a company's knowledge can be found in text repositories, current content management systems have limited capabilities for structuring and interpreting documents. In this context, we propose a tool for transforming text documents describing users' requirements to an UML model. The presented tool uses Natural Language Processing (NLP) and semantic rules to generate an UML class diagram. The main contribution of our tool is to provide assistance to designers facilitating the transition from a textual description of user requirements to their UML diagrams based on GATE (General Architecture of Text) by formulating necessary rules that generate new semantic annotations.

  8. Facilitate generation connections on Orkney by automatic distribution network management

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2004-07-01

    This report summarises the results of a study assessing the capability and limitations of the Orkney Network under a variety of conditions of demand, generation connections, network configuration, and reactive compensation). A conceptual active management scheme (AMS) suitable for the conditions on Orkney is developed and evaluated. Details are given of a proposed framework for the design and evaluation of future active management schemes, logic control sequences for managed generation units, and a proposed evaluation method for the active management scheme. Implications of introducing the proposed AMS are examined, and the commercial aspects of an AMS and system security are considered. The existing Orkney network is described; and an overview of the SHEPDL (Scottish Hydro Electric Power Distribution Ltd.) SCADA system is presented with a discussion of AMS identification, selection, and development.

  9. An Aid to Independent Study through Automatic Question Generation (AUTOQUEST)

    Science.gov (United States)

    1975-10-01

    natural language analysis and generation tackles a difficult class of anaphoric inference problems (finding th correct referent for an English pronoun...overall principle of xsemantic preference’ used to set up the original meaning representation, of which these anaphoric inference procedures are a...templates has been considered in detail elsewhere, so this report concentrates on the second phase of analysis, which binds templates together into a

  10. SABATPG-A Structural Analysis Based Automatic Test Generation System

    Institute of Scientific and Technical Information of China (English)

    李忠诚; 潘榆奇; 闵应骅

    1994-01-01

    A TPG system, SABATPG, is given based on a generic structural model of large circuits. Three techniques of partial implication, aftereffect of identified undetectable faults and shared sensitization with new concepts of localization and aftereffect are employed in the system to improve FAN algorithm. Experiments for the 10 ISCAS benchmark circuits show that the computing time of SABATPG for test generation is 19.42% less than that of FAN algorithm.

  11. Technical Note: Automatic river network generation for a physically-based river catchment model

    Directory of Open Access Journals (Sweden)

    S. J. Birkinshaw

    2010-09-01

    Full Text Available SHETRAN is a physically-based distributed modelling system that gives detailed simulations in time and space of water flow and sediment and solute transport in river catchments. Standard algorithms for the automatic generation of river channel networks from digital elevation data are impossible to apply in SHETRAN and other similar models because the river channels are assumed to run along the edges of grid cells. In this work a new algorithm for the automatic generation of a river channel network in SHETRAN is described and its use in an example catchment demonstrated.

  12. Technical Note: Automatic river network generation for a physically-based river catchment model

    Science.gov (United States)

    Birkinshaw, S. J.

    2010-09-01

    SHETRAN is a physically-based distributed modelling system that gives detailed simulations in time and space of water flow and sediment and solute transport in river catchments. Standard algorithms for the automatic generation of river channel networks from digital elevation data are impossible to apply in SHETRAN and other similar models because the river channels are assumed to run along the edges of grid cells. In this work a new algorithm for the automatic generation of a river channel network in SHETRAN is described and its use in an example catchment demonstrated.

  13. Technical Note: Automatic river network generation for a physically-based river catchment model

    Directory of Open Access Journals (Sweden)

    S. J. Birkinshaw

    2010-05-01

    Full Text Available SHETRAN is a physically-based distributed modelling system that gives detailed simulations in time and space of water flow and sediment and solute transport in river catchments. Standard algorithms for the automatic generation of river channel networks from digital elevation data are impossible to apply in SHETRAN and other similar models because the river channels are assumed to run along the edges of grid cells. In this work a new algorithm for the automatic generation of a river channel network in SHETRAN is described and its use in an example catchment demonstrated.

  14. Semi-automatic simulation model generation of virtual dynamic networks for production flow planning

    Science.gov (United States)

    Krenczyk, D.; Skolud, B.; Olender, M.

    2016-08-01

    Computer modelling, simulation and visualization of production flow allowing to increase the efficiency of production planning process in dynamic manufacturing networks. The use of the semi-automatic model generation concept based on parametric approach supporting processes of production planning is presented. The presented approach allows the use of simulation and visualization for verification of production plans and alternative topologies of manufacturing network configurations as well as with automatic generation of a series of production flow scenarios. Computational examples with the application of Enterprise Dynamics simulation software comprising the steps of production planning and control for manufacturing network have been also presented.

  15. Using automatic item generation to create multiple-choice test items.

    Science.gov (United States)

    Gierl, Mark J; Lai, Hollis; Turner, Simon R

    2012-08-01

    Many tests of medical knowledge, from the undergraduate level to the level of certification and licensure, contain multiple-choice items. Although these are efficient in measuring examinees' knowledge and skills across diverse content areas, multiple-choice items are time-consuming and expensive to create. Changes in student assessment brought about by new forms of computer-based testing have created the demand for large numbers of multiple-choice items. Our current approaches to item development cannot meet this demand. We present a methodology for developing multiple-choice items based on automatic item generation (AIG) concepts and procedures. We describe a three-stage approach to AIG and we illustrate this approach by generating multiple-choice items for a medical licensure test in the content area of surgery. To generate multiple-choice items, our method requires a three-stage process. Firstly, a cognitive model is created by content specialists. Secondly, item models are developed using the content from the cognitive model. Thirdly, items are generated from the item models using computer software. Using this methodology, we generated 1248 multiple-choice items from one item model. Automatic item generation is a process that involves using models to generate items using computer technology. With our method, content specialists identify and structure the content for the test items, and computer technology systematically combines the content to generate new test items. By combining these outcomes, items can be generated automatically. © Blackwell Publishing Ltd 2012.

  16. Advanced Automatic Hexahedral Mesh Generation from Surface Quad Meshes

    OpenAIRE

    Kremer, Michael; Bommes, David; Lim, Isaak; Kobbelt, Leif

    2013-01-01

    International audience; A purely topological approach for the generation of hexahedral meshes from quadrilateral surface meshes of genus zero has been proposed by M. Müller-Hannemann: in a first stage, the input surface mesh is reduced to a single hexahedron by successively eliminating loops from the dual graph of the quad mesh; in the second stage, the hexahedral mesh is constructed by extruding a layer of hexahedra for each dual loop from the first stage in reverse elimination order. In th...

  17. Automatic generation of fuzzy rules for the sensor-based navigation of a mobile robot

    Energy Technology Data Exchange (ETDEWEB)

    Pin, F.G.; Watanabe, Y.

    1994-10-01

    A system for automatic generation of fuzzy rules is proposed which is based on a new approach, called {open_quotes}Fuzzy Behaviorist,{close_quotes} and on its associated formalism for rule base development in behavior-based robot control systems. The automated generator of fuzzy rules automatically constructs the set of rules and the associated membership functions that implement reasoning schemes that have been expressed in qualitative terms. The system also checks for completeness of the rule base and independence and/or redundancy of the rules to ensure that the requirements of the formalism are satisfied. Examples of the automatic generation of fuzzy rules for cases involving suppression and/or inhibition of fuzzy behaviors are given and discussed. Experimental results obtained with the automated fuzzy rule generator applied to the domain of sensor-based navigation in a priori unknown environments using one of our autonomous test-bed robots are then presented and discussed to illustrate the feasibility of large-scale automatic fuzzy rule generation using our proposed {open_quotes}Fuzzy Behaviorist{close_quotes} approach.

  18. Automatic Tool Path Generation for Robot Integrated Surface Sculpturing System

    Science.gov (United States)

    Zhu, Jiang; Suzuki, Ryo; Tanaka, Tomohisa; Saito, Yoshio

    In this paper, a surface sculpturing system based on 8-axis robot is proposed, the CAD/CAM software and tool path generation algorithm for this sculpturing system are presented. The 8-axis robot is composed of a 6-axis manipulator and a 2-axis worktable, it carves block of polystyrene foams by heated cutting tools. Multi-DOF (Degree of Freedom) robot benefits from the faster fashion than traditional RP (Rapid Prototyping) methods and more flexibility than CNC machining. With its flexibility driven from an 8-axis configuration, as well as efficient custom-developed software for rough cutting and finish cutting, this surface sculpturing system can carve sculptured surface accurately and efficiently.

  19. Contribution of supraspinal systems to generation of automatic postural responses

    Directory of Open Access Journals (Sweden)

    Tatiana G Deliagina

    2014-10-01

    Full Text Available Different species maintain a particular body orientation in space due to activity of the closed-loop postural control system. In this review we discuss the role of neurons of descending pathways in operation of this system as revealed in animal models of differing complexity: lower vertebrate (lamprey and higher vertebrates (rabbit and cat.In the lamprey and quadruped mammals, the role of spinal and supraspinal mechanisms in the control of posture is different. In the lamprey, the system contains one closed-loop mechanism consisting of supraspino-spinal networks. Reticulospinal (RS neurons play a key role in generation of postural corrections. Due to vestibular input, any deviation from the stabilized body orientation leads to activation of a specific population of RS neurons. Each of the neurons activates a specific motor synergy. Collectively, these neurons evoke the motor output necessary for the postural correction. In contrast to lampreys, postural corrections in quadrupeds are primarily based not on the vestibular input but on the somatosensory input from limb mechanoreceptors. The system contains two closed-loop mechanisms – spinal and spino-supraspinal networks, which supplement each other. Spinal networks receive somatosensory input from the limb signaling postural perturbations, and generate spinal postural limb reflexes. These reflexes are relatively weak, but in intact animals they are enhanced due to both tonic supraspinal drive and phasic supraspinal commands. Recent studies of these supraspinal influences are considered in this review. A hypothesis suggesting common principles of operation of the postural systems stabilizing body orientation in a particular plane in the lamprey and quadrupeds, that is interaction of antagonistic postural reflexes, is discussed.

  20. Contribution of supraspinal systems to generation of automatic postural responses.

    Science.gov (United States)

    Deliagina, Tatiana G; Beloozerova, Irina N; Orlovsky, Grigori N; Zelenin, Pavel V

    2014-01-01

    Different species maintain a particular body orientation in space due to activity of the closed-loop postural control system. In this review we discuss the role of neurons of descending pathways in operation of this system as revealed in animal models of differing complexity: lower vertebrate (lamprey) and higher vertebrates (rabbit and cat). In the lamprey and quadruped mammals, the role of spinal and supraspinal mechanisms in the control of posture is different. In the lamprey, the system contains one closed-loop mechanism consisting of supraspino-spinal networks. Reticulospinal (RS) neurons play a key role in generation of postural corrections. Due to vestibular input, any deviation from the stabilized body orientation leads to activation of a specific population of RS neurons. Each of the neurons activates a specific motor synergy. Collectively, these neurons evoke the motor output necessary for the postural correction. In contrast to lampreys, postural corrections in quadrupeds are primarily based not on the vestibular input but on the somatosensory input from limb mechanoreceptors. The system contains two closed-loop mechanisms - spinal and spino-supraspinal networks, which supplement each other. Spinal networks receive somatosensory input from the limb signaling postural perturbations, and generate spinal postural limb reflexes. These reflexes are relatively weak, but in intact animals they are enhanced due to both tonic supraspinal drive and phasic supraspinal commands. Recent studies of these supraspinal influences are considered in this review. A hypothesis suggesting common principles of operation of the postural systems stabilizing body orientation in a particular plane in the lamprey and quadrupeds, that is interaction of antagonistic postural reflexes, is discussed.

  1. Automatic, context-specific generation of Gene Ontology slims

    Directory of Open Access Journals (Sweden)

    Sehgal Muhammad

    2010-10-01

    Full Text Available Abstract Background The use of ontologies to control vocabulary and structure annotation has added value to genome-scale data, and contributed to the capture and re-use of knowledge across research domains. Gene Ontology (GO is widely used to capture detailed expert knowledge in genomic-scale datasets and as a consequence has grown to contain many terms, making it unwieldy for many applications. To increase its ease of manipulation and efficiency of use, subsets called GO slims are often created by collapsing terms upward into more general, high-level terms relevant to a particular context. Creation of a GO slim currently requires manipulation and editing of GO by an expert (or community familiar with both the ontology and the biological context. Decisions about which terms to include are necessarily subjective, and the creation process itself and subsequent curation are time-consuming and largely manual. Results Here we present an objective framework for generating customised ontology slims for specific annotated datasets, exploiting information latent in the structure of the ontology graph and in the annotation data. This framework combines ontology engineering approaches, and a data-driven algorithm that draws on graph and information theory. We illustrate this method by application to GO, generating GO slims at different information thresholds, characterising their depth of semantics and demonstrating the resulting gains in statistical power. Conclusions Our GO slim creation pipeline is available for use in conjunction with any GO-annotated dataset, and creates dataset-specific, objectively defined slims. This method is fast and scalable for application to other biomedical ontologies.

  2. Automatic Generation of Setup for CNC Spring Coiler Based on Case-based Reasoning

    Institute of Scientific and Technical Information of China (English)

    KU Xiangchen; WANG Runxiao; LI Jishun; WANG Dongbo

    2006-01-01

    When producing special-shape spring in CNC spring coiler, the setup of the coiler is often a manual work using a trial-and-error method. As a result, the setup of coiler consumes so much time and becomes the bottleneck of the spring production process. In order to cope with this situation, this paper proposes an automatic generation system of setup for CNC spring coiler using case-based reasoning (CBR). The core of the study contains: (1) integrated reasoning model of CBR system;(2) spatial shape describe of special-shape spring based on feature;(3) coiling case representation using shape feature matrix; and (4) case similarity measure algorithm. The automatic generation system has implemented with C++ Builder 6.0 and is helpful in improving the automaticity and efficiency of spring coiler.

  3. Automatic Generation of Proof Tactics for Finite-Valued Logics

    Directory of Open Access Journals (Sweden)

    João Marcos

    2010-03-01

    Full Text Available A number of flexible tactic-based logical frameworks are nowadays available that can implement a wide range of mathematical theories using a common higher-order metalanguage. Used as proof assistants, one of the advantages of such powerful systems resides in their responsiveness to extensibility of their reasoning capabilities, being designed over rule-based programming languages that allow the user to build her own `programs to construct proofs' - the so-called proof tactics. The present contribution discusses the implementation of an algorithm that generates sound and complete tableau systems for a very inclusive class of sufficiently expressive finite-valued propositional logics, and then illustrates some of the challenges and difficulties related to the algorithmic formation of automated theorem proving tactics for such logics. The procedure on whose implementation we will report is based on a generalized notion of analyticity of proof systems that is intended to guarantee termination of the corresponding automated tactics on what concerns theoremhood in our targeted logics.

  4. Automatic Generation of Printed Catalogs: An Initial Attempt

    Directory of Open Access Journals (Sweden)

    Jared Camins-Esakov

    2010-06-01

    Full Text Available Printed catalogs are useful in a variety of contexts. In special collections, they are often used as reference tools and to commemorate exhibits. They are useful in settings, such as in developing countries, where reliable access to the Internet—or even electricity—is not available. In addition, many private collectors like to have printed catalogs of their collections. All the information needed for creating printed catalogs is readily available in the MARC bibliographic records used by most libraries, but there are no turnkey solutions available for the conversion from MARC to printed catalog. This article describes the development of a system, available on github, that uses XSLT, Perl, and LaTeX to produce press-ready PDFs from MARCXML files. The article particularly focuses on the two XSLT stylesheets which comprise the core of the system, and do the "heavy lifting" of sorting and indexing the entries in the catalog. The author also highlights points where the data stored in MARC bibliographic records requires particular "massaging," and suggests improvements for future attempts at automated printed catalog generation.

  5. Students' Feedback Preferences: How Do Students React to Timely and Automatically Generated Assessment Feedback?

    Science.gov (United States)

    Bayerlein, Leopold

    2014-01-01

    This study assesses whether or not undergraduate and postgraduate accounting students at an Australian university differentiate between timely feedback and extremely timely feedback, and whether or not the replacement of manually written formal assessment feedback with automatically generated feedback influences students' perception of feedback…

  6. Using Automatic Code Generation in the Attitude Control Flight Software Engineering Process

    Science.gov (United States)

    McComas, David; O'Donnell, James R., Jr.; Andrews, Stephen F.

    1999-01-01

    This paper presents an overview of the attitude control subsystem flight software development process, identifies how the process has changed due to automatic code generation, analyzes each software development phase in detail, and concludes with a summary of our lessons learned.

  7. A GA-fuzzy automatic generation controller for interconnected power system

    CSIR Research Space (South Africa)

    Boesack, CD

    2011-10-01

    Full Text Available This paper presents a GA-Fuzzy Automatic Generation Controller for large interconnected power systems. The design of Fuzzy Logic Controllers by means of expert knowledge have typically been the traditional design norm, however, this may not yield...

  8. Using Automatic Code Generation in the Attitude Control Flight Software Engineering Process

    Science.gov (United States)

    McComas, David; O'Donnell, James R., Jr.; Andrews, Stephen F.

    1999-01-01

    This paper presents an overview of the attitude control subsystem flight software development process, identifies how the process has changed due to automatic code generation, analyzes each software development phase in detail, and concludes with a summary of our lessons learned.

  9. Students' Feedback Preferences: How Do Students React to Timely and Automatically Generated Assessment Feedback?

    Science.gov (United States)

    Bayerlein, Leopold

    2014-01-01

    This study assesses whether or not undergraduate and postgraduate accounting students at an Australian university differentiate between timely feedback and extremely timely feedback, and whether or not the replacement of manually written formal assessment feedback with automatically generated feedback influences students' perception of…

  10. Accuracy assessment of building point clouds automatically generated from iphone images

    Science.gov (United States)

    Sirmacek, B.; Lindenbergh, R.

    2014-06-01

    Low-cost sensor generated 3D models can be useful for quick 3D urban model updating, yet the quality of the models is questionable. In this article, we evaluate the reliability of an automatic point cloud generation method using multi-view iPhone images or an iPhone video file as an input. We register such automatically generated point cloud on a TLS point cloud of the same object to discuss accuracy, advantages and limitations of the iPhone generated point clouds. For the chosen example showcase, we have classified 1.23% of the iPhone point cloud points as outliers, and calculated the mean of the point to point distances to the TLS point cloud as 0.11 m. Since a TLS point cloud might also include measurement errors and noise, we computed local noise values for the point clouds from both sources. Mean (μ) and standard deviation (σ) of roughness histograms are calculated as (μ1 = 0.44 m., σ1 = 0.071 m.) and (μ2 = 0.025 m., σ2 = 0.037 m.) for the iPhone and TLS point clouds respectively. Our experimental results indicate possible usage of the proposed automatic 3D model generation framework for 3D urban map updating, fusion and detail enhancing, quick and real-time change detection purposes. However, further insights should be obtained first on the circumstances that are needed to guarantee a successful point cloud generation from smartphone images.

  11. Development of a system of automatic gap-adjusted electrodes for shock wave generators

    Science.gov (United States)

    Manousakas, Ioannis; Liang, Shen-Min; Wan, Long-Ray; Wang, Chia-Hui

    2004-11-01

    In this study, a system of automatic gap-adjusted electrodes is developed for electrohydraulic shock wave generators that can be used both for extracorporeal shock wave lithotripsy (treatment of renal calculi) and for the extracorporeal shock wave therapy for musculo-skeletal disorders. This system is composed of three main components: (1) two electrodes and their bases; (2) servo motors and control software; (3) a high sensitivity CCD camera and image processing program. To verify system performance, in vitro fragmentation tests have been conducted using kidney stone phantoms. Results indicate that the efficiency of stone fragmentation using automatic gap adjustment can be increased up to 55.2%, which is twice more than without automatic gap adjustment (26.7%). This system can be applied to any commercial electrohydraulic extracorporeal shock wave lithotriptor or orthotriptor without difficulty.

  12. The mesh-matching algorithm: an automatic 3D mesh generator for Finite element structures

    CERN Document Server

    Couteau, B; Lavallee, S; Payan, Yohan; Lavallee, St\\'{e}phane

    2000-01-01

    Several authors have employed Finite Element Analysis (FEA) for stress and strain analysis in orthopaedic biomechanics. Unfortunately, the use of three-dimensional models is time consuming and consequently the number of analysis to be performed is limited. The authors have investigated a new method allowing automatically 3D mesh generation for structures as complex as bone for example. This method called Mesh-Matching (M-M) algorithm generated automatically customized 3D meshes of bones from an already existing model. The M-M algorithm has been used to generate FE models of ten proximal human femora from an initial one which had been experimentally validated. The new meshes seemed to demonstrate satisfying results.

  13. Research on Community Competition and Adaptive Genetic Algorithm for Automatic Generation of Tang Poetry

    Directory of Open Access Journals (Sweden)

    Wujian Yang

    2016-01-01

    Full Text Available As there are many researches about traditional Tang poetry, among which automatically generated Tang poetry has arouse great concern in recent years. This study presents a community-based competition and adaptive genetic algorithm for automatically generating Tang poetry. The improved algorithm with community-based competition that has been added aims to maintain the diversity of genes during evolution; meanwhile, the adaptation means that the probabilities of crossover and mutation are varied from the fitness values of the Tang poetry to prevent premature convergence and generate better poems more quickly. According to the analysis of experimental results, it has been found that the improved algorithm is superior to the conventional method.

  14. Automatically-generated rectal dose constraints in intensity-modulated radiation therapy for prostate cancer

    Science.gov (United States)

    Hwang, Taejin; Kim, Yong Nam; Kim, Soo Kon; Kang, Sei-Kwon; Cheong, Kwang-Ho; Park, Soah; Yoon, Jai-Woong; Han, Taejin; Kim, Haeyoung; Lee, Meyeon; Kim, Kyoung-Joo; Bae, Hoonsik; Suh, Tae-Suk

    2015-06-01

    The dose constraint during prostate intensity-modulated radiation therapy (IMRT) optimization should be patient-specific for better rectum sparing. The aims of this study are to suggest a novel method for automatically generating a patient-specific dose constraint by using an experience-based dose volume histogram (DVH) of the rectum and to evaluate the potential of such a dose constraint qualitatively. The normal tissue complication probabilities (NTCPs) of the rectum with respect to V %ratio in our study were divided into three groups, where V %ratio was defined as the percent ratio of the rectal volume overlapping the planning target volume (PTV) to the rectal volume: (1) the rectal NTCPs in the previous study (clinical data), (2) those statistically generated by using the standard normal distribution (calculated data), and (3) those generated by combining the calculated data and the clinical data (mixed data). In the calculated data, a random number whose mean value was on the fitted curve described in the clinical data and whose standard deviation was 1% was generated by using the `randn' function in the MATLAB program and was used. For each group, we validated whether the probability density function (PDF) of the rectal NTCP could be automatically generated with the density estimation method by using a Gaussian kernel. The results revealed that the rectal NTCP probability increased in proportion to V %ratio , that the predictive rectal NTCP was patient-specific, and that the starting point of IMRT optimization for the given patient might be different. The PDF of the rectal NTCP was obtained automatically for each group except that the smoothness of the probability distribution increased with increasing number of data and with increasing window width. We showed that during the prostate IMRT optimization, the patient-specific dose constraints could be automatically generated and that our method could reduce the IMRT optimization time as well as maintain the

  15. Automatic WSDL-guided Test Case Generation for PropEr Testing of Web Services

    Directory of Open Access Journals (Sweden)

    Konstantinos Sagonas

    2012-10-01

    Full Text Available With web services already being key ingredients of modern web systems, automatic and easy-to-use but at the same time powerful and expressive testing frameworks for web services are increasingly important. Our work aims at fully automatic testing of web services: ideally the user only specifies properties that the web service is expected to satisfy, in the form of input-output relations, and the system handles all the rest. In this paper we present in detail the component which lies at the heart of this system: how the WSDL specification of a web service is used to automatically create test case generators that can be fed to PropEr, a property-based testing tool, to create structurally valid random test cases for its operations and check its responses. Although the process is fully automatic, our tool optionally allows the user to easily modify its output to either add semantic information to the generators or write properties that test for more involved functionality of the web services.

  16. Automatic treatment planning facilitates fast generation of high-quality treatment plans for esophageal cancer

    DEFF Research Database (Denmark)

    Hansen, Christian Rønn; Nielsen, Morten; Bertelsen, Anders Smedegaard

    2017-01-01

    BACKGROUND: The quality of radiotherapy planning has improved substantially in the last decade with the introduction of intensity modulated radiotherapy. The purpose of this study was to analyze the plan quality and efficacy of automatically (AU) generated VMAT plans for inoperable esophageal...... to the lungs. The automation of the planning process generated esophageal cancer treatment plans quickly and with high quality....... cancer patients. MATERIAL AND METHODS: Thirty-two consecutive inoperable patients with esophageal cancer originally treated with manually (MA) generated volumetric modulated arc therapy (VMAT) plans were retrospectively replanned using an auto-planning engine. All plans were optimized with one full 6MV...

  17. Ground-based observations of exoplanet atmospheres

    NARCIS (Netherlands)

    Mooij, Ernst Johan Walter de

    2011-01-01

    This thesis focuses on the properties of exoplanet atmospheres. The results for ground-based near-infrared secondary eclipse observations of three different exoplanets, TrES-3b, HAT-P-1b and WASP-33b, are presented which have been obtained with ground-based telescopes as part of the GROUSE project.

  18. Ground-based observations of exoplanet atmospheres

    NARCIS (Netherlands)

    Mooij, Ernst Johan Walter de

    2011-01-01

    This thesis focuses on the properties of exoplanet atmospheres. The results for ground-based near-infrared secondary eclipse observations of three different exoplanets, TrES-3b, HAT-P-1b and WASP-33b, are presented which have been obtained with ground-based telescopes as part of the GROUSE project.

  19. Towards automatically generating graphical user interfaces from openEHR archetypes.

    Science.gov (United States)

    Schuler, Thilo; Garde, Sebastian; Heard, Sam; Beale, Thomas

    2006-01-01

    One of the main challenges in the field of Electronic Health Records (EHRs) is semantic interoperability. To utilise the full potential of interoperable EHR systems they have to be accepted by their users, the health care providers. Good Graphical User Interfaces (GUIs) that support customisation and data validation play a decisive role for user acceptance and data quality. This study investigates the use of openEHR archetypes to automatically generate coherent, customizable, data-validating GUIs. Using the Mozilla XML User Interface Language (XUL) a series of prototypes has been developed. The results show that the automatic generation of GUIs from openEHR archetypes is feasible in principle. Although XUL revealed some problems, the advantages of XML-based GUI languages are evident.

  20. Automatic finite elements mesh generation from planar contours of the brain: an image driven 'blobby' approach

    CERN Document Server

    Bucki, M; Bucki, Marek; Payan, Yohan

    2005-01-01

    In this paper, we address the problem of automatic mesh generation for finite elements modeling of anatomical organs for which a volumetric data set is available. In the first step a set of characteristic outlines of the organ is defined manually or automatically within the volume. The outlines define the "key frames" that will guide the procedure of surface reconstruction. Then, based on this information, and along with organ surface curvature information extracted from the volume data, a 3D scalar field is generated. This field allows a 3D reconstruction of the organ: as an iso-surface model, using a marching cubes algorithm; or as a 3D mesh, using a grid "immersion" technique, the field value being used as the outside/inside test. The final reconstruction respects the various topological changes that occur within the organ, such as holes and branching elements.

  1. OntoDiagram: Automatic Diagram Generation for Congenital Heart Defects in Pediatric Cardiology

    OpenAIRE

    Vishwanath, Kartik; Viswanath, Venkatesh; Drake, William; Lee, Yugyung

    2005-01-01

    In pediatric cardiology as well as many other medical specialties, the accurate portrayal of a large volume of patient information is crucial to providing good patient care. Our research aims at utilizing clinical and spatial ontologies representing the human heart, to automatically generate a Mullins-like diagram [6] based on a patient's information in the cardiology databases. Our ontology allows an intuitive way of modeling congenital defects with the structure of the hum...

  2. Automatic Generation of English-Japanese Translation Pattern Utilizing Genetic Programming Technique

    Science.gov (United States)

    Matsumura, Koki; Tamekuni, Yuji; Kimura, Shuhei

    There are a lot of constructional differences in an English-Japanese phrase template, and that often makes the act of translation difficult. Moreover, there exist various and tremendous phrase templates and sentence to be refered to. It is not easy to prepare the corpus that covers the all. Therefore, it is very significant to generate the translation pattern of the sentence pattern automatically from a viewpoint of the translation success rate and the capacity of the pattern dictionary. Then, for the purpose of realizing the automatic generation of the translation pattern, this paper proposed the new method for the generation of the translation pattern by using the genetic programming technique (GP). The technique tries to generate the translation pattern of various sentences which are not registered in the phrase template dictionary automatically by giving the genetic operation to the parsing tree of a basic pattern. The tree consists of the pair of the English-Japanese sentence generated as the first stage population. The analysis tree data base with 50,100,150,200 pairs was prepared as the first stage population. And this system was applied and executed for an English input of 1,555 sentences. As a result, the analysis tree increases from 200 to 517, and the accuracy rate of the translation pattern has improved from 42.57% to 70.10%. And, 86.71% of the generated translations was successfully done, whose meanings are enough acceptable and understandable. It seemed that this proposal technique became a clue to raise the translation success rate, and to find the possibility of the reduction of the analysis tree data base.

  3. Design of an Intelligent Interlocking System Based on Automatically Generated Interlocking Table

    Energy Technology Data Exchange (ETDEWEB)

    Ko, Y.S. [Namseoul University, Chonan (Korea)

    2002-03-01

    In this paper, we propose an expert system for electronic interlocking which enhances the safety, efficiency and expanability of the existing system by designing real-time interlocking control based on the interlocking table automatically generated using artificial intelligence approach. The expert system consists of two parts; an interlocking table generation part and a real-time interlocking control part. The former generates automatically the interlocking relationship of all possible routes by searching dynamically the station topology which is obtained from station database. On the other hand, the latter controls the status of station facilities in real-time by applying the generated interlocking relationship to the signal facilities such as signal devices, points, track circuits for a given route. The expert system is implemented in C language which is suitable to implement the interlocking table generation part using the dynamic memory allocation technique. Finally, the effectiveness of the expert system is proved by simulating for the typical station model. (author). 11 refs., 9 figs., 2 tabs.

  4. Automatic Generation of Cycle-Approximate TLMs with Timed RTOS Model Support

    Science.gov (United States)

    Hwang, Yonghyun; Schirner, Gunar; Abdi, Samar

    This paper presents a technique for automatically generating cycle-approximate transaction level models (TLMs) for multi-process applications mapped to embedded platforms. It incorporates three key features: (a) basic block level timing annotation, (b) RTOS model integration, and (c) RTOS overhead delay modeling. The inputs to TLM generation are application C processes and their mapping to processors in the platform. A processor data model, including pipelined datapath, memory hierarchy and branch delay model is used to estimate basic block execution delays. The delays are annotated to the C code, which is then integrated with a generated SystemC RTOS model. Our abstract RTOS provides dynamic scheduling and inter-process communication (IPC) with processor- and RTOS-specific pre-characterized timing. Our experiments using a MP3 decoder and a JPEG encoder show that timed TLMs, with integrated RTOS models, can be automatically generated in less than a minute. Our generated TLMs simulated three times faster than real-time and showed less than 10% timing error compared to board measurements.

  5. A Method of Generating Indoor Map Spatial Data Automatically from Architectural Plans

    Directory of Open Access Journals (Sweden)

    SUN Weixin

    2016-06-01

    Full Text Available Taking architectural plans as data source, we proposed a method which can automatically generate indoor map spatial data. Firstly, referring to the spatial data demands of indoor map, we analyzed the basic characteristics of architectural plans, and introduced concepts of wall segment, adjoining node and adjoining wall segment, based on which basic flow of indoor map spatial data automatic generation was further established. Then, according to the adjoining relation between wall lines at the intersection with column, we constructed a repair method for wall connectivity in relation to the column. Utilizing the method of gradual expansibility and graphic reasoning to judge wall symbol local feature type at both sides of door or window, through update the enclosing rectangle of door or window, we developed a repair method for wall connectivity in relation to the door or window and a method for transform door or window into indoor map point feature. Finally, on the basis of geometric relation between adjoining wall segment median lines, a wall center-line extraction algorithm was presented. Taking one exhibition hall's architectural plan as example, we performed experiment and results show that the proposed methods have preferable applicability to deal with various complex situations, and realized indoor map spatial data automatic extraction effectively.

  6. A Genetic Algorithm Optimised Fuzzy Logic Controller for Automatic Generation Control for Single Area System

    Science.gov (United States)

    Saini, J. S.; Jain, V.

    2015-03-01

    This paper presents a genetic algorithm (GA)-based design and optimization of fuzzy logic controller (FLC) for automatic generation control (AGC) for a single area. FLCs are characterized by a set of parameters, which are optimized using GA to improve their performance. The design of input and output membership functions (mfs) of an FLC is carried out by automatically tuning (off-line) the parameters of the membership functions. Tuning is based on maximization of a comprehensive fitness function constructed as inverse of a weighted average of three performance indices, i.e., integral square deviation (ISD), the integral of square of the frequency deviation and peak overshoot (Mp), and settling time (ts). The GA-optimized FLC (GAFLC) shows better performance as compared to a conventional proportional integral (PI) and a hand-designed fuzzy logic controller not only for a standard system (displaying frequency deviations) but also under parametric and load disturbances.

  7. Evaluating the Potential of Imaging Rover for Automatic Point Cloud Generation

    Science.gov (United States)

    Cera, V.; Campi, M.

    2017-02-01

    The paper presents a phase of an on-going interdisciplinary research concerning the medieval site of Casertavecchia (Italy). The project aims to develop a multi-technique approach for the semantic - enriched 3D modeling starting from the automatic acquisition of several data. In particular, the paper reports the results of the first stage about the Cathedral square of the medieval village. The work is focused on evaluating the potential of an imaging rover for automatic point cloud generation. Each of survey techniques has its own advantages and disadvantages so the ideal approach is an integrated methodology in order to maximize single instrument performance. The experimentation was conducted on the Cathedral square of the ancient site of Casertavecchia, in Campania, Italy.

  8. Development of ANJOYMC Program for Automatic Generation of Monte Carlo Cross Section Libraries

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Kang Seog; Lee, Chung Chan

    2007-03-15

    The NJOY code developed at Los Alamos National Laboratory is to generate the cross section libraries in ACE format for the Monte Carlo codes such as MCNP and McCARD by processing the evaluated nuclear data in ENDF/B format. It takes long time to prepare all the NJOY input files for hundreds of nuclides with various temperatures, and there can be some errors in the input files. In order to solve these problems, ANJOYMC program has been developed. By using a simple user input deck, this program is not only to generate all the NJOY input files automatically, but also to generate a batch file to perform all the NJOY calculations. The ANJOYMC program is written in Fortran90 and can be executed under the WINDOWS and LINUX operating systems in Personal Computer. Cross section libraries in ACE format can be generated in a short time and without an error by using a simple user input deck.

  9. Automatic Generation Control Using PI Controller with Bacterial Foraging for both Thermal and Hydro Plants

    Directory of Open Access Journals (Sweden)

    Preeti Hooda,

    2014-06-01

    Full Text Available The load-frequency control (LFC is used to restore the balance between load and generation in each control area by means of speed control. In power system, the main goal of load frequency control (LFC or automatic generation control (AGC is to maintain the frequency of each area and tie- line power flow within specified tolerance by adjusting the MW outputs of LFC generators so as to accommodate fluctuating load demands. In this paper, attempt is made to make a scheme for automatic generation control within a restructured environment considering effects of contracts between DISCOs and GENCOs to make power system network in normal state where, GENCO used are hydro plants as well as thermal plants. The bacterial foraging optimization technique is being developed, which is applied to AGC in an interconnected four area system.The performance of the system is obtained by MATLAB Simulink tool. The results are shown in frequency and power response for four area AGC system. In this paper we have shown practical work by using thermal and hydro both system at Genco’s side.As reheated system transfer function is being used.

  10. Automatic Generation of Data Types for Classification of Deep Web Sources

    Energy Technology Data Exchange (ETDEWEB)

    Ngu, A H; Buttler, D J; Critchlow, T J

    2005-02-14

    A Service Class Description (SCD) is an effective meta-data based approach for discovering Deep Web sources whose data exhibit some regular patterns. However, it is tedious and error prone to create an SCD description manually. Moreover, a manually created SCD is not adaptive to the frequent changes of Web sources. It requires its creator to identify all the possible input and output types of a service a priori. In many domains, it is impossible to exhaustively list all the possible input and output data types of a source in advance. In this paper, we describe machine learning approaches for automatic generation of the data types of an SCD. We propose two different approaches for learning data types of a class of Web sources. The Brute-Force Learner is able to generate data types that can achieve high recall, but with low precision. The Clustering-based Learner generates data types that have a high precision rate, but with a lower recall rate. We demonstrate the feasibility of these two learning-based solutions for automatic generation of data types for citation Web sources and presented a quantitative evaluation of these two solutions.

  11. Three-dimensional elliptic grid generation with fully automatic boundary constraints

    Science.gov (United States)

    Kaul, Upender K.

    2010-08-01

    A new procedure for generating smooth uniformly clustered three-dimensional structured elliptic grids is presented here which formulates three-dimensional boundary constraints by extending the two-dimensional counterpart presented by the author earlier. This fully automatic procedure obviates the need for manual specification of decay parameters over the six bounding surfaces of a given volume grid. The procedure has been demonstrated here for the Mars Science Laboratory (MSL) geometries such as aeroshell and canopy, as well as the Inflatable Aerodynamic Decelerator (IAD) geometry and a 3D analytically defined geometry. The new procedure also enables generation of single-block grids for such geometries because the automatic boundary constraints permit the decay parameters to evolve as part of the solution to the elliptic grid system of equations. These decay parameters are no longer just constants, as specified in the conventional approach, but functions of generalized coordinate variables over a given bounding surface. Since these decay functions vary over a given boundary, orthogonal grids around any arbitrary simply-connected boundary can be clustered automatically without having to break up the boundaries and the corresponding interior or exterior domains into various blocks for grid generation. The new boundary constraints are not limited to the simply-connected regions only, but can also be formulated around multiply-connected and isolated regions in the interior. The proposed method is superior to other methods of grid generation such as algebraic and hyperbolic techniques in that the grids obtained here are C2 continuous, whereas simple elliptic smoothing of algebraic or hyperbolic grids to enforce C2 continuity destroys the grid clustering near the boundaries. US patent 7231329.

  12. On the application of bezier surfaces for GA-Fuzzy controller design for use in automatic generation control

    CSIR Research Space (South Africa)

    Boesack, CD

    2012-03-01

    Full Text Available Automatic Generation Control (AGC) of large interconnected power systems are typically controlled by a PI or PID type control law. Recently intelligent control techniques such as GA-Fuzzy controllers have been widely applied within the power...

  13. Algorithms for the automatic generation of 2-D structured multi-block grids

    Science.gov (United States)

    Schoenfeld, Thilo; Weinerfelt, Per; Jenssen, Carl B.

    1995-01-01

    Two different approaches to the fully automatic generation of structured multi-block grids in two dimensions are presented. The work aims to simplify the user interactivity necessary for the definition of a multiple block grid topology. The first approach is based on an advancing front method commonly used for the generation of unstructured grids. The original algorithm has been modified toward the generation of large quadrilateral elements. The second method is based on the divide-and-conquer paradigm with the global domain recursively partitioned into sub-domains. For either method each of the resulting blocks is then meshed using transfinite interpolation and elliptic smoothing. The applicability of these methods to practical problems is demonstrated for typical geometries of fluid dynamics.

  14. Slow Dynamics Model of Compressed Air Energy Storage and Battery Storage Technologies for Automatic Generation Control

    Energy Technology Data Exchange (ETDEWEB)

    Krishnan, Venkat; Das, Trishna

    2016-05-01

    Increasing variable generation penetration and the consequent increase in short-term variability makes energy storage technologies look attractive, especially in the ancillary market for providing frequency regulation services. This paper presents slow dynamics model for compressed air energy storage and battery storage technologies that can be used in automatic generation control studies to assess the system frequency response and quantify the benefits from storage technologies in providing regulation service. The paper also represents the slow dynamics model of the power system integrated with storage technologies in a complete state space form. The storage technologies have been integrated to the IEEE 24 bus system with single area, and a comparative study of various solution strategies including transmission enhancement and combustion turbine have been performed in terms of generation cycling and frequency response performance metrics.

  15. Automatic test pattern generation for logic circuits using the Boolean tree

    Energy Technology Data Exchange (ETDEWEB)

    Jeong Taegwon.

    1991-01-01

    The goal of this study was to develop an algorithm that can generate test patterns for combinational circuits and sequential logic circuits automatically. The new proposed algorithm generates a test pattern by using a special tree called a modified Boolean tree. In this algorithm, the construction of a modified Boolean tree is the most time-consuming step. Following the construction of a modified Boolean tree, a test pattern can be found by simply assigning a logic value 1 for even primary inputs and a logic value 0 for odd primary inputs of the constructed modified Boolean tree. The algorithm is applied to several benchmark circuits. The results showed the following: (1) for combinational circuits, the algorithm can generate test patterns 10-15% faster than the FAN algorithm, which is known as one of the most efficient algorithms to-date; (2) for sequential circuits, the algorithm shows more fault coverage than the nine valued algorithm.

  16. LHC-GCS a model-driven approach for automatic PLC and SCADA code generation

    CERN Document Server

    Thomas, Geraldine; Barillère, Renaud; Cabaret, Sebastien; Kulman, Nikolay; Pons, Xavier; Rochez, Jacques

    2005-01-01

    The LHC experiments’ Gas Control System (LHC GCS) project [1] aims to provide the four LHC experiments (ALICE, ATLAS, CMS and LHCb) with control for their 23 gas systems. To ease the production and maintenance of 23 control systems, a model-driven approach has been adopted to generate automatically the code for the Programmable Logic Controllers (PLCs) and for the Supervision Control And Data Acquisition (SCADA) systems. The first milestones of the project have been achieved. The LHC GCS framework [4] and the generation tools have been produced. A first control application has actually been generated and is in production, and a second is in preparation. This paper describes the principle and the architecture of the model-driven solution. It will in particular detail how the model-driven solution fits with the LHC GCS framework and with the UNICOS [5] data-driven tools.

  17. Z Specification Automatic Generator%Z规格说明自动生成器

    Institute of Scientific and Technical Information of China (English)

    赵正旭; 温晋杰

    2016-01-01

    The formalized Z language can improve the reliability and robustness of the software via using complex mathematical theories. However, only a few people can understand these theories and compile with Z specification. At present, the main research of Z language focuses on the theoretical research. There is no corresponding tools support the automatic generation of Z specification. The research of Z specification automatic generator introduced in this article can help with the compilation of the Z specification and cut the cost of formal development. This automatic generator has great significance for the large-scale promotion of the Z language.%形式化Z语言采用严格的数学理论可以有效提高软件的可靠性和鲁棒性,但是由于其包含的数学理论使得只有少数人能够熟练应用Z语言进行形式化规格说明书的编写.目前,多数对于Z语言的研究集中在理论阶段,还没有相应的工具支持Z规格说明的自动生成.本文中对于Z规格说明自动生成器的研究有助于降低Z规格说明书的编写难度,降低了形式化开发的难度及成本,对于形式化Z语言的推广具有重要的意义.

  18. Lightning Protection Performance Assessment of Transmission Line Based on ATP model Automatic Generation

    Directory of Open Access Journals (Sweden)

    Luo Hanwu

    2016-01-01

    Full Text Available This paper presents a novel method to solve the initial lightning breakdown current by combing ATP and MATLAB simulation software effectively, with the aims to evaluate the lightning protection performance of transmission line. Firstly, the executable ATP simulation model is generated automatically according to the required information such as power source parameters, tower parameters, overhead line parameters, grounding resistance and lightning current parameters, etc. through an interface program coded by MATLAB. Then, the data are extracted from the generated LIS files which can be obtained by executing the ATP simulation model, the occurrence of transmission lie breakdown can be determined by the relative data in LIS file. The lightning current amplitude should be reduced when the breakdown occurs, and vice the verse. Thus the initial lightning breakdown current of a transmission line with given parameters can be determined accurately by continuously changing the lightning current amplitude, which is realized by a loop computing algorithm that is coded by MATLAB software. The method proposed in this paper can generate the ATP simulation program automatically, and facilitates the lightning protection performance assessment of transmission line.

  19. Application of GA optimization for automatic generation control design in an interconnected power system

    Energy Technology Data Exchange (ETDEWEB)

    Golpira, H., E-mail: hemin.golpira@uok.ac.i [Department of Electrical and Computer Engineering, University of Kurdistan, Sanandaj, PO Box 416, Kurdistan (Iran, Islamic Republic of); Bevrani, H. [Department of Electrical and Computer Engineering, University of Kurdistan, Sanandaj, PO Box 416, Kurdistan (Iran, Islamic Republic of); Golpira, H. [Department of Industrial Engineering, Islamic Azad University, Sanandaj Branch, PO Box 618, Kurdistan (Iran, Islamic Republic of)

    2011-05-15

    Highlights: {yields} A realistic model for automatic generation control (AGC) design is proposed. {yields} The model considers GRC, Speed governor dead band, filters and time delay. {yields} The model provides an accurate model for the digital simulations. -- Abstract: This paper addresses a realistic model for automatic generation control (AGC) design in an interconnected power system. The proposed scheme considers generation rate constraint (GRC), dead band, and time delay imposed to the power system by governor-turbine, filters, thermodynamic process, and communication channels. Simplicity of structure and acceptable response of the well-known integral controller make it attractive for the power system AGC design problem. The Genetic algorithm (GA) is used to compute the decentralized control parameters to achieve an optimum operating point. A 3-control area power system is considered as a test system, and the closed-loop performance is examined in the presence of various constraints scenarios. It is shown that neglecting above physical constraints simultaneously or in part, leads to impractical and invalid results and may affect the system security, reliability and integrity. Taking to account the advantages of GA besides considering a more complete dynamic model provides a flexible and more realistic AGC system in comparison of existing conventional schemes.

  20. An Automatic Code Generator Expert System Using Proprietary Language for Wider Business Application

    Directory of Open Access Journals (Sweden)

    Aurangzeb Khan

    2014-05-01

    Full Text Available The proposed System is an automatic front-end Code Generator Expert System (CGES for ensuring wider business application for the generation of GUI with a source code for Databases. Safe keeping of data for smooth transaction in business has always been a matter of concern. With the help of the proposed CGES, with economy of effort and time, a customizable database application may be produced with a simple wizard. The CGES requires a database as a pre-requisite input. Once the normalized database is featured with a diagram, the CGES shall apply techniques according to the pre-defined algorithm; the complete application with source code in various modules shall automatically produce. By selecting the CGES solutions, an N-tier application shall give rise to a product, comprising of SQL server queries, Object Oriented features and modules. The results prove on a test working principles of the system are written in MS SQL Server and on the Visual Basic.NET source code generated by CGES.

  1. Ontorat: automatic generation of new ontology terms, annotations, and axioms based on ontology design patterns.

    Science.gov (United States)

    Xiang, Zuoshuang; Zheng, Jie; Lin, Yu; He, Yongqun

    2015-01-01

    It is time-consuming to build an ontology with many terms and axioms. Thus it is desired to automate the process of ontology development. Ontology Design Patterns (ODPs) provide a reusable solution to solve a recurrent modeling problem in the context of ontology engineering. Because ontology terms often follow specific ODPs, the Ontology for Biomedical Investigations (OBI) developers proposed a Quick Term Templates (QTTs) process targeted at generating new ontology classes following the same pattern, using term templates in a spreadsheet format. Inspired by the ODPs and QTTs, the Ontorat web application is developed to automatically generate new ontology terms, annotations of terms, and logical axioms based on a specific ODP(s). The inputs of an Ontorat execution include axiom expression settings, an input data file, ID generation settings, and a target ontology (optional). The axiom expression settings can be saved as a predesigned Ontorat setting format text file for reuse. The input data file is generated based on a template file created by a specific ODP (text or Excel format). Ontorat is an efficient tool for ontology expansion. Different use cases are described. For example, Ontorat was applied to automatically generate over 1,000 Japan RIKEN cell line cell terms with both logical axioms and rich annotation axioms in the Cell Line Ontology (CLO). Approximately 800 licensed animal vaccines were represented and annotated in the Vaccine Ontology (VO) by Ontorat. The OBI team used Ontorat to add assay and device terms required by ENCODE project. Ontorat was also used to add missing annotations to all existing Biobank specific terms in the Biobank Ontology. A collection of ODPs and templates with examples are provided on the Ontorat website and can be reused to facilitate ontology development. With ever increasing ontology development and applications, Ontorat provides a timely platform for generating and annotating a large number of ontology terms by following

  2. GLAST and Ground-Based Gamma-Ray Astronomy

    Science.gov (United States)

    McEnery, Julie

    2008-01-01

    The launch of the Gamma-ray Large Area Space Telescope together with the advent of a new generation of ground-based gamma-ray detectors such as VERITAS, HESS, MAGIC and CANGAROO, will usher in a new era of high-energy gamma-ray astrophysics. GLAST and the ground based gamma-ray observatories will provide highly complementary capabilities for spectral, temporal and spatial studies of high energy gamma-ray sources. Joint observations will cover a huge energy range, from 20 MeV to over 20 TeV. The LAT will survey the entire sky every three hours, allowing it both to perform uniform, long-term monitoring of variable sources and to detect flaring sources promptly. Both functions complement the high-sensitivity pointed observations provided by ground-based detectors. Finally, the large field of view of GLAST will allow a study of gamma-ray emission on large angular scales and identify interesting regions of the sky for deeper studies at higher energies. In this poster, we will discuss the science returns that might result from joint GLAST/ground-based gamma-ray observations and illustrate them with detailed source simulations.

  3. GLAST and Ground-Based Gamma-Ray Astronomy

    Science.gov (United States)

    McEnery, Julie

    2008-01-01

    The launch of the Gamma-ray Large Area Space Telescope together with the advent of a new generation of ground-based gamma-ray detectors such as VERITAS, HESS, MAGIC and CANGAROO, will usher in a new era of high-energy gamma-ray astrophysics. GLAST and the ground based gamma-ray observatories will provide highly complementary capabilities for spectral, temporal and spatial studies of high energy gamma-ray sources. Joint observations will cover a huge energy range, from 20 MeV to over 20 TeV. The LAT will survey the entire sky every three hours, allowing it both to perform uniform, long-term monitoring of variable sources and to detect flaring sources promptly. Both functions complement the high-sensitivity pointed observations provided by ground-based detectors. Finally, the large field of view of GLAST will allow a study of gamma-ray emission on large angular scales and identify interesting regions of the sky for deeper studies at higher energies. In this poster, we will discuss the science returns that might result from joint GLAST/ground-based gamma-ray observations and illustrate them with detailed source simulations.

  4. Automatic Data Extraction from Websites for Generating Aquatic Product Market Information

    Institute of Scientific and Technical Information of China (English)

    YUAN Hong-chun; CHEN Ying; SUN Yue-fu

    2006-01-01

    The massive web-based information resources have led to an increasing demand for effective automatic retrieval of target information for web applications. This paper introduces a web-based data extraction tool that deploys various algorithms to locate, extract and filter tabular data from HTML pages and to transform them into new web-based representations. The tool has been applied in an aquaculture web application platform for extracting and generating aquatic product market information.Results prove that this tool is very effective in extracting the required data from web pages.

  5. LanHEP - a package for automatic generation of Feynman rules in gauge models

    CERN Document Server

    Semenov, A Yu

    1996-01-01

    We consider the general problem of derivation of the Feynman rules for the matrix elements in momentum representation from the given Lagrangian in coordinate space invariant under the transformation of some gauge group. LanHEP package presented in this paper allows to define in a convenient way the gauge model Lagrangian in canonical form and then to generate automatically the Feynman rules that can be used in the following calculation of the physical processes by means of CompHEP package. The detailed description of LanHEP commands is given and several examples of LanHEP applications (QED, QCD, Standard Model in the t'Hooft-Feynman gauge) are presented.

  6. Experiences with the application of the ADIC automatic differentiation tool for to the CSCMDO 3-D volume grid generation code

    Energy Technology Data Exchange (ETDEWEB)

    Bischof, C.H.; Mauer, A. [Argonne National Lab., IL (United States). Mathematics and Computer Science Division; Jones, W.T. [Computer Sciences Corp., Hampton, VA (United States)] [and others

    1995-12-31

    Automatic differentiation (AD) is a methodology for developing reliable sensitivity-enhanced versions of arbitrary computer programs with little human effort. It can vastly accelerate the use of advanced simulation codes in multidisciplinary design optimization, since the time for generating and verifying derivative codes is greatly reduced. In this paper, we report on the application of the recently developed ADIC automatic differentiation tool for ANSI C programs to the CSCMDO multiblock three-dimensional volume grid generator. The ADIC-generated code can easily be interfaced with Fortran derivative codes generated with the ADIFOR AD tool FORTRAN 77 programs, thus providing efficient sensitivity-enhancement techniques for multilanguage, multidiscipline problems.

  7. Automatic Seamline Network Generation for Urban Orthophoto Mosaicking with the Use of a Digital Surface Model

    Directory of Open Access Journals (Sweden)

    Qi Chen

    2014-12-01

    Full Text Available Intelligent seamline selection for image mosaicking is an area of active research in the fields of massive data processing, computer vision, photogrammetry and remote sensing. In mosaicking applications for digital orthophoto maps (DOMs, the visual transition in mosaics is mainly caused by differences in positioning accuracy, image tone and relief displacement of high ground objects between overlapping DOMs. Among these three factors, relief displacement, which prevents the seamless mosaicking of images, is relatively more difficult to address. To minimize visual discontinuities, many optimization algorithms have been studied for the automatic selection of seamlines to avoid high ground objects. Thus, a new automatic seamline selection algorithm using a digital surface model (DSM is proposed. The main idea of this algorithm is to guide a seamline toward a low area on the basis of the elevation information in a DSM. Given that the elevation of a DSM is not completely synchronous with a DOM, a new model, called the orthoimage elevation synchronous model (OESM, is derived and introduced. OESM can accurately reflect the elevation information for each DOM unit. Through the morphological processing of the OESM data in the overlapping area, an initial path network is obtained for seamline selection. Subsequently, a cost function is defined on the basis of several measurements, and Dijkstra’s algorithm is adopted to determine the least-cost path from the initial network. Finally, the proposed algorithm is employed for automatic seamline network construction; the effective mosaic polygon of each image is determined, and a seamless mosaic is generated. The experiments with three different datasets indicate that the proposed method meets the requirements for seamline network construction. In comparative trials, the generated seamlines pass through fewer ground objects with low time consumption.

  8. Automatic Generation of the Axial Lines of Urban Environments to Capture What We Perceive

    CERN Document Server

    Jiang, Bin

    2008-01-01

    Based on the concepts of isovists and medial axes, we developed a set of algorithms that can automatically generate the axial lines for representing individual linearly stretched parts of open space of an urban environment. Open space is the space between buildings, where people can freely move around. The generation of the axial lines has been a key aspect of space syntax research, conventionally relying on hand-drawn axial lines of an urban environment, often called the axial map, for urban morphological analysis. Although various attempts have been made towards an automatic solution, few of them can produce the axial map that is identical to the hand-drawn one, and none of them really works for different urban environments. Our algorithms are proved to provide a better solution than existing ones. Throughout this paper, we have also argued and demonstrated that the axial lines constitute a true skeleton, superior to the medial axes, in capturing what we perceive about the urban environment. Keywords: Visib...

  9. Automatic Generation Control in Multi Area Interconnected Power System by using HVDC Links

    Directory of Open Access Journals (Sweden)

    Yogendra Arya

    2012-01-01

    Full Text Available This paper investigates the effects of HVDC link in parallel with HVAC link on automatic generation control (AGC problem for a multi-area power system taking into consideration system parameter variations. A fuzzy logic controller is proposed for four area power system interconnected via parallel HVAC/HVDC transmission link which is also referred as asynchronous tie-lines. The linear model of HVAC/HVDC link is developed and the system responses to sudden load change are studied. The simulation studies are carried out for a four area interconnected thermal power system. Suitable solution for automatic generation control problem of four area electrical power system is obtained by means of improving the dynamic performance of power system under study. Robustness of controller is also checked by varying parameters. Simulation results indicate that the scheme works well. The dynamic analyses have been done with and without HVDC link using fuzzy logic controller in Matlab-Simulink. Further a comparison between the two is presented and it has been shown that the performance of the proposed scheme is superior in terms of overshoot and settling time.

  10. The application of ANN technique to automatic generation control for multi-area power system

    Energy Technology Data Exchange (ETDEWEB)

    Zeynelgil, H.L.; Demiroren, A.; Sengor, N.S. [Istanbul Technical Univ., Maslak (Turkey). Electrical and Electronic Faculty

    2002-06-01

    This paper presents an application of layered artificial neural network controller (ANN) to study automatic generation control (AGC) problem in a four-area interconnected power system that three areas include steam turbines and the other area includes a hydro turbine. Each area of steam turbine in the system contains the reheat effect non-linearity of the steam turbine and the area of hydro turbine contains upper and lower constraints for generation rate. Only one ANN controller, which controls the inputs of each area in the power system together, is considered. In the study, back propagation-through-time algorithm is used as ANN learning rule. By comparing the results for both cases, the performance of ANN controller is better than conventional controllers. (author)

  11. Semi-automatic identification photo generation with facial pose and illumination normalization

    Science.gov (United States)

    Jiang, Bo; Liu, Sijiang; Wu, Song

    2016-07-01

    Identification photo is a category of facial image that has strict requirements on image quality like size, illumination, user expression, dressing, etc. Traditionally, these photos are taken in professional studios. With the rapid popularity of mobile devices, how to conveniently take identification photo at any time and anywhere with such devices is an interesting problem. In this paper, we propose a novel semi-automatic identification photo generation approach. Given a user image, facial pose and expression are first normalized to meet the basic requirements. To correct uneven lighting condition in photo, an facial illumination normalization approach is adopted to further improve the image quality. Finally, foreground user is extracted and re-targeted to a specific photo size. Besides, background can also be changed as required. Preliminary experimental results show that the proposed method is efficient and effective in identification photo generation compared to commercial software based manual tunning.

  12. Automatic code generation in SPARK: Applications of computer algebra and compiler-compilers

    Energy Technology Data Exchange (ETDEWEB)

    Nataf, J.M.; Winkelmann, F.

    1992-09-01

    We show how computer algebra and compiler-compilers are used for automatic code generation in the Simulation Problem Analysis and Research Kernel (SPARK), an object oriented environment for modeling complex physical systems that can be described by differential-algebraic equations. After a brief overview of SPARK, we describe the use of computer algebra in SPARK`s symbolic interface, which generates solution code for equations that are entered in symbolic form. We also describe how the Lex/Yacc compiler-compiler is used to achieve important extensions to the SPARK simulation language, including parametrized macro objects and steady-state resetting of a dynamic simulation. The application of these methods to solving the partial differential equations for two-dimensional heat flow is illustrated.

  13. Automatic code generation in SPARK: Applications of computer algebra and compiler-compilers

    Energy Technology Data Exchange (ETDEWEB)

    Nataf, J.M.; Winkelmann, F.

    1992-09-01

    We show how computer algebra and compiler-compilers are used for automatic code generation in the Simulation Problem Analysis and Research Kernel (SPARK), an object oriented environment for modeling complex physical systems that can be described by differential-algebraic equations. After a brief overview of SPARK, we describe the use of computer algebra in SPARK's symbolic interface, which generates solution code for equations that are entered in symbolic form. We also describe how the Lex/Yacc compiler-compiler is used to achieve important extensions to the SPARK simulation language, including parametrized macro objects and steady-state resetting of a dynamic simulation. The application of these methods to solving the partial differential equations for two-dimensional heat flow is illustrated.

  14. Automatic generation control with thyristor controlled series compensator including superconducting magnetic energy storage units

    Directory of Open Access Journals (Sweden)

    Saroj Padhan

    2014-09-01

    Full Text Available In the present work, an attempt has been made to understand the dynamic performance of Automatic Generation Control (AGC of multi-area multi-units thermal–thermal power system with the consideration of Reheat turbine, Generation Rate Constraint (GRC and Time delay. Initially, the gains of the fuzzy PID controller are optimized using Differential Evolution (DE algorithm. The superiority of DE is demonstrated by comparing the results with Genetic Algorithm (GA. After that performance of Thyristor Controlled Series Compensator (TCSC has been investigated. Further, a TCSC is placed in the tie-line and Superconducting Magnetic Energy Storage (SMES units are considered in both areas. Finally, sensitivity analysis is performed by varying the system parameters and operating load conditions from their nominal values. It is observed that the optimum gains of the proposed controller need not be reset even if the system is subjected to wide variation in loading condition and system parameters.

  15. Automatic generation of Chinese character using features fusion from calligraphy and font

    Science.gov (United States)

    Shi, Cao; Xiao, Jianguo; Xu, Canhui; Jia, Wenhua

    2014-02-01

    A spatial statistic based contour feature representation is proposed to achieve extraction of local contour feature from Chinese calligraphy character, and a features fusion strategy is designed to automatically generate new hybrid character, making well use of contour feature of calligraphy and structural feature of font. The features fusion strategy employs dilation and erosion operations iteratively to inject the extracted contour feature from Chinese calligraphy into font, which are similar to "pad" and "cut" in a sculpture progress. Experimental results demonstrate that the generated new hybrid character hold both contour feature of calligraphy and structural feature of font. Especially, two kinds of Chinese calligraphy skills called "Fei Bai" and "Zhang Mo" are imitated in the hybrid character. "Fei Bai" depicts a phenomenon that part of a stroke fade out due to the fast movement of hair brush or the lack of ink, and "Zhang Mo" describes a condition that hair brush holds so much ink that strokes overlap.

  16. Automatic verification of SSD and generation of respiratory signal with lasers in radiotherapy: a preliminary study.

    Science.gov (United States)

    Prabhakar, Ramachandran

    2012-01-01

    Source to surface distance (SSD) plays a very important role in external beam radiotherapy treatment verification. In this study, a simple technique has been developed to verify the SSD automatically with lasers. The study also suggests a methodology for determining the respiratory signal with lasers. Two lasers, red and green are mounted on the collimator head of a Clinac 2300 C/D linac along with a camera to determine the SSD. A software (SSDLas) was developed to estimate the SSD automatically from the images captured by a 12-megapixel camera. To determine the SSD to a patient surface, the external body contour of the central axis transverse computed tomography (CT) cut is imported into the software. Another important aspect in radiotherapy is the generation of respiratory signal. The changes in the lasers separation as the patient breathes are converted to produce a respiratory signal. Multiple frames of laser images were acquired from the camera mounted on the collimator head and each frame was analyzed with SSDLas to generate the respiratory signal. The SSD as observed with the ODI on the machine and SSD measured by the SSDlas software was found to be within the tolerance limit. The methodology described for generating the respiratory signals will be useful for the treatment of mobile tumors such as lung, liver, breast, pancreas etc. The technique described for determining the SSD and the generation of respiratory signals using lasers is cost effective and simple to implement. Copyright © 2011 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  17. Program Code Generator for Cardiac Electrophysiology Simulation with Automatic PDE Boundary Condition Handling.

    Science.gov (United States)

    Punzalan, Florencio Rusty; Kunieda, Yoshitoshi; Amano, Akira

    2015-01-01

    Clinical and experimental studies involving human hearts can have certain limitations. Methods such as computer simulations can be an important alternative or supplemental tool. Physiological simulation at the tissue or organ level typically involves the handling of partial differential equations (PDEs). Boundary conditions and distributed parameters, such as those used in pharmacokinetics simulation, add to the complexity of the PDE solution. These factors can tailor PDE solutions and their corresponding program code to specific problems. Boundary condition and parameter changes in the customized code are usually prone to errors and time-consuming. We propose a general approach for handling PDEs and boundary conditions in computational models using a replacement scheme for discretization. This study is an extension of a program generator that we introduced in a previous publication. The program generator can generate code for multi-cell simulations of cardiac electrophysiology. Improvements to the system allow it to handle simultaneous equations in the biological function model as well as implicit PDE numerical schemes. The replacement scheme involves substituting all partial differential terms with numerical solution equations. Once the model and boundary equations are discretized with the numerical solution scheme, instances of the equations are generated to undergo dependency analysis. The result of the dependency analysis is then used to generate the program code. The resulting program code are in Java or C programming language. To validate the automatic handling of boundary conditions in the program code generator, we generated simulation code using the FHN, Luo-Rudy 1, and Hund-Rudy cell models and run cell-to-cell coupling and action potential propagation simulations. One of the simulations is based on a published experiment and simulation results are compared with the experimental data. We conclude that the proposed program code generator can be used to

  18. Program Code Generator for Cardiac Electrophysiology Simulation with Automatic PDE Boundary Condition Handling.

    Directory of Open Access Journals (Sweden)

    Florencio Rusty Punzalan

    Full Text Available Clinical and experimental studies involving human hearts can have certain limitations. Methods such as computer simulations can be an important alternative or supplemental tool. Physiological simulation at the tissue or organ level typically involves the handling of partial differential equations (PDEs. Boundary conditions and distributed parameters, such as those used in pharmacokinetics simulation, add to the complexity of the PDE solution. These factors can tailor PDE solutions and their corresponding program code to specific problems. Boundary condition and parameter changes in the customized code are usually prone to errors and time-consuming. We propose a general approach for handling PDEs and boundary conditions in computational models using a replacement scheme for discretization. This study is an extension of a program generator that we introduced in a previous publication. The program generator can generate code for multi-cell simulations of cardiac electrophysiology. Improvements to the system allow it to handle simultaneous equations in the biological function model as well as implicit PDE numerical schemes. The replacement scheme involves substituting all partial differential terms with numerical solution equations. Once the model and boundary equations are discretized with the numerical solution scheme, instances of the equations are generated to undergo dependency analysis. The result of the dependency analysis is then used to generate the program code. The resulting program code are in Java or C programming language. To validate the automatic handling of boundary conditions in the program code generator, we generated simulation code using the FHN, Luo-Rudy 1, and Hund-Rudy cell models and run cell-to-cell coupling and action potential propagation simulations. One of the simulations is based on a published experiment and simulation results are compared with the experimental data. We conclude that the proposed program code

  19. Automatic Control Systems (ACS for Generation and Sale of Electric Power Under Conditions of Industry-Sector Liberalization

    Directory of Open Access Journals (Sweden)

    Yu. Petrusha

    2013-01-01

    Full Text Available Possible risks pertaining to transition of electric-power industry to market relations have been considered in the paper. The paper presents an integrated ACS for generation and sale of electric power as an improvement of methodology for organizational and technical management. The given system is based on integration of operating Automatic Dispatch Control System (ADCS and developing Automatic Electricity Meter Reading System (AEMRS. The paper proposes to form an inter-branch sector of ACS PLC (Automatic Control System for Prolongation of Life Cycle users which is oriented on provision of development strategy.

  20. Automatic evaluation and data generation for analytical chemistry instrumental analysis exercises

    Directory of Open Access Journals (Sweden)

    Arsenio Muñoz de la Peña

    2014-01-01

    Full Text Available In general, laboratory activities are costly in terms of time, space, and money. As such, the ability to provide realistically simulated laboratory data that enables students to practice data analysis techniques as a complementary activity would be expected to reduce these costs while opening up very interesting possibilities. In the present work, a novel methodology is presented for design of analytical chemistry instrumental analysis exercises that can be automatically personalized for each student and the results evaluated immediately. The proposed system provides each student with a different set of experimental data generated randomly while satisfying a set of constraints, rather than using data obtained from actual laboratory work. This allows the instructor to provide students with a set of practical problems to complement their regular laboratory work along with the corresponding feedback provided by the system's automatic evaluation process. To this end, the Goodle Grading Management System (GMS, an innovative web-based educational tool for automating the collection and assessment of practical exercises for engineering and scientific courses, was developed. The proposed methodology takes full advantage of the Goodle GMS fusion code architecture. The design of a particular exercise is provided ad hoc by the instructor and requires basic Matlab knowledge. The system has been employed with satisfactory results in several university courses. To demonstrate the automatic evaluation process, three exercises are presented in detail. The first exercise involves a linear regression analysis of data and the calculation of the quality parameters of an instrumental analysis method. The second and third exercises address two different comparison tests, a comparison test of the mean and a t-paired test.

  1. Evaluation of Semi-Automatic Metadata Generation Tools: A Survey of the Current State of the Art

    Directory of Open Access Journals (Sweden)

    Jung-ran Park

    2015-09-01

    Full Text Available Assessment of the current landscape of semi-automatic metadata generation tools is particularly important considering the rapid development of digital repositories and the recent explosion of big data. Utilization of (semiautomatic metadata generation is critical in addressing these environmental changes and may be unavoidable in the future considering the costly and complex operation of manual metadata creation. To address such needs, this study examines the range of semi-automatic metadata generation tools (n=39 while providing an analysis of their techniques, features, and functions. The study focuses on open-source tools that can be readily utilized in libraries and other memory institutions.  The challenges and current barriers to implementation of these tools were identified. The greatest area of difficulty lies in the fact that  the piecemeal development of most semi-automatic generation tools only addresses part of the issue of semi-automatic metadata generation, providing solutions to one or a few metadata elements but not the full range elements.  This indicates that significant local efforts will be required to integrate the various tools into a coherent set of a working whole.  Suggestions toward such efforts are presented for future developments that may assist information professionals with incorporation of semi-automatic tools within their daily workflows.

  2. Automatic generation of 3D motifs for classification of protein binding sites

    Directory of Open Access Journals (Sweden)

    Herzyk Pawel

    2007-08-01

    Full Text Available Abstract Background Since many of the new protein structures delivered by high-throughput processes do not have any known function, there is a need for structure-based prediction of protein function. Protein 3D structures can be clustered according to their fold or secondary structures to produce classes of some functional significance. A recent alternative has been to detect specific 3D motifs which are often associated to active sites. Unfortunately, there are very few known 3D motifs, which are usually the result of a manual process, compared to the number of sequential motifs already known. In this paper, we report a method to automatically generate 3D motifs of protein structure binding sites based on consensus atom positions and evaluate it on a set of adenine based ligands. Results Our new approach was validated by generating automatically 3D patterns for the main adenine based ligands, i.e. AMP, ADP and ATP. Out of the 18 detected patterns, only one, the ADP4 pattern, is not associated with well defined structural patterns. Moreover, most of the patterns could be classified as binding site 3D motifs. Literature research revealed that the ADP4 pattern actually corresponds to structural features which show complex evolutionary links between ligases and transferases. Therefore, all of the generated patterns prove to be meaningful. Each pattern was used to query all PDB proteins which bind either purine based or guanine based ligands, in order to evaluate the classification and annotation properties of the pattern. Overall, our 3D patterns matched 31% of proteins with adenine based ligands and 95.5% of them were classified correctly. Conclusion A new metric has been introduced allowing the classification of proteins according to the similarity of atomic environment of binding sites, and a methodology has been developed to automatically produce 3D patterns from that classification. A study of proteins binding adenine based ligands showed that

  3. Differential evolution algorithm based automatic generation control for interconnected power systems with

    Directory of Open Access Journals (Sweden)

    Banaja Mohanty

    2014-09-01

    Full Text Available This paper presents the design and performance analysis of Differential Evolution (DE algorithm based Proportional–Integral (PI and Proportional–Integral–Derivative (PID controllers for Automatic Generation Control (AGC of an interconnected power system. Initially, a two area thermal system with governor dead-band nonlinearity is considered for the design and analysis purpose. In the proposed approach, the design problem is formulated as an optimization problem control and DE is employed to search for optimal controller parameters. Three different objective functions are used for the design purpose. The superiority of the proposed approach has been shown by comparing the results with a recently published Craziness based Particle Swarm Optimization (CPSO technique for the same interconnected power system. It is noticed that, the dynamic performance of DE optimized PI controller is better than CPSO optimized PI controllers. Additionally, controller parameters are tuned at different loading conditions so that an adaptive gain scheduling control strategy can be employed. The study is further extended to a more realistic network of two-area six unit system with different power generating units such as thermal, hydro, wind and diesel generating units considering boiler dynamics for thermal plants, Generation Rate Constraint (GRC and Governor Dead Band (GDB non-linearity.

  4. Automatic Generation of the Planning Tunnel High Speed Craft Hull Form

    Institute of Scientific and Technical Information of China (English)

    Morteza Ghassabzadeh; Hassan Ghassemi

    2012-01-01

    The creation of geometric model of a ship to determine the characteristics of hydrostatic and hydrodynamic,and also for structural design and equipments arrangement are so important in the ship design process.Planning tunnel high speed craft is one of the crafts in which,achievement to their top speed is more important.These crafts with the use of tunnel have the aero-hydrodynamics properties to diminish the resistance,good sea-keeping behavior,reduce slamming and avoid porpoising.Because of the existence of the tunnel,the hull form generation of these crafts is more complex and difficult.In this paper,it has attempted to provide a method based on geometry creation guidelines and with an entry of the least control and hull form adjustment parameters,to generate automatically the hull form of planning tunnel craft.At first,the equations of mathematical model are described and subsequent,three different models generated based on present method are compared and analyzed.Obviously,the generated model has more application in the early stages of design.

  5. Automatic Generation of combination of Values for Functional Testing Using Metaheuristics

    Directory of Open Access Journals (Sweden)

    Arloys Macias Rojas

    2016-11-01

    Full Text Available Several authors agree with the importance of the tests like element of quality control of the software and in the impossibility of their realization of exhaustive way. This opinion defends that, the necessary quan-tity of stages and test values to achieve the maximum coverage is too big, what converts the test-case design, and in particular the generation of its values, in a combinatorial problem. That´s why, in many instances, in front of the impossibility of covering all the stages, testers leave out of the design some inter-esting values, which can discover inconsistencies with the specified requirements.This work presents a proposal for the automatic generation of values of functional test cases, by means of the use of meta-heu-ristic algorithms and maximizing the coverage of the stages. Furthermore, the algorithms implemented for the generation of initial values and for the generation of combinations are detailed. Additionally a set of good practices to use the component and the comparison of the obtained results with other existing solutions are described.

  6. Fresnel zones for ground-based antennas

    DEFF Research Database (Denmark)

    Andersen, J. Bach

    1964-01-01

    The ordinary Fresnel zone concept is modified to include the influence of finite ground conductivity. This is important for ground-based antennas because the influence on the radiation pattern of irregularities near the antenna is determined by the amplitude and phase of the groundwave. A new...

  7. Calibration of Ground-based Lidar instrument

    DEFF Research Database (Denmark)

    Villanueva, Héctor; Gómez Arranz, Paula

    This report presents the result of the lidar calibration performed for the given Ground-based Lidar at DTU’s test site for large wind turbines at Høvsøre, Denmark. Calibration is here understood as the establishment of a relation between the reference wind speed measurements with measurement unce...

  8. Calibration of Ground -based Lidar instrument

    DEFF Research Database (Denmark)

    Villanueva, Héctor; Yordanova, Ginka

    This report presents the result of the lidar calibration performed for the given Ground-based Lidar at DTU’s test site for large wind turbines at Høvsøre, Denmark. Calibration is here understood as the establishment of a relation between the reference wind speed measurements with measurement unce...

  9. A Development Process for Enterprise Information Systems Based on Automatic Generation of the Components

    Directory of Open Access Journals (Sweden)

    Adrian ALEXANDRESCU

    2008-01-01

    Full Text Available This paper contains some ideas concerning the Enterprise Information Systems (EIS development. It combines known elements from the software engineering domain, with original elements, which the author has conceived and experimented. The author has followed two major objectives: to use a simple description for the concepts of an EIS, and to achieve a rapid and reliable EIS development process with minimal cost. The first goal was achieved defining some models, which describes the conceptual elements of the EIS domain: entities, events, actions, states and attribute-domain. The second goal is based on a predefined architectural model for the EIS, on predefined analyze and design models for the elements of the domain and finally on the automatic generation of the system components. The proposed methods do not depend on a special programming language or a data base management system. They are general and may be applied to any combination of such technologies.

  10. Design and analysis of differential evolution algorithm based automatic generation control for interconnected power system

    Directory of Open Access Journals (Sweden)

    Umesh Kumar Rout

    2013-09-01

    Full Text Available This paper presents the design and performance analysis of Differential Evolution (DE algorithm based Proportional-Integral (PI controller for Automatic Generation Control (AGC of an interconnected power system. A two area non-reheat thermal system equipped with PI controllers which is widely used in literature is considered for the design and analysis purpose. The design problem is formulated as an optimization problem control and DE is employed to search for optimal controller parameters. Three different objective functions using Integral Time multiply Absolute Error (ITAE, damping ratio of dominant eigenvalues and settling time with appropriate weight coefficients are derived in order to increase the performance of the controller. The superiority of the proposed DE optimized PI controller has been shown by comparing the results with some recently published modern heuristic optimization techniques such as Bacteria Foraging Optimization Algorithm (BFOA and Genetic Algorithm (GA based PI controller for the same interconnected power system.

  11. A new PID controller design for automatic generation control of hydro power systems

    Energy Technology Data Exchange (ETDEWEB)

    Khodabakhshian, A.; Hooshmand, R. [Electrical Engineering Department, University of Isfahan (Iran)

    2010-06-15

    This paper presents a new robust PID controller for automatic generation control (AGC) of hydro turbine power systems. The method is mainly based on a maximum peak resonance specification that is graphically supported by the Nichols chart. The open-loop frequency response curve is tangent to a specified ellipse and this makes the method to be efficient for controlling the overshoot, the stability and the dynamics of the system. Comparative results of this new load frequency controller with a conventional PI one and also with another PID controller design tested on a multimachine power system show the improvement in system damping remarkably. The region of acceptable performance of the new PID controller covers a wide range of operating and system conditions. (author)

  12. Automatic Optimizer Generation Method Based on Location and Context Information to Improve Mobile Services

    Directory of Open Access Journals (Sweden)

    Yunsik Son

    2017-01-01

    Full Text Available Several location-based services (LBSs have been recently developed for smartphones. Among these are proactive LBSs, which provide services to smartphone users by periodically collecting background logs. However, because they consume considerable battery power, they are not widely used for various LBS-based services. Battery consumption, in particular, is a significant issue on account of the characteristics of mobile systems. This problem involves a greater service restriction when performing complex operations. Therefore, to successfully enable various services based on location, this problem must be solved. In this paper, we introduce a technique to automatically generate a customized service optimizer for each application, service type, and platform using location and situation information. By using the proposed technique, energy and computing resources can be more efficiently employed for each service. Thus, users should receive more effective LBSs on mobile devices, such as smartphones.

  13. HELAC-Onia: an automatic matrix element generator for heavy quarkonium physics

    CERN Document Server

    Shao, Hua-Sheng

    2013-01-01

    By the virtues of the Dyson-Schwinger equations, we upgrade the published code \\mtt{HELAC} to be capable to calculate the heavy quarkonium helicity amplitudes in the framework of NRQCD factorization, which we dub \\mtt{HELAC-Onia}. We rewrote the original \\mtt{HELAC} to make the new program be able to calculate helicity amplitudes of multi P-wave quarkonium states production at hadron colliders and electron-positron colliders by including new P-wave off-shell currents. Therefore, besides the high efficiencies in computation of multi-leg processes within the Standard Model, \\mtt{HELAC-Onia} is also sufficiently numerical stable in dealing with P-wave quarkonia (e.g. $h_{c,b},\\chi_{c,b}$) and P-wave color-octet intermediate states. To the best of our knowledge, it is a first general-purpose automatic quarkonium matrix elements generator based on recursion relations on the market.

  14. Decentralized automatic generation control of interconnected power systems incorporating asynchronous tie-lines.

    Science.gov (United States)

    Ibraheem; Hasan, Naimul; Hussein, Arkan Ahmed

    2014-01-01

    This Paper presents the design of decentralized automatic generation controller for an interconnected power system using PID, Genetic Algorithm (GA) and Particle Swarm Optimization (PSO). The designed controllers are tested on identical two-area interconnected power systems consisting of thermal power plants. The area interconnections between two areas are considered as (i) AC tie-line only (ii) Asynchronous tie-line. The dynamic response analysis is carried out for 1% load perturbation. The performance of the intelligent controllers based on GA and PSO has been compared with the conventional PID controller. The investigations of the system dynamic responses reveal that PSO has the better dynamic response result as compared with PID and GA controller for both type of area interconnection.

  15. Automatic Generation of Building Models with Levels of Detail 1-3

    Science.gov (United States)

    Nguatem, W.; Drauschke, M.; Mayer, H.

    2016-06-01

    We present a workflow for the automatic generation of building models with levels of detail (LOD) 1 to 3 according to the CityGML standard (Gröger et al., 2012). We start with orienting unsorted image sets employing (Mayer et al., 2012), we compute depth maps using semi-global matching (SGM) (Hirschmüller, 2008), and fuse these depth maps to reconstruct dense 3D point clouds (Kuhn et al., 2014). Based on planes segmented from these point clouds, we have developed a stochastic method for roof model selection (Nguatem et al., 2013) and window model selection (Nguatem et al., 2014). We demonstrate our workflow up to the export into CityGML.

  16. Automatic Generation of 3D Caricatures Based on Artistic Deformation Styles.

    Science.gov (United States)

    Clarke, Lyndsey; Chen, Min; Mora, Benjamin

    2011-06-01

    Caricatures are a form of humorous visual art, usually created by skilled artists for the intention of amusement and entertainment. In this paper, we present a novel approach for automatic generation of digital caricatures from facial photographs, which capture artistic deformation styles from hand-drawn caricatures. We introduced a pseudo stress-strain model to encode the parameters of an artistic deformation style using "virtual" physical and material properties. We have also developed a software system for performing the caricaturistic deformation in 3D which eliminates the undesirable artifacts in 2D caricaturization. We employed a Multilevel Free-Form Deformation (MFFD) technique to optimize a 3D head model reconstructed from an input facial photograph, and for controlling the caricaturistic deformation. Our results demonstrated the effectiveness and usability of the proposed approach, which allows ordinary users to apply the captured and stored deformation styles to a variety of facial photographs.

  17. Automatic Motion Generation for Robotic Milling Optimizing Stiffness with Sample-Based Planning

    Directory of Open Access Journals (Sweden)

    Julian Ricardo Diaz Posada

    2017-01-01

    Full Text Available Optimal and intuitive robotic machining is still a challenge. One of the main reasons for this is the lack of robot stiffness, which is also dependent on the robot positioning in the Cartesian space. To make up for this deficiency and with the aim of increasing robot machining accuracy, this contribution describes a solution approach for optimizing the stiffness over a desired milling path using the free degree of freedom of the machining process. The optimal motion is computed based on the semantic and mathematical interpretation of the manufacturing process modeled on its components: product, process and resource; and by configuring automatically a sample-based motion problem and the transition-based rapid-random tree algorithm for computing an optimal motion. The approach is simulated on a CAM software for a machining path revealing its functionality and outlining future potentials for the optimal motion generation for robotic machining processes.

  18. Automatic Generation of OpenMP Directives and Its Application to Computational Fluid Dynamics Codes

    Science.gov (United States)

    Yan, Jerry; Jin, Haoqiang; Frumkin, Michael; Yan, Jerry (Technical Monitor)

    2000-01-01

    The shared-memory programming model is a very effective way to achieve parallelism on shared memory parallel computers. As great progress was made in hardware and software technologies, performance of parallel programs with compiler directives has demonstrated large improvement. The introduction of OpenMP directives, the industrial standard for shared-memory programming, has minimized the issue of portability. In this study, we have extended CAPTools, a computer-aided parallelization toolkit, to automatically generate OpenMP-based parallel programs with nominal user assistance. We outline techniques used in the implementation of the tool and discuss the application of this tool on the NAS Parallel Benchmarks and several computational fluid dynamics codes. This work demonstrates the great potential of using the tool to quickly port parallel programs and also achieve good performance that exceeds some of the commercial tools.

  19. A QUANTIFIER-ELIMINATION BASED HEURISTIC FOR AUTOMATICALLY GENERATING INDUCTIVE ASSERTIONS FOR PROGRAMS

    Institute of Scientific and Technical Information of China (English)

    Deepak KAPUR

    2006-01-01

    A method using quantifier-elimination is proposed for automatically generating program invariants/inductive assertions. Given a program, inductive assertions, hypothesized as parameterized formulas in a theory, are associated with program locations. Parameters in inductive assertions are discovered by generating constraints on parameters by ensuring that an inductive assertion is indeed preserved by all execution paths leading to the associated location of the program. The method can be used to discover loop invariants-properties of variables that remain invariant at the entry of a loop. The parameterized formula can be successively refined by considering execution paths one by one; heuristics can be developed for determining the order in which the paths are considered. Initialization of program variables as well as the precondition and postcondition, if available, can also be used to further refine the hypothesized invariant. The method does not depend on the availability of the precondition and postcondition of a program. Constraints on parameters generated in this way are solved for possible values of parameters. If no solution is possible, this means that an invariant of the hypothesized form is not likely to exist for the loop under the assumptions/approximations made to generate the associated verification condition. Otherwise, if the parametric constraints are solvable, then under certain conditions on methods for generating these constraints, the strongest possible invariant of the hypothesized form can be generated from most general solutions of the parametric constraints. The approach is illustrated using the logical languages of conjunction of polynomial equations as well as Presburger arithmetic for expressing assertions.

  20. Automatic generation of endocardial surface meshes with 1-to-1 correspondence from cine-MR images

    Science.gov (United States)

    Su, Yi; Teo, S.-K.; Lim, C. W.; Zhong, L.; Tan, R. S.

    2015-03-01

    In this work, we develop an automatic method to generate a set of 4D 1-to-1 corresponding surface meshes of the left ventricle (LV) endocardial surface which are motion registered over the whole cardiac cycle. These 4D meshes have 1- to-1 point correspondence over the entire set, and is suitable for advanced computational processing, such as shape analysis, motion analysis and finite element modelling. The inputs to the method are the set of 3D LV endocardial surface meshes of the different frames/phases of the cardiac cycle. Each of these meshes is reconstructed independently from border-delineated MR images and they have no correspondence in terms of number of vertices/points and mesh connectivity. To generate point correspondence, the first frame of the LV mesh model is used as a template to be matched to the shape of the meshes in the subsequent phases. There are two stages in the mesh correspondence process: (1) a coarse matching phase, and (2) a fine matching phase. In the coarse matching phase, an initial rough matching between the template and the target is achieved using a radial basis function (RBF) morphing process. The feature points on the template and target meshes are automatically identified using a 16-segment nomenclature of the LV. In the fine matching phase, a progressive mesh projection process is used to conform the rough estimate to fit the exact shape of the target. In addition, an optimization-based smoothing process is used to achieve superior mesh quality and continuous point motion.

  1. A generative statistical approach to automatic 3D building roof reconstruction from laser scanning data

    Science.gov (United States)

    Huang, Hai; Brenner, Claus; Sester, Monika

    2013-05-01

    This paper presents a generative statistical approach to automatic 3D building roof reconstruction from airborne laser scanning point clouds. In previous works, bottom-up methods, e.g., points clustering, plane detection, and contour extraction, are widely used. Due to the data artefacts caused by tree clutter, reflection from windows, water features, etc., the bottom-up reconstruction in urban areas may suffer from a number of incomplete or irregular roof parts. Manually given geometric constraints are usually needed to ensure plausible results. In this work we propose an automatic process with emphasis on top-down approaches. The input point cloud is firstly pre-segmented into subzones containing a limited number of buildings to reduce the computational complexity for large urban scenes. For the building extraction and reconstruction in the subzones we propose a pure top-down statistical scheme, in which the bottom-up efforts or additional data like building footprints are no more required. Based on a predefined primitive library we conduct a generative modeling to reconstruct roof models that fit the data. Primitives are assembled into an entire roof with given rules of combination and merging. Overlaps of primitives are allowed in the assembly. The selection of roof primitives, as well as the sampling of their parameters, is driven by a variant of Markov Chain Monte Carlo technique with specified jump mechanism. Experiments are performed on data-sets of different building types (from simple houses, high-rise buildings to combined building groups) and resolutions. The results show robustness despite the data artefacts mentioned above and plausibility in reconstruction.

  2. Tra-la-Lyrics 2.0: Automatic Generation of Song Lyrics on a Semantic Domain

    Science.gov (United States)

    Gonçalo Oliveira, Hugo

    2015-12-01

    Tra-la-Lyrics is a system that generates song lyrics automatically. In its original version, the main focus was to produce text where stresses matched the rhythm of given melodies. There were no concerns on whether the text made sense or if the selected words shared some kind of semantic association. In this article, we describe the development of a new version of Tra-la-Lyrics, where text is generated on a semantic domain, defined by one or more seed words. This effort involved the integration of the original rhythm module of Tra-la-Lyrics in PoeTryMe, a generic platform that generates poetry with semantically coherent sentences. To measure our progress, the rhythm, the rhymes, and the semantic coherence in lyrics produced by the original Tra-la-Lyrics were analysed and compared with lyrics produced by the new instantiation of this system, dubbed Tra-la-Lyrics 2.0. The analysis showed that, in the lyrics by the new system, words have higher semantic association among them and with the given seeds, while the rhythm is still matched and rhymes are present. The previous analysis was complemented with a crowdsourced evaluation, where contributors answered a survey about relevant features of lyrics produced by the previous and the current versions of Tra-la-Lyrics. Though tight, the survey results confirmed the improvements of the lyrics by Tra-la-Lyrics 2.0.

  3. Development of an Immersive Environment to Aid in Automatic Mesh Generation LDRD Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Pavlakos, Constantine J.

    1998-10-01

    The purpose of this work was to explore the use of immersive technologies, such as those used in synthetic environments (commordy referred to as virtual realily, or VR), in enhancing the mesh- generation process for 3-dimensional (3D) engineering models. This work was motivated by the fact that automatic mesh generation systems are still imperfect - meshing algorithms, particularly in 3D, are sometimes unable to construct a mesh to completion, or they may produce anomalies or undesirable complexities in the resulting mesh. It is important that analysts and meshing code developers be able to study their meshes effectively in order to understand the topology and qualily of their meshes. We have implemented prototype capabilities that enable such exploration of meshes in a highly visual and intuitive manner. Since many applications are making use of increasingly large meshes, we have also investigated approaches to handle large meshes while maintaining interactive response. Ideally, it would also be possible to interact with the meshing process, allowing interactive feedback which corrects problems and/or somehow enables proper completion of the meshing process. We have implemented some functionality towards this end -- in doing so, we have explored software architectures that support such an interactive meshing process. This work has incorporated existing technologies developed at SandiaNational Laboratories, including the CUBIT mesh generation system, and the EIGEN/VR (previously known as MUSE) and FLIGHT systems, which allow applications to make use of immersive technologies and advanced human computer interfaces. 1

  4. Automatic generation of large ensembles for air quality forecasting using the Polyphemus system

    Directory of Open Access Journals (Sweden)

    D. Garaud

    2009-07-01

    Full Text Available This paper describes a method to automatically generate a large ensemble of air quality simulations. This is achieved using the Polyphemus system, which is flexible enough to build various different models. The system offers a wide range of options in the construction of a model: many physical parameterizations, several numerical schemes and different input data can be combined. In addition, input data can be perturbed. In this paper, some 30 alternatives are available for the generation of a model. For each alternative, the options are given a probability, based on how reliable they are supposed to be. Each model of the ensemble is defined by randomly selecting one option per alternative. In order to decrease the computational load, as many computations as possible are shared by the models of the ensemble. As an example, an ensemble of 101 photochemical models is generated and run for the year 2001 over Europe. The models' performance is quickly reviewed, and the ensemble structure is analyzed. We found a strong diversity in the results of the models and a wide spread of the ensemble. It is noteworthy that many models turn out to be the best model in some regions and some dates.

  5. Development of an Immersive Environment to Aid in Automatic Mesh Generation LDRD Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Pavlakos, Constantine J.

    1998-10-01

    The purpose of this work was to explore the use of immersive technologies, such as those used in synthetic environments (commordy referred to as virtual realily, or VR), in enhancing the mesh- generation process for 3-dimensional (3D) engineering models. This work was motivated by the fact that automatic mesh generation systems are still imperfect - meshing algorithms, particularly in 3D, are sometimes unable to construct a mesh to completion, or they may produce anomalies or undesirable complexities in the resulting mesh. It is important that analysts and meshing code developers be able to study their meshes effectively in order to understand the topology and qualily of their meshes. We have implemented prototype capabilities that enable such exploration of meshes in a highly visual and intuitive manner. Since many applications are making use of increasingly large meshes, we have also investigated approaches to handle large meshes while maintaining interactive response. Ideally, it would also be possible to interact with the meshing process, allowing interactive feedback which corrects problems and/or somehow enables proper completion of the meshing process. We have implemented some functionality towards this end -- in doing so, we have explored software architectures that support such an interactive meshing process. This work has incorporated existing technologies developed at SandiaNational Laboratories, including the CUBIT mesh generation system, and the EIGEN/VR (previously known as MUSE) and FLIGHT systems, which allow applications to make use of immersive technologies and advanced human computer interfaces. 1

  6. Atlas-Based Automatic Generation of Subject-Specific Finite Element Tongue Meshes.

    Science.gov (United States)

    Bijar, Ahmad; Rohan, Pierre-Yves; Perrier, Pascal; Payan, Yohan

    2016-01-01

    Generation of subject-specific 3D finite element (FE) models requires the processing of numerous medical images in order to precisely extract geometrical information about subject-specific anatomy. This processing remains extremely challenging. To overcome this difficulty, we present an automatic atlas-based method that generates subject-specific FE meshes via a 3D registration guided by Magnetic Resonance images. The method extracts a 3D transformation by registering the atlas' volume image to the subject's one, and establishes a one-to-one correspondence between the two volumes. The 3D transformation field deforms the atlas' mesh to generate the subject-specific FE mesh. To preserve the quality of the subject-specific mesh, a diffeomorphic non-rigid registration based on B-spline free-form deformations is used, which guarantees a non-folding and one-to-one transformation. Two evaluations of the method are provided. First, a publicly available CT-database is used to assess the capability to accurately capture the complexity of each subject-specific Lung's geometry. Second, FE tongue meshes are generated for two healthy volunteers and two patients suffering from tongue cancer using MR images. It is shown that the method generates an appropriate representation of the subject-specific geometry while preserving the quality of the FE meshes for subsequent FE analysis. To demonstrate the importance of our method in a clinical context, a subject-specific mesh is used to simulate tongue's biomechanical response to the activation of an important tongue muscle, before and after cancer surgery.

  7. Space and Ground-Based Infrastructures

    Science.gov (United States)

    Weems, Jon; Zell, Martin

    This chapter deals first with the main characteristics of the space environment, outside and inside a spacecraft. Then the space and space-related (ground-based) infrastructures are described. The most important infrastructure is the International Space Station, which holds many European facilities (for instance the European Columbus Laboratory). Some of them, such as the Columbus External Payload Facility, are located outside the ISS to benefit from external space conditions. There is only one other example of orbital platforms, the Russian Foton/Bion Recoverable Orbital Capsule. In contrast, non-orbital weightless research platforms, although limited in experimental time, are more numerous: sounding rockets, parabolic flight aircraft, drop towers and high-altitude balloons. In addition to these facilities, there are a number of ground-based facilities and space simulators, for both life sciences (for instance: bed rest, clinostats) and physical sciences (for instance: magnetic compensation of gravity). Hypergravity can also be provided by human and non-human centrifuges.

  8. Automatic sequences

    CERN Document Server

    Haeseler, Friedrich

    2003-01-01

    Automatic sequences are sequences which are produced by a finite automaton. Although they are not random they may look as being random. They are complicated, in the sense of not being not ultimately periodic, they may look rather complicated, in the sense that it may not be easy to name the rule by which the sequence is generated, however there exists a rule which generates the sequence. The concept automatic sequences has special applications in algebra, number theory, finite automata and formal languages, combinatorics on words. The text deals with different aspects of automatic sequences, in particular:· a general introduction to automatic sequences· the basic (combinatorial) properties of automatic sequences· the algebraic approach to automatic sequences· geometric objects related to automatic sequences.

  9. Development of Ground-Based Plant Sentinels

    Science.gov (United States)

    2007-11-02

    plants in response to different strains of Pseudomonas syringae. Planta . 217:767-775. De Moraes CM, Schultz JC, Mescher MC, Tumlinson JH. (2004...09-30-2004 Final Technical _ April 2001 - April 2003 4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER Developing Plants as Ground-based Sentinels 5b. GRANT...SUPPLEMENTARY NOTES 14. ABSTRACT 9 "Z Plants emit volatile mixes characteristic of exposure to both plant and animal (insect) pathogens (bacteria and fungi). The

  10. Illumination compensation in ground based hyperspectral imaging

    Science.gov (United States)

    Wendel, Alexander; Underwood, James

    2017-07-01

    Hyperspectral imaging has emerged as an important tool for analysing vegetation data in agricultural applications. Recently, low altitude and ground based hyperspectral imaging solutions have come to the fore, providing very high resolution data for mapping and studying large areas of crops in detail. However, these platforms introduce a unique set of challenges that need to be overcome to ensure consistent, accurate and timely acquisition of data. One particular problem is dealing with changes in environmental illumination while operating with natural light under cloud cover, which can have considerable effects on spectral shape. In the past this has been commonly achieved by imaging known reference targets at the time of data acquisition, direct measurement of irradiance, or atmospheric modelling. While capturing a reference panel continuously or very frequently allows accurate compensation for illumination changes, this is often not practical with ground based platforms, and impossible in aerial applications. This paper examines the use of an autonomous unmanned ground vehicle (UGV) to gather high resolution hyperspectral imaging data of crops under natural illumination. A process of illumination compensation is performed to extract the inherent reflectance properties of the crops, despite variable illumination. This work adapts a previously developed subspace model approach to reflectance and illumination recovery. Though tested on a ground vehicle in this paper, it is applicable to low altitude unmanned aerial hyperspectral imagery also. The method uses occasional observations of reference panel training data from within the same or other datasets, which enables a practical field protocol that minimises in-field manual labour. This paper tests the new approach, comparing it against traditional methods. Several illumination compensation protocols for high volume ground based data collection are presented based on the results. The findings in this paper are

  11. Automatic Generation of Indoor Navigable Space Using a Point Cloud and its Scanner Trajectory

    Science.gov (United States)

    Staats, B. R.; Diakité, A. A.; Voûte, R. L.; Zlatanova, S.

    2017-09-01

    Automatic generation of indoor navigable models is mostly based on 2D floor plans. However, in many cases the floor plans are out of date. Buildings are not always built according to their blue prints, interiors might change after a few years because of modified walls and doors, and furniture may be repositioned to the user's preferences. Therefore, new approaches for the quick recording of indoor environments should be investigated. This paper concentrates on laser scanning with a Mobile Laser Scanner (MLS) device. The MLS device stores a point cloud and its trajectory. If the MLS device is operated by a human, the trajectory contains information which can be used to distinguish different surfaces. In this paper a method is presented for the identification of walkable surfaces based on the analysis of the point cloud and the trajectory of the MLS scanner. This method consists of several steps. First, the point cloud is voxelized. Second, the trajectory is analysing and projecting to acquire seed voxels. Third, these seed voxels are generated into floor regions by the use of a region growing process. By identifying dynamic objects, doors and furniture, these floor regions can be modified so that each region represents a specific navigable space inside a building as a free navigable voxel space. By combining the point cloud and its corresponding trajectory, the walkable space can be identified for any type of building even if the interior is scanned during business hours.

  12. Automatic Generation of Optimized and Synthesizable Hardware Implementation from High-Level Dataflow Programs

    Directory of Open Access Journals (Sweden)

    Khaled Jerbi

    2012-01-01

    Full Text Available In this paper, we introduce the Reconfigurable Video Coding (RVC standard based on the idea that video processing algorithms can be defined as a library of components that can be updated and standardized separately. MPEG RVC framework aims at providing a unified high-level specification of current MPEG coding technologies using a dataflow language called Cal Actor Language (CAL. CAL is associated with a set of tools to design dataflow applications and to generate hardware and software implementations. Before this work, the existing CAL hardware compilers did not support high-level features of the CAL. After presenting the main notions of the RVC standard, this paper introduces an automatic transformation process that analyses the non-compliant features and makes the required changes in the intermediate representation of the compiler while keeping the same behavior. Finally, the implementation results of the transformation on video and still image decoders are summarized. We show that the obtained results can largely satisfy the real time constraints for an embedded design on FPGA as we obtain a throughput of 73 FPS for MPEG 4 decoder and 34 FPS for coding and decoding process of the LAR coder using a video of CIF image size. This work resolves the main limitation of hardware generation from CAL designs.

  13. A Simulink Library of cryogenic components to automatically generate control schemes for large Cryorefrigerators

    Science.gov (United States)

    Bonne, François; Alamir, Mazen; Hoa, Christine; Bonnay, Patrick; Bon-Mardion, Michel; Monteiro, Lionel

    2015-12-01

    In this article, we present a new Simulink library of cryogenics components (such as valve, phase separator, mixer, heat exchanger...) to assemble to generate model-based control schemes. Every component is described by its algebraic or differential equation and can be assembled with others to build the dynamical model of a complete refrigerator or the model of a subpart of it. The obtained model can be used to automatically design advanced model based control scheme. It also can be used to design a model based PI controller. Advanced control schemes aim to replace classical user experience designed approaches usually based on many independent PI controllers. This is particularly useful in the case where cryoplants are submitted to large pulsed thermal loads, expected to take place in future fusion reactors such as those expected in the cryogenic cooling systems of the International Thermonuclear Experimental Reactor (ITER) or the Japan Torus-60 Super Advanced Fusion Experiment (JT- 60SA). The paper gives the example of the generation of the dynamical model of the 400W@1.8K refrigerator and shows how to build a Constrained Model Predictive Control for it. Based on the scheme, experimental results will be given. This work is being supported by the French national research agency (ANR) through the ANR-13-SEED-0005 CRYOGREEN program.

  14. A review of metaphase chromosome image selection techniques for automatic karyotype generation.

    Science.gov (United States)

    Arora, Tanvi; Dhir, Renu

    2016-08-01

    The karyotype is analyzed to detect the genetic abnormalities. It is generated by arranging the chromosomes after extracting them from the metaphase chromosome images. The chromosomes are non-rigid bodies that contain the genetic information of an individual. The metaphase chromosome image spread contains the chromosomes, but these chromosomes are not distinct bodies; they can either be individual chromosomes or be touching one another; they may be bent or even may be overlapping and thus forming a cluster of chromosomes. The extraction of chromosomes from these touching and overlapping chromosomes is a very tedious process. The segmentation of a random metaphase chromosome image may not give us correct and accurate results. Therefore, before taking up a metaphase chromosome image for analysis, it must be analyzed for the orientation of the chromosomes it contains. The various reported methods for metaphase chromosome image selection for automatic karyotype generation are compared in this paper. After analysis, it has been concluded that each metaphase chromosome image selection method has its advantages and disadvantages.

  15. Performance Evaluation of Antlion Optimizer Based Regulator in Automatic Generation Control of Interconnected Power System

    Directory of Open Access Journals (Sweden)

    Esha Gupta

    2016-01-01

    Full Text Available This paper presents an application of the recently introduced Antlion Optimizer (ALO to find the parameters of primary governor loop of thermal generators for successful Automatic Generation Control (AGC of two-area interconnected power system. Two standard objective functions, Integral Square Error (ISE and Integral Time Absolute Error (ITAE, have been employed to carry out this parameter estimation process. The problem is transformed in optimization problem to obtain integral gains, speed regulation, and frequency sensitivity coefficient for both areas. The comparison of the regulator performance obtained from ALO is carried out with Genetic Algorithm (GA, Particle Swarm Optimization (PSO, and Gravitational Search Algorithm (GSA based regulators. Different types of perturbations and load changes are incorporated to establish the efficacy of the obtained design. It is observed that ALO outperforms all three optimization methods for this real problem. The optimization performance of ALO is compared with other algorithms on the basis of standard deviations in the values of parameters and objective functions.

  16. Solution to automatic generation control problem using firefly algorithm optimized I(λ)D(µ) controller.

    Science.gov (United States)

    Debbarma, Sanjoy; Saikia, Lalit Chandra; Sinha, Nidul

    2014-03-01

    Present work focused on automatic generation control (AGC) of a three unequal area thermal systems considering reheat turbines and appropriate generation rate constraints (GRC). A fractional order (FO) controller named as I(λ)D(µ) controller based on crone approximation is proposed for the first time as an appropriate technique to solve the multi-area AGC problem in power systems. A recently developed metaheuristic algorithm known as firefly algorithm (FA) is used for the simultaneous optimization of the gains and other parameters such as order of integrator (λ) and differentiator (μ) of I(λ)D(µ) controller and governor speed regulation parameters (R). The dynamic responses corresponding to optimized I(λ)D(µ) controller gains, λ, μ, and R are compared with that of classical integer order (IO) controllers such as I, PI and PID controllers. Simulation results show that the proposed I(λ)D(µ) controller provides more improved dynamic responses and outperforms the IO based classical controllers. Further, sensitivity analysis confirms the robustness of the so optimized I(λ)D(µ) controller to wide changes in system loading conditions and size and position of SLP. Proposed controller is also found to have performed well as compared to IO based controllers when SLP takes place simultaneously in any two areas or all the areas. Robustness of the proposed I(λ)D(µ) controller is also tested against system parameter variations.

  17. Automatic generation and verification of railway interlocking control tables using FSM and NuSMV

    Directory of Open Access Journals (Sweden)

    Mohammad B. YAZDI

    2009-01-01

    Full Text Available Due to their important role in providing safe conditions for train movements, railway interlocking systems are considered as safety critical systems. The reliability, safety and integrity of these systems, relies on reliability and integrity of all stages in their lifecycle including the design, verification, manufacture, test, operation and maintenance.In this paper, the Automatic generation and verification of interlocking control tables, as one of the most important stages in the interlocking design process has been focused on, by the safety critical research group in the School of Railway Engineering, SRE. Three different subsystems including a graphical signalling layout planner, a Control table generator and a Control table verifier, have been introduced. Using NuSMV model checker, the control table verifier analyses the contents of control table besides the safe train movement conditions and checks for any conflicting settings in the table. This includes settings for conflicting routes, signals, points and also settings for route isolation and single and multiple overlap situations. The latest two settings, as route isolation and multiple overlap situations are from new outcomes of the work comparing to works represented on the subject recently.

  18. Integration of Variable Speed Pumped Hydro Storage in Automatic Generation Control Systems

    Science.gov (United States)

    Fulgêncio, N.; Moreira, C.; Silva, B.

    2017-04-01

    Pumped storage power (PSP) plants are expected to be an important player in modern electrical power systems when dealing with increasing shares of new renewable energies (NRE) such as solar or wind power. The massive penetration of NRE and consequent replacement of conventional synchronous units will significantly affect the controllability of the system. In order to evaluate the capability of variable speed PSP plants participation in the frequency restoration reserve (FRR) provision, taking into account the expected performance in terms of improved ramp response capability, a comparison with conventional hydro units is presented. In order to address this issue, a three area test network was considered, as well as the corresponding automatic generation control (AGC) systems, being responsible for re-dispatching the generation units to re-establish power interchange between areas as well as the system nominal frequency. The main issue under analysis in this paper is related to the benefits of the fast response of variable speed PSP with respect to its capability of providing fast power balancing in a control area.

  19. Ground based spectroscopy of hot Jupiters

    Science.gov (United States)

    Waldmann, Ingo

    2010-05-01

    It has been shown in recent years with great success that spectroscopy of exoplanetary atmospheres is feasible using space based observatories such as the HST and Spitzer. However, with the end of the Spitzer cold-phase, space based observations in the near to mid infra-red are limited, which will remain true until the the onset of the JWST. The importance of developing methods of ground based spectroscopic analysis of known hot Jupiters is therefore apparent. In the past, various groups have attempted exoplanetary spectroscopy using ground based facilities and various techniques. Here I will present results using a novel spectral retrieval method for near to mid infra-red emission and transmission spectra of exoplanetary atmospheres taken from the ground and discuss the feasibility of future ground-based spectroscopy in a broader context. My recently commenced PhD project is under the supervision of Giovanna Tinetti (University College London) and in collaboration with J. P. Beaulieu (Institut d'Astrophysique de Paris), Mark Swain and Pieter Deroo (Jet Propulsion Laboratory, Caltech).

  20. Automatic generation of a JET 3D neutronics model from CAD geometry data for Monte Carlo calculations

    Energy Technology Data Exchange (ETDEWEB)

    Tsige-Tamirat, H. [Association FZK-Euratom, Forschungszentrum Karlsruhe, P.O. Box 3640, 76021 Karlsruhe (Germany)]. E-mail: tsige@irs.fzk.de; Fischer, U. [Association FZK-Euratom, Forschungszentrum Karlsruhe, P.O. Box 3640, 76021 Karlsruhe (Germany); Carman, P.P. [Euratom/UKAEA Fusion Association, Culham Science Center, Abingdon, Oxfordshire OX14 3DB (United Kingdom); Loughlin, M. [Euratom/UKAEA Fusion Association, Culham Science Center, Abingdon, Oxfordshire OX14 3DB (United Kingdom)

    2005-11-15

    The paper describes the automatic generation of a JET 3D neutronics model from data of computer aided design (CAD) system for Monte Carlo (MC) calculations. The applied method converts suitable CAD data into a representation appropriate for MC codes. The converted geometry is fully equivalent to the CAD geometry.

  1. Automatic treatment planning facilitates fast generation of high-quality treatment plans for esophageal cancer.

    Science.gov (United States)

    Hansen, Christian Rønn; Nielsen, Morten; Bertelsen, Anders Smedegaard; Hazell, Irene; Holtved, Eva; Zukauskaite, Ruta; Bjerregaard, Jon Kroll; Brink, Carsten; Bernchou, Uffe

    2017-08-25

    The quality of radiotherapy planning has improved substantially in the last decade with the introduction of intensity modulated radiotherapy. The purpose of this study was to analyze the plan quality and efficacy of automatically (AU) generated VMAT plans for inoperable esophageal cancer patients. Thirty-two consecutive inoperable patients with esophageal cancer originally treated with manually (MA) generated volumetric modulated arc therapy (VMAT) plans were retrospectively replanned using an auto-planning engine. All plans were optimized with one full 6MV VMAT arc giving 60 Gy to the primary target and 50 Gy to the elective target. The planning techniques were blinded before clinical evaluation by three specialized oncologists. To supplement the clinical evaluation, the optimization time for the AU plan was recorded along with DVH parameters for all plans. Upon clinical evaluation, the AU plan was preferred for 31/32 patients, and for one patient, there was no difference in the plans. In terms of DVH parameters, similar target coverage was obtained between the two planning methods. The mean dose for the spinal cord increased by 1.8 Gy using AU (p = .002), whereas the mean lung dose decreased by 1.9 Gy (p plans were more modulated as seen by the increase of 12% in mean MUs (p = .001). The median optimization time for AU plans was 117 min. The AU plans were in general preferred and showed a lower mean dose to the lungs. The automation of the planning process generated esophageal cancer treatment plans quickly and with high quality.

  2. ScholarLens: extracting competences from research publications for the automatic generation of semantic user profiles

    Directory of Open Access Journals (Sweden)

    Bahar Sateli

    2017-07-01

    Full Text Available Motivation Scientists increasingly rely on intelligent information systems to help them in their daily tasks, in particular for managing research objects, like publications or datasets. The relatively young research field of Semantic Publishing has been addressing the question how scientific applications can be improved through semantically rich representations of research objects, in order to facilitate their discovery and re-use. To complement the efforts in this area, we propose an automatic workflow to construct semantic user profiles of scholars, so that scholarly applications, like digital libraries or data repositories, can better understand their users’ interests, tasks, and competences, by incorporating these user profiles in their design. To make the user profiles sharable across applications, we propose to build them based on standard semantic web technologies, in particular the Resource Description Framework (RDF for representing user profiles and Linked Open Data (LOD sources for representing competence topics. To avoid the cold start problem, we suggest to automatically populate these profiles by analyzing the publications (co-authored by users, which we hypothesize reflect their research competences. Results We developed a novel approach, ScholarLens, which can automatically generate semantic user profiles for authors of scholarly literature. For modeling the competences of scholarly users and groups, we surveyed a number of existing linked open data vocabularies. In accordance with the LOD best practices, we propose an RDF Schema (RDFS based model for competence records that reuses existing vocabularies where appropriate. To automate the creation of semantic user profiles, we developed a complete, automated workflow that can generate semantic user profiles by analyzing full-text research articles through various natural language processing (NLP techniques. In our method, we start by processing a set of research articles for a

  3. Automatic Generation of Algorithms for the Statistical Analysis of Planetary Nebulae Images

    Science.gov (United States)

    Fischer, Bernd

    2004-01-01

    Analyzing data sets collected in experiments or by observations is a Core scientific activity. Typically, experimentd and observational data are &aught with uncertainty, and the analysis is based on a statistical model of the conjectured underlying processes, The large data volumes collected by modern instruments make computer support indispensible for this. Consequently, scientists spend significant amounts of their time with the development and refinement of the data analysis programs. AutoBayes [GF+02, FS03] is a fully automatic synthesis system for generating statistical data analysis programs. Externally, it looks like a compiler: it takes an abstract problem specification and translates it into executable code. Its input is a concise description of a data analysis problem in the form of a statistical model as shown in Figure 1; its output is optimized and fully documented C/C++ code which can be linked dynamically into the Matlab and Octave environments. Internally, however, it is quite different: AutoBayes derives a customized algorithm implementing the given model using a schema-based process, and then further refines and optimizes the algorithm into code. A schema is a parameterized code template with associated semantic constraints which define and restrict the template s applicability. The schema parameters are instantiated in a problem-specific way during synthesis as AutoBayes checks the constraints against the original model or, recursively, against emerging sub-problems. AutoBayes schema library contains problem decomposition operators (which are justified by theorems in a formal logic in the domain of Bayesian networks) as well as machine learning algorithms (e.g., EM, k-Means) and nu- meric optimization methods (e.g., Nelder-Mead simplex, conjugate gradient). AutoBayes augments this schema-based approach by symbolic computation to derive closed-form solutions whenever possible. This is a major advantage over other statistical data analysis systems

  4. SU-F-BRB-16: A Spreadsheet Based Automatic Trajectory GEnerator (SAGE): An Open Source Tool for Automatic Creation of TrueBeam Developer Mode Robotic Trajectories

    Energy Technology Data Exchange (ETDEWEB)

    Etmektzoglou, A; Mishra, P; Svatos, M [Varian Medical Systems, Palo Alto, CA (United States)

    2015-06-15

    Purpose: To automate creation and delivery of robotic linac trajectories with TrueBeam Developer Mode, an open source spreadsheet-based trajectory generation tool has been developed, tested and made freely available. The computing power inherent in a spreadsheet environment plus additional functions programmed into the tool insulate users from the underlying schema tedium and allow easy calculation, parameterization, graphical visualization, validation and finally automatic generation of Developer Mode XML scripts which are directly loadable on a TrueBeam linac. Methods: The robotic control system platform that allows total coordination of potentially all linac moving axes with beam (continuous, step-and-shoot, or combination thereof) becomes available in TrueBeam Developer Mode. Many complex trajectories are either geometric or can be described in analytical form, making the computational power, graphing and programmability available in a spreadsheet environment an easy and ideal vehicle for automatic trajectory generation. The spreadsheet environment allows also for parameterization of trajectories thus enabling the creation of entire families of trajectories using only a few variables. Standard spreadsheet functionality has been extended for powerful movie-like dynamic graphic visualization of the gantry, table, MLC, room, lasers, 3D observer placement and beam centerline all as a function of MU or time, for analysis of the motions before requiring actual linac time. Results: We used the tool to generate and deliver extended SAD “virtual isocenter” trajectories of various shapes such as parameterized circles and ellipses. We also demonstrated use of the tool in generating linac couch motions that simulate respiratory motion using analytical parameterized functions. Conclusion: The SAGE tool is a valuable resource to experiment with families of complex geometric trajectories for a TrueBeam Linac. It makes Developer Mode more accessible as a vehicle to quickly

  5. An efficient algorithm for automatically generating multivariable fuzzy systems by Fourier series method.

    Science.gov (United States)

    Chen, Liang; Tokuda, N

    2002-01-01

    By exploiting the Fourier series expansion, we have developed a new constructive method of automatically generating a multivariable fuzzy inference system from any given sample set with the resulting multivariable function being constructed within any specified precision to the original sample set. The given sample sets are first decomposed into a cluster of simpler sample sets such that a single input fuzzy system is constructed readily for a sample set extracted directly from the cluster independent of the other variables. Once the relevant fuzzy rules and membership functions are constructed for each of the variables completely independent of the other variables, the resulting decomposed fuzzy rules and membership functions are integrated back into the fuzzy system appropriate for the original sample set requiring only a moderate cost of computation in the required decomposition and composition processes. After proving two basic theorems which we need to ensure the validity of the decomposition and composition processes of the system construction, we have demonstrated a constructive algorithm of a multivariable fuzzy system. Exploiting an implicit error bound analysis available at each of the construction steps, the present Fourier method is capable of implementing a more stable fuzzy system than the power series expansion method of ParNeuFuz and PolyNeuFuz, covering and implementing a wider range of more robust applications.

  6. Ground-based Space Weather Monitoring with LOFAR

    Science.gov (United States)

    Wise, Michael; van Haarlem, Michiel; Lawrence, Gareth; Reid, Simon; Bos, Andre; Rawlings, Steve; Salvini, Stef; Mitchell, Cathryn; Soleimani, Manuch; Amado, Sergio; Teresa, Vital

    As one of the first of a new generation of radio instruments, the International LOFAR Telescope (ILT) will provide a number of unique and novel capabilities for the astronomical community. These include remote configuration and operation, dynamic real-time processing and system response, and the ability to provide multiple simultaneous streams of data to a community whose scientific interests run the gamut from lighting in the atmospheres of distant planets to the origins of the universe itself. The LOFAR (LOw Frequency ARray) system is optimized for a frequency range from 30-240 MHz and consists of multiple antenna fields spread across Europe. In the Netherlands, a total 36 LOFAR stations are nearing completion with an initial 8 international stations currently being deployed in Germany, France, Sweden, and the UK. Digital beam-forming techniques make the LOFAR system agile and allow for rapid repointing of the telescope as well as the potential for multiple simultaneous observations. With its dense core array and long interferometric baselines, LOFAR has the potential to achieve unparalleled sensitivity and spatial resolution in the low frequency radio regime. LOFAR will also be one of the first radio observatories to feature automated processing pipelines to deliver fully calibrated science products to its user community. As we discuss in this presentation, the same capabilities that make LOFAR a powerful tool for radio astronomy also provide an excellent platform upon which to build a ground-based monitoring system for space weather events. For example, the ability to monitor Solar activity in near real-time is one of the key scientific capabilities being developed for LOFAR. With only a fraction of its total observing capacity, LOFAR will be able to provide continuous monitoring of the Solar spectrum over the entire 10-240 MHz band down to microsecond timescales. Autonomous routines will scan these incoming spectral data for evidence of Solar flares and be

  7. Calibration of Ground-based Lidar instrument

    DEFF Research Database (Denmark)

    Yordanova, Ginka; Gómez Arranz, Paula

    This report presents the result of the lidar calibration performed for the given Ground-based Lidar at DTU’s test site for large wind turbines at Høvsøre, Denmark. Calibration is here understood as the establishment of a relation between the reference wind speed measurements with measurement...... uncertainties provided by measurement standard and corresponding lidar wind speed indications with associated measurement uncertainties. The lidar calibration concerns the 10 minute mean wind speed measurements. The comparison of the lidar measurements of the wind direction with that from wind vanes...

  8. Calibration of Ground-based Lidar instrument

    DEFF Research Database (Denmark)

    Yordanova, Ginka; Gómez Arranz, Paula

    This report presents the result of a test of a ground-based lidar of other type. The test was performed at DTU’s test site for large wind turbines at Høvsøre, Denmark. The result as an establishment of a relation between the reference wind speed measurements with measurement uncertainties provided...... by measurement standard and corresponding lidar wind speed indications with associated measurement uncertainties. The comparison of the lidar measurements of the wind direction with that from the wind vanes is also given....

  9. Calibration of Ground -based Lidar instrument

    DEFF Research Database (Denmark)

    Villanueva, Héctor; Yordanova, Ginka

    This report presents the result of the lidar calibration performed for the given Ground-based Lidar at DTU’s test site for large wind turbines at Høvsøre, Denmark. Calibration is here understood as the establishment of a relation between the reference wind speed measurements with measurement...... uncertainties provided by measurement standard and corresponding lidar wind speed indications with associated measurement uncertainties. The lidar calibration concerns the 10 minute mean wind speed measurements. The comparison of the lidar measurements of the wind direction with that from wind vanes...

  10. Calibration of Ground-based Lidar instrument

    DEFF Research Database (Denmark)

    Yordanova, Ginka; Gómez Arranz, Paula

    This report presents the result of the lidar calibration performed for the given Ground-based Lidar at DTU’s test site for large wind turbines at Høvsøre, Denmark. Calibration is here understood as the establishment of a relation between the reference wind speed measurements with measurement...... uncertainties provided by measurement standard and corresponding lidar wind speed indications with associated measurement uncertainties. The lidar calibration concerns the 10 minute mean wind speed measurements. The comparison of the lidar measurements of the wind direction with that from wind vanes...

  11. Calibration of Ground -based Lidar instrument

    DEFF Research Database (Denmark)

    Villanueva, Héctor; Georgieva Yankova, Ginka

    This report presents the result of the lidar calibration performed for the given Ground-based Lidar at DTU’s test site for large wind turbines at Høvsøre, Denmark. Calibration is here understood as the establishment of a relation between the reference wind speed measurements with measurement...... uncertainties provided by measurement standard and corresponding lidar wind speed indications with associated measurement uncertainties. The lidar calibration concerns the 10 minute mean wind speed measurements. The comparison of the lidar measurements of the wind direction with that from wind vanes...

  12. Toolkits for Automatic Service Generation: WATT and Kill-A-WATT

    Science.gov (United States)

    Qu, Y.; Bollig, E. F.; Erlebacher, G.

    2007-12-01

    As part of the NSF funded VLab consortium [1], we have been involved in the automatic generation of visualization web services using the Web Automation and Translation Toolkit (WATT) compiler. The WATT compiler converts VTK Tcl input scripts into equivalent yet more efficient C++ web services by interpreting code structure, translating and then integrating bindings to the gSOAP library. WATT seeks to completely automate code distribution, integration of transport protocols and interface generation. Ideally, developers should concentrate on writing core applications, and let WATT transform them into web services in the background. Currently, the WATT compiler is limited to converting known Tcl commands and types to C++. For VTK a simple one to one mapping between Tcl and C++ is enforced, but Tcl commands without direct mappings slow the compilation process and require new mappings to be created. Loops and conditional statements are not yet implemented. In an effort to move forward with automation and not get caught up in the details of cross-language compilation, we developed a new application: Kill-A-WATT (KWATT). KWATT is a C++ application that utilizes the C++/Tcl library [2] to evaluate Tcl input scripts using the official Tcl interpreter. During evaluation of the input script, KWATT interprets code structure, integrating communication details via a Tcl-specific SOAP library [3]. Since KWATT drives the Tcl interpreter, the application has access to the full Tcl command base plus the ability to load new commands from other packages. KWATT is not a compiler; instead, it is a stand-alone application that is itself a web service. When KWATT consumes Tcl input, the generated web methods extend the list of previously available commands. This implies that C++ web methods statically defined in KWATT provide a set of standard methods available to every service. Also, since KWATT uses the Tcl interpreter, it has the potential to accept additional Tcl at any time while

  13. BioASF: a framework for automatically generating executable pathway models specified in BioPAX.

    Science.gov (United States)

    Haydarlou, Reza; Jacobsen, Annika; Bonzanni, Nicola; Feenstra, K Anton; Abeln, Sanne; Heringa, Jaap

    2016-06-15

    Biological pathways play a key role in most cellular functions. To better understand these functions, diverse computational and cell biology researchers use biological pathway data for various analysis and modeling purposes. For specifying these biological pathways, a community of researchers has defined BioPAX and provided various tools for creating, validating and visualizing BioPAX models. However, a generic software framework for simulating BioPAX models is missing. Here, we attempt to fill this gap by introducing a generic simulation framework for BioPAX. The framework explicitly separates the execution model from the model structure as provided by BioPAX, with the advantage that the modelling process becomes more reproducible and intrinsically more modular; this ensures natural biological constraints are satisfied upon execution. The framework is based on the principles of discrete event systems and multi-agent systems, and is capable of automatically generating a hierarchical multi-agent system for a given BioPAX model. To demonstrate the applicability of the framework, we simulated two types of biological network models: a gene regulatory network modeling the haematopoietic stem cell regulators and a signal transduction network modeling the Wnt/β-catenin signaling pathway. We observed that the results of the simulations performed using our framework were entirely consistent with the simulation results reported by the researchers who developed the original models in a proprietary language. The framework, implemented in Java, is open source and its source code, documentation and tutorial are available at http://www.ibi.vu.nl/programs/BioASF CONTACT: j.heringa@vu.nl. © The Author 2016. Published by Oxford University Press.

  14. Embedded Platform for Automatic Testing and Optimizing of FPGA Based Cryptographic True Random Number Generators

    Directory of Open Access Journals (Sweden)

    M. Varchola

    2009-12-01

    Full Text Available This paper deals with an evaluation platform for cryptographic True Random Number Generators (TRNGs based on the hardware implementation of statistical tests for FPGAs. It was developed in order to provide an automatic tool that helps to speed up the TRNG design process and can provide new insights on the TRNG behavior as it will be shown on a particular example in the paper. It enables to test sufficient statistical properties of various TRNG designs under various working conditions on the fly. Moreover, the tests are suitable to be embedded into cryptographic hardware products in order to recognize TRNG output of weak quality and thus increase its robustness and reliability. Tests are fully compatible with the FIPS 140 standard and are implemented by the VHDL language as an IP-Core for vendor independent FPGAs. A recent Flash based Actel Fusion FPGA was chosen for preliminary experiments. The Actel version of the tests possesses an interface to the Actel’s CoreMP7 softcore processor that is fully compatible with the industry standard ARM7TDMI. Moreover, identical tests suite was implemented to the Xilinx Virtex 2 and 5 in order to compare the performance of the proposed solution with the performance of already published one based on the same FPGAs. It was achieved 25% and 65% greater clock frequency respectively while consuming almost equal resources of the Xilinx FPGAs. On the top of it, the proposed FIPS 140 architecture is capable of processing one random bit per one clock cycle which results in 311.5 Mbps throughput for Virtex 5 FPGA.

  15. Field Robotics in Sports: Automatic Generation of guidance Lines for Automatic Grass Cutting, Striping and Pitch Marking of Football Playing Fields

    Directory of Open Access Journals (Sweden)

    Ole Green

    2011-03-01

    Full Text Available Progress is constantly being made and new applications are constantly coming out in the area of field robotics. In this paper, a promising application of field robotics in football playing fields is introduced. An algorithmic approach for generating the way points required for the guidance of a GPS-based field robotic through a football playing field to automatically carry out periodical tasks such as cutting the grass field, pitch and line marking illustrations and lawn striping is represented. The manual operation of these tasks requires very skilful personnel able to work for long hours with very high concentration for the football yard to be compatible with standards of Federation Internationale de Football Association (FIFA. In the other side, a GPS-based guided vehicle or robot with three implements; grass mower, lawn stripping roller and track marking illustrator is capable of working 24 h a day, in most weather and in harsh soil conditions without loss of quality. The proposed approach for the automatic operation of football playing fields requires no or very limited human intervention and therefore it saves numerous working hours and free a worker to focus on other tasks. An economic feasibility study showed that the proposed method is economically superimposing the current manual practices.

  16. Automatic Generation of Analytic Equations for Vibrational and Rovibrational Constants from Fourth-Order Vibrational Perturbation Theory

    Science.gov (United States)

    Matthews, Devin A.; Gong, Justin Z.; Stanton, John F.

    2014-06-01

    The derivation of analytic expressions for vibrational and rovibrational constants, for example the anharmonicity constants χij and the vibration-rotation interaction constants α^B_r, from second-order vibrational perturbation theory (VPT2) can be accomplished with pen and paper and some practice. However, the corresponding quantities from fourth-order perturbation theory (VPT4) are considerably more complex, with the only known derivations by hand extensively using many layers of complicated intermediates and for rotational quantities requiring specialization to orthorhombic cases or the form of Watson's reduced Hamiltonian. We present an automatic computer program for generating these expressions with full generality based on the adaptation of an existing numerical program based on the sum-over-states representation of the energy to a computer algebra context. The measures taken to produce well-simplified and factored expressions in an efficient manner are discussed, as well as the framework for automatically checking the correctness of the generated equations.

  17. DEVELOPMENT OF THE MODEL OF AN AUTOMATIC GENERATION OF TOTAL AMOUNTS OF COMMISSIONS IN INTERNATIONAL INTERBANK PAYMENTS

    Directory of Open Access Journals (Sweden)

    Dmitry N. Bolotov

    2013-01-01

    Full Text Available The article deals with the main form of international payment - bank transfer and features when it is charging by banks correspondent fees for transit funds in their correspondent accounts. In order to optimize the cost of expenses for international money transfers there is a need to develop models and toolkit of automatic generation of the total amount of commissions in international interbank settlements. Accordingly, based on graph theory, approach to the construction of the model was developed.

  18. An Automatic Generation System for Examination ID Cards%考场考证自动生成系统

    Institute of Scientific and Technical Information of China (English)

    杨姝

    2001-01-01

    The paper describes the feature of the automatic generation system for examination ID cards,the strategy of random scheduling and the main algorithm for creating examination ID cards.%论述了考场考证自动生成系统的特点,随机生成考证的策略及考证生成的主要算法.

  19. AUTOMATIC GENERATION OF SQL TEST CASE SETS%SQL测试用例集的自动生成

    Institute of Scientific and Technical Information of China (English)

    丁祥武; 张钦; 韩朱忠

    2012-01-01

    Compiling SQL sentences is an import part for test database management system. Automatic generation of SQL sentences can effectively reduce the workload of the tester. There is almost no automatic tool supporting the direct generation of SQL sentences at present. By simulating the direct derivation process of generative formals the SQL sentences which are generated in accordance with the grammar based on SQL grammars are presented, this is used as the approach for test cases. In the paper we study on the automatic progress from BNF files which are the representation of the grammar to generating SQL test case sets. The process has several stages; Each non-terminals of SQL grammar is converted to a corresponding parse function and the set of all these parse functions forms the rules library. The generative formals of the grammar are traversed to generate SQL test cases automatically. The use of weight arrays in conjunction with stochastic numbers increases the flexibility of test cases generation. Maximum calling times of the non-terminals are employed to terminate the generation of SQL test cases. Through the tool prototype introduced, the SQL test cases in conformity with SQL grammar can be derived.%编写SQL语句是测试数据库管理系统的一个重要部分.自动生成SQL语句可以有效减少测试人员的工作量,而目前没有直接生成SQL语句的自动化工具.通过模拟产生式的直接推导过程,根据SQL文法,给出生成符合该文法的SQL语句,用作测试用例的方法;研究从表示文法的BNF文件生成SQL测试用例集合的自动化过程.这个过程包括几个阶段:将SQL文法的每一个非终结符转换成一个对应的解析函数,所有解析函数的集合构成规则库;遍历文法的产生式自动生成SQL测试用例;使用权值数组结合随机数,加大生成测试用例的灵活性;使用非终结符的最大调用次数来终止SQL测试用例的生成.通过介绍的工具原型,可以得到符合SQL语法的SQL测试用例.

  20. An automatic method to generate domain-specific investigator networks using PubMed abstracts

    Directory of Open Access Journals (Sweden)

    Gwinn Marta

    2007-06-01

    Full Text Available Abstract Background Collaboration among investigators has become critical to scientific research. This includes ad hoc collaboration established through personal contacts as well as formal consortia established by funding agencies. Continued growth in online resources for scientific research and communication has promoted the development of highly networked research communities. Extending these networks globally requires identifying additional investigators in a given domain, profiling their research interests, and collecting current contact information. We present a novel strategy for building investigator networks dynamically and producing detailed investigator profiles using data available in PubMed abstracts. Results We developed a novel strategy to obtain detailed investigator information by automatically parsing the affiliation string in PubMed records. We illustrated the results by using a published literature database in human genome epidemiology (HuGE Pub Lit as a test case. Our parsing strategy extracted country information from 92.1% of the affiliation strings in a random sample of PubMed records and in 97.0% of HuGE records, with accuracies of 94.0% and 91.0%, respectively. Institution information was parsed from 91.3% of the general PubMed records (accuracy 86.8% and from 94.2% of HuGE PubMed records (accuracy 87.0. We demonstrated the application of our approach to dynamic creation of investigator networks by creating a prototype information system containing a large database of PubMed abstracts relevant to human genome epidemiology (HuGE Pub Lit, indexed using PubMed medical subject headings converted to Unified Medical Language System concepts. Our method was able to identify 70–90% of the investigators/collaborators in three different human genetics fields; it also successfully identified 9 of 10 genetics investigators within the PREBIC network, an existing preterm birth research network. Conclusion We successfully created a

  1. ARP: Automatic rapid processing for the generation of problem dependent SAS2H/ORIGEN-s cross section libraries

    Energy Technology Data Exchange (ETDEWEB)

    Leal, L.C.; Hermann, O.W.; Bowman, S.M.; Parks, C.V.

    1998-04-01

    In this report, a methodology is described which serves as an alternative to the SAS2H path of the SCALE system to generate cross sections for point-depletion calculations with the ORIGEN-S code. ARP, Automatic Rapid Processing, is an algorithm that allows the generation of cross-section libraries suitable to the ORIGEN-S code by interpolation over pregenerated SAS2H libraries. The interpolations are carried out on the following variables: burnup, enrichment, and water density. The adequacy of the methodology is evaluated by comparing measured and computed spent fuel isotopic compositions for PWR and BWR systems.

  2. Automatic SAR/optical cross-matching for GCP monograph generation

    Science.gov (United States)

    Nutricato, Raffaele; Morea, Alberto; Nitti, Davide Oscar; La Mantia, Claudio; Agrimano, Luigi; Samarelli, Sergio; Chiaradia, Maria Teresa

    2016-10-01

    Ground Control Points (GCP), automatically extracted from Synthetic Aperture Radar (SAR) images through 3D stereo analysis, can be effectively exploited for an automatic orthorectification of optical imagery if they can be robustly located in the basic optical images. The present study outlines a SAR/Optical cross-matching procedure that allows a robust alignment of radar and optical images, and consequently to derive automatically the corresponding sub-pixel position of the GCPs in the optical image in input, expressed as fractional pixel/line image coordinates. The cross-matching in performed in two subsequent steps, in order to gradually gather a better precision. The first step is based on the Mutual Information (MI) maximization between optical and SAR chips while the last one uses the Normalized Cross-Correlation as similarity metric. This work outlines the designed algorithmic solution and discusses the results derived over the urban area of Pisa (Italy), where more than ten COSMO-SkyMed Enhanced Spotlight stereo images with different beams and passes are available. The experimental analysis involves different satellite images, in order to evaluate the performances of the algorithm w.r.t. the optical spatial resolution. An assessment of the performances of the algorithm has been carried out, and errors are computed by measuring the distance between the GCP pixel/line position in the optical image, automatically estimated by the tool, and the "true" position of the GCP, visually identified by an expert user in the optical images.

  3. Development of an Automatic Gain Controller Card or Next Generation EDFAs

    Institute of Scientific and Technical Information of China (English)

    C. Y. Liaw; T. H. Cheng; C. Lu; M.Akiyama; T.Sakai; A.Wada

    2003-01-01

    This paper describes a low cost automatic gain controller card that provides fast transient gain control to maintain the power of the surviving channels when the number of input channels to an erbium-doped fiber amplifier (EDFA)changes rapidly.

  4. Model-based automatic 3d building model generation by integrating LiDAR and aerial images

    Science.gov (United States)

    Habib, A.; Kwak, E.; Al-Durgham, M.

    2011-12-01

    Accurate, detailed, and up-to-date 3D building models are important for several applications such as telecommunication network planning, urban planning, and military simulation. Existing building reconstruction approaches can be classified according to the data sources they use (i.e., single versus multi-sensor approaches), the processing strategy (i.e., data-driven, model-driven, or hybrid), or the amount of user interaction (i.e., manual, semiautomatic, or fully automated). While it is obvious that 3D building models are important components for many applications, they still lack the economical and automatic techniques for their generation while taking advantage of the available multi-sensory data and combining processing strategies. In this research, an automatic methodology for building modelling by integrating multiple images and LiDAR data is proposed. The objective of this research work is to establish a framework for automatic building generation by integrating data driven and model-driven approaches while combining the advantages of image and LiDAR datasets.

  5. The STACEE Ground-Based Gamma-ray Observatory

    Science.gov (United States)

    Ragan, Ken

    2002-04-01

    The Solar Tower Atmospheric Cherenkov Effect Experiment (STACEE) is a ground-based instrument designed to study astrophysical sources of gamma rays in the energy range from 50 to 500 GeV using an array of heliostat mirrors at the National Solar Thermal Test Facility in New Mexico. The mirrors collect Cherenkov light generated by gamma-ray air showers and concentrate it onto cameras composed of photomultiplier tubes. The STACEE instrument is now complete, and uses a total of 64 heliostats. Prototype instruments, using smaller numbers of heliostats, have previously detected gamma emission from both the Crab Nebula and the Active Galactic Nucleus Mrk421. The complete instrument has a lower threshold -- approximately 50 GeV -- than those prototypes due to superior triggering and electronics, including flash ADCs for every channel.We will discuss the performance of the complete instrument in its first full season of operation, and present preliminary results of selected observations.

  6. MAGE (M-file/Mif Automatic GEnerator): A graphical interface tool for automatic generation of Object Oriented Micromagnetic Framework configuration files and Matlab scripts for results analysis

    Science.gov (United States)

    Chęciński, Jakub; Frankowski, Marek

    2016-10-01

    We present a tool for fully-automated generation of both simulations configuration files (Mif) and Matlab scripts for automated data analysis, dedicated for Object Oriented Micromagnetic Framework (OOMMF). We introduce extended graphical user interface (GUI) that allows for fast, error-proof and easy creation of Mifs, without any programming skills usually required for manual Mif writing necessary. With MAGE we provide OOMMF extensions for complementing it by mangetoresistance and spin-transfer-torque calculations, as well as local magnetization data selection for output. Our software allows for creation of advanced simulations conditions like simultaneous parameters sweeps and synchronic excitation application. Furthermore, since output of such simulation could be long and complicated we provide another GUI allowing for automated creation of Matlab scripts suitable for analysis of such data with Fourier and wavelet transforms as well as user-defined operations.

  7. Operational optical turbulence forecast for the service mode of top-class ground based telescopes

    Science.gov (United States)

    Masciadri, Elena; Lascaux, Franck; Turchi, Alessio; Fini, Luca

    2016-07-01

    In this contribution we present the most relevant results obtained in the context of a feasibility study (MOSE) undertaken for ESO. The principal aim of the project was to quantify the performances of an atmospherical non-hydrostatical mesoscale model (Astro-Meso-NH code) in forecasting all the main atmospherical parameters relevant for the ground-based astronomical observations and the optical turbulence (CN2 and associated integrated astroclimatic parameters) above Cerro Paranal (site of the VLT) and Cerro Armazones (site of the E-ELT). A detailed analysis on the score of success of the predictive capacities of the system have been carried out for all the astroclimatic as well as for the atmospherical parameters. Considering the excellent results that we obtained, this study proved the opportunity to implement on these two sites an automatic system to be run nightly in an operational configuration to support the scheduling of scientific programs as well as of astronomical facilities (particularly those supported by AO systems) of the VLT and the E-ELT. At the end of 2016 a new project for the implementation of a demonstrator of an operational system to be run on the two ESO's sites will start. The fact that the system can be run simultaneously on the two sites is an ancillary appealing feature of the system. Our team is also responsible for the implementation of a similar automatic system at Mt.Graham, site of the LBT (ALTA Project). Our system/method will permit therefore to make a step ahead in the framework of the Service Mode for new generation telescopes. Among the most exciting achieved results we cite the fact that we proved to be able to forecast CN2 profiles with a vertical resolution as high as 150 m. Such a feature is particularly crucial for all WFAO systems that require such detailed information on the OT vertical stratification on the whole 20 km above the ground. This important achievement tells us that all the WFAO systems can rely on automatic

  8. Solution Approach to Automatic Generation Control Problem Using Hybridized Gravitational Search Algorithm Optimized PID and FOPID Controllers

    Directory of Open Access Journals (Sweden)

    DAHIYA, P.

    2015-05-01

    Full Text Available This paper presents the application of hybrid opposition based disruption operator in gravitational search algorithm (DOGSA to solve automatic generation control (AGC problem of four area hydro-thermal-gas interconnected power system. The proposed DOGSA approach combines the advantages of opposition based learning which enhances the speed of convergence and disruption operator which has the ability to further explore and exploit the search space of standard gravitational search algorithm (GSA. The addition of these two concepts to GSA increases its flexibility for solving the complex optimization problems. This paper addresses the design and performance analysis of DOGSA based proportional integral derivative (PID and fractional order proportional integral derivative (FOPID controllers for automatic generation control problem. The proposed approaches are demonstrated by comparing the results with the standard GSA, opposition learning based GSA (OGSA and disruption based GSA (DGSA. The sensitivity analysis is also carried out to study the robustness of DOGSA tuned controllers in order to accommodate variations in operating load conditions, tie-line synchronizing coefficient, time constants of governor and turbine. Further, the approaches are extended to a more realistic power system model by considering the physical constraints such as thermal turbine generation rate constraint, speed governor dead band and time delay.

  9. A proposed metamodel for the implementation of object oriented software through the automatic generation of source code

    Directory of Open Access Journals (Sweden)

    CARVALHO, J. S. C.

    2008-12-01

    Full Text Available During the development of software one of the most visible risks and perhaps the biggest implementation obstacle relates to the time management. All delivery deadlines software versions must be followed, but it is not always possible, sometimes due to delay in coding. This paper presents a metamodel for software implementation, which will rise to a development tool for automatic generation of source code, in order to make any development pattern transparent to the programmer, significantly reducing the time spent in coding artifacts that make up the software.

  10. Algorithm of automatic generation of technology process and process relations of automotive wiring harnesses

    Institute of Scientific and Technical Information of China (English)

    XU Benzhu; ZHU Jiman; LIU Xiaoping

    2012-01-01

    Identifying each process and their constraint relations from the complex wiring harness drawings quickly and accurately is the basis for formulating process routes. According to the knowledge of automotive wiring harness and the characteristics of wiring harness components, we established the model of wiring harness graph. Then we research the algorithm of identifying technology processes automatically, finally we describe the relationships between processes by introducing the constraint matrix, which is in or- der to lay a good foundation for harness process planning and production scheduling.

  11. Automatic Generation of Structural Building Descriptions from 3D Point Cloud Scans

    DEFF Research Database (Denmark)

    Ochmann, Sebastian; Vock, Richard; Wessel, Raoul

    2013-01-01

    scans to derive high-level architectural entities like rooms and doors. Starting with a registered 3D point cloud, we probabilistically model the affiliation of each measured point to a certain room in the building. We solve the resulting clustering problem using an iterative algorithm that relies......We present a new method for automatic semantic structuring of 3D point clouds representing buildings. In contrast to existing approaches which either target the outside appearance like the facade structure or rather low-level geometric structures, we focus on the building’s interior using indoor...

  12. Automatic generation of the index of productive syntax for child language transcripts.

    Science.gov (United States)

    Hassanali, Khairun-nisa; Liu, Yang; Iglesias, Aquiles; Solorio, Thamar; Dollaghan, Christine

    2014-03-01

    The index of productive syntax (IPSyn; Scarborough (Applied Psycholinguistics 11:1-22, 1990) is a measure of syntactic development in child language that has been used in research and clinical settings to investigate the grammatical development of various groups of children. However, IPSyn is mostly calculated manually, which is an extremely laborious process. In this article, we describe the AC-IPSyn system, which automatically calculates the IPSyn score for child language transcripts using natural language processing techniques. Our results show that the AC-IPSyn system performs at levels comparable to scores computed manually. The AC-IPSyn system can be downloaded from www.hlt.utdallas.edu/~nisa/ipsyn.html .

  13. Automatic Generation of Machine Emulators: Efficient Synthesis of Robust Virtual Machines for Legacy Software Migration

    DEFF Research Database (Denmark)

    Franz, Michael; Gal, Andreas; Probst, Christian

    2006-01-01

    As older mainframe architectures become obsolete, the corresponding le- gacy software is increasingly executed via platform emulators running on top of more modern commodity hardware. These emulators are virtual machines that often include a combination of interpreters and just-in-time compilers....... Implementing interpreters and compilers for each combination of emulated and target platform independently of each other is a redundant and error-prone task. We describe an alternative approach that automatically synthesizes specialized virtual-machine interpreters and just-in-time compilers, which...

  14. Automatic Generation of Machine Emulators: Efficient Synthesis of Robust Virtual Machines for Legacy Software Migration

    DEFF Research Database (Denmark)

    Franz, Michael; Gal, Andreas; Probst, Christian

    2006-01-01

    As older mainframe architectures become obsolete, the corresponding le- gacy software is increasingly executed via platform emulators running on top of more modern commodity hardware. These emulators are virtual machines that often include a combination of interpreters and just-in-time compilers....... Implementing interpreters and compilers for each combination of emulated and target platform independently of each other is a redundant and error-prone task. We describe an alternative approach that automatically synthesizes specialized virtual-machine interpreters and just-in-time compilers, which...... then execute on top of an existing software portability platform such as Java. The result is a considerably reduced implementation effort....

  15. A Prototype Expert System for Automatic Generation of Image Processing Programs

    Institute of Scientific and Technical Information of China (English)

    宋茂强; FelixGrimm; 等

    1991-01-01

    A prototype expert system for generating image processing programs using the subroutine package SPIDER is described in this paper.Based on an interactive dialog,the system can generate a complete application program using SPIDER routines.

  16. Ground-based observations of Kepler asteroseismic targets

    DEFF Research Database (Denmark)

    Uyttterhoeven , K.; Karoff, Christoffer

    2010-01-01

    We present the ground-based activities within the different working groups of the Kepler Asteroseismic Science Consortium (KASC). The activities aim at the systematic characterization of the 5000+ KASC targets, and at the collection of ground-based follow-up time-series data of selected promising...

  17. Automatic Extraction of Destinations, Origins and Route Parts from Human Generated Route Directions

    Science.gov (United States)

    Zhang, Xiao; Mitra, Prasenjit; Klippel, Alexander; Maceachren, Alan

    Researchers from the cognitive and spatial sciences are studying text descriptions of movement patterns in order to examine how humans communicate and understand spatial information. In particular, route directions offer a rich source of information on how cognitive systems conceptualize movement patterns by segmenting them into meaningful parts. Route directions are composed using a plethora of cognitive spatial organization principles: changing levels of granularity, hierarchical organization, incorporation of cognitively and perceptually salient elements, and so forth. Identifying such information in text documents automatically is crucial for enabling machine-understanding of human spatial language. The benefits are: a) creating opportunities for large-scale studies of human linguistic behavior; b) extracting and georeferencing salient entities (landmarks) that are used by human route direction providers; c) developing methods to translate route directions to sketches and maps; and d) enabling queries on large corpora of crawled/analyzed movement data. In this paper, we introduce our approach and implementations that bring us closer to the goal of automatically processing linguistic route directions. We report on research directed at one part of the larger problem, that is, extracting the three most critical parts of route directions and movement patterns in general: origin, destination, and route parts. We use machine-learning based algorithms to extract these parts of routes, including, for example, destination names and types. We prove the effectiveness of our approach in several experiments using hand-tagged corpora.

  18. Automatic selection of informative sentences: The sentences that can generate multiple choice questions

    Directory of Open Access Journals (Sweden)

    Mukta Majumder

    2014-12-01

    Full Text Available Traditional education cannot meet the expectation and requirement of a Smart City; it require more advance forms like active learning, ICT education etc. Multiple choice questions (MCQs play an important role in educational assessment and active learning which has a key role in Smart City education. MCQs are effective to assess the understanding of well-defined concepts. A fraction of all the sentences of a text contain well-defined concepts or information that can be asked as a MCQ. These informative sentences are required to be identified first for preparing multiple choice questions manually or automatically. In this paper we propose a technique for automatic identification of such informative sentences that can act as the basis of MCQ. The technique is based on parse structure similarity. A reference set of parse structures is compiled with the help of existing MCQs. The parse structure of a new sentence is compared with the reference structures and if similarity is found then the sentence is considered as a potential candidate. Next a rule-based post-processing module works on these potential candidates to select the final set of informative sentences. The proposed approach is tested in sports domain, where many MCQs are easily available for preparing the reference set of structures. The quality of the system selected sentences is evaluated manually. The experimental result shows that the proposed technique is quite promising.

  19. Movable Ground Based Recovery System for Reuseable Space Flight Hardware

    Science.gov (United States)

    Sarver, George L. (Inventor)

    2013-01-01

    A reusable space flight launch system is configured to eliminate complex descent and landing systems from the space flight hardware and move them to maneuverable ground based systems. Precision landing of the reusable space flight hardware is enabled using a simple, light weight aerodynamic device on board the flight hardware such as a parachute, and one or more translating ground based vehicles such as a hovercraft that include active speed, orientation and directional control. The ground based vehicle maneuvers itself into position beneath the descending flight hardware, matching its speed and direction and captures the flight hardware. The ground based vehicle will contain propulsion, command and GN&C functionality as well as space flight hardware landing cushioning and retaining hardware. The ground based vehicle propulsion system enables longitudinal and transverse maneuverability independent of its physical heading.

  20. Wind power integration into the automatic generation control of power systems with large-scale wind power

    DEFF Research Database (Denmark)

    Basit, Abdul; Hansen, Anca Daniela; Altin, Müfit

    2014-01-01

    Transmission system operators have an increased interest in the active participation of wind power plants (WPP) in the power balance control of power systems with large wind power penetration. The emphasis in this study is on the integration of WPPs into the automatic generation control (AGC......) of the power system. The present paper proposes a coordinated control strategy for the AGC between combined heat and power plants (CHPs) and WPPs to enhance the security and the reliability of a power system operation in the case of a large wind power penetration. The proposed strategy, described...... and exemplified for the future Danish power system, takes the hour-ahead regulating power plan for generation and power exchange with neighbouring power systems into account. The performance of the proposed strategy for coordinated secondary control is assessed and discussed by means of simulations for different...

  1. A Solar Automatic Tracking System that Generates Power for Lighting Greenhouses

    Directory of Open Access Journals (Sweden)

    Qi-Xun Zhang

    2015-07-01

    Full Text Available In this study we design and test a novel solar tracking generation system. Moreover, we show that this system could be successfully used as an advanced solar power source to generate power in greenhouses. The system was developed after taking into consideration the geography, climate, and other environmental factors of northeast China. The experimental design of this study included the following steps: (i the novel solar tracking generation system was measured, and its performance was analyzed; (ii the system configuration and operation principles were evaluated; (iii the performance of this power generation system and the solar irradiance were measured according to local time and conditions; (iv the main factors affecting system performance were analyzed; and (v the amount of power generated by the solar tracking system was compared with the power generated by fixed solar panels. The experimental results indicated that compared to the power generated by fixed solar panels, the solar tracking system generated about 20% to 25% more power. In addition, the performance of this novel power generating system was found to be closely associated with solar irradiance. Therefore, the solar tracking system provides a new approach to power generation in greenhouses.

  2. Automatic generation of reaction energy databases from highly accurate atomization energy benchmark sets.

    Science.gov (United States)

    Margraf, Johannes T; Ranasinghe, Duminda S; Bartlett, Rodney J

    2017-03-31

    In this contribution, we discuss how reaction energy benchmark sets can automatically be created from arbitrary atomization energy databases. As an example, over 11 000 reaction energies derived from the W4-11 database, as well as some relevant subsets are reported. Importantly, there is only very modest computational overhead involved in computing >11 000 reaction energies compared to 140 atomization energies, since the rate-determining step for either benchmark is performing the same 140 quantum chemical calculations. The performance of commonly used electronic structure methods for the new database is analyzed. This allows investigating the relationship between the performances for atomization and reaction energy benchmarks based on an identical set of molecules. The atomization energy is found to be a weak predictor for the overall usefulness of a method. The performance of density functional approximations in light of the number of empirically optimized parameters used in their design is also discussed.

  3. Uav Aerial Survey: Accuracy Estimation for Automatically Generated Dense Digital Surface Model and Orthothoto Plan

    Science.gov (United States)

    Altyntsev, M. A.; Arbuzov, S. A.; Popov, R. A.; Tsoi, G. V.; Gromov, M. O.

    2016-06-01

    A dense digital surface model is one of the products generated by using UAV aerial survey data. Today more and more specialized software are supplied with modules for generating such kind of models. The procedure for dense digital model generation can be completely or partly automated. Due to the lack of reliable criterion of accuracy estimation it is rather complicated to judge the generation validity of such models. One of such criterion can be mobile laser scanning data as a source for the detailed accuracy estimation of the dense digital surface model generation. These data may be also used to estimate the accuracy of digital orthophoto plans created by using UAV aerial survey data. The results of accuracy estimation for both kinds of products are presented in the paper.

  4. Inhibition of bradycardia pacing caused by far-field atrial sensing in a third-generation cardioverter defibrillator with an automatic gain feature.

    Science.gov (United States)

    Curwin, J H; Roelke, M; Ruskin, J N

    1996-01-01

    The diagnostic accuracy of implantable cardioverter defibrillators may be improved by automatically adjusting gain algorithms, which in general reduce the likelihood of oversensing while maintaining the ability to detect the low amplitude signals associated with ventricular fibrillation. We present a patient with a third-generation device who developed prolonged ventricular asystole arising as a complication of the automatic gain feature. During asystole the device automatically increased sensitivity in order to prevent undersensing of ventricular fibrillation, which in this case resulted in far-field sensing of atrial activity and inhibition of ventricular pacing.

  5. Automatic Mesh Generation of Hybrid Mesh on Valves in Multiple Positions in Feedline Systems

    Science.gov (United States)

    Ross, Douglass H.; Ito, Yasushi; Dorothy, Fredric W.; Shih, Alan M.; Peugeot, John

    2010-01-01

    Fluid flow simulations through a valve often require evaluation of the valve in multiple opening positions. A mesh has to be generated for the valve for each position and compounding. The problem is the fact that the valve is typically part of a larger feedline system. In this paper, we propose to develop a system to create meshes for feedline systems with parametrically controlled valve openings. Herein we outline two approaches to generate the meshes for a valve in a feedline system at multiple positions. There are two issues that must be addressed. The first is the creation of the mesh on the valve for multiple positions. The second is the generation of the mesh for the total feedline system including the valve. For generation of the mesh on the valve, we will describe the use of topology matching and mesh generation parameter transfer. For generation of the total feedline system, we will describe two solutions that we have implemented. In both cases the valve is treated as a component in the feedline system. In the first method the geometry of the valve in the feedline system is replaced with a valve at a different opening position. Geometry is created to connect the valve to the feedline system. Then topology for the valve is created and the portion of the topology for the valve is topology matched to the standard valve in a different position. The mesh generation parameters are transferred and then the volume mesh for the whole feedline system is generated. The second method enables the user to generate the volume mesh on the valve in multiple open positions external to the feedline system, to insert it into the volume mesh of the feedline system, and to reduce the amount of computer time required for mesh generation because only two small volume meshes connecting the valve to the feedline mesh need to be updated.

  6. An automatic generation of non-uniform mesh for CFD analyses of image-based multiscale human airway models

    Science.gov (United States)

    Miyawaki, Shinjiro; Tawhai, Merryn H.; Hoffman, Eric A.; Lin, Ching-Long

    2014-11-01

    The authors have developed a method to automatically generate non-uniform CFD mesh for image-based human airway models. The sizes of generated tetrahedral elements vary in both radial and longitudinal directions to account for boundary layer and multiscale nature of pulmonary airflow. The proposed method takes advantage of our previously developed centerline-based geometry reconstruction method. In order to generate the mesh branch by branch in parallel, we used the open-source programs Gmsh and TetGen for surface and volume meshes, respectively. Both programs can specify element sizes by means of background mesh. The size of an arbitrary element in the domain is a function of wall distance, element size on the wall, and element size at the center of airway lumen. The element sizes on the wall are computed based on local flow rate and airway diameter. The total number of elements in the non-uniform mesh (10 M) was about half of that in the uniform mesh, although the computational time for the non-uniform mesh was about twice longer (170 min). The proposed method generates CFD meshes with fine elements near the wall and smooth variation of element size in longitudinal direction, which are required, e.g., for simulations with high flow rate. NIH Grants R01-HL094315, U01-HL114494, and S10-RR022421. Computer time provided by XSEDE.

  7. Effective System for Automatic Bundle Block Adjustment and Ortho Image Generation from Multi Sensor Satellite Imagery

    Science.gov (United States)

    Akilan, A.; Nagasubramanian, V.; Chaudhry, A.; Reddy, D. Rajesh; Sudheer Reddy, D.; Usha Devi, R.; Tirupati, T.; Radhadevi, P. V.; Varadan, G.

    2014-11-01

    Block Adjustment is a technique for large area mapping for images obtained from different remote sensingsatellites.The challenge in this process is to handle huge number of satellite imageries from different sources with different resolution and accuracies at the system level. This paper explains a system with various tools and techniques to effectively handle the end-to-end chain in large area mapping and production with good level of automation and the provisions for intuitive analysis of final results in 3D and 2D environment. In addition, the interface for using open source ortho and DEM references viz., ETM, SRTM etc. and displaying ESRI shapes for the image foot-prints are explained. Rigorous theory, mathematical modelling, workflow automation and sophisticated software engineering tools are included to ensure high photogrammetric accuracy and productivity. Major building blocks like Georeferencing, Geo-capturing and Geo-Modelling tools included in the block adjustment solution are explained in this paper. To provide optimal bundle block adjustment solution with high precision results, the system has been optimized in many stages to exploit the full utilization of hardware resources. The robustness of the system is ensured by handling failure in automatic procedure and saving the process state in every stage for subsequent restoration from the point of interruption. The results obtained from various stages of the system are presented in the paper.

  8. Gene prediction using the Self-Organizing Map: automatic generation of multiple gene models

    Directory of Open Access Journals (Sweden)

    Smith Terry J

    2004-03-01

    Full Text Available Abstract Background Many current gene prediction methods use only one model to represent protein-coding regions in a genome, and so are less likely to predict the location of genes that have an atypical sequence composition. It is likely that future improvements in gene finding will involve the development of methods that can adequately deal with intra-genomic compositional variation. Results This work explores a new approach to gene-prediction, based on the Self-Organizing Map, which has the ability to automatically identify multiple gene models within a genome. The current implementation, named RescueNet, uses relative synonymous codon usage as the indicator of protein-coding potential. Conclusions While its raw accuracy rate can be less than other methods, RescueNet consistently identifies some genes that other methods do not, and should therefore be of interest to gene-prediction software developers and genome annotation teams alike. RescueNet is recommended for use in conjunction with, or as a complement to, other gene prediction methods.

  9. Design of an optimal SMES for automatic generation control of two-area thermal power system using Cuckoo search algorithm

    Directory of Open Access Journals (Sweden)

    Sabita Chaine

    2015-05-01

    Full Text Available This work presents a methodology adopted in order to tune the controller parameters of superconducting magnetic energy storage (SMES system in the automatic generation control (AGC of a two-area thermal power system. The gains of integral controllers of AGC loop, proportional controller of SMES loop and gains of the current feedback loop of the inductor in SMES are optimized simultaneously in order to achieve a desired performance. Recently proposed intelligent technique based algorithm known as Cuckoo search algorithm (CSA is applied for optimization. Sensitivity and robustness of the tuned gains tested at different operating conditions prove the effectiveness of fast acting energy storage devices like SMES in damping out oscillations in power system when their controllers are properly tuned.

  10. Automatic landmark generation for deformable image registration evaluation for 4D CT images of lung

    Science.gov (United States)

    Vickress, J.; Battista, J.; Barnett, R.; Morgan, J.; Yartsev, S.

    2016-10-01

    Deformable image registration (DIR) has become a common tool in medical imaging across both diagnostic and treatment specialties, but the methods used offer varying levels of accuracy. Evaluation of DIR is commonly performed using manually selected landmarks, which is subjective, tedious and time consuming. We propose a semi-automated method that saves time and provides accuracy comparable to manual selection. Three landmarking methods including manual (with two independent observers), scale invariant feature transform (SIFT), and SIFT with manual editing (SIFT-M) were tested on 10 thoracic 4DCT image studies corresponding to the 0% and 50% phases of respiration. Results of each method were evaluated against a gold standard (GS) landmark set comparing both mean and proximal landmark displacements. The proximal method compares the local deformation magnitude between a test landmark pair and the closest GS pair. Statistical analysis was done using an intra class correlation (ICC) between test and GS displacement values. The creation time per landmark pair was 22, 34, 2.3, and 4.3 s for observers 1 and 2, SIFT, and SIFT-M methods respectively. Across 20 lungs from the 10 CT studies, the ICC values between the GS and observer 1 and 2, SIFT, and SIFT-M methods were 0.85, 0.85, 0.84, and 0.82 for mean lung deformation, and 0.97, 0.98, 0.91, and 0.96 for proximal landmark deformation, respectively. SIFT and SIFT-M methods have an accuracy that is comparable to manual methods when tested against a GS landmark set while saving 90% of the time. The number and distribution of landmarks significantly affected the analysis as manifested by the different results for mean deformation and proximal landmark deformation methods. Automatic landmark methods offer a promising alternative to manual landmarking, if the quantity, quality and distribution of landmarks can be optimized for the intended application.

  11. On the Automatic Generation of Plans for Life Cycle Assembly Processes

    Energy Technology Data Exchange (ETDEWEB)

    CALTON,TERRI L.

    2000-01-01

    Designing products for easy assembly and disassembly during their entire life cycles for purposes including product assembly, product upgrade, product servicing and repair, and product disposal is a process that involves many disciplines. In addition, finding the best solution often involves considering the design as a whole and by considering its intended life cycle. Different goals and manufacturing plan selection criteria, as compared to initial assembly, require re-visiting significant fundamental assumptions and methods that underlie current assembly planning techniques. Previous work in this area has been limited to either academic studies of issues in assembly planning or to applied studies of life cycle assembly processes that give no attention to automatic planning. It is believed that merging these two areas will result in a much greater ability to design for, optimize, and analyze the cycle assembly processes. The study of assembly planning is at the very heart of manufacturing research facilities and academic engineering institutions; and, in recent years a number of significant advances in the field of assembly planning have been made. These advances have ranged from the development of automated assembly planning systems, such as Sandia's Automated Assembly Analysis System Archimedes 3.0{copyright}, to the startling revolution in microprocessors and computer-controlled production tools such as computer-aided design (CAD), computer-aided manufacturing (CAM), flexible manufacturing systems (EMS), and computer-integrated manufacturing (CIM). These results have kindled considerable interest in the study of algorithms for life cycle related assembly processes and have blossomed into a field of intense interest. The intent of this manuscript is to bring together the fundamental results in this area, so that the unifying principles and underlying concepts of algorithm design may more easily be implemented in practice.

  12. Experiments on a Ground-Based Tomographic Synthetic Aperture Radar

    Directory of Open Access Journals (Sweden)

    Hoonyol Lee

    2016-08-01

    Full Text Available This paper presents the development and experiment of three-dimensional image formation by using a ground-based tomographic synthetic aperture radar (GB-TomoSAR system. GB-TomoSAR formulates two-dimensional synthetic aperture by the motion of antennae, both in azimuth and vertical directions. After range compression, three-dimensional image focusing is performed by applying Deramp-FFT (Fast Fourier Transform algorithms, both in azimuth and vertical directions. Geometric and radiometric calibrations were applied to make an image cube, which is then projected into range-azimuth and range-vertical cross-sections for visualization. An experiment with a C-band GB-TomoSAR system with a scan length of 2.49 m and 1.86 m in azimuth and vertical-direction, respectively, shows distinctive three-dimensional radar backscattering of stable buildings and roads with resolutions similar to the theoretical values. Unstable objects such as trees and moving cars generate severe noise due to decorrelation during the eight-hour image-acquisition time.

  13. MISMATCH: A basis for semi-automatic functional mixed-signal test-pattern generation

    NARCIS (Netherlands)

    Kerkhoff, Hans G.; Tangelder, R.J.W.T.; Speek, Han; Engin, N.

    1996-01-01

    This paper describes a tool which assists the designer in the rapid generation of functional tests for mixed-signal circuits down to the actual test-signals for the tester. The tool is based on manipulating design data, making use of macro-based test libraries and tester resources provided by the

  14. Automatic Virtual Entity Simulation of Conceptual Design Results-Part I:Symbolic Scheme Generation and Identification

    Institute of Scientific and Technical Information of China (English)

    WANG Yu-xin; LI Yu-tong

    2014-01-01

    The development of new products of high quality, low unit cost, and short lead time to market are the key elements required for any enterprise to obtain a competitive advantage. For shorting the lead time to market and improving the creativity and performances of the product, a rule-based conceptual design approach and a methodology to simulate the conceptual design results generated in conceptual design process in automatical virtual entity form are presented in this paper. This part of paper presents a rule-based conceptual design method for generating creative conceptual design schemes of mechanisms based on Yan’s kinematic chain regeneration creative design method. The design rules are adopted to describe the design requirements of the functional characteristics, the connection relationships and topological characteristics among mechanisms. Through the graphs-based reasoning process, the conceptual design space is expanded extremely, and the potential creative conceptual design results are then dug out. By refining the design rules, the solution exploration problem is avioded, and the tendentious conceptual design schemes are generated. Since mechanical, electrical and hydraulic subsystems can be transformed into general mechansims, the conceptual design method presented in this paper can also be applied in the conceptual design problem of complex mechatronic systems. And then the method to identify conceptual design schemes is given.

  15. Automatic code generation enables nuclear gradient computations for fully internally contracted multireference theory

    CERN Document Server

    MacLeod, Matthew K

    2015-01-01

    Analytical nuclear gradients for fully internally contracted complete active space second-order perturbation theory (CASPT2) are reported. This implementation has been realized by an automated code generator that can handle spin-free formulas for the CASPT2 energy and its derivatives with respect to variations of molecular orbitals and reference coefficients. The underlying complete active space self-consistent field and the so-called Z-vector equations are solved using density fitting. With full internal contraction the size of first-order wave functions scales polynomially with the number of active orbitals. The CASPT2 gradient program and the code generator are both publicly available. This work enables the CASPT2 geometry optimization of molecules as complex as those investigated by respective single-point calculations.

  16. Extending a User Interface Prototyping Tool with Automatic MISRA C Code Generation

    Directory of Open Access Journals (Sweden)

    Gioacchino Mauro

    2017-01-01

    Full Text Available We are concerned with systems, particularly safety-critical systems, that involve interaction between users and devices, such as the user interface of medical devices. We therefore developed a MISRA C code generator for formal models expressed in the PVSio-web prototyping toolkit. PVSio-web allows developers to rapidly generate realistic interactive prototypes for verifying usability and safety requirements in human-machine interfaces. The visual appearance of the prototypes is based on a picture of a physical device, and the behaviour of the prototype is defined by an executable formal model. Our approach transforms the PVSio-web prototyping tool into a model-based engineering toolkit that, starting from a formally verified user interface design model, will produce MISRA C code that can be compiled and linked into a final product. An initial validation of our tool is presented for the data entry system of an actual medical device.

  17. A tool for automatic generation of RTL-level VHDL description of RNS FIR filters

    DEFF Research Database (Denmark)

    Re, Andrea Del; Nannarelli, Alberto; Re, Marco

    2004-01-01

    Although digital filters based on the Residue Number System (RNS) show high performance and low power dissipation, RNS filters are not widely used in DSP systems, because of the complexity of the algorithms involved. We present a tool to design RNS FIR filters which hides the RNS algorithms to th...... to the designer, and generates a synthesizable VHDL description of the filter taking into account several design constraints such as: delay, area and energy....

  18. GUDM: Automatic Generation of Unified Datasets for Learning and Reasoning in Healthcare.

    Science.gov (United States)

    Ali, Rahman; Siddiqi, Muhammad Hameed; Idris, Muhammad; Ali, Taqdir; Hussain, Shujaat; Huh, Eui-Nam; Kang, Byeong Ho; Lee, Sungyoung

    2015-07-02

    A wide array of biomedical data are generated and made available to healthcare experts. However, due to the diverse nature of data, it is difficult to predict outcomes from it. It is therefore necessary to combine these diverse data sources into a single unified dataset. This paper proposes a global unified data model (GUDM) to provide a global unified data structure for all data sources and generate a unified dataset by a "data modeler" tool. The proposed tool implements user-centric priority based approach which can easily resolve the problems of unified data modeling and overlapping attributes across multiple datasets. The tool is illustrated using sample diabetes mellitus data. The diverse data sources to generate the unified dataset for diabetes mellitus include clinical trial information, a social media interaction dataset and physical activity data collected using different sensors. To realize the significance of the unified dataset, we adopted a well-known rough set theory based rules creation process to create rules from the unified dataset. The evaluation of the tool on six different sets of locally created diverse datasets shows that the tool, on average, reduces 94.1% time efforts of the experts and knowledge engineer while creating unified datasets.

  19. GUDM: Automatic Generation of Unified Datasets for Learning and Reasoning in Healthcare

    Directory of Open Access Journals (Sweden)

    Rahman Ali

    2015-07-01

    Full Text Available A wide array of biomedical data are generated and made available to healthcare experts. However, due to the diverse nature of data, it is difficult to predict outcomes from it. It is therefore necessary to combine these diverse data sources into a single unified dataset. This paper proposes a global unified data model (GUDM to provide a global unified data structure for all data sources and generate a unified dataset by a “data modeler” tool. The proposed tool implements user-centric priority based approach which can easily resolve the problems of unified data modeling and overlapping attributes across multiple datasets. The tool is illustrated using sample diabetes mellitus data. The diverse data sources to generate the unified dataset for diabetes mellitus include clinical trial information, a social media interaction dataset and physical activity data collected using different sensors. To realize the significance of the unified dataset, we adopted a well-known rough set theory based rules creation process to create rules from the unified dataset. The evaluation of the tool on six different sets of locally created diverse datasets shows that the tool, on average, reduces 94.1% time efforts of the experts and knowledge engineer while creating unified datasets.

  20. A heads-up no-limit Texas Hold'em poker player: Discretized betting models and automatically generated equilibrium-finding programs

    DEFF Research Database (Denmark)

    Gilpin, Andrew G.; Sandholm, Tuomas; Sørensen, Troels Bjerre

    2008-01-01

    choices in the game. Second, we employ potential-aware automated abstraction algorithms for identifying strategically similar situations in order to decrease the size of the game tree. Third, we develop a new technique for automatically generating the source code of an equilibrium-finding algorithm from...... an XML-based description of a game. This automatically generated program is more efficient than what would be possible with a general-purpose equilibrium-finding program. Finally, we present results from the AAAI-07 Computer Poker Competition, in which Tartanian placed second out of ten entries....

  1. Office OperationTest Generation and Automatic Judgment of C# Language%Office操作题生成与自动判分的C#语言实现

    Institute of Scientific and Technical Information of China (English)

    高上雄

    2014-01-01

    Through the paperless examination of Office operation test generation and automatic judgment analysis, this paper uses the test operation test generation C/S architecture for Word and Excelland automatic judgment realization are discussed using C# language.%通过针对无纸化考试中Office操作题生成与自动判分的分析,该文采用C#语言采用C/S构架对Word及Excel的操作考试试题生成与自动判分的实现进行探讨。

  2. Low Power Ground-Based Laser Illumination for Electric Propulsion Applications

    Science.gov (United States)

    Lapointe, Michael R.; Oleson, Steven R.

    1994-01-01

    A preliminary evaluation of low power, ground-based laser powered electric propulsion systems is presented. A review of available and near-term laser, photovoltaic, and adaptive optic systems indicates that approximately 5-kW of ground-based laser power can be delivered at an equivalent one-sun intensity to an orbit of approximately 2000 km. Laser illumination at the proper wavelength can double photovoltaic array conversion efficiencies compared to efficiencies obtained with solar illumination at the same intensity, allowing a reduction in array mass. The reduced array mass allows extra propellant to be carried with no penalty in total spacecraft mass. The extra propellant mass can extend the satellite life in orbit, allowing additional revenue to be generated. A trade study using realistic cost estimates and conservative ground station viewing capability was performed to estimate the number of communication satellites which must be illuminated to make a proliferated system of laser ground stations economically attractive. The required number of satellites is typically below that of proposed communication satellite constellations, indicating that low power ground-based laser beaming may be commercially viable. However, near-term advances in low specific mass solar arrays and high energy density batteries for LEO applications would render the ground-based laser system impracticable.

  3. Automatically Augmenting Lifelog Events Using Pervasively Generated Content from Millions of People

    Directory of Open Access Journals (Sweden)

    Alan F. Smeaton

    2010-02-01

    Full Text Available In sensor research we take advantage of additional contextual sensor information to disambiguate potentially erroneous sensor readings or to make better informed decisions on a single sensor’s output. This use of additional information reinforces, validates, semantically enriches, and augments sensed data. Lifelog data is challenging to augment, as it tracks one’s life with many images including the places they go, making it non-trivial to find associated sources of information. We investigate realising the goal of pervasive user-generated content based on sensors, by augmenting passive visual lifelogs with “Web 2.0” content collected by millions of other individuals.

  4. Communication: Automatic code generation enables nuclear gradient computations for fully internally contracted multireference theory

    Energy Technology Data Exchange (ETDEWEB)

    MacLeod, Matthew K.; Shiozaki, Toru [Department of Chemistry, Northwestern University, 2145 Sheridan Rd., Evanston, Illinois 60208 (United States)

    2015-02-07

    Analytical nuclear gradients for fully internally contracted complete active space second-order perturbation theory (CASPT2) are reported. This implementation has been realized by an automated code generator that can handle spin-free formulas for the CASPT2 energy and its derivatives with respect to variations of molecular orbitals and reference coefficients. The underlying complete active space self-consistent field and the so-called Z-vector equations are solved using density fitting. The implementation has been applied to the vertical and adiabatic ionization potentials of the porphin molecule to illustrate its capability.

  5. Intra-Hour Dispatch and Automatic Generator Control Demonstration with Solar Forecasting - Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Coimbra, Carlos F. M. [Univ. of California, San Diego, CA (United States

    2016-02-25

    In this project we address multiple resource integration challenges associated with increasing levels of solar penetration that arise from the variability and uncertainty in solar irradiance. We will model the SMUD service region as its own balancing region, and develop an integrated, real-time operational tool that takes solar-load forecast uncertainties into consideration and commits optimal energy resources and reserves for intra-hour and intra-day decisions. The primary objectives of this effort are to reduce power system operation cost by committing appropriate amount of energy resources and reserves, as well as to provide operators a prediction of the generation fleet’s behavior in real time for realistic PV penetration scenarios. The proposed methodology includes the following steps: clustering analysis on the expected solar variability per region for the SMUD system, Day-ahead (DA) and real-time (RT) load forecasts for the entire service areas, 1-year of intra-hour CPR forecasts for cluster centers, 1-year of smart re-forecasting CPR forecasts in real-time for determination of irreducible errors, and uncertainty quantification for integrated solar-load for both distributed and central stations (selected locations within service region) PV generation.

  6. Automatic Generation of High Quality DSM Based on IRS-P5 Cartosat-1 Stereo Data

    Science.gov (United States)

    d'Angelo, Pablo; Uttenthaler, Andreas; Carl, Sebastian; Barner, Frithjof; Reinartz, Peter

    2010-12-01

    IRS-P5 Cartosat-1 high resolution stereo satellite imagery is well suited for the creation of digital surface models (DSM). A system for highly automated and operational DSM and orthoimage generation based on IRS-P5 Cartosat-1 imagery is presented, with an emphasis on automated processing and product quality. The proposed system processes IRS-P5 level-1 stereo scenes using the rational polynomial coefficients (RPC) universal sensor model. The described method uses an RPC correction based on DSM alignment instead of using reference images with a lower lateral accuracy, this results in improved geolocation of the DSMs and orthoimages. Following RPC correction, highly detailed DSMs with 5 m grid spacing are derived using Semiglobal Matching. The proposed method is part of an operational Cartosat-1 processor for the generation of a high resolution DSM. Evaluation of 18 scenes against independent ground truth measurements indicates a mean lateral error (CE90) of 6.7 meters and a mean vertical accuracy (LE90) of 5.1 meters.

  7. BIOSMILE: A semantic role labeling system for biomedical verbs using a maximum-entropy model with automatically generated template features

    Directory of Open Access Journals (Sweden)

    Tsai Richard

    2007-09-01

    Full Text Available Abstract Background Bioinformatics tools for automatic processing of biomedical literature are invaluable for both the design and interpretation of large-scale experiments. Many information extraction (IE systems that incorporate natural language processing (NLP techniques have thus been developed for use in the biomedical field. A key IE task in this field is the extraction of biomedical relations, such as protein-protein and gene-disease interactions. However, most biomedical relation extraction systems usually ignore adverbial and prepositional phrases and words identifying location, manner, timing, and condition, which are essential for describing biomedical relations. Semantic role labeling (SRL is a natural language processing technique that identifies the semantic roles of these words or phrases in sentences and expresses them as predicate-argument structures. We construct a biomedical SRL system called BIOSMILE that uses a maximum entropy (ME machine-learning model to extract biomedical relations. BIOSMILE is trained on BioProp, our semi-automatic, annotated biomedical proposition bank. Currently, we are focusing on 30 biomedical verbs that are frequently used or considered important for describing molecular events. Results To evaluate the performance of BIOSMILE, we conducted two experiments to (1 compare the performance of SRL systems trained on newswire and biomedical corpora; and (2 examine the effects of using biomedical-specific features. The experimental results show that using BioProp improves the F-score of the SRL system by 21.45% over an SRL system that uses a newswire corpus. It is noteworthy that adding automatically generated template features improves the overall F-score by a further 0.52%. Specifically, ArgM-LOC, ArgM-MNR, and Arg2 achieve statistically significant performance improvements of 3.33%, 2.27%, and 1.44%, respectively. Conclusion We demonstrate the necessity of using a biomedical proposition bank for training

  8. Wind power integration into the automatic generation control of power systems with large-scale wind power

    Directory of Open Access Journals (Sweden)

    Abdul Basit

    2014-10-01

    Full Text Available Transmission system operators have an increased interest in the active participation of wind power plants (WPP in the power balance control of power systems with large wind power penetration. The emphasis in this study is on the integration of WPPs into the automatic generation control (AGC of the power system. The present paper proposes a coordinated control strategy for the AGC between combined heat and power plants (CHPs and WPPs to enhance the security and the reliability of a power system operation in the case of a large wind power penetration. The proposed strategy, described and exemplified for the future Danish power system, takes the hour-ahead regulating power plan for generation and power exchange with neighbouring power systems into account. The performance of the proposed strategy for coordinated secondary control is assessed and discussed by means of simulations for different possible future scenarios, when wind power production in the power system is high and conventional production from CHPs is at a minimum level. The investigation results of the proposed control strategy have shown that the WPPs can actively help the AGC, and reduce the real-time power imbalance in the power system, by down regulating their production when CHPs are unable to provide the required response.

  9. Automatic Multi-GPU Code Generation applied to Simulation of Electrical Machines

    CERN Document Server

    Rodrigues, Antonio Wendell De Oliveira; Dekeyser, Jean-Luc; Menach, Yvonnick Le

    2011-01-01

    The electrical and electronic engineering has used parallel programming to solve its large scale complex problems for performance reasons. However, as parallel programming requires a non-trivial distribution of tasks and data, developers find it hard to implement their applications effectively. Thus, in order to reduce design complexity, we propose an approach to generate code for hybrid architectures (e.g. CPU + GPU) using OpenCL, an open standard for parallel programming of heterogeneous systems. This approach is based on Model Driven Engineering (MDE) and the MARTE profile, standard proposed by Object Management Group (OMG). The aim is to provide resources to non-specialists in parallel programming to implement their applications. Moreover, thanks to model reuse capacity, we can add/change functionalities or the target architecture. Consequently, this approach helps industries to achieve their time-to-market constraints and confirms by experimental tests, performance improvements using multi-GPU environmen...

  10. Optimization of automatically generated multi-core code for the LTE RACH-PD algorithm

    CERN Document Server

    Pelcat, Maxime; Nezan, Jean François

    2008-01-01

    Embedded real-time applications in communication systems require high processing power. Manual scheduling devel-oped for single-processor applications is not suited to multi-core architectures. The Algorithm Architecture Matching (AAM) methodology optimizes static application implementation on multi-core architectures. The Random Access Channel Preamble Detection (RACH-PD) is an algorithm for non-synchronized access of Long Term Evolu-tion (LTE) wireless networks. LTE aims to improve the spectral efficiency of the next generation cellular system. This paper de-scribes a complete methodology for implementing the RACH-PD. AAM prototyping is applied to the RACH-PD which is modelled as a Synchronous DataFlow graph (SDF). An efficient implemen-tation of the algorithm onto a multi-core DSP, the TI C6487, is then explained. Benchmarks for the solution are given.

  11. Design of a variable width pulse generator feasible for manual or automatic control

    Science.gov (United States)

    Vegas, I.; Antoranz, P.; Miranda, J. M.; Franco, F. J.

    2017-01-01

    A variable width pulse generator featuring more than 4-V peak amplitude and less than 10-ns FWHM is described. In this design the width of the pulses is controlled by means of the control signal slope. Thus, a variable transition time control circuit (TTCC) is also developed, based on the charge and discharge of a capacitor by means of two tunable current sources. Additionally, it is possible to activate/deactivate the pulses when required, therefore allowing the creation of any desired pulse pattern. Furthermore, the implementation presented here can be electronically controlled. In conclusion, due to its versatility, compactness and low cost it can be used in a wide variety of applications.

  12. Particle production during inflation and gravitational waves detectable by ground-based interferometers

    OpenAIRE

    Cook, Jessica L.; Sorbo, Lorenzo

    2011-01-01

    Inflation typically predicts a quasi scale-invariant spectrum of gravitational waves. In models of slow-roll inflation, the amplitude of such a background is too small to allow direct detection without a dedicated space-based experiment such as the proposed BBO or DECIGO. In this paper we note that particle production during inflation can generate a feature in the spectrum of primordial gravitational waves. We discuss the possibility that such a feature might be detected by ground-based laser...

  13. Designing and Implementation of Retina Image Drawing System and Automatic Report Generation from Retina Examinations

    Science.gov (United States)

    Safdari, Reza; Mokhtaran, Mehrshad; Tahmasebian, Shahram

    2016-01-01

    Introduction: Electronic medical records as one of major parts of electronic health records is an important application of Medical Informatics. EMR includes different types of data, Graphical items being one of these data types. To this end, a standard structure for storing and recovering and finally exchanging this data type is required. In order to standardize information items in this research, UMLS standard is used. In this research, graphical information from fondues designing in retina surgery forms is used for the task of implementation. Implementation: Three-layer software architecture is used for implementation of this system, which includes user interface, data base access and business logic. XML database is used for storing and exchanging of data. User interface is designed by the means of Adobe Flash. Also in the user interface for eye examinations, appropriate icons compatible with current pathologies in retina examinations are considered and UMLS codes are used for standardizations purposes. Results: As this project is independently implemented in Adobe Flash, it can be run in most of electronic patient records software. For evaluation purposes of this research, an EMR system for eye clinics is used. Tree structure is used for data entry and finally a text report based on the entered data will be generated. By storing graphical items in this software editing and searching in medical concepts and also comparing features will be available. Conclusion: One of the data items that we encounter in various medical records is graphical data. In order to cover the patient’s complete electronic medical records, the Electronic Implementation of this information is important. For this purpose, graphical items in retina surgery forms were used and finally a software application for drawing retina picture was developed. Also, XML files were used for the purpose of storing valuable medical data from the pictures, and also UMLS were applied for the standardization

  14. Automatic Generation of CFD-Ready Surface Triangulations from CAD Geometry

    Science.gov (United States)

    Aftosmis, M. J.; Delanaye, M.; Haimes, R.; Nixon, David (Technical Monitor)

    1998-01-01

    This paper presents an approach for the generation of closed manifold surface triangulations from CAD geometry. CAD parts and assemblies are used in their native format, without translation, and a part's native geometry engine is accessed through a modeler-independent application programming interface (API). In seeking a robust and fully automated procedure, the algorithm is based on a new physical space manifold triangulation technique which was developed to avoid robustness issues associated with poorly conditioned mappings. In addition, this approach avoids the usual ambiguities associated with floating-point predicate evaluation on constructed coordinate geometry in a mapped space, The technique is incremental, so that each new site improves the triangulation by some well defined quality measure. Sites are inserted using a variety of priority queues to ensure that new insertions will address the worst triangles first, As a result of this strategy, the algorithm will return its 'best' mesh for a given (prespecified) number of sites. Alternatively, the algorithm may be allowed to terminate naturally after achieving a prespecified measure of mesh quality. The resulting triangulations are 'CFD-ready' in that: (1) Edges match the underlying part model to within a specified tolerance. (2) Triangles on disjoint surfaces in close proximity have matching length-scales. (3) The algorithm produces a triangulation such that no angle is less than a given angle bound, alpha, or greater than Pi - 2alpha This result also sets bounds on the maximum vertex degree, triangle aspect-ratio and maximum stretching rate for the triangulation. In addition to tile output triangulations for a variety of CAD parts, tile discussion presents related theoretical results which assert the existence of such all angle bound, and demonstrate that maximum bounds of between 25 deg and 30 deg may be achieved in practice.

  15. Automatic generation control of multi-area power systems with diverse energy sources using Teaching Learning Based Optimization algorithm

    Directory of Open Access Journals (Sweden)

    Rabindra Kumar Sahu

    2016-03-01

    Full Text Available This paper presents the design and analysis of Proportional-Integral-Double Derivative (PIDD controller for Automatic Generation Control (AGC of multi-area power systems with diverse energy sources using Teaching Learning Based Optimization (TLBO algorithm. At first, a two-area reheat thermal power system with appropriate Generation Rate Constraint (GRC is considered. The design problem is formulated as an optimization problem and TLBO is employed to optimize the parameters of the PIDD controller. The superiority of the proposed TLBO based PIDD controller has been demonstrated by comparing the results with recently published optimization technique such as hybrid Firefly Algorithm and Pattern Search (hFA-PS, Firefly Algorithm (FA, Bacteria Foraging Optimization Algorithm (BFOA, Genetic Algorithm (GA and conventional Ziegler Nichols (ZN for the same interconnected power system. Also, the proposed approach has been extended to two-area power system with diverse sources of generation like thermal, hydro, wind and diesel units. The system model includes boiler dynamics, GRC and Governor Dead Band (GDB non-linearity. It is observed from simulation results that the performance of the proposed approach provides better dynamic responses by comparing the results with recently published in the literature. Further, the study is extended to a three unequal-area thermal power system with different controllers in each area and the results are compared with published FA optimized PID controller for the same system under study. Finally, sensitivity analysis is performed by varying the system parameters and operating load conditions in the range of ±25% from their nominal values to test the robustness.

  16. Automatic query generation using word embeddings for retrieving passages describing experimental methods.

    Science.gov (United States)

    Aydın, Ferhat; Hüsünbeyi, Zehra Melce; Özgür, Arzucan

    2017-01-01

    Information regarding the physical interactions among proteins is crucial, since protein-protein interactions (PPIs) are central for many biological processes. The experimental techniques used to verify PPIs are vital for characterizing and assessing the reliability of the identified PPIs. A lot of information about PPIs and the experimental methods are only available in the text of the scientific publications that report them. In this study, we approach the problem of identifying passages with experimental methods for physical interactions between proteins as an information retrieval search task. The baseline system is based on query matching, where the queries are generated by utilizing the names (including synonyms) of the experimental methods in the Proteomics Standard Initiative-Molecular Interactions (PSI-MI) ontology. We propose two methods, where the baseline queries are expanded by including additional relevant terms. The first method is a supervised approach, where the most salient terms for each experimental method are obtained by using the term frequency-relevance frequency (tf.rf) metric over 13 articles from our manually annotated data set of 30 full text articles, which is made publicly available. On the other hand, the second method is an unsupervised approach, where the queries for each experimental method are expanded by using the word embeddings of the names of the experimental methods in the PSI-MI ontology. The word embeddings are obtained by utilizing a large unlabeled full text corpus. The proposed methods are evaluated on the test set consisting of 17 articles. Both methods obtain higher recall scores compared with the baseline, with a loss in precision. Besides higher recall, the word embeddings based approach achieves higher F-measure than the baseline and the tf.rf based methods. We also show that incorporating gene name and interaction keyword identification leads to improved precision and F-measure scores for all three evaluated methods

  17. Automatic query generation using word embeddings for retrieving passages describing experimental methods

    Science.gov (United States)

    Aydın, Ferhat; Hüsünbeyi, Zehra Melce; Özgür, Arzucan

    2017-01-01

    Information regarding the physical interactions among proteins is crucial, since protein–protein interactions (PPIs) are central for many biological processes. The experimental techniques used to verify PPIs are vital for characterizing and assessing the reliability of the identified PPIs. A lot of information about PPIs and the experimental methods are only available in the text of the scientific publications that report them. In this study, we approach the problem of identifying passages with experimental methods for physical interactions between proteins as an information retrieval search task. The baseline system is based on query matching, where the queries are generated by utilizing the names (including synonyms) of the experimental methods in the Proteomics Standard Initiative–Molecular Interactions (PSI-MI) ontology. We propose two methods, where the baseline queries are expanded by including additional relevant terms. The first method is a supervised approach, where the most salient terms for each experimental method are obtained by using the term frequency–relevance frequency (tf.rf) metric over 13 articles from our manually annotated data set of 30 full text articles, which is made publicly available. On the other hand, the second method is an unsupervised approach, where the queries for each experimental method are expanded by using the word embeddings of the names of the experimental methods in the PSI-MI ontology. The word embeddings are obtained by utilizing a large unlabeled full text corpus. The proposed methods are evaluated on the test set consisting of 17 articles. Both methods obtain higher recall scores compared with the baseline, with a loss in precision. Besides higher recall, the word embeddings based approach achieves higher F-measure than the baseline and the tf.rf based methods. We also show that incorporating gene name and interaction keyword identification leads to improved precision and F-measure scores for all three evaluated

  18. The SBAS Sentinel-1 Surveillance service for automatic and systematic generation of Earth surface displacement within the GEP platform.

    Science.gov (United States)

    Casu, Francesco; De Luca, Claudio; Lanari, Riccardo; Manunta, Michele; Zinno, Ivana

    2017-04-01

    The Geohazards Exploitation Platform (GEP) is an ESA activity of the Earth Observation (EO) ground segment to demonstrate the benefit of new technologies for large scale processing of EO data. GEP aims at providing both on-demand processing services for scientific users of the geohazards community and an integration platform for new EO data analysis processors dedicated to scientists and other expert users. In the Remote Sensing scenario, a crucial role is played by the recently launched Sentinel-1 (S1) constellation that, with its global acquisition policy, has literally flooded the scientific community with a huge amount of data acquired over large part of the Earth on a regular basis (down to 6-days with both Sentinel-1A and 1B passes). Moreover, the S1 data, as part of the European Copernicus program, are openly and freely accessible, thus fostering their use for the development of tools for Earth surface monitoring. In particular, due to their specific SAR Interferometry (InSAR) design, Sentinel-1 satellites can be exploited to build up operational services for the generation of advanced interferometric products that can be very useful within risk management and natural hazard monitoring scenarios. Accordingly, in this work we present the activities carried out for the development, integration, and deployment of the SBAS Sentinel-1 Surveillance service of CNR-IREA within the GEP platform. This service is based on a parallel implementation of the SBAS approach, referred to as P-SBAS, able to effectively run in large distributed computing infrastructures (grid and cloud) and to allow for an efficient computation of large SAR data sequences with advanced DInSAR approaches. In particular, the Surveillance service developed on GEP platform consists on the systematic and automatic processing of Sentinel-1 data on selected Areas of Interest (AoI) to generate updated surface displacement time series via the SBAS-InSAR algorithm. We built up a system that is

  19. 一种实用的输电网潮流图自动生成算法%A Practical Automatic Generation Algorithm of Power Flow Diagram

    Institute of Scientific and Technical Information of China (English)

    徐丽燕; 陈文静; 苏运光; 孙云枫

    2015-01-01

    Based on the general characteristics of the electricity transmission grid trend diagram ,this paper comprehensively studies a variety of automatic layout and wiring method ,proposes a practical electricity transmission grid trend chart automatic generation algorithm ,and has set up an experiment to verify its effectiveness.%文章针对输电网潮流图的一般特点,综合研究了多种自动布局与布线方法,提出了一种实用性较强的输电网潮流图自动生成算法,并已通过实验验证其有效性.

  20. Automatic generation of smart earthquake-resistant building system: Hybrid system of base-isolation and building-connection.

    Science.gov (United States)

    Kasagi, M; Fujita, K; Tsuji, M; Takewaki, I

    2016-02-01

    A base-isolated building may sometimes exhibit an undesirable large response to a long-duration, long-period earthquake ground motion and a connected building system without base-isolation may show a large response to a near-fault (rather high-frequency) earthquake ground motion. To overcome both deficiencies, a new hybrid control system of base-isolation and building-connection is proposed and investigated. In this new hybrid building system, a base-isolated building is connected to a stiffer free wall with oil dampers. It has been demonstrated in a preliminary research that the proposed hybrid system is effective both for near-fault (rather high-frequency) and long-duration, long-period earthquake ground motions and has sufficient redundancy and robustness for a broad range of earthquake ground motions.An automatic generation algorithm of this kind of smart structures of base-isolation and building-connection hybrid systems is presented in this paper. It is shown that, while the proposed algorithm does not work well in a building without the connecting-damper system, it works well in the proposed smart hybrid system with the connecting damper system.

  1. Automatic generation of smart earthquake-resistant building system: Hybrid system of base-isolation and building-connection

    Directory of Open Access Journals (Sweden)

    M. Kasagi

    2016-02-01

    Full Text Available A base-isolated building may sometimes exhibit an undesirable large response to a long-duration, long-period earthquake ground motion and a connected building system without base-isolation may show a large response to a near-fault (rather high-frequency earthquake ground motion. To overcome both deficiencies, a new hybrid control system of base-isolation and building-connection is proposed and investigated. In this new hybrid building system, a base-isolated building is connected to a stiffer free wall with oil dampers. It has been demonstrated in a preliminary research that the proposed hybrid system is effective both for near-fault (rather high-frequency and long-duration, long-period earthquake ground motions and has sufficient redundancy and robustness for a broad range of earthquake ground motions.An automatic generation algorithm of this kind of smart structures of base-isolation and building-connection hybrid systems is presented in this paper. It is shown that, while the proposed algorithm does not work well in a building without the connecting-damper system, it works well in the proposed smart hybrid system with the connecting damper system.

  2. CAV_KO: a Simple 1-D Langrangian Hydrocode for MS EXCEL™ with Automatic Generation of X-T Diagrams

    Science.gov (United States)

    Tsembelis, K.; Ramsden, B.; Proud, W. G.; Borg, J.

    2007-12-01

    Hydrocodes are widely used to predict or simulate highly dynamic and transient events such as blast and impact. Codes such as GRIM, CTH or AUTODYN are well developed and involve complex numerical methods and in many cases require a large computing infrastructure. In this paper we present a simple 1-D Langrangian hydrocode developed at the University of Cambridge, called CAV_KO written in Visual Basic. The motivation being to produce a code which, while being relatively simple, is useful for both experimental planning and teaching. The code has been adapted from the original KO code written in FORTRAN by J. Borg, which, in turn, is based on the algorithm developed by Wilkins [1]. The developed GUI within MS Excel™ and the automatic generation of x-t diagrams allow CAV_KO to be a useful tool for quick calculations of plate impact events and teaching purposes. The VB code is licensed under the GNU General Public License and a MS Excel™ spreadsheet containing the code can be downloaded from www.shockphysics.com together with a copy of the user guide.

  3. Techniques to extend the reach of ground based gravitational wave detectors

    Science.gov (United States)

    Dwyer, Sheila

    2016-03-01

    While the current generation of advanced ground based detectors will open the gravitational wave universe to observation, ground based interferometry has the potential to extend the reach of these observatories to high redshifts. Several techniques have the potential to improve the advanced detectors beyond design sensitivity, including the use of squeezed light, upgraded suspensions, and possibly new optical coatings, new test mass materials, and cryogenic suspensions. To improve the sensitivity by more than a factor of 10 compared to advanced detectors new, longer facilities will be needed. Future observatories capable of hosting interferometers 10s of kilometers long have the potential to extend the reach of gravitational wave astronomy to cosmological distances, enabling detection of binary inspirals from throughout the history of star formation.

  4. An image-based automatic mesh generation and numerical simulation for a population-based analysis of aerosol delivery in the human lungs

    Science.gov (United States)

    Miyawaki, Shinjiro; Tawhai, Merryn H.; Hoffman, Eric A.; Lin, Ching-Long

    2013-11-01

    The authors propose a method to automatically generate three-dimensional subject-specific airway geometries and meshes for computational fluid dynamics (CFD) studies of aerosol delivery in the human lungs. The proposed method automatically expands computed tomography (CT)-based airway skeleton to generate the centerline (CL)-based model, and then fits it to the CT-segmented geometry to generate the hybrid CL-CT-based model. To produce a turbulent laryngeal jet known to affect aerosol transport, we developed a physiologically-consistent laryngeal model that can be attached to the trachea of the above models. We used Gmsh to automatically generate the mesh for the above models. To assess the quality of the models, we compared the regional aerosol distributions in a human lung predicted by the hybrid model and the manually generated CT-based model. The aerosol distribution predicted by the hybrid model was consistent with the prediction by the CT-based model. We applied the hybrid model to 8 healthy and 16 severe asthmatic subjects, and average geometric error was 3.8% of the branch radius. The proposed method can be potentially applied to the branch-by-branch analyses of a large population of healthy and diseased lungs. NIH Grants R01-HL-094315 and S10-RR-022421, CT data provided by SARP, and computer time provided by XSEDE.

  5. Android event code automatic generation method based on object relevance%基于对象关联的Android事件代码自动生成方法

    Institute of Scientific and Technical Information of China (English)

    李杨; 胡文

    2012-01-01

    为解决Android事件代码自动生成问题,结合对象关联理论,论述了控件对象关联关系,并给出控件对象关联关系定义并实现其构建过程,最终建立控件对象关联关系树COARTree,将其应用于Android事件代码生成过程中,解决了Android事件代码自动生成问题,并取得了良好的应用价值.以简易电话簿为实例,验证了Android事件代码自动生成的方法.%In order to solve the problem of Android event code automatically generated, this paper combined with the object of relevance theory (OAR) , discussed on the control object relationship, and gave the control object relationships theory ( COAR) defining and achieve their build process, and ultimately establish control object relationship tree(COARTree) applied to Android event code generation process to solve the problem of Android event code automatically generated, and have achieved good application value. Simple phone book, for instance, to verify the Android event code automatically generated.

  6. Modeling and simulation of the generation automatic control of electric power systems; Modelado y simulacion del control automatico de generacion de sistemas electricos de potencia

    Energy Technology Data Exchange (ETDEWEB)

    Caballero Ortiz, Ezequiel

    2002-12-01

    This work is devoted to the analysis of the Automatic Control of Electrical Systems Generation of power, as of the information that generates the loop with Load-Frequency Control and the Automatic Voltage Regulator loop. To accomplish the analysis, the control classical theory and feedback control systems concepts are applied. Thus also, the modern theory concepts are employed. The studies are accomplished in the digital computer through the MATLAB program and the available simulation technique in the SIMULINK tool. In this thesis the theoretical and physical concepts of the automatic control of generation are established; dividing it in load frequency control and automatic voltage regulator loops. The mathematical models of the two control loops are established. Later, the models of the elements are interconnected in order to integrate the loop with load frequency control and the digital simulation of the system is carried out. In first instance, the function of the primary control in are - machine, area - multi machine and multi area - multi machine power systems, is analyzed. Then, the automatic control of generation of the area and multi area power systems is studied. The economic dispatch concept is established and with this plan the power system multi area is simulated, there in after the energy exchange among areas in stationary stage is studied. The mathematical models of the component elements of the control loop of the automatic voltage regulator are interconnected. Data according to the nature of each component are generated and their behavior is simulated to analyze the system response. The two control loops are interconnected and a simulation is carry out with data generated previously, examining the performance of the automatic control of generation and the interaction between the two control loops. Finally, the Poles Positioning and the Optimum Control techniques of the modern control theory are applied to the automatic control of an area generation

  7. Improved ground-based FTS measurement for column abundance CO2 retrievals(Conference Presentation)

    Science.gov (United States)

    Goo, Tae-Young

    2016-10-01

    The National Institute of Meteorological Sciences has operated a ground-based Fourier Transform Spectrometer (FTS) at Anmyeondo, Korea since December 2012. Anmyeondo FTS site is a designated operational station of Total Carbon Column Observing Network (TCCON) and belongs to regional Global Atmosphere Watch observatory. A Bruker IFS-125HR model, which has a significantly high spectral resolution by 0.02 cm-1, is employed and instrument specification is almost same as the TCCON configuration. such as a spectrum range of 3,800 16,000 cm-1, a resolution of 1 cm-1, InGaAs and Si-Diode detectors and CaF2 beam splitter. It is found that measured spectra have a good agreement with simulated spectra. In order to improve the spectral accuracy and stability, The Operational Automatic System for Intensity of Sunray (OASIS) has been developed. The OASIS can provide consistent photon energy optimized to detector range by controlling the diameter of solar beam reflected from the mirror of suntracker. As a result, monthly modulation efficiency (ME), which indicates the spectral accuracy of FTS measurement, has been recorded the vicinity of 99.9% since Feb 2015. The ME of 98% is regarded as the error of 0.1% in the ground-based in-situ CO2 measurement. Total column abundances of CO2 and CH4 during 2015 are estimated by using GGG v14 and compared with ground-based in-situ CO2 and CH4 measurements at the height of 86 m above sea level. The seasonality of CO2 is well captured by both FTS and in-situ measurements while there is considerable difference on the amplitude of CO2 seasonal variation due to the insensitivity of column CO2 to the surface carbon cycle dynamics in nature as well as anthropogenic sources. Total column CO2 and CH4 approximately vary from 395 ppm to 405 ppm and from 1.82 ppm to 1.88 ppm, respectively. It should be noted that few measurements obtained in July to August because of a lot of cloud and fog. It is found that enhancement of CH4 from the FTS at Anmyeondo

  8. Ground-based observations of Kepler asteroseismic targets

    CERN Document Server

    Uytterhoeven, K; Southworth, J; Randall, S; Ostensen, R; Molenda-Zakowicz, J; Marconi, M; Kurtz, D W; Kiss, L; Gutierrez-Soto, J; Frandsen, S; De Cat, P; Bruntt, H; Briquet, M; Zhang, X B; Telting, J H; Steslicki, M; Ripepi, V; Pigulski, A; Paparo, M; Oreiro, R; Choong, Ngeow Chow; Niemczura, E; Nemec, J; Narwid, A; Mathias, P; Martin-Ruiz, S; Lehman, H; Kopacki, G; Karoff, C; Jackiewicz, J; Henden, A A; Handler, G; Grigachene, A; Green, E M; Garrido, R; Machado, L Fox; Debosscher, J; Creevey, O L; Catanzaro, G; Bognar, Z; Biazzo, K; Bernabei, S

    2010-01-01

    We present the ground-based activities within the different working groups of the Kepler Asteroseismic Science Consortium (KASC). The activities aim at the systematic characterization of the 5000+ KASC targets, and at the collection of ground-based follow-up time-series data of selected promising Kepler pulsators. So far, 35 different instruments at 30 telescopes on 22 different observatories in 12 countries are in use, and a total of more than 530 observing nights has been awarded. (Based on observations made with the Isaac Newton Telescope, William Herschel Telescope, Nordic Optical Telescope, Telescopio Nazionale Galileo, Mercator Telescope (La Palma, Spain), and IAC-80 (Tenerife, Spain). Also based on observations taken at the observatories of Sierra Nevada, San Pedro Martir, Vienna, Xinglong, Apache Point, Lulin, Tautenburg, Loiano, Serra la Nave, Asiago, McDonald, Skinakas, Pic du Midi, Mauna Kea, Steward Observatory, Bialkow Observatory of the Wroclaw University, Piszkesteto Mountain Station, Observato...

  9. Ground-based Nuclear Detonation Detection (GNDD) Technology Roadmap

    Energy Technology Data Exchange (ETDEWEB)

    Casey, Leslie A.

    2014-01-13

    This GNDD Technology Roadmap is intended to provide guidance to potential researchers and help management define research priorities to achieve technology advancements for ground-based nuclear explosion monitoring science being pursued by the Ground-based Nuclear Detonation Detection (GNDD) Team within the Office of Nuclear Detonation Detection in the National Nuclear Security Administration (NNSA) of the U.S. Department of Energy (DOE). Four science-based elements were selected to encompass the entire scope of nuclear monitoring research and development (R&D) necessary to facilitate breakthrough scientific results, as well as deliver impactful products. Promising future R&D is delineated including dual use associated with the Comprehensive Nuclear-Test-Ban Treaty (CTBT). Important research themes as well as associated metrics are identified along with a progression of accomplishments, represented by a selected bibliography, that are precursors to major improvements to nuclear explosion monitoring.

  10. Ground-Based Calibration Of A Microwave Landing System

    Science.gov (United States)

    Kiriazes, John J.; Scott, Marshall M., Jr.; Willis, Alfred D.; Erdogan, Temel; Reyes, Rolando

    1996-01-01

    System of microwave instrumentation and data-processing equipment developed to enable ground-based calibration of microwave scanning-beam landing system (MSBLS) at distances of about 500 to 1,000 ft from MSBLS transmitting antenna. Ensures accuracy of MSBLS near touchdown point, without having to resort to expense and complex logistics of aircraft-based testing. Modified versions prove useful in calibrating aircraft instrument landing systems.

  11. Ground-based Observations of the Solar Sources of Space Weather

    Science.gov (United States)

    Veronig, A. M.; Pötzi, W.

    2016-04-01

    Monitoring of the Sun and its activity is a task of growing importance in the frame of space weather research and awareness. Major space weather disturbances at Earth have their origin in energetic outbursts from the Sun: solar flares, coronal mass ejections and associated solar energetic particles. In this review we discuss the importance and complementarity of ground-based and space-based observations for space weather studies. The main focus is drawn on ground-based observations in the visible range of the spectrum, in particular in the diagnostically manifold Hα spectral line, which enables us to detect and study solar flares, filaments (prominences), filament (prominence) eruptions, and Moreton waves. Existing Hα networks such as the GONG and the Global High-Resolution Hα Network are discussed. As an example of solar observations from space weather research to operations, we present the system of real-time detection of Hα flares and filaments established at Kanzelhöhe Observatory (KSO; Austria) in the frame of the space weather segment of the ESA Space Situational Awareness programme (swe.ssa.esa.int). An evaluation of the system, which is continuously running since July 2013 is provided, covering an evaluation period of almost 2.5 years. During this period, KSO provided 3020 hours of real-time Hα observations at the ESA SWE portal. In total, 824 Hα flares were detected and classified by the real-time detection system, including 174 events of Hα importance class 1 and larger. For the total sample of events, 95 % of the automatically determined flare peak times lie within ±5 min of the values given in the official optical flares reports (by NOAA and KSO), and 76 % of the start times. The heliographic positions determined are better than ±5°. The probability of detection of flares of importance 1 or larger is 95 %, with a false alarm rate of 16 %. These numbers confirm the high potential of automatic flare detection and alerting from ground-based

  12. Ground-Based Lidar for Atmospheric Boundary Layer Ozone Measurements

    Science.gov (United States)

    Kuang, Shi; Newchurch, Michael J.; Burris, John; Liu, Xiong

    2013-01-01

    Ground-based lidars are suitable for long-term ozone monitoring as a complement to satellite and ozonesonde measurements. However, current ground-based lidars are unable to consistently measure ozone below 500 m above ground level (AGL) due to both engineering issues and high retrieval sensitivity to various measurement errors. In this paper, we present our instrument design, retrieval techniques, and preliminary results that focus on the high-temporal profiling of ozone within the atmospheric boundary layer (ABL) achieved by the addition of an inexpensive and compact mini-receiver to the previous system. For the first time, to the best of our knowledge, the lowest, consistently achievable observation height has been extended down to 125 m AGL for a ground-based ozone lidar system. Both the analysis and preliminary measurements demonstrate that this lidar measures ozone with a precision generally better than 10% at a temporal resolution of 10 min and a vertical resolution from 150 m at the bottom of the ABL to 550 m at the top. A measurement example from summertime shows that inhomogeneous ozone aloft was affected by both surface emissions and the evolution of ABL structures.

  13. Ground-based lidar for atmospheric boundary layer ozone measurements.

    Science.gov (United States)

    Kuang, Shi; Newchurch, Michael J; Burris, John; Liu, Xiong

    2013-05-20

    Ground-based lidars are suitable for long-term ozone monitoring as a complement to satellite and ozonesonde measurements. However, current ground-based lidars are unable to consistently measure ozone below 500 m above ground level (AGL) due to both engineering issues and high retrieval sensitivity to various measurement errors. In this paper, we present our instrument design, retrieval techniques, and preliminary results that focus on the high-temporal profiling of ozone within the atmospheric boundary layer (ABL) achieved by the addition of an inexpensive and compact mini-receiver to the previous system. For the first time, to the best of our knowledge, the lowest, consistently achievable observation height has been extended down to 125 m AGL for a ground-based ozone lidar system. Both the analysis and preliminary measurements demonstrate that this lidar measures ozone with a precision generally better than ±10% at a temporal resolution of 10 min and a vertical resolution from 150 m at the bottom of the ABL to 550 m at the top. A measurement example from summertime shows that inhomogeneous ozone aloft was affected by both surface emissions and the evolution of ABL structures.

  14. Automatic Weather Station (AWS) Lidar

    Science.gov (United States)

    Rall, Jonathan A.R.; Abshire, James B.; Spinhirne, James D.; Smith, David E. (Technical Monitor)

    2000-01-01

    An autonomous, low-power atmospheric lidar instrument is being developed at NASA Goddard Space Flight Center. This compact, portable lidar will operate continuously in a temperature controlled enclosure, charge its own batteries through a combination of a small rugged wind generator and solar panels, and transmit its data from remote locations to ground stations via satellite. A network of these instruments will be established by co-locating them at remote Automatic Weather Station (AWS) sites in Antarctica under the auspices of the National Science Foundation (NSF). The NSF Office of Polar Programs provides support to place the weather stations in remote areas of Antarctica in support of meteorological research and operations. The AWS meteorological data will directly benefit the analysis of the lidar data while a network of ground based atmospheric lidar will provide knowledge regarding the temporal evolution and spatial extent of Type la polar stratospheric clouds (PSC). These clouds play a crucial role in the annual austral springtime destruction of stratospheric ozone over Antarctica, i.e. the ozone hole. In addition, the lidar will monitor and record the general atmospheric conditions (transmission and backscatter) of the overlying atmosphere which will benefit the Geoscience Laser Altimeter System (GLAS). Prototype lidar instruments have been deployed to the Amundsen-Scott South Pole Station (1995-96, 2000) and to an Automated Geophysical Observatory site (AGO 1) in January 1999. We report on data acquired with these instruments, instrument performance, and anticipated performance of the AWS Lidar.

  15. Evaluation of plan quality assurance models for prostate cancer patients based on fully automatically generated Pareto-optimal treatment plans

    Science.gov (United States)

    Wang, Yibing; Breedveld, Sebastiaan; Heijmen, Ben; Petit, Steven F.

    2016-06-01

    IMRT planning with commercial Treatment Planning Systems (TPSs) is a trial-and-error process. Consequently, the quality of treatment plans may not be consistent among patients, planners and institutions. Recently, different plan quality assurance (QA) models have been proposed, that could flag and guide improvement of suboptimal treatment plans. However, the performance of these models was validated using plans that were created using the conventional trail-and-error treatment planning process. Consequently, it is challenging to assess and compare quantitatively the accuracy of different treatment planning QA models. Therefore, we created a golden standard dataset of consistently planned Pareto-optimal IMRT plans for 115 prostate patients. Next, the dataset was used to assess the performance of a treatment planning QA model that uses the overlap volume histogram (OVH). 115 prostate IMRT plans were fully automatically planned using our in-house developed TPS Erasmus-iCycle. An existing OVH model was trained on the plans of 58 of the patients. Next it was applied to predict DVHs of the rectum, bladder and anus of the remaining 57 patients. The predictions were compared with the achieved values of the golden standard plans for the rectum D mean, V 65, and V 75, and D mean of the anus and the bladder. For the rectum, the prediction errors (predicted-achieved) were only  -0.2  ±  0.9 Gy (mean  ±  1 SD) for D mean,-1.0  ±  1.6% for V 65, and  -0.4  ±  1.1% for V 75. For D mean of the anus and the bladder, the prediction error was 0.1  ±  1.6 Gy and 4.8  ±  4.1 Gy, respectively. Increasing the training cohort to 114 patients only led to minor improvements. A dataset of consistently planned Pareto-optimal prostate IMRT plans was generated. This dataset can be used to train new, and validate and compare existing treatment planning QA models, and has been made publicly available. The OVH model was highly accurate

  16. A computer program to automatically generate state equations and macro-models. [for network analysis and design

    Science.gov (United States)

    Garrett, S. J.; Bowers, J. C.; Oreilly, J. E., Jr.

    1978-01-01

    A computer program, PROSE, that produces nonlinear state equations from a simple topological description of an electrical or mechanical network is described. Unnecessary states are also automatically eliminated, so that a simplified terminal circuit model is obtained. The program also prints out the eigenvalues of a linearized system and the sensitivities of the eigenvalue of largest magnitude.

  17. VC++ PROGRAMMING AUTOMATIC GENERATING WITH NATURAL LANGUAGE INTERFACE%以自然语言为界面实现VC++程序的自动生成

    Institute of Scientific and Technical Information of China (English)

    周玉龙; 辛运帏; 谷大勇; 陈有祺

    2001-01-01

    本文提出以自然语言为界面实现程序自动生成的研究成果.该研究系统使用面向对象的方法与技术,以自然语言为输入界面,使用扩充的格语法进行语法语义分析,将用户描述的VC++期望程序功能的汉语篇章依次进行自动切词处理、语法处理、语义分析理解、目标程序的自动生成,最终形成满足用户要求且符合Visual C++语法的结果程序.%The study of realizing programming automatic generation with natural language interface by combined two important fields of natural language processing and software automation is reported in this paper.   Using object-oriented technology and method, with chinese character text input interface, regarding case grammar as the theory foundaton, after doing automatic word segmenting and syntactic and semantic analyzing, as a final output results, the correct executable visual C++ programs with functions described by user in chinese character texts are automatically generated by the researched system.

  18. Augmenting WFIRST Microlensing with a Ground-Based Telescope Network

    Science.gov (United States)

    Zhu, Wei; Gould, Andrew

    2016-06-01

    Augmenting the Wide Field Infrared Survey Telescope (WFIRST) microlensing campaigns with intensive observations from a ground-based network of wide-field survey telescopes would have several major advantages. First, it would enable full two-dimensional (2-D) vector microlens parallax measurements for a substantial fraction of low-mass lenses as well as planetary and binary events that show caustic crossing features. For a significant fraction of the free-floating planet (FFP) events and all caustic-crossing planetary/binary events, these 2-D parallax measurements directly lead to complete solutions (mass, distance, transverse velocity) of the lens object (or lens system). For even more events, the complementary ground-based observations will yield 1-D parallax measurements. Together with the 1-D parallaxes from WFIRST alone, they can probe the entire mass range M > M_Earth. For luminous lenses, such 1-D parallax measurements can be promoted to complete solutions (mass, distance, transverse velocity) by high-resolution imaging. This would provide crucial information not only about the hosts of planets and other lenses, but also enable a much more precise Galactic model. Other benefits of such a survey include improved understanding of binaries (particularly with low mass primaries), and sensitivity to distant ice-giant and gas-giant companions of WFIRST lenses that cannot be detected by WFIRST itself due to its restricted observing windows. Existing ground-based microlensing surveys can be employed if WFIRST is pointed at lower-extinction fields than is currently envisaged. This would come at some cost to the event rate. Therefore the benefits of improved characterization of lenses must be weighed against these costs.

  19. Comparison of Thermal Structure Results from Venus Express and Ground Based Observations since Vira

    Science.gov (United States)

    Limaye, Sanjay

    2016-07-01

    from different experiments. The temperature structure in the lower thermosphere from disk resolved ground based observations (except for one ground based investigation), is generally consistent with the Venus Express results. The differences between VIRA and the new results between 50-90 km can be attributed to different binning in averaging the data and presence of gravity waves. At higher altitudes the range of temperature values (especially in the 120-160 km range) is surprisingly large (> 50 K). The corresponding range of atmospheric density variations is also large. Reasons for this large variability are not yet fully understood. The results of the comparison of the space and ground based observations among themselves and with VIRA models are being submitted for publication. Analysis of the Venus observations is continuing and refinements to the results from different experiments are anticipated in the near future. Considerable effort is ahead to present an update to the VIRA model both for the vertical structure above 90 km and two dimensional structure before generating tables.The team acknowledges support from ISSI for the inter comparison effort as well as from the respective host institutions and funding agencies that made this work possible. Several team members acknowledge funding from the European Union Seventh Framework Program (FP7) under grant agreement n° 606798 (EuroVenus).

  20. The STACEE-32 Ground Based Gamma-ray Detector

    CERN Document Server

    Hanna, D S; Boone, L M; Chantell, M C; Conner, Z; Covault, C E; Dragovan, M; Fortin, P; Gregorich, D T; Hinton, J A; Mukherjee, R; Ong, R A; Oser, S; Ragan, K; Scalzo, R A; Schütte, D R; Theoret, C G; Tümer, T O; Williams, D A; Zweerink, J A

    2002-01-01

    We describe the design and performance of the Solar Tower Atmospheric Cherenkov Effect Experiment detector in its initial configuration (STACEE-32). STACEE is a new ground-based gamma ray detector using the atmospheric Cherenkov technique. In STACEE, the heliostats of a solar energy research array are used to collect and focus the Cherenkov photons produced in gamma-ray induced air showers. The large Cherenkov photon collection area of STACEE results in a gamma-ray energy threshold below that of previous detectors.

  1. The STACEE Ground-Based Gamma-Ray Detector

    CERN Document Server

    Gingrich, D M; Bramel, D; Carson, J; Covault, C E; Fortin, P; Hanna, D S; Hinton, J A; Jarvis, A; Kildea, J; Lindner, T; Müller, C; Mukherjee, R; Ong, R A; Ragan, K; Scalzo, R A; Theoret, C G; Williams, D A; Zweerink, J A

    2005-01-01

    We describe the design and performance of the Solar Tower Atmospheric Cherenkov Effect Experiment (STACEE) in its complete configuration. STACEE uses the heliostats of a solar energy research facility to collect and focus the Cherenkov photons produced in gamma-ray induced air showers. The light is concentrated onto an array of photomultiplier tubes located near the top of a tower. The large Cherenkov photon collection area of STACEE results in a gamma-ray energy threshold below that of previous ground-based detectors. STACEE is being used to observe pulsars, supernova remnants, active galactic nuclei, and gamma-ray bursts.

  2. Research on target accuracy for ground-based lidar

    Science.gov (United States)

    Zhu, Ling; Shi, Ruoming

    2009-05-01

    In ground based Lidar system, the targets are used in the process of registration, georeferencing for point cloud, and also can be used as check points. Generally, the accuracy of capturing the flat target center is influenced by scanning range and scanning angle. In this research, the experiments are designed to extract accuracy index of the target center with 0-90°scan angles and 100-195 meter scan ranges using a Leica HDS3000 laser scanner. The data of the experiments are listed in detail and the related results are analyzed.

  3. Research on DSP automatic code generation technology with Matlab platform%Matlab平台DSP自动代码生成技术研究

    Institute of Scientific and Technical Information of China (English)

    王巧明; 李中健; 姜达郁

    2012-01-01

    Since it is difficult and time-consuming to programm for DSP, a method that synthetically uses Matlab, code, composer studio (CCS), and their embedded tools and connection softwares to generate code automatically is proposed. The research mainly focuses on the method of automatic code generation with DM642 EVM board. The edge detection experiment is taken to verify the performability and reliability of the method. The results show that the code generation method is not only efficient, but also flexible. The generated code can be executed smoothly in the DSP board with a good processing result.%针对DSP编程难度大,耗时长的问题,给出了一种综合运用Matlab软件、Code Composer Studio(CCS)软件及其内嵌工具和连接软件进行自动代码生成的方法.重点研究DM642 EVM板的自动代码生成方法,并以边缘检测实验为例,验证自动生成代码的可执行性.实验结果表明,该代码生成方法不仅具有极高的生成效率,而且灵活易用;生成的可执行代码可以在DSP板上顺利运行,并可取得非常好的处理结果.

  4. Ground-Based Global Positioning System (GPS) Meteorology Integrated Precipitable Water Vapor (IPW)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Ground-Based Global Positioning System (GPS) Meteorology Integrated Precipitable Water Vapor (IPW) data set measures atmospheric water vapor using ground-based...

  5. Statistical Studies of Ground-Based Optical Lightning Signatures

    Science.gov (United States)

    Hunt, C. R.; Nemzek, R. J.; Suszcynsky, D. M.

    2005-12-01

    Most extensive optical studies of lightning have been conducted from orbit, and the statistics of events collected from earth are relatively poorly documented. The time signatures of optical power measured in the presence of clouds are inevitably affected by scattering,which can distort the signatures by extending and delaying the amplitude profile in time. We have deployed two all-sky photodiode detectors, one in New Mexico and one in Oklahoma, which are gathering data alongside electric field change monitors as part of the LANL EDOTX Great Plains Array. Preliminary results show that the photodiode is sensitive to approximately 50% or more of RF events detected at ranges of up to 30 km, and still has some sensitivity at ranges in excess of 60 km (distances determined by the EDOTX field-change array). The shapes of events within this range were assessed, with focus on rise time, width, peak power, and their correlation to corresponding electric field signatures, and these are being compared with published on-orbit and ground-based data. Initial findings suggest a mean characteristic width (ratio of total detected optical energy to peak power) of 291 +/- 12 microseconds and a mean delay between the RF signal peak and optical peak of 121 +/- 17 microseconds. These values fall between prior ground-based measurements of direct return stroke emissions, and scattering-dominated on-orbit measurements. This work will promote better understanding of the correspondence between radio and optical measurements of lightning.

  6. Network operability of ground-based microwave radiometers: Calibration and standardization efforts

    Science.gov (United States)

    Pospichal, Bernhard; Löhnert, Ulrich; Küchler, Nils; Czekala, Harald

    2017-04-01

    Ground-based microwave radiometers (MWR) are already widely used by national weather services and research institutions all around the world. Most of the instruments operate continuously and are beginning to be implemented into data assimilation for atmospheric models. Especially their potential for continuously observing boundary-layer temperature profiles as well as integrated water vapor and cloud liquid water path makes them valuable for improving short-term weather forecasts. However until now, most MWR have been operated as stand-alone instruments. In order to benefit from a network of these instruments, standardization of calibration, operation and data format is necessary. In the frame of TOPROF (COST Action ES1303) several efforts have been undertaken, such as uncertainty and bias assessment, or calibration intercomparison campaigns. The goal was to establish protocols for providing quality controlled (QC) MWR data and their uncertainties. To this end, standardized calibration procedures for MWR have been developed and recommendations for radiometer users compiled. Based on the results of the TOPROF campaigns, a new, high-accuracy liquid-nitrogen calibration load has been introduced for MWR manufactured by Radiometer Physics GmbH (RPG). The new load improves the accuracy of the measurements considerably and will lead to even more reliable atmospheric observations. Next to the recommendations for set-up, calibration and operation of ground-based MWR within a future network, we will present homogenized methods to determine the accuracy of a running calibration as well as means for automatic data quality control. This sets the stage for the planned microwave calibration center at JOYCE (Jülich Observatory for Cloud Evolution), which will be shortly introduced.

  7. LiDAR The Generation of Automatic Mapping for Buildings, Using High Spatial Resolution Digital Vertical Aerial Photography and LiDAR Point Clouds

    Directory of Open Access Journals (Sweden)

    William Barragán Zaque

    2015-06-01

    Full Text Available The aim of this paper is to generate photogrammetrie products and to automatically map buildings in the area of interest in vector format. The research was conducted Bogotá using high resolution digital vertical aerial photographs and point clouds obtained using LIDAR technology. Image segmentation was also used, alongside radiometric and geometric digital processes. The process took into account aspects including building height, segmentation algorithms, and spectral band combination. The results had an effectiveness of 97.2 % validated through ground-truthing.

  8. Multiband Gravitational-Wave Astronomy: Parameter Estimation and Tests of General Relativity with Space- and Ground-Based Detectors

    Science.gov (United States)

    Vitale, Salvatore

    2016-07-01

    With the discovery of the binary-black-hole (BBH) coalescence GW150914 the era of gravitational-wave (GW) astronomy has started. It has recently been shown that BBH with masses comparable to or higher than GW150914 would be visible in the Evolved Laser Interferometer Space Antenna (eLISA) band a few years before they finally merge in the band of ground-based detectors. This would allow for premerger electromagnetic alerts, dramatically increasing the chances of a joint detection, if BBHs are indeed luminous in the electromagnetic band. In this Letter we explore a quite different aspect of multiband GW astronomy, and verify if, and to what extent, measurement of masses and sky position with eLISA could improve parameter estimation and tests of general relativity with ground-based detectors. We generate a catalog of 200 BBHs and find that having prior information from eLISA can reduce the uncertainty in the measurement of source distance and primary black hole spin by up to factor of 2 in ground-based GW detectors. The component masses estimate from eLISA will not be refined by the ground based detectors, whereas joint analysis will yield precise characterization of the newly formed black hole and improve consistency tests of general relativity.

  9. Multiband Gravitational-Wave Astronomy: Parameter Estimation and Tests of General Relativity with Space- and Ground-Based Detectors.

    Science.gov (United States)

    Vitale, Salvatore

    2016-07-29

    With the discovery of the binary-black-hole (BBH) coalescence GW150914 the era of gravitational-wave (GW) astronomy has started. It has recently been shown that BBH with masses comparable to or higher than GW150914 would be visible in the Evolved Laser Interferometer Space Antenna (eLISA) band a few years before they finally merge in the band of ground-based detectors. This would allow for premerger electromagnetic alerts, dramatically increasing the chances of a joint detection, if BBHs are indeed luminous in the electromagnetic band. In this Letter we explore a quite different aspect of multiband GW astronomy, and verify if, and to what extent, measurement of masses and sky position with eLISA could improve parameter estimation and tests of general relativity with ground-based detectors. We generate a catalog of 200 BBHs and find that having prior information from eLISA can reduce the uncertainty in the measurement of source distance and primary black hole spin by up to factor of 2 in ground-based GW detectors. The component masses estimate from eLISA will not be refined by the ground based detectors, whereas joint analysis will yield precise characterization of the newly formed black hole and improve consistency tests of general relativity.

  10. Atmospheric contamination for CMB ground-based observations

    CERN Document Server

    Errard, J; Akiba, Y; Arnold, K; Atlas, M; Baccigalupi, C; Barron, D; Boettger, D; Borrill, J; Chapman, S; Chinone, Y; Cukierman, A; Delabrouille, J; Dobbs, M; Ducout, A; Elleflot, T; Fabbian, G; Feng, C; Feeney, S; Gilbert, A; Goeckner-Wald, N; Halverson, N W; Hasegawa, M; Hattori, K; Hazumi, M; Hill, C; Holzapfel, W L; Hori, Y; Inoue, Y; Jaehnig, G C; Jaffe, A H; Jeong, O; Katayama, N; Kaufman, J; Keating, B; Kermish, Z; Keskitalo, R; Kisner, T; Jeune, M Le; Lee, A T; Leitch, E M; Leon, D; Linder, E; Matsuda, F; Matsumura, T; Miller, N J; Myers, M J; Navaroli, M; Nishino, H; Okamura, T; Paar, H; Peloton, J; Poletti, D; Puglisi, G; Rebeiz, G; Reichardt, C L; Richards, P L; Ross, C; Rotermund, K M; Schenck, D E; Sherwin, B D; Siritanasak, P; Smecher, G; Stebor, N; Steinbach, B; Stompor, R; Suzuki, A; Tajima, O; Takakura, S; Tikhomirov, A; Tomaru, T; Whitehorn, N; Wilson, B; Yadav, A; Zahn, O

    2015-01-01

    Atmosphere is one of the most important noise sources for ground-based Cosmic Microwave Background (CMB) experiments. By increasing optical loading on the detectors, it amplifies their effective noise, while its fluctuations introduce spatial and temporal correlations between detected signals. We present a physically motivated 3d-model of the atmosphere total intensity emission in the millimeter and sub-millimeter wavelengths. We derive an analytical estimate for the correlation between detectors time-ordered data as a function of the instrument and survey design, as well as several atmospheric parameters such as wind, relative humidity, temperature and turbulence characteristics. Using numerical computation, we examine the effect of each physical parameter on the correlations in the time series of a given experiment. We then use a parametric-likelihood approach to validate the modeling and estimate atmosphere parameters from the POLARBEAR-I project first season data set. We compare our results to previous st...

  11. Observational Selection Effects with Ground-based Gravitational Wave Detectors

    CERN Document Server

    Chen, Hsin-Yu; Vitale, Salvatore; Holz, Daniel E; Katsavounidis, Erik

    2016-01-01

    Ground-based interferometers are not perfectly all-sky instruments, and it is important to account for their behavior when considering the distribution of detected events. In particular, the LIGO detectors are most sensitive to sources above North America and the Indian Ocean and, as the Earth rotates, the sensitive regions are swept across the sky. However, because the detectors do not acquire data uniformly over time, there is a net bias on detectable sources' right ascensions. Both LIGO detectors preferentially collect data during their local night; it is more than twice as likely to be local midnight than noon when both detectors are operating. We discuss these selection effects and how they impact LIGO's observations and electromagnetic follow-up. Beyond galactic foregrounds associated with seasonal variations, we find that equatorial observatories can access over $80\\%$ of the localization probability, while mid-latitudes will access closer to $70\\%$. Facilities located near the two LIGO sites can obser...

  12. Progress in the ULTRA 1-m ground-based telescope

    Science.gov (United States)

    Romeo, Robert C.; Martin, Robert N.; Twarog, Bruce; Anthony-Twarog, Barbara; Taghavi, Ray; Hale, Rick; Etzel, Paul; Fesen, Rob; Shawl, Steve

    2006-06-01

    We present the technical status of the Ultra Lightweight Telescope for Research in Astronomy (ULTRA) program. The program is a 3-year Major Research Instrumentation (MRI) program funded by NSF. The MRI is a collaborative effort involving Composite Mirror Applications, Inc. (CMA), University of Kansas, San Diego State University and Dartmouth College. Objectives are to demonstrate the feasibility of carbon fiber reinforced plastic (CFRP) composite mirror technology for ground-based optical telescopes. CMA is spearheading the development of surface replication techniques to produce the optics, fabricating the 1m glass mandrel, and constructing the optical tube assembly (OTA). Presented will be an overview and status of the 1-m mandrel fabrication, optics development, telescope design and CFRP telescope fabrication by CMA for the ULTRA Telescope.

  13. Ground-based optical observation system for LEO objects

    Science.gov (United States)

    Yanagisawa, T.; Kurosaki, H.; Oda, H.; Tagawa, M.

    2015-08-01

    We propose a ground-based optical observation system for monitoring LEO objects, which uses numerous optical sensors to cover a vast region of the sky. Its potential in terms of detection and orbital determination were examined. About 30 cm LEO objects at 1000 km altitude are detectable using an 18 cm telescope, a CCD camera and the analysis software developed. Simulations and a test observation showed that two longitudinally separate observation sites with arrays of optical sensors can identify the same objects from numerous data sets and determine their orbits precisely. The proposed system may complement or replace the current radar observation system for monitoring LEO objects, like space-situation awareness, in the near future.

  14. Identification of rainy periods from ground based microwave radiometry

    Directory of Open Access Journals (Sweden)

    Ada Vittoria Bosisio

    2012-03-01

    Full Text Available In this paper the authors present the results of a study aiming at detecting rainy data in measurements collected by a dual band ground-based radiometer. The proposed criterion is based on the ratio of the brightness temperatures observed in the 20-30 GHz band without need of any ancillary information. A major result obtained from the probability density of the ratio computed over one month of data is the identification of threshold values between clear sky, cloudy sky and rainy sky, respectively. A linear fit performed by using radiometric data and concurrent rain gauge measurements shows a correlation coefficient equal to 0.56 between the temperature ratio and the observed precipitation.

  15. Optical vortex coronagraphs on ground-based telescopes

    CERN Document Server

    Jenkins, Charles

    2007-01-01

    The optical vortex coronagraph is potentially a remarkably effective device, at least for an ideal unobstructed telescope. Most ground-based telescopes however suffer from central obscuration and also have to operate through the aberrations of the turbulent atmosphere. This note analyzes the performance of the optical vortex in these circumstances and compares to some other designs, showing that it performs similarly in this situation. There is a large class of coronagraphs of this general type, and choosing between them in particular applications depends on details of performance at small off-axis distances and uniformity of response in the focal plane. Issues of manufacturability to the necessary tolerances are also likely to be important.

  16. Observational Selection Effects with Ground-based Gravitational Wave Detectors

    Science.gov (United States)

    Chen, Hsin-Yu; Essick, Reed; Vitale, Salvatore; Holz, Daniel; Katsavounidis, Erik

    2017-01-01

    Ground-based interferometers are not perfectly all-sky instruments, and it is important to account for their behavior when considering the distribution of detected events. In particular, the LIGO detectors are most sensitive to sources above North America and the Indian Ocean and, as the Earth rotates, the sensitive regions are swept across the sky. However, because the detectors do not acquire data uniformly over time, there is a net bias on detectable sources' right ascensions. Both LIGO detectors preferentially collect data during their local night; it is more than twice as likely to be local midnight than noon when both detectors are operating. We discuss these selection effects and how they impact LIGO's observations and electromagnetic follow-up. These effects can inform electromagnetic follow-up activities and optimization, including the possibility of directing observations even before gravitational-wave events occur.

  17. Unique cell culture systems for ground based research

    Science.gov (United States)

    Lewis, Marian L.

    1990-01-01

    The horizontally rotating fluid-filled, membrane oxygenated bioreactors developed at NASA Johnson for spacecraft applications provide a powerful tool for ground-based research. Three-dimensional aggregates formed by cells cultured on microcarrier beads are useful for study of cell-cell interactions and tissue development. By comparing electron micrographs of plant seedlings germinated during Shuttle flight 61-C and in an earth-based rotating bioreactor it is shown that some effects of microgravity are mimicked. Bioreactors used in the UAH Bioreactor Laboratory will make it possible to determine some of the effects of altered gravity at the cellular level. Bioreactors can be valuable for performing critical, preliminary-to-spaceflight experiments as well as medical investigations such as in vitro tumor cell growth and chemotherapeutic drug response; the enrichment of stem cells from bone marrow; and the effect of altered gravity on bone and muscle cell growth and function and immune response depression.

  18. Spatial-angular modeling of ground-based biaxial lidar

    Science.gov (United States)

    Agishev, Ravil R.

    1997-10-01

    Results of spatial-angular LIDAR modeling based on an efficiency criterion introduced are represented. Their analysis shows that a low spatial-angular efficiency of traditional VIS and NIR systems is a main cause of a low S/BR ratio at the photodetector input. It determines the considerable measurements errors and the following low accuracy of atmospheric optical parameters retrieval. As we have shown, the most effective protection against intensive sky background radiation for ground-based biaxial LIDAR's consist in forming of their angular field according to spatial-angular efficiency criterion G. Some effective approaches to high G-parameter value achievement to achieve the receiving system optimization are discussed.

  19. A new tool for rapid and automatic estimation of earthquake source parameters and generation of seismic bulletins

    Science.gov (United States)

    Zollo, Aldo

    2016-04-01

    RISS S.r.l. is a Spin-off company recently born from the initiative of the research group constituting the Seismology Laboratory of the Department of Physics of the University of Naples Federico II. RISS is an innovative start-up, based on the decade-long experience in earthquake monitoring systems and seismic data analysis of its members and has the major goal to transform the most recent innovations of the scientific research into technological products and prototypes. With this aim, RISS has recently started the development of a new software, which is an elegant solution to manage and analyse seismic data and to create automatic earthquake bulletins. The software has been initially developed to manage data recorded at the ISNet network (Irpinia Seismic Network), which is a network of seismic stations deployed in Southern Apennines along the active fault system responsible for the 1980, November 23, MS 6.9 Irpinia earthquake. The software, however, is fully exportable and can be used to manage data from different networks, with any kind of station geometry or network configuration and is able to provide reliable estimates of earthquake source parameters, whichever is the background seismicity level of the area of interest. Here we present the real-time automated procedures and the analyses performed by the software package, which is essentially a chain of different modules, each of them aimed at the automatic computation of a specific source parameter. The P-wave arrival times are first detected on the real-time streaming of data and then the software performs the phase association and earthquake binding. As soon as an event is automatically detected by the binder, the earthquake location coordinates and the origin time are rapidly estimated, using a probabilistic, non-linear, exploration algorithm. Then, the software is able to automatically provide three different magnitude estimates. First, the local magnitude (Ml) is computed, using the peak-to-peak amplitude

  20. MO-G-BRE-04: Automatic Verification of Daily Treatment Deliveries and Generation of Daily Treatment Reports for a MR Image-Guided Treatment Machine

    Energy Technology Data Exchange (ETDEWEB)

    Yang, D; Li, X; Li, H; Wooten, H; Green, O; Rodriguez, V; Mutic, S [Washington University School of Medicine, St. Louis, MO (United States)

    2014-06-15

    Purpose: Two aims of this work were to develop a method to automatically verify treatment delivery accuracy immediately after patient treatment and to develop a comprehensive daily treatment report to provide all required information for daily MR-IGRT review. Methods: After systematically analyzing the requirements for treatment delivery verification and understanding the available information from a novel MR-IGRT treatment machine, we designed a method to use 1) treatment plan files, 2) delivery log files, and 3) dosimetric calibration information to verify the accuracy and completeness of daily treatment deliveries. The method verifies the correctness of delivered treatment plans and beams, beam segments, and for each segment, the beam-on time and MLC leaf positions. Composite primary fluence maps are calculated from the MLC leaf positions and the beam-on time. Error statistics are calculated on the fluence difference maps between the plan and the delivery. We also designed the daily treatment delivery report by including all required information for MR-IGRT and physics weekly review - the plan and treatment fraction information, dose verification information, daily patient setup screen captures, and the treatment delivery verification results. Results: The parameters in the log files (e.g. MLC positions) were independently verified and deemed accurate and trustable. A computer program was developed to implement the automatic delivery verification and daily report generation. The program was tested and clinically commissioned with sufficient IMRT and 3D treatment delivery data. The final version has been integrated into a commercial MR-IGRT treatment delivery system. Conclusion: A method was developed to automatically verify MR-IGRT treatment deliveries and generate daily treatment reports. Already in clinical use since December 2013, the system is able to facilitate delivery error detection, and expedite physician daily IGRT review and physicist weekly chart

  1. Fully automatized renal parenchyma volumetry using a support vector machine based recognition system for subject-specific probability map generation in native MR volume data

    Science.gov (United States)

    Gloger, Oliver; Tönnies, Klaus; Mensel, Birger; Völzke, Henry

    2015-11-01

    In epidemiological studies as well as in clinical practice the amount of produced medical image data strongly increased in the last decade. In this context organ segmentation in MR volume data gained increasing attention for medical applications. Especially in large-scale population-based studies organ volumetry is highly relevant requiring exact organ segmentation. Since manual segmentation is time-consuming and prone to reader variability, large-scale studies need automatized methods to perform organ segmentation. Fully automatic organ segmentation in native MR image data has proven to be a very challenging task. Imaging artifacts as well as inter- and intrasubject MR-intensity differences complicate the application of supervised learning strategies. Thus, we propose a modularized framework of a two-stepped probabilistic approach that generates subject-specific probability maps for renal parenchyma tissue, which are refined subsequently by using several, extended segmentation strategies. We present a three class-based support vector machine recognition system that incorporates Fourier descriptors as shape features to recognize and segment characteristic parenchyma parts. Probabilistic methods use the segmented characteristic parenchyma parts to generate high quality subject-specific parenchyma probability maps. Several refinement strategies including a final shape-based 3D level set segmentation technique are used in subsequent processing modules to segment renal parenchyma. Furthermore, our framework recognizes and excludes renal cysts from parenchymal volume, which is important to analyze renal functions. Volume errors and Dice coefficients show that our presented framework outperforms existing approaches.

  2. Development of a Ground-Based Atmospheric Monitoring Network for the Global Mercury Observation System (GMOS

    Directory of Open Access Journals (Sweden)

    Sprovieri F.

    2013-04-01

    Full Text Available Consistent, high-quality measurements of atmospheric mercury (Hg are necessary in order to better understand Hg emissions, transport, and deposition on a global scale. Although the number of atmospheric Hg monitoring stations has increased in recent years, the available measurement database is limited and there are many regions of the world where measurements have not been extensively performed. Long-term atmospheric Hg monitoring and additional ground-based monitoring sites are needed in order to generate datasets that will offer new insight and information about the global scale trends of atmospheric Hg emissions and deposition. In the framework of the Global Mercury Observation System (GMOS project, a coordinated global observational network for atmospheric Hg is being established. The overall research strategy of GMOS is to develop a state-of-the-art observation system able to provide information on the concentration of Hg species in ambient air and precipitation on the global scale. This network is being developed by integrating previously established ground-based atmospheric Hg monitoring stations with newly established GMOS sites that are located both at high altitude and sea level locations, as well as in climatically diverse regions. Through the collection of consistent, high-quality atmospheric Hg measurement data, we seek to create a comprehensive assessment of atmospheric Hg concentrations and their dependence on meteorology, long-range atmospheric transport and atmospheric emissions.

  3. Understanding the Laminar Distribution of Tropospheric Ozone from Ground-Based, Airborne, Spaceborne, and Modeling Perspectives

    Science.gov (United States)

    Newchurch, Mike; Johnson, Matthew S.; Huang, Guanyu; Kuang, Shi; Wang, Lihua; Chance, Kelly; Liu, Xiong

    2016-01-01

    Laminar ozone structure is a ubiquitous feature of tropospheric-ozone distributions resulting from dynamic and chemical atmospheric processes. Understanding the characteristics of these ozone laminae and the mechanisms responsible for producing them is important to outline the transport pathways of trace gases and to quantify the impact of different sources on tropospheric background ozone. In this study, we present a new method to detect ozone laminae to understand their climatological characteristics of occurrence frequency in terms of thickness and altitude. We employ both ground-based and airborne ozone lidar measurements and other synergistic observations and modeling to investigate the sources and mechanisms such as biomass burning transport, stratospheric intrusion, lightning-generated NOx, and nocturnal low-level jets that are responsible for depleted or enhanced tropospheric ozone layers. Spaceborne (e.g., OMI (Ozone Monitoring Instrument), TROPOMI (Tropospheric Monitoring Instrument), TEMPO (Tropospheric Emissions: Monitoring of Pollution)) measurements of these laminae will observe greater horizontal extent and lower vertical resolution than balloon-borne or lidar measurements will quantify. Using integrated ground-based, airborne, and spaceborne observations in a modeling framework affords insight into how to gain knowledge of both the vertical and horizontal evolution of these ubiquitous ozone laminae.

  4. Model Driven Automatic Generation of Web Application Systems%模型驱动下的Web应用系统自动生成

    Institute of Scientific and Technical Information of China (English)

    王海林

    2012-01-01

    In order to promote software development efficiency,it proposes an approach of model driven automatic generation of Web applications. The approach takes MetaEdit+ as a meta-modeling tool. The first step the approach suggests is to build Web application meta -models and to customize DSL. The further step is to build Web application domain models by DSL. Then by using generator definition language MERL which MetaEdit+ provides,the software developers can design conveniently JSP generator,Servlet generator,Javabeans generator and database generator that Web application systems need. These generators can produce the whole Web application system directly from the Web application graph models. Finally, the approach of model driven automatic generation of Web applications will be introduced in detail through an instance named WebShopping. The test result indicates that the generated Web application can run correctly on the Web application server in the Windows operating system environment.%为了提高Web应用系统开发效率,提出了模型驱动下的Web应用系统自动生成方法.这种生成方法是以MetaEdit+作为元建模工具,首先创建Web应用系统元模型、定制DSL,进而建立Web应用系统领域模型,然后通过MetaEdit+提供的生成器定义语言MERL,软件开发人员可以很方便地设计出Web应用系统所需的JSP生成器、Servlet生成器、Jay -abeans生成器和数据库生成器,从Web应用系统图形模型直接生成整个Web应用系统.最后通过一个WebShopping实例详细介绍了模型驱动下的Web应用系统生成方法及生成过程.经测试,所生成的Web应用系统可以在Windows操作系统中的Web应用服务器上正确运行.

  5. Some Behavioral Considerations on the GPS4GEF Cloud-Based Generator of Evaluation Forms with Automatic Feedback and References to Interactive Support Content

    Directory of Open Access Journals (Sweden)

    Daniel HOMOCIANU

    2015-01-01

    Full Text Available The paper introduces some considerations on a previously defined general purpose system used to dynamically generate online evaluation forms with automatic feedback immediately after submitting responses and working with a simple and well-known data source format able to store questions, answers and links to additional support materials in order to increase the productivity of evaluation and assessment. Beyond presenting a short description of the prototype’s components and underlining advantages and limitations of using it for any user involved in assessment and evaluation processes, this paper promotes the use of such a system together with a simple technique of generating and referencing interactive support content cited within this paper and defined together with the LIVES4IT approach. This type of content means scenarios having adhoc documentation and interactive simulation components useful when emulating concrete examples of working with real world objects, operating with devices or using software applications from any activity field.

  6. Long-term ionospheric anomaly monitoring for ground based augmentation systems

    Science.gov (United States)

    Jung, Sungwook; Lee, Jiyun

    2012-08-01

    Extreme ionospheric anomalies can pose a potential integrity threat to ground-based augmentation of the Global Positioning System (GPS), and thus the development of ionospheric anomaly threat models for each region of operation is essential for system design and operation. This paper presents a methodology for automated long-term ionospheric anomaly monitoring, which will be used to build an ionospheric anomaly threat model, evaluate its validity over the life cycle of the system, continuously monitor ionospheric anomalies, and update the threat model if necessary. This procedure automatically processes GPS data collected from external networks and estimates ionospheric gradients at regular intervals. If ionospheric gradients large enough to be potentially hazardous to users are identified, manual data examination is triggered. This paper also develops a simplified truth processing method to create precise ionospheric delay estimates in near real-time, which is the key to automating the ionospheric monitoring procedure. The performance of the method is examined using data from the 20 November 2003 and 9 November 2004 ionospheric storms. These results demonstrate the effectiveness of simplified truth processing within long-term ionosphere monitoring. From the case studies, the automated procedure successfully identified extreme ionospheric anomalies, including the two worst ionospheric gradients observed and validated previously based on manual analysis. The automation of data processing enables us to analyze ionospheric data continuously going forward and to more accurately categorize ionospheric behavior under both nominal and anomalous conditions.

  7. Operational optical turbulence forecast for the Service Mode of top-class ground based telescopes

    CERN Document Server

    Masciadri, E; Turchi, A; Fini, L

    2016-01-01

    In this contribution we present the most relevant results obtained in the context of a feasibility study (MOSE) undertaken for ESO. The principal aim of the project was to quantify the performances of a mesoscale model (Astro-Meso-NH code) in forecasting all the main atmospherical parameters relevant for the ground-based astronomical observations and the optical turbulence (CN2 and associated integrated astroclimatic parameters) above Cerro Paranal (site of the VLT) and Cerro Armazones (site of the E-ELT). A detailed analysis on the score of success of the predictive capacities of the system have been carried out for all the astroclimatic as well as for the atmospherical parameters. Considering the excellent results that we obtained, this study proved the opportunity to implement on these two sites an automatic system to be run nightly in an operational configuration to support the scheduling of scientific programs as well as of astronomical facilities (particularly those supported by AO systems) of the VLT a...

  8. Atmospheric aerosol characterization with a ground-based SPEX spectropolarimetric instrument

    Directory of Open Access Journals (Sweden)

    G. van Harten

    2014-06-01

    Full Text Available Characterization of atmospheric aerosols is important for understanding their impact on health and climate. A wealth of aerosol parameters can be retrieved from multi-angle, multi-wavelength radiance and polarization measurements of the clear sky. We developed a ground-based SPEX instrument (groundSPEX for accurate spectropolarimetry, based on the passive, robust, athermal and snapshot spectral polarization modulation technique, and hence ideal for field deployment. It samples the scattering phase function in the principal plane in an automated fashion, using a motorized pan/tilt unit and automatic exposure time detection. Extensive radiometric and polarimetric calibrations were performed, yielding values for both random noise and systematic uncertainties. The absolute polarimetric accuracy at low degrees of polarization is established to be ~ 5 × 10−3. About 70 measurement sequences have been performed throughout four clear-sky days at Cabauw, the Netherlands. Several aerosol parameters were retrieved: aerosol optical thickness, effective radius, and complex refractive index for fine and coarse mode. The results are in good agreement with the co-located AERONET products, with a correlation coefficient of ρ = 0.932 for the total aerosol optical thickness at 550 nm.

  9. Probing Pluto's Atmosphere Using Ground-Based Stellar Occultations

    Science.gov (United States)

    Sicardy, Bruno; Rio de Janeiro Occultation Team, Granada Team, International Occultation and Timing Association, Royal Astronomical Society New Zealand Occultation Section, Lucky Star associated Teams

    2016-10-01

    Over the last three decades, some twenty stellar occultations by Pluto have been monitored from Earth. They occur when the dwarf planet blocks the light from a star for a few minutes as it moves on the sky. Such events led to the hint of a Pluto's atmosphere in 1985, that was fully confirmed during another occultation in 1988, but it was only in 2002 that a new occultation could be recorded. From then on, the dwarf planet started to move in front of the galactic center, which amplified by a large factor the number of events observable per year.Pluto occultations are essentially refractive events during which the stellar rays are bent by the tenuous atmosphere, causing a gradual dimming of the star. This provides the density, pressure and temperature profiles of the atmosphere from a few kilometers above the surface up to about 250 km altitude, corresponding respectively to pressure levels of about 10 and 0.1 μbar. Moreover, the extremely fine spatial resolution (a few km) obtained through this technique allows the detection of atmospheric gravity waves, and permits in principle the detection of hazes, if present.Several aspects make Pluto stellar occultations quite special: first, they are the only way to probe Pluto's atmosphere in detail, as the dwarf planet is far too small on the sky and the atmosphere is far too tenuous to be directly imaged from Earth. Second, they are an excellent example of participative science, as many amateurs have been able to record those events worldwide with valuable scientific returns, in collaboration with professional astronomers. Third, they reveal Pluto's climatic changes on decade-scales and constrain the various seasonal models currently explored.Finally, those observations are fully complementary to space exploration, in particular with the New Horizons (NH) mission. I will show how ground-based occultations helped to better calibrate some NH profiles, and conversely, how NH results provide some key boundary conditions

  10. Independet Component Analyses of Ground-based Exoplanetary Transits

    Science.gov (United States)

    Silva Martins-Filho, Walter; Griffith, Caitlin Ann; Pearson, Kyle; Waldmann, Ingo; Biddle, Lauren; Zellem, Robert Thomas; Alvarez-Candal, Alvaro

    2016-10-01

    Most observations of exoplanetary atmospheres are conducted when a "Hot Jupiter" exoplanet transits in front of its host star. These Jovian-sized planets have small orbital periods, on the order of days, and therefore a short transit time, making them more ameanable to observations. Measurements of Hot Jupiter transits must achieve a 10-4 level of accuracy in the flux to determine the spectral modulations of the exoplanetary atmosphere. In order to accomplish this level of precision, we need to extract systematic errors, and, for ground-based measurements, the effects of Earth's atmosphere, from the signal due to the exoplanet, which is several orders of magnitudes smaller. Currently, the effects of the terrestrial atmosphere and the some of the time-dependent systematic errors are treated by dividing the host star by a reference star at each wavelength and time step of the transit. More recently, Independent Component Analyses (ICA) have been used to remove systematic effects from the raw data of space-based observations (Waldmann 2014,2012; Morello et al.,2015,2016). ICA is a statistical method born from the ideas of the blind-source separation studies, which can be used to de-trend several independent source signals of a data set (Hyvarinen and Oja, 2000). One strength of this method is that it requires no additional prior knowledge of the system. Here, we present a study of the application of ICA to ground-based transit observations of extrasolar planets, which are affected by Earth's atmosphere. We analyze photometric data of two extrasolar planets, WASP-1b and GJ3470b, recorded by the 61" Kuiper Telescope at Stewart Observatory using the Harris B and U filters. The presentation will compare the light curve depths and their dispersions as derived from the ICA analysis to those derived by analyses that ratio of the host star to nearby reference stars.References: Waldmann, I.P. 2012 ApJ, 747, 12, Waldamann, I. P. 2014 ApJ, 780, 23; Morello G. 2015 ApJ, 806

  11. Research on Automatic Generation Technology of General Crystal Report%通用水晶报表自动生成技术研究

    Institute of Scientific and Technical Information of China (English)

    丛凤侠; 杨玉强

    2013-01-01

      针对水晶报表制作周期长、维护困难,难已满足用户个性化和不断变化需求的现状,文中研究自动生成技术,设计思路是由大型数据库支持前端程序运行,将报表的外观、结构、程序等信息存储在数据库中,运行时根据这些信息自动生成报表。首先进行界面设计,包括报表页眉节、页眉节、详细资料节、页脚节、报表页脚节;然后进行数据库设计,包括概念结构设计和逻辑结构设计;最后进行关键程序设计,包括主程序设计、设置字段设计、设置统计值设计。运用自动生成技术,提高了软件开发劳动生产率,改变了传统的软件开发模式。%For the current situation about long production cycle,difficult to maintain,difficult to meet users’ individual and changing needs of crystal reports,study automatic generation technology,the design idea is run by the large database front-end programs,and the report appearance,structure,procedures are stored in database,running to automatically generate reports based on this information. First conduct interface design,including the report header segment,header segment,details segment,page footer segment,report footer seg-ment;second,carry on database design,including conceptual structure design and logic structure design;finally,conduct key program de-sign,including main program design,set field design,set statistics design. Using the automatic generation technology,improve the labor productivity of software development,and has changed the traditional mode of software development.

  12. Semi-automatic ground truth generation using unsupervised clustering and limited manual labeling: Application to handwritten character recognition.

    Science.gov (United States)

    Vajda, Szilárd; Rangoni, Yves; Cecotti, Hubert

    2015-06-01

    For training supervised classifiers to recognize different patterns, large data collections with accurate labels are necessary. In this paper, we propose a generic, semi-automatic labeling technique for large handwritten character collections. In order to speed up the creation of a large scale ground truth, the method combines unsupervised clustering and minimal expert knowledge. To exploit the potential discriminant complementarities across features, each character is projected into five different feature spaces. After clustering the images in each feature space, the human expert labels the cluster centers. Each data point inherits the label of its cluster's center. A majority (or unanimity) vote decides the label of each character image. The amount of human involvement (labeling) is strictly controlled by the number of clusters - produced by the chosen clustering approach. To test the efficiency of the proposed approach, we have compared, and evaluated three state-of-the art clustering methods (k-means, self-organizing maps, and growing neural gas) on the MNIST digit data set, and a Lampung Indonesian character data set, respectively. Considering a k-nn classifier, we show that labeling manually only 1.3% (MNIST), and 3.2% (Lampung) of the training data, provides the same range of performance than a completely labeled data set would.

  13. Observing Tsunamis in the Ionosphere Using Ground Based GPS Measurements

    Science.gov (United States)

    Galvan, D. A.; Komjathy, A.; Song, Y. Tony; Stephens, P.; Hickey, M. P.; Foster, J.

    2011-01-01

    Ground-based Global Positioning System (GPS) measurements of ionospheric Total Electron Content (TEC) show variations consistent with atmospheric internal gravity waves caused by ocean tsunamis following recent seismic events, including the Tohoku tsunami of March 11, 2011. We observe fluctuations correlated in time, space, and wave properties with this tsunami in TEC estimates processed using JPL's Global Ionospheric Mapping Software. These TEC estimates were band-pass filtered to remove ionospheric TEC variations with periods outside the typical range of internal gravity waves caused by tsunamis. Observable variations in TEC appear correlated with the Tohoku tsunami near the epicenter, at Hawaii, and near the west coast of North America. Disturbance magnitudes are 1-10% of the background TEC value. Observations near the epicenter are compared to estimates of expected tsunami-driven TEC variations produced by Embry Riddle Aeronautical University's Spectral Full Wave Model, an atmosphere-ionosphere coupling model, and found to be in good agreement. The potential exists to apply these detection techniques to real-time GPS TEC data, providing estimates of tsunami speed and amplitude that may be useful for future early warning systems.

  14. Tissue Engineering of Cartilage on Ground-Based Facilities

    Science.gov (United States)

    Aleshcheva, Ganna; Bauer, Johann; Hemmersbach, Ruth; Egli, Marcel; Wehland, Markus; Grimm, Daniela

    2016-06-01

    Investigations under simulated microgravity offer the opportunity for a better understanding of the influence of altered gravity on cells and the scaffold-free three-dimensional (3D) tissue formation. To investigate the short-term influence, human chondrocytes were cultivated for 2 h, 4 h, 16 h, and 24 h on a 2D Fast-Rotating Clinostat (FRC) in DMEM/F-12 medium supplemented with 10 % FCS. We detected holes in the vimentin network, perinuclear accumulations of vimentin after 2 h, and changes in the chondrocytes shape visualised by F-actin staining after 4 h of FRC-exposure. Scaffold-free cultivation of chondrocytes for 7 d on the Random Positioning Machine (RPM), the FRC and the Rotating Wall Vessel (RWV) resulted in spheroid formation, a phenomenon already known from spaceflight experiments with chondrocytes (MIR Space Station) and thyroid cancer cells (SimBox/Shenzhou-8 space mission). The experiments enabled by the ESA-CORA-GBF programme gave us an optimal opportunity to study gravity-related cellular processes, validate ground-based facilities for our chosen cell system, and prepare long-term experiments under real microgravity conditions in space

  15. Theoretical validation of ground-based microwave ozone observations

    Directory of Open Access Journals (Sweden)

    P. Ricaud

    Full Text Available Ground-based microwave measurements of the diurnal and seasonal variations of ozoneat 42±4.5 and 55±8 km are validated by comparing with results from a zero-dimensional photochemical model and a two-dimensional (2D chemical/radiative/dynamical model, respectively. O3 diurnal amplitudes measured in Bordeaux are shown to be in agreement with theory to within 5%. For the seasonal analysis of O3 variation, at 42±4.5 km, the 2D model underestimates the yearly averaged ozone concentration compared with the measurements. A double maximum oscillation (~3.5% is measured in Bordeaux with an extended maximum in September and a maximum in February, whilst the 2D model predicts only a single large maximum (17% in August and a pronounced minimum in January. Evidence suggests that dynamical transport causes the winter O3 maximum by propagation of planetary waves, phenomena which are not explicitly reproduced by the 2D model. At 55±8 km, the modeled yearly averaged O3 concentration is in very good agreement with the measured yearly average. A strong annual oscillation is both measured and modeled with differences in the amplitude shown to be exclusively linked to temperature fields.

  16. Models of ionospheric VLF absorption of powerful ground based transmitters

    Science.gov (United States)

    Cohen, M. B.; Lehtinen, N. G.; Inan, U. S.

    2012-12-01

    Ground based Very Low Frequency (VLF, 3-30 kHz) radio transmitters play a role in precipitation of energetic Van Allen electrons. Initial analyses of the contribution of VLF transmitters to radiation belt losses were based on early models of trans-ionospheric propagation known as the Helliwell absorption curves, but some recent studies have found that the model overestimates (by 20-100 dB) the VLF energy reaching the magnetosphere. It was subsequently suggested that conversion of wave energy into electrostatic modes may be responsible for the error. We utilize a newly available extensive record of VLF transmitter energy reaching the magnetosphere, taken from the DEMETER satellite, and perform a direct comparison with a sophisticated full wave model of trans-ionospheric propagation. Although the model does not include the effect of ionospheric irregularities, it correctly predicts the average total power injected into the magnetosphere within several dB. The results, particularly at nighttime, appear to be robust against the variability of the ionospheric electron density. We conclude that the global effect of irregularity scattering on whistler mode conversion to quasi-electrostatic may be no larger than 6 dB.

  17. Atmospheric Refraction Path Integrals in Ground-Based Interferometry

    CERN Document Server

    Mathar, R J

    2004-01-01

    The basic effect of the earth's atmospheric refraction on telescope operation is the reduction of the true zenith angle to the apparent zenith angle, associated with prismatic aberrations due to the dispersion in air. If one attempts coherent superposition of star images in ground-based interferometry, one is in addition interested in the optical path length associated with the refracted rays. In a model of a flat earth, the optical path difference between these is not concerned as the translational symmetry of the setup means no net effect remains. Here, I evaluate these interferometric integrals in the more realistic arrangement of two telescopes located on the surface of a common earth sphere and point to a star through an atmosphere which also possesses spherical symmetry. Some focus is put on working out series expansions in terms of the small ratio of the baseline over the earth radius, which allows to bypass some numerics which otherwise is challenged by strong cancellation effects in building the opti...

  18. A comparative study of satellite and ground-based phenology.

    Science.gov (United States)

    Studer, S; Stöckli, R; Appenzeller, C; Vidale, P L

    2007-05-01

    Long time series of ground-based plant phenology, as well as more than two decades of satellite-derived phenological metrics, are currently available to assess the impacts of climate variability and trends on terrestrial vegetation. Traditional plant phenology provides very accurate information on individual plant species, but with limited spatial coverage. Satellite phenology allows monitoring of terrestrial vegetation on a global scale and provides an integrative view at the landscape level. Linking the strengths of both methodologies has high potential value for climate impact studies. We compared a multispecies index from ground-observed spring phases with two types (maximum slope and threshold approach) of satellite-derived start-of-season (SOS) metrics. We focus on Switzerland from 1982 to 2001 and show that temporal and spatial variability of the multispecies index correspond well with the satellite-derived metrics. All phenological metrics correlate with temperature anomalies as expected. The slope approach proved to deviate strongly from the temporal development of the ground observations as well as from the threshold-defined SOS satellite measure. The slope spring indicator is considered to indicate a different stage in vegetation development and is therefore less suited as a SOS parameter for comparative studies in relation to ground-observed phenology. Satellite-derived metrics are, however, very susceptible to snow cover, and it is suggested that this snow cover should be better accounted for by the use of newer satellite sensors.

  19. Satellite Type Estination from Ground-based Photometric Observation

    Science.gov (United States)

    Endo, T.; Ono, H.; Suzuki, J.; Ando, T.; Takanezawa, T.

    2016-09-01

    The optical photometric observation is potentially a powerful tool for understanding of the Geostationary Earth Orbit (GEO) objects. At first, we measured in laboratory the surface reflectance of common satellite materials, for example, Multi-layer Insulation (MLI), mono-crystalline silicon cells, and Carbon Fiber Reinforced Plastic (CFRP). Next, we calculated visual magnitude of a satellite by simplified shape and albedo. In this calculation model, solar panels have dimensions of 2 by 8 meters, and the bus area is 2 meters squared with measured optical properties described above. Under these conditions, it clarified the brightness can change the range between 3 and 4 magnitudes in one night, but color index changes only from 1 to 2 magnitudes. Finally, we observed the color photometric data of several GEO satellites visible from Japan multiple times in August and September 2014. We obtained that light curves of GEO satellites recorded in the B and V bands (using Johnson filters) by a ground-base optical telescope. As a result, color index changed approximately from 0.5 to 1 magnitude in one night, and the order of magnitude was not changed in all cases. In this paper, we briefly discuss about satellite type estimation using the relation between brightness and color index obtained from the photometric observation.

  20. Ground-based measurements of UV Index (UVI at Helwan

    Directory of Open Access Journals (Sweden)

    H. Farouk

    2012-12-01

    Full Text Available On October 2010 UV Index (UVI ground-based measurements were carried out by weather station at solar laboratory in NRIAG. The daily variation has maximum values in spring and summer days, while minimum values in autumn and winter days. The low level of UVI between 2.55 and 2.825 was found in December, January and February. The moderate level of UVI between 3.075 and 5.6 was found in March, October and November. The high level of UVI between 6.7 and 7.65 was found in April, May and September. The very high level of UVI between 8 and 8.6 was found in June, July and August. High level of radiation over 6 months per year including 3 months with a very high level UVI. According to the equation {UVI=a[SZA]b} the UVI increases with decreasing SZA by 82% on a daily scale and 88% on a monthly scale. Helwan exposure to a high level of radiation over 6 months per year including 3 months with a very high level UVI, so it is advisable not to direct exposure to the sun from 11 am to 2:00 pm.

  1. Optical turbulence forecast: toward a new era of ground-based astronomy

    CERN Document Server

    Masciadri, E

    2009-01-01

    The simulation of the optical turbulence (OT) for astronomical applications obtained with non-hydrostatic atmospherical models at meso-scale presents, with respect to measurements, some advantages. The future of the ground-based astronomy relies upon the potentialities and feasibility of the ELTs. Our ability in knowing, controlling and 'managing' the effects of the turbulence on such a new generation telescopes and facilities are determinant to assure their competitiveness with respect to the space astronomy. In the past several studies have been carried out proving the feasibility of the simulation of realistic Cn2 profiles above astronomical sites. The European Community (FP6 Program) decided recently to fund a Project aiming, from one side, to prove the feasibility of the OT forecasts and the ability of meso-scale models in discriminating astronomical sites from optical turbulence point of view and, from the other side, to boost the development of this discipline at the borderline between the astrophysics...

  2. Flight validation of ground-based assessment for control power requirements at high angles of attack

    Science.gov (United States)

    Ogburn, Marilyn E.; Ross, Holly M.; Foster, John V.; Pahle, Joseph W.; Sternberg, Charles A.; Traven, Ricardo; Lackey, James B.; Abbott, Troy D.

    1994-01-01

    A review is presented in viewgraph format of an ongoing NASA/U.S. Navy study to determine control power requirements at high angles of attack for the next generation high-performance aircraft. This paper focuses on recent flight test activities using the NASA High Alpha Research Vehicle (HARV), which are intended to validate results of previous ground-based simulation studies. The purpose of this study is discussed, and the overall program structure, approach, and objectives are described. Results from two areas of investigation are presented: (1) nose-down control power requirements and (2) lateral-directional control power requirements. Selected results which illustrate issues and challenges that are being addressed in the study are discussed including test methodology, comparisons between simulation and flight, and general lessons learned.

  3. Ultrahigh-resolution mapping of peatland microform using ground-based structure from motion with multiview stereo

    Science.gov (United States)

    Mercer, Jason J.; Westbrook, Cherie J.

    2016-11-01

    Microform is important in understanding wetland functions and processes. But collecting imagery of and mapping the physical structure of peatlands is often expensive and requires specialized equipment. We assessed the utility of coupling computer vision-based structure from motion with multiview stereo photogrammetry (SfM-MVS) and ground-based photos to map peatland topography. The SfM-MVS technique was tested on an alpine peatland in Banff National Park, Canada, and guidance was provided on minimizing errors. We found that coupling SfM-MVS with ground-based photos taken with a point and shoot camera is a viable and competitive technique for generating ultrahigh-resolution elevations (i.e., scientists' toolkit.

  4. 利用ArcIMS自动生成震中分布图%Automatic generation of epicenter image with ArcIMS

    Institute of Scientific and Technical Information of China (English)

    董星宏; 贾宁

    2011-01-01

    Using the function of the ArcIMS website publishing, we implement automatic generation of static epicenter image, and integrate it into the portal website′s management, enrich the content of the rapid earthquake information report, save the time for manually drawing the Epicenter figure.%利用ArcIMS的地图发布功能,较好地实现自动生成静态震中分布图的功能,并将该功能与门户网站集成起来,可丰富地震速报信息的内容,节约应急时期人工绘图的时间.

  5. 数据驱动的NC代码自动生成方法研究%Study in NC code generating automatically based on data driving

    Institute of Scientific and Technical Information of China (English)

    李克天; 何汉武; 王志坚; 郑德涛; 陈统坚

    2001-01-01

    提出了以数据驱动方式来代替常规的人机交互方式对制造模型进行处理,最终可自动生成NC代码。论述了数据驱动文件原理、表达规则、运行方式以及生成NC代码的过程。%It was presented that data driving method used in manufacturing model instead of the method of manually, and NC code could be generated automatically. Its principle, expressing rules, and NC code creating of the data driving file were discussed step by step.

  6. Observational Selection Effects with Ground-based Gravitational Wave Detectors

    Science.gov (United States)

    Chen, Hsin-Yu; Essick, Reed; Vitale, Salvatore; Holz, Daniel E.; Katsavounidis, Erik

    2017-01-01

    Ground-based interferometers are not perfect all-sky instruments, and it is important to account for their behavior when considering the distribution of detected events. In particular, the LIGO detectors are most sensitive to sources above North America and the Indian Ocean, and as the Earth rotates, the sensitive regions are swept across the sky. However, because the detectors do not acquire data uniformly over time, there is a net bias on detectable sources’ right ascensions. Both LIGO detectors preferentially collect data during their local night; it is more than twice as likely to be local midnight than noon when both detectors are operating. We discuss these selection effects and how they impact LIGO’s observations and electromagnetic (EM) follow-up. Beyond galactic foregrounds associated with seasonal variations, we find that equatorial observatories can access over 80% of the localization probability, while mid-latitudes will access closer to 70%. Facilities located near the two LIGO sites can observe sources closer to their zenith than their analogs in the south, but the average observation will still be no closer than 44° from zenith. We also find that observatories in Africa or the South Atlantic will wait systematically longer before they can begin observing compared to the rest of the world though, there is a preference for longitudes near the LIGOs. These effects, along with knowledge of the LIGO antenna pattern, can inform EM follow-up activities and optimization, including the possibility of directing observations even before gravitational-wave events occur.

  7. Ozone profiles above Kiruna from two ground-based radiometers

    Science.gov (United States)

    Ryan, Niall J.; Walker, Kaley A.; Raffalski, Uwe; Kivi, Rigel; Gross, Jochen; Manney, Gloria L.

    2016-09-01

    This paper presents new atmospheric ozone concentration profiles retrieved from measurements made with two ground-based millimetre-wave radiometers in Kiruna, Sweden. The instruments are the Kiruna Microwave Radiometer (KIMRA) and the Millimeter wave Radiometer 2 (MIRA 2). The ozone concentration profiles are retrieved using an optimal estimation inversion technique, and they cover an altitude range of ˜ 16-54 km, with an altitude resolution of, at best, 8 km. The KIMRA and MIRA 2 measurements are compared to each other, to measurements from balloon-borne ozonesonde measurements at Sodankylä, Finland, and to measurements made by the Microwave Limb Sounder (MLS) aboard the Aura satellite. KIMRA has a correlation of 0.82, but shows a low bias, with respect to the ozonesonde data, and MIRA 2 shows a smaller magnitude low bias and a 0.98 correlation coefficient. Both radiometers are in general agreement with each other and with MLS data, showing high correlation coefficients, but there are differences between measurements that are not explained by random errors. An oscillatory bias with a peak of approximately ±1 ppmv is identified in the KIMRA ozone profiles over an altitude range of ˜ 18-35 km, and is believed to be due to baseline wave features that are present in the spectra. A time series analysis of KIMRA ozone for winters 2008-2013 shows the existence of a local wintertime minimum in the ozone profile above Kiruna. The measurements have been ongoing at Kiruna since 2002 and late 2012 for KIMRA and MIRA 2, respectively.

  8. Project management for complex ground-based instruments: MEGARA plan

    Science.gov (United States)

    García-Vargas, María. Luisa; Pérez-Calpena, Ana; Gil de Paz, Armando; Gallego, Jesús; Carrasco, Esperanza; Cedazo, Raquel; Iglesias, Jorge

    2014-08-01

    The project management of complex instruments for ground-based large telescopes is a challenge itself. A good management is a clue for project success in terms of performance, schedule and budget. Being on time has become a strict requirement for two reasons: to assure the arrival at the telescope due to the pressure on demanding new instrumentation for this first world-class telescopes and to not fall in over-costs. The budget and cash-flow is not always the expected one and has to be properly handled from different administrative departments at the funding centers worldwide distributed. The complexity of the organizations, the technological and scientific return to the Consortium partners and the participation in the project of all kind of professional centers working in astronomical instrumentation: universities, research centers, small and large private companies, workshops and providers, etc. make the project management strategy, and the tools and procedures tuned to the project needs, crucial for success. MEGARA (Multi-Espectrógrafo en GTC de Alta Resolución para Astronomía) is a facility instrument of the 10.4m GTC (La Palma, Spain) working at optical wavelengths that provides both Integral-Field Unit (IFU) and Multi-Object Spectrograph (MOS) capabilities at resolutions in the range R=6,000-20,000. The project is an initiative led by Universidad Complutense de Madrid (Spain) in collaboration with INAOE (Mexico), IAA-CSIC (Spain) and Universidad Politécnica de Madrid (Spain). MEGARA is being developed under contract with GRANTECAN.

  9. Contract Automatic Generation Method Based on TASC Model%基于TASC模型的合同自动生成方法

    Institute of Scientific and Technical Information of China (English)

    张恒; 李自臣

    2011-01-01

    通过可信自治服务协同模型解决因自治个体行为难以预测和控制而导致的协同可信危机.在该模型下,Agent之间通过服务合同建立协同关系.提出一种有效的服务合同生成方法,利用服务分类体系、基于适用情景的服务发现机制和服务合同协议模板进行求解,以一种快速查找、可重用的方式生成合同,从而提高服务组合的自动化程度,并给出可信自治式服务协同过程和服务合同自动生成过程.%TASC model can solve the cooperation trust crisis result from the phenomenon that it is difficult to predict and control the autonomy individual behavior. Under this model, service contract establishes cooperation relations between Agents. This paper proposes an effective service contract generation method. It uses service classification system, service discovery mechanism for scenario and service contract protocol template,generates contract in a fast and reusable way, thus improve the level of automation, and gives the process of trust autonomy service cooperation and service contract automatic generation.

  10. 基于MATLAB的SVPWM算法自动代码生成技术研究%Search on Generating Code Automatically of SVPWM Based on MATLAB

    Institute of Scientific and Technical Information of China (English)

    杨蕊; 张建军; 马昭; 路瑜

    2015-01-01

    Because of DSP’s complicated programming,long developing term,a method is given by using MAT⁃LAB,CCS and its auxiliary software to generate SVPWM code automatically. MATLAB/Simulink is used to build a corresponding algorithmic model. After testifying the correct of this model,it generats codes,compilies automatical⁃ly and downloads them to DSP system to operate. The wave is identical with that of the theoretical result. Compared with manuel code,this method is easy to do,and has a short developing term and high efficiency of generating codes.%针对DSP编程复杂,开发周期长的问题,给出了一种运用MATLAB软件、Code Composer Studio(CCS)软件及其内嵌工具和链接软件自动生成SVPWM代码的方法。利用MATLAB/Simulink仿真软件建立了相应的算法模型,验证模型的正确性之后,自动生成代码,并编译、下载到DSP平台中运行,产生的波形与理论相吻合。与手工编写代码的方法相比较,该方法简单易行,开发周期短,生成代码效率高。

  11. FDM工艺中的支撑自动生成技术研究%Research on Support Automatically Generation Technology in Fused Deposition Modeling

    Institute of Scientific and Technical Information of China (English)

    何新英; 潘夕琪

    2012-01-01

    在FDM工艺中,基于其成型特点,在加工过程中需要添加支撑.支撑结构的合理性对成型件的精度和加工效率都有很大的影响.提出的基于扫描线的支撑自动生成技术生成支撑的速度快且无遗漏,在此基础上还能快速完成路径优化,实际应用效果较好.%According to the essential principle, the support must be generated in FDM prototyping. The reasonability of support structure has a great impact on the quality of these processes and the efficiency of prototyping. A new algorithm was described, in which support was automatically generated base on the linescan. It is proved the velocity of generating support is very fast and nowhere is missed. And optimization of the scanning road is very easy. The algorithm is robust and very efficient.

  12. Overview and Initial Results from the DEEPWAVE Airborne and Ground-Based Measurement Program

    Science.gov (United States)

    Fritts, D. C.

    2015-12-01

    The deep-propagating gravity wave experiment (DEEPWAVE) was performed on and over New Zealand, the Tasman Sea, and the Southern Ocean with core airborne measurements extending from 5 June to 21 July 2014 and supporting ground-based measurements spanning a longer interval. The NSF/NCAR GV employed standard flight-level measurements and new airborne lidar and imaging measurements of gravity waves (GWs) from sources at lower altitudes throughout the stratosphere and into the mesosphere and lower thermosphere (MLT). The new GV lidars included a Rayleigh lidar measuring atmospheric density and temperature from ~20-60 km and a sodium resonance lidar measuring sodium density and temperature at ~75-105 km. An airborne Advanced Mesosphere Temperature Mapper (AMTM) and two IR "wing" cameras imaged the OH airglow temperature and/or intensity fields extending ~900 km across the GV flight track. The DLR Falcon was equipped with its standard flight-level instruments and an aerosol Doppler lidar measuring radial winds below the Falcon. DEEPWAVE also included extensive ground-based measurements in New Zealand, Tasmania, and Southern Ocean Islands. DEEPWAVE performed 26 GV flights and 13 Falcon flights, and ground-based measurements occurred whether or not the aircraft were flying. Collectively, many diverse cases of GW forcing, propagation, refraction, and dissipation spanning altitudes of 0-100 km were observed. Examples include strong mountain wave (MW) forcing and breaking in the lower and middle stratosphere, weak MW forcing yielding MW penetration into the MLT having very large amplitudes and momentum fluxes, MW scales at higher altitudes ranging from ~10-250 km, large-scale trailing waves from orography refracting into the polar vortex and extending to high altitudes, GW generation by deep convection, large-scale GWs arising from jet stream sources, and strong MWs in the MLT arising from strong surface flow over a small island. DEEPWAVE yielded a number of surprises, among

  13. An autonomous receiver/digital signal processor applied to ground-based and rocket-borne wave experiments

    Science.gov (United States)

    Dombrowski, M. P.; LaBelle, J.; McGaw, D. G.; Broughton, M. C.

    2016-07-01

    The programmable combined receiver/digital signal processor platform presented in this article is designed for digital downsampling and processing of general waveform inputs with a 66 MHz initial sampling rate and multi-input synchronized sampling. Systems based on this platform are capable of fully autonomous low-power operation, can be programmed to preprocess and filter the data for preselection and reduction, and may output to a diverse array of transmission or telemetry media. We describe three versions of this system, one for deployment on sounding rockets and two for ground-based applications. The rocket system was flown on the Correlation of High-Frequency and Auroral Roar Measurements (CHARM)-II mission launched from Poker Flat Research Range, Alaska, in 2010. It measured auroral "roar" signals at 2.60 MHz. The ground-based systems have been deployed at Sondrestrom, Greenland, and South Pole Station, Antarctica. The Greenland system synchronously samples signals from three spaced antennas providing direction finding of 0-5 MHz waves. It has successfully measured auroral signals and man-made broadcast signals. The South Pole system synchronously samples signals from two crossed antennas, providing polarization information. It has successfully measured the polarization of auroral kilometric radiation-like signals as well as auroral hiss. Further systems are in development for future rocket missions and for installation in Antarctic Automatic Geophysical Observatories.

  14. Production optimization of 99Mo/99mTc zirconium molybate gel generators at semi-automatic device: DISIGEG.

    Science.gov (United States)

    Monroy-Guzman, F; Rivero Gutiérrez, T; López Malpica, I Z; Hernández Cortes, S; Rojas Nava, P; Vazquez Maldonado, J C; Vazquez, A

    2012-01-01

    DISIGEG is a synthesis installation of zirconium (99)Mo-molybdate gels for (99)Mo/(99m)Tc generator production, which has been designed, built and installed at the ININ. The device consists of a synthesis reactor and five systems controlled via keyboard: (1) raw material access, (2) chemical air stirring, (3) gel dried by air and infrared heating, (4) moisture removal and (5) gel extraction. DISIGEG operation is described and dried condition effects of zirconium (99)Mo- molybdate gels on (99)Mo/(99m)Tc generator performance were evaluated as well as some physical-chemical properties of these gels. The results reveal that temperature, time and air flow applied during the drying process directly affects zirconium (99)Mo-molybdate gel generator performance. All gels prepared have a similar chemical structure probably constituted by three-dimensional network, based on zirconium pentagonal bipyramids and molybdenum octahedral. Basic structural variations cause a change in gel porosity and permeability, favouring or inhibiting (99m)TcO(4)(-) diffusion into the matrix. The (99m)TcO(4)(-) eluates produced by (99)Mo/(99m)Tc zirconium (99)Mo-molybdate gel generators prepared in DISIGEG, air dried at 80°C for 5h and using an air flow of 90mm, satisfied all the Pharmacopoeias regulations: (99m)Tc yield between 70-75%, (99)Mo breakthrough less than 3×10(-3)%, radiochemical purities about 97% sterile and pyrogen-free eluates with a pH of 6.

  15. High Resolution Spectral Analysis of Hiss and Chorus Emissions in Ground Based Data

    Science.gov (United States)

    Hosseini Aliabad, S. P.; Golkowski, M.; Gibby, A. R.

    2015-12-01

    The dynamic evolution of the radiation belts is believed to be controlled in large part by two separate but related classes of naturally occurring plasma waves: ELF/VLF chorus and hiss emissions. Although whistler mode chorus has been extensively studied since the first reports by Storey in 1953, the source mechanism and properties are still subjects of active research. Moreover, the origin of plasmaspheric hiss, the electromagnetic emission believed to be responsible for the gap between the inner and outer radiation belts, has been debated for over four decades. Although these waves can be observed in situ on spacecraft, ground-based observing stations can provide orders of magnitude higher data volumes and decades long data coverage essential for certain long-term and statistical studies of wave properties. Recent observational and theoretical works suggest that high resolution analysis of the spectral features of both hiss and chorus emissions can provide insight into generation processes and be used to validate existing theories. Application of the classic Fourier (FFT) technique unfortunately yields a tradeoff between time and frequency resolution. In additional to Fourier spectra, we employ novel methods to make spectrograms with high time and frequency resolutions, independently using minimum variance distortionless response (MVDR). These techniques are applied to ground based data observations of hiss and chorus made in Alaska. Plasmaspheric hiss has been widely regarded as a broadband, structure less, incoherent emission. We quantify the extent to which plasmaspheric hiss can be a coherent emission with complex fine structure. Likewise, to date, researchers have differentiated between hiss and chorus coherency primarily using qualitative "naked eye" approaches to amplitude spectra. Using a quantitative approach to observed amplitude and we present more rigorous classification criteria for these emissions.

  16. LanHEP—a package for the automatic generation of Feynman rules in field theory. Version 3.0

    Science.gov (United States)

    Semenov, A. V.

    2009-03-01

    The LanHEP program version 3.0 for Feynman rules generation from the Lagrangian is described. It reads the Lagrangian written in a compact form, close to the one used in publications. It means that Lagrangian terms can be written with summation over indices of broken symmetries and using special symbols for complicated expressions, such as covariant derivative and strength tensor for gauge fields. Supersymmetric theories can be described using the superpotential formalism and the 2-component fermion notation. The output is Feynman rules in terms of physical fields and independent parameters in the form of CompHEP model files, which allows one to start calculations of processes in the new physical model. Alternatively, Feynman rules can be generated in FeynArts format or as LaTeX table. One-loop counterterms can be generated in FeynArts format. Program summaryProgram title: LanHEP Catalogue identifier: ADZV_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AECH_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 83 041 No. of bytes in distributed program, including test data, etc.: 1 090 931 Distribution format: tar.gz Programming language: C Computer: PC Operating system: Linux RAM: 2 MB (SM), 12 MB (MSSM), 120 MB (MSSM with counterterms) Classification: 4.4 Nature of problem: Deriving Feynman rules from the Lagrangian Solution method: The program reads the Lagrangian written in a compact form, close to the one used in publications. It means that Lagrangian terms can be written with summation over indices of broken symmetries and using special symbols for complicated expressions, such as covariant derivative and strength tensor for gauge fields. Tools for checking the correctness of the model, and for simplifying the output expressions are provided. The output is

  17. 模型驱动下的数据库自动生成%Automatic Generation of Database Based on Model Driven

    Institute of Scientific and Technical Information of China (English)

    王海林

    2011-01-01

    为提高软件开发效率,提出模型驱动下的数据库自动生成方法.该方法以MetaEdit+作为元建模工具,由领域专家建立领域元模型和模型,通过生成器定义语言MERL,软件开发人员可以很方便地设计代码生成器,直接从领域专家所建立的图形领域模型生成Java程序代码,并运行已生成的程序代码进而生成数据库.通过一个实例详细介绍了数据库概念模型元模型设计、E-R模型设计并给出从E-R模型到Java代码的生成器设计.经测试,所生成的Java程序代码可以在Windows操作系统环境的Java平台上运行,并能正确生成Oracle 10g数据库实例.%In order to promote software's development efficiency, is proposes an approach of database, automatic generation based on model driven. This approach takes MetaEdit-t- as a meta-modeling tool. Domain meta-models and models are established by domain experts. By using generator definition language MERL, the software developers can design code generators very conveniently. From the graph domain models which are established by domain experts, code generators can generate Java codes directly and then the generated codes can produce database. It introduces in detail design of database conceptual meta-model, E-R model and generators for Java code through an instance. The test result indicates that the generated Java codes can run correctly on Java platforms in Windows operating system environment and then can produce a database instance in Oracle lOg.

  18. Design concepts for the Cherenkov Telescope Array CTA: an advanced facility for ground-based high-energy gamma-ray astronomy

    OpenAIRE

    Actis, M.; Agnetta, G.; Aharonian, F.; Akhperjanian, A.; Aleksić, J.; Aliu, E.; Allan, D.; Allekotte, I.; Antico, F.; Antonelli, L.A.; Antoranz, P.; Aravantinos, A.; Arlen, T.; Arnaldi, H.; Artmann, S

    2011-01-01

    Ground-based gamma-ray astronomy has had a major breakthrough with the impressive results obtained using systems of imaging atmospheric Cherenkov telescopes. Ground-based gamma-ray astronomy has a huge potential in astrophysics, particle physics and cosmology. CTA is an international initiative to build the next generation instrument, with a factor of 5-10 improvement in sensitivity in the 100 GeV to 10 TeV range and the extension to energies well below 100 GeV and above 100 TeV. CTA will con...

  19. Ground Based Investigation of Electrostatic Accelerometer in HUST

    Science.gov (United States)

    Bai, Y.; Zhou, Z.

    2013-12-01

    High-precision electrostatic accelerometers with six degrees of freedom (DOF) acceleration measurement were successfully used in CHAMP, GRACE and GOCE missions which to measure the Earth's gravity field. In our group, space inertial sensor based on the capacitance transducer and electrostatic control technique has been investigated for test of equivalence principle (TEPO), searching non-Newtonian force in micrometer range, and satellite Earth's field recovery. The significant techniques of capacitive position sensor with the noise level at 2×10-7pF/Hz1/2 and the μV/Hz1/2 level electrostatic actuator are carried out and all the six servo loop controls by using a discrete PID algorithm are realized in a FPGA device. For testing on ground, in order to compensate one g earth's gravity, the fiber torsion pendulum facility is adopt to measure the parameters of the electrostatic controlled inertial sensor such as the resolution, and the electrostatic stiffness, the cross couple between different DOFs. A short distance and a simple double capsule equipment the valid duration about 0.5 second is set up in our lab for the free fall tests of the engineering model which can directly verify the function of six DOF control. Meanwhile, high voltage suspension method is also realized and preliminary results show that the horizontal axis of acceleration noise is about 10-8m/s2/Hz1/2 level which limited mainly by the seismic noise. Reference: [1] Fen Gao, Ze-Bing Zhou, Jun Luo, Feasibility for Testing the Equivalence Principle with Optical Readout in Space, Chin. Phys. Lett. 28(8) (2011) 080401. [2] Z. Zhu, Z. B. Zhou, L. Cai, Y. Z. Bai, J. Luo, Electrostatic gravity gradiometer design for the advanced GOCE mission, Adv. Sp. Res. 51 (2013) 2269-2276. [3] Z B Zhou, L Liu, H B Tu, Y Z Bai, J Luo, Seismic noise limit for ground-based performance measurements of an inertial sensor using a torsion balance, Class. Quantum Grav. 27 (2010) 175012. [4] H B Tu, Y Z Bai, Z B Zhou, L Liu, L

  20. Ground-Based Observing Campaign of Briz-M Debris

    Science.gov (United States)

    Lederer, S. M.; Buckalew, B.; Frith, J.; Cowardin, H. M.; Hickson, P.; Matney, M.; Anz-Meador, P.

    2017-01-01

    In 2015, NASA's Orbital Debris Program Office (ODPO) completed the installation of the Meter Class Autonomous Telescope (MCAT) on Ascension Island. MCAT is a 1.3m optical telescope designed with a fast tracking capability for observing orbital debris at all orbital regimes (Low-Erath orbits to Geosyncronous (GEO) orbits) from a low latitude site. This new asset is dedicated year-round for debris observations, and its location fills a geographical gap in the Ground-based Electro Optical Space Surveillance (GEODSS) network. A commercial off the shelf (COTS) research grade 0.4m telescope (named the Benbrook telescope) will also be installed on Ascension at the end of 2016. This smaller version is controlled by the same master software, designed by Euclid Research, and can be tasked to work independently or in concert with MCAT. Like MCAT, it has a the same suite of filters, a similar field of view, and a fast-tracking Astelco mount, and is also capable of tracking debris at all orbital regimes. These assets are well suited for targeted campagins or surveys of debris. Since 2013, NASA's ODPO has also had extensive access to the 3.8m infrared UKIRT telescope, located on Mauna Kea. At nearly 14,000-ft, this site affords excellent conditions for collecting both photometery and spectroscopy at near-IR (0.9 - 2.5 micrometers SWIR) and thermal-IR (8 - 25 micrometers; LWIR) regimes, ideal for investigating material properties as well as thermal characteristics and sizes of debris. For the purposes of understanding orbital debris, taking data in both survey mode as well as targeting individual objects for more in-depth characterizations are desired. With the recent break-ups of Briz-M rocket bodies, we have collected a suite of data in the optical, near-infrared, and mid-infrared of in-tact objects as well as those classified as debris. A break-up at GEO of a Briz-M rocket occurred in January, 2016, well timed for the first remote observing survey-campaign with MCAT. Access to

  1. SU-E-J-141: Comparison of Dose Calculation On Automatically Generated MRBased ED Maps and Corresponding Patient CT for Clinical Prostate EBRT Plans

    Energy Technology Data Exchange (ETDEWEB)

    Schadewaldt, N; Schulz, H; Helle, M; Renisch, S [Philips Research Laboratories Hamburg, Hamburg (Germany); Frantzen-Steneker, M; Heide, U [The Netherlands Cancer Institute, Amsterdam (Netherlands)

    2014-06-01

    Purpose: To analyze the effect of computing radiation dose on automatically generated MR-based simulated CT images compared to true patient CTs. Methods: Six prostate cancer patients received a regular planning CT for RT planning as well as a conventional 3D fast-field dual-echo scan on a Philips 3.0T Achieva, adding approximately 2 min of scan time to the clinical protocol. Simulated CTs (simCT) where synthesized by assigning known average CT values to the tissue classes air, water, fat, cortical and cancellous bone. For this, Dixon reconstruction of the nearly out-of-phase (echo 1) and in-phase images (echo 2) allowed for water and fat classification. Model based bone segmentation was performed on a combination of the DIXON images. A subsequent automatic threshold divides into cortical and cancellous bone. For validation, the simCT was registered to the true CT and clinical treatment plans were re-computed on the simCT in pinnacle{sup 3}. To differentiate effects related to the 5 tissue classes and changes in the patient anatomy not compensated by rigid registration, we also calculate the dose on a stratified CT, where HU values are sorted in to the same 5 tissue classes as the simCT. Results: Dose and volume parameters on PTV and risk organs as used for the clinical approval were compared. All deviations are below 1.1%, except the anal sphincter mean dose, which is at most 2.2%, but well below clinical acceptance threshold. Average deviations are below 0.4% for PTV and risk organs and 1.3% for the anal sphincter. The deviations of the stratifiedCT are in the same range as for the simCT. All plans would have passed clinical acceptance thresholds on the simulated CT images. Conclusion: This study demonstrated the clinical usability of MR based dose calculation with the presented Dixon acquisition and subsequent fully automatic image processing. N. Schadewaldt, H. Schulz, M. Helle and S. Renisch are employed by Phlips Technologie Innovative Techonologies, a

  2. Design of Automatic Solar Photovoltaic Power Generation Device%全自动光伏发电装置的设计

    Institute of Scientific and Technical Information of China (English)

    周天沛

    2011-01-01

    This paper introduces an automatic following system which takes DS 1302 and AT89S51 as the core. The stepper motor is controlled by using the method of the comparison of photoresistors, and by using the stepper motor to drive the biaxial mechanical tracking system to assure the solar panels vertical to the solar incident angle, thus improve the conversion efficiency of the solar energy. Everyday the device can return the initial sitting after tracking, and in the second day,it will continue track automatically to remove the accumulative error. The results show that the device generates more electricity than the fixed photovoltaic system continuously during the period of the experiment and its total power production is approximately 3.12 times of the latter. The anticipative targets is reached. It has widespread applicable potentialities.%设计了一套以DS1302和AT89S51单片机为核心的全自动跟踪光伏发电装置,采用光敏电阻比较法对步进电机进行控制,并利用步进电机驱动双轴机械驱动定位系统,使太阳能电池板始终垂直于太阳入射角,从而提高太阳能的转换效率.该装置能够在每天跟踪结束后自动回到初始位置,第二天继续自动跟踪,从而消除积累误差.实验证明,该装置在测量时段内的发电量始终明显大于固定光伏系统,是后者的3.12倍,达到预期的性能指标,具有广泛的应用潜力.

  3. Design of Controller for Automatic Tracking Solar Power Generation%全天候太阳能自动跟踪系统装置的研究

    Institute of Scientific and Technical Information of China (English)

    郑锋; 王炜灵; 陈健强; 陈泽群; 张晓薇

    2014-01-01

    本文提出了一种全天候太阳能自动跟踪系统。在检测系统上,硬件方面使用实际的光电跟踪模型,软件上设置视日运动轨迹跟踪程序;在控制系统上,采用双轴跟踪的机械传动机构,通过驱动直流电机调整太阳能板的最佳位置,并通过传功装置实现单台电机带动整排太阳能电池板的联动;针对阴雨天和狂风天气控制系统做出一系列的预防措施。本装置旨在全天采光发电,结构简单、能耗低、效率高。%The principle and structure of an intelligent automatic solar tracker are proposed. For testing system, a modelofphotoelectric tracing as hardware is used to track light while the device sets up a program to analysis the movement of the light as software. For controlling system, the controller has a two-axis tracker for mechanical design,and promote the whole row of solar panel linked by linkage. The controller drives the stepping motor to adjust the position of the solar panel to follow the sunlight,and. Other actions are taken to avoid the rain and strong wind. And the intelligent energy-saving design is involved. The simple-designed and energy-saving automatic tracking solar power generation is expected to work in days with highly efficiency.

  4. The ear, the eye, earthquakes and feature selection: listening to automatically generated seismic bulletins for clues as to the differences between true and false events.

    Science.gov (United States)

    Kuzma, H. A.; Arehart, E.; Louie, J. N.; Witzleben, J. L.

    2012-04-01

    Listening to the waveforms generated by earthquakes is not new. The recordings of seismometers have been sped up and played to generations of introductory seismology students, published on educational websites and even included in the occasional symphony. The modern twist on earthquakes as music is an interest in using state-of-the-art computer algorithms for seismic data processing and evaluation. Algorithms such as such as Hidden Markov Models, Bayesian Network models and Support Vector Machines have been highly developed for applications in speech recognition, and might also be adapted for automatic seismic data analysis. Over the last three years, the International Data Centre (IDC) of the Comprehensive Test Ban Treaty Organization (CTBTO) has supported an effort to apply computer learning and data mining algorithms to IDC data processing, particularly to the problem of weeding through automatically generated event bulletins to find events which are non-physical and would otherwise have to be eliminated by the hand of highly trained human analysts. Analysts are able to evaluate events, distinguish between phases, pick new phases and build new events by looking at waveforms displayed on a computer screen. Human ears, however, are much better suited to waveform processing than are the eyes. Our hypothesis is that combining an auditory representation of seismic events with visual waveforms would reduce the time it takes to train an analyst and the time they need to evaluate an event. Since it takes almost two years for a person of extraordinary diligence to become a professional analyst and IDC contracts are limited to seven years by Treaty, faster training would significantly improve IDC operations. Furthermore, once a person learns to distinguish between true and false events by ear, various forms of audio compression can be applied to the data. The compression scheme which yields the smallest data set in which relevant signals can still be heard is likely an

  5. 改进的蠕虫特征自动提取模型及算法设计%Improved Automatic Generation Model of Worm Signatures

    Institute of Scientific and Technical Information of China (English)

    汪颖; 康松林

    2012-01-01

    This paper presents an worm signatures automatic generation model based on sequence aligment, it uses unified tilting and clustering processing to enhance the suspicious traffic sample's purtity, and with the modified T-Coffe multiple sequence alignment algorithm to generate worms signature. For comparative analysis of the signature generation model, this paper use two popular kinds of algorithms---Apache-Knacker algorithm and Hamsa algorithnr--to capture the signature of Apache-Knacker and TSIG worms virus. According to the experiment result, the signature generation model which are proposed in this paper is superior to the other two kinds of technology.%文章提出了一种基于序列比对的蠕虫特征自动提取模型,该模型针对现有蠕虫特征自动提取系统的可疑蠕虫样本流量单来源和粗预处理等问题,提出了对网络边界可疑流量和蜜罐捕获网络流量统一的聚类预处理,并使用改进的T-Coffee多序列比对算法进行蠕虫特征提取。实验分别对Apache-Knacker和TSIG这两种蠕虫病毒进行特征提取,从实验结果可以看出文章提出的模型产生的特征质量优于比较流行的Polygraph、Hamsa两种技术。

  6. 基于Matlab模糊控制器HDL代码的自动生成%Fuzzy Controller HDL Code Automatically Generated Based on Matlab

    Institute of Scientific and Technical Information of China (English)

    诸葛俊贵

    2012-01-01

    This article, taking the Water Level Control in a Tank as an example, proposes a fuzzy controller automatic HDL code generation method based on Matlab. The generated code can be ported to the FPGA control system. The method is divided into four steps : ( 1 ) The fuzzy controller is designed using Fuzzy Logic Toolbox in the Matlab. (2) The fuzzy controller is transformed into the form of Lookup Table. (3) The controller Lookup table is implemented with the state machine. (4) The fuzzy controller implementation with state machine was translated into HDL code by HDL Coder.%以水箱液位控制为例,提出了一种基于Maflab的模糊控制器HDL代码自动生成方法,生成的代码可以移植到FPGA控制系统上。该方法分4个步骤:(1)利用Maflab的FuzzyLogic工具箱设计模糊控制器。(2)将模糊控制器转换为LookupTable的形式。(3)将制作好的控制器查找表用状态机实现。(4)用HDLCoder将状态机实现的模糊控制器翻译成HDL代码。

  7. 一个XML文档自动生成器模型设计%DESIGN OF A MODEL OF XML DOCUMENTS' AUTOMATIC GENERATOR

    Institute of Scientific and Technical Information of China (English)

    孙煜飞; 马良荔; 吴清怡

    2013-01-01

    XML作为交互性电子技术手册创作中技术数据的描述方法,是实现技术信息共享和交换的关键技术.基于相应的XML-Schema约束,设计一个选择数据模块类别、提交数据文档,输出相应的XML文档的自动生成器模型.有效解决了XML编辑过程中的繁琐录入过程,为开发类似的数据模块编辑器提供思路.%As the description way of technical data in compiling Interactive Electronic Technical Manuals (IETM) , XML is the key technology to realise the sharing and exchange of technical information. Based on corresponding XML-Schema constraints, we design an automatic generator model, which can select data module category and submit data files, as well as export relevant XML documents. This generator effectively solves the complicated inputting process in XML documents editing, and provides an idea for the development of similar data module editors.

  8. Automatic generation of boundary conditions using Demons non-rigid image registration for use in 3D modality-independent elastography

    Science.gov (United States)

    Pheiffer, Thomas S.; Ou, Jao J.; Miga, Michael I.

    2010-02-01

    Modality-independent elastography (MIE) is a method of elastography that reconstructs the elastic properties of tissue using images acquired under different loading conditions and a biomechanical model. Boundary conditions are a critical input to the algorithm, and are often determined by time-consuming point correspondence methods requiring manual user input. Unfortunately, generation of accurate boundary conditions for the biomechanical model is often difficult due to the challenge of accurately matching points between the source and target surfaces and consequently necessitates the use of large numbers of fiducial markers. This study presents a novel method of automatically generating boundary conditions by non-rigidly registering two image sets with a Demons diffusion-based registration algorithm. The use of this method was successfully performed in silico using magnetic resonance and X-ray computed tomography image data with known boundary conditions. These preliminary results have produced boundary conditions with accuracy of up to 80% compared to the known conditions. Finally, these boundary conditions were utilized within a 3D MIE reconstruction to determine an elasticity contrast ratio between tumor and normal tissue. Preliminary results show a reasonable characterization of the material properties on this first attempt and a significant improvement in the automation level and viability of the method.

  9. Automatic generation of attack vectors for stored-XSS%存储型XSS攻击向量自动化生成技术

    Institute of Scientific and Technical Information of China (English)

    陈景峰; 王一丁; 张玉清; 刘奇旭

    2012-01-01

    针对危害性最为严重的存储型XSS漏洞的特点及其触发方式,设计并实现了一款自动生成存储型XSS攻击向量的工具.使用该工具对中国2个大型视频分享网站的日志发布系统进行测试,发现6类导致存储型XSS漏洞的攻击向量.实验结果验证了该方法及测试工具的有效性,并说明中国视频网站仍存在着较大安全隐患.%The stored-XSS (cross-site scripting) is generally more serious than the other modalities of XSS. We study the characteristics and trigger mechanism of stored-XSS, propose an generation method of attack vectors for stored-XSS, and accomplish a tool which can generate the attack vectors automatically. After we used this tool in testing the blog systems of two popular video-sharing sites in China, we found 6 types of attcak vectors which can trigger stored-XSS. The results of the testing experiments show the effectiveness of our method and also show the potential security risk in the video-sharing sites.

  10. Automatic Camera Control

    DEFF Research Database (Denmark)

    Burelli, Paolo; Preuss, Mike

    2014-01-01

    Automatically generating computer animations is a challenging and complex problem with applications in games and film production. In this paper, we investigate howto translate a shot list for a virtual scene into a series of virtual camera configurations — i.e automatically controlling the virtual...... camera. We approach this problem by modelling it as a dynamic multi-objective optimisation problem and show how this metaphor allows a much richer expressiveness than a classical single objective approach. Finally, we showcase the application of a multi-objective evolutionary algorithm to generate a shot...

  11. Design and Implementation of the Parameterized Standard Graphics Automatic Generating System%参数化标准件图形自动生成系统的设计与实现

    Institute of Scientific and Technical Information of China (English)

    赵明洁; 徐岩

    2012-01-01

    结合标准件自动绘图系统的具体要求及不同标准件的数据特点,在AutoCAD环境下,使用AutoLISP语言开发了参数化标准件图形自动生成系统.该系统采用交互式输入、程序化处理和数据文件管理等方式处理数据,可以自动生成各种标准件图形.%Combined the specific requirements of automatic drawing system of standard parts with the data characteristics of different standard parts, the parameterized standard graphics automatic generating system is developed in AutoLISP language under AutoCAD environment. By the way of interactive input, procedure processing and data file management, the system can generate the graphics of various standard parts automatically.

  12. Production optimization of {sup 99}Mo/{sup 99m}Tc zirconium molybate gel generators at semi-automatic device: DISIGEG

    Energy Technology Data Exchange (ETDEWEB)

    Monroy-Guzman, F., E-mail: fabiola.monroy@inin.gob.mx [Instituto Nacional de Investigaciones Nucleares, Carretera Mexico-Toluca S/N, La Marquesa, Ocoyoacac, 52750, Estado de Mexico (Mexico); Rivero Gutierrez, T., E-mail: tonatiuh.rivero@inin.gob.mx [Instituto Nacional de Investigaciones Nucleares, Carretera Mexico-Toluca S/N, La Marquesa, Ocoyoacac, 52750, Estado de Mexico (Mexico); Lopez Malpica, I.Z.; Hernandez Cortes, S.; Rojas Nava, P.; Vazquez Maldonado, J.C. [Instituto Nacional de Investigaciones Nucleares, Carretera Mexico-Toluca S/N, La Marquesa, Ocoyoacac, 52750, Estado de Mexico (Mexico); Vazquez, A. [Instituto Mexicano del Petroleo, Eje Central Norte Lazaro Cardenas 152, Col. San Bartolo Atepehuacan, 07730, Mexico D.F. (Mexico)

    2012-01-15

    DISIGEG is a synthesis installation of zirconium {sup 99}Mo-molybdate gels for {sup 99}Mo/{sup 99m}Tc generator production, which has been designed, built and installed at the ININ. The device consists of a synthesis reactor and five systems controlled via keyboard: (1) raw material access, (2) chemical air stirring, (3) gel dried by air and infrared heating, (4) moisture removal and (5) gel extraction. DISIGEG operation is described and dried condition effects of zirconium {sup 99}Mo- molybdate gels on {sup 99}Mo/{sup 99m}Tc generator performance were evaluated as well as some physical-chemical properties of these gels. The results reveal that temperature, time and air flow applied during the drying process directly affects zirconium {sup 99}Mo-molybdate gel generator performance. All gels prepared have a similar chemical structure probably constituted by three-dimensional network, based on zirconium pentagonal bipyramids and molybdenum octahedral. Basic structural variations cause a change in gel porosity and permeability, favouring or inhibiting {sup 99m}TcO{sub 4}{sup -} diffusion into the matrix. The {sup 99m}TcO{sub 4}{sup -} eluates produced by {sup 99}Mo/{sup 99m}Tc zirconium {sup 99}Mo-molybdate gel generators prepared in DISIGEG, air dried at 80 Degree-Sign C for 5 h and using an air flow of 90 mm, satisfied all the Pharmacopoeias regulations: {sup 99m}Tc yield between 70-75%, {sup 99}Mo breakthrough less than 3 Multiplication-Sign 10{sup -3}%, radiochemical purities about 97% sterile and pyrogen-free eluates with a pH of 6. - Highlights: Black-Right-Pointing-Pointer {sup 99}Mo/{sup 99m}Tc generators based on {sup 99}Mo-molybdate gels were synthesized at a semi-automatic device. Black-Right-Pointing-Pointer Generator performances depend on synthesis conditions of the zirconium {sup 99}Mo-molybdate gel. Black-Right-Pointing-Pointer {sup 99m}TcO{sub 4}{sup -} diffusion and yield into generator depends on gel porosity and permeability. Black

  13. Automatic Generation of Agents using Reusable Soft Computing Code Libraries to develop Multi Agent System for Healthcare

    Directory of Open Access Journals (Sweden)

    Priti Srinivas Sajja

    2015-04-01

    Full Text Available This paper illustrates architecture for a multi agent system in healthcare domain. The architecture is generic and designed in form of multiple layers. One of the layers of the architecture contains many proactive, co-operative and intelligent agents such as resource management agent, query agent, pattern detection agent and patient management agent. Another layer of the architecture is a collection of libraries to auto-generate code for agents using soft computing techniques. At this stage, codes for artificial neural network and fuzzy logic are developed and encompassed in this layer. The agents use these codes for development of neural network, fuzzy logic or hybrid solutions such as neuro-fuzzy solution. Third layer encompasses knowledge base, metadata and other local databases. The multi layer architecture is supported by personalized user interfaces for friendly interaction with its users. The framework is generic, flexible, and designed for a distributed environment like the Web; with minor modifications it can be employed on grid or cloud platform. The paper also discusses detail design issues, suitable applications and future enhancement of the work.

  14. Automatable on-line generation of calibration curves and standard additions in solution-cathode glow discharge optical emission spectrometry

    Science.gov (United States)

    Schwartz, Andrew J.; Ray, Steven J.; Hieftje, Gary M.

    2015-03-01

    Two methods are described that enable on-line generation of calibration standards and standard additions in solution-cathode glow discharge optical emission spectrometry (SCGD-OES). The first method employs a gradient high-performance liquid chromatography pump to perform on-line mixing and delivery of a stock standard, sample solution, and diluent to achieve a desired solution composition. The second method makes use of a simpler system of three peristaltic pumps to perform the same function of on-line solution mixing. Both methods can be computer-controlled and automated, and thereby enable both simple and standard-addition calibrations to be rapidly performed on-line. Performance of the on-line approaches is shown to be comparable to that of traditional methods of sample preparation, in terms of calibration curves, signal stability, accuracy, and limits of detection. Potential drawbacks to the on-line procedures include signal lag between changes in solution composition and pump-induced multiplicative noise. Though the new on-line methods were applied here to SCGD-OES to improve sample throughput, they are not limited in application to only SCGD-OES-any instrument that samples from flowing solution streams (flame atomic absorption spectrometry, ICP-OES, ICP-mass spectrometry, etc.) could benefit from them.

  15. Unattended instruments for ground-based hyperspectral measurements: development and application for plant photosynthesis monitoring

    Science.gov (United States)

    Cogliati, S.; Rossini, M.; Meroni, M.; Barducci, A.; Julitta, T.; Colombo, R.

    2011-12-01

    The aim of the present work is the development of ground-based hyperspectral systems capable of collecting continuous and long-term hyperspectral measurements of the Earth-surface. The development of such instruments includes the optical design, the development of the data acquisition (Auto3S) and processing software as well as the definition of the calibration procedures. In particular an in-field calibration methodologie based on the comparison between field spectra and data modeled using Radiative Transfer (RT) approach has been proposed to regularly upgrade instrument calibration coefficients. Two different automatic spectrometric systems have been developed: the HyperSpectral Irradiometer (HSI) [Meroni et al., 2011] and the Multiplexer Radiometer Irradiometer (MRI) [Cogliati, 2011]. Both instruments are able to continuously measure: sun incoming irradiance (ETOT) and irradiance (ES, HSI)/radiance (LS, MRI) upwelling from the investigated surface. Both instruments employ two Ocean Optics HR4000 spectrometers sharing the same optical signal that allow to simultaneously collect "fine" (1 nm Full Width at Half Maximum, FWHM) spectra in the 400-1000 nm rangeand "ultra-fine" (0.1 nm FWHM) spectra within the 700-800 nm. The collected optical data allow to estimate biochemical/structural properties of vegetation (e.g. NDVI) as well as its photosynthetic efficiency through the Photochemical Reflectance Index (PRI) and the analysis of sun-induced chlorophyll Fluorescence in the O2-A Fraunhofer line (F@760). The automatic instruments were operated in coordination with eddy covariance flux tower measurements of carbon exchange in the framework of several field campaigns: HSI was employed in a subalpine pasture (2009-ongoing) (www.phenoalp.eu) while MRI was employed in 2009 in the Sen3Exp field survey promoted by the European Space Agency as consolidation study to the future mission Sentinel-3. Results show that the proposed instruments succeeded in collecting continuous

  16. Ground-based monitoring of solar radiation in Moldova

    Science.gov (United States)

    Aculinin, Alexandr; Smicov, Vladimir

    2010-05-01

    Integrated measurements of solar radiation in Kishinev, Moldova have been started by Atmospheric Research Group (ARG) at the Institute of Applied Physics from 2003. Direct, diffuse and total components of solar and atmospheric long-wave radiation are measured by using of the radiometric complex at the ground-based solar radiation monitoring station. Measurements are fulfilled at the stationary and moving platforms equipped with the set of 9 broadband solar radiation sensors overlapping wavelength range from UV-B to IR. Detailed description of the station can be found at the site http://arg.phys.asm.md. Ground station is placed in an urban environment of Kishinev city (47.00N; 28.56E). Summary of observation data acquired at the station in the course of short-term period from 2004 to 2009 are presented below. Solar radiation measurements were fulfilled by using CM11(280-3000 nm) and CH1 sensors (Kipp&Zonen). In the course of a year maximum and minimum of monthly sums of total radiation was ~706.4 MJm-2 in June and ~82.1MJm-2 in December, respectively. Monthly sums of direct solar radiation (on horizontal plane) show the maximum and minimum values of the order ~456.9 MJm-2 in July and ~25.5MJm-2 in December, respectively. In an average, within a year should be marked the predominance of direct radiation over the scattered radiation, 51% and 49%, respectively. In the course of a year, the percentage contribution of the direct radiation into the total radiation is ~55-65% from May to September. In the remaining months, the percentage contribution decreases and takes the minimum value of ~ 28% in December. In an average, annual sum of total solar radiation is ~4679.9 MJm-2. For the period from April to September accounts for ~76% of the annual amount of total radiation. Annual sum of sunshine duration accounts for ~2149 hours, which is of ~ 48% from the possible sunshine duration. In an average, within a year maximum and minimum of sunshine duration is ~ 304 hours in

  17. Biosensors for EVA: Improved Instrumentation for Ground-based Studies

    Science.gov (United States)

    Soller, B.; Ellerby, G.; Zou, F.; Scott, P.; Jin, C.; Lee, S. M. C.; Coates, J.

    2010-01-01

    During lunar excursions in the EVA suit, real-time measurement of metabolic rate is required to manage consumables and guide activities to ensure safe return to the base. Metabolic rate, or oxygen consumption (VO2), is normally measured from pulmonary parameters but cannot be determined with standard techniques in the oxygen-rich environment of a spacesuit. Our group has developed novel near infrared spectroscopic (NIRS) methods to calculate muscle oxygen saturation (SmO 2), hematocrit, and pH, and we recently demonstrated that we can use our NIRS sensor to measure VO 2 on the leg during cycling. Our NSBRI project has 4 objectives: (1) increase the accuracy of the metabolic rate calculation through improved prediction of stroke volume; (2) investigate the relative contributions of calf and thigh oxygen consumption to metabolic rate calculation for walking and running; (3) demonstrate that the NIRS-based noninvasive metabolic rate methodology is sensitive enough to detect decrement in VO 2 in a space analog; and (4) improve instrumentation to allow testing within a spacesuit. Over the past year we have made progress on all four objectives, but the most significant progress was made in improving the instrumentation. The NIRS system currently in use at JSC is based on fiber optics technology. Optical fiber bundles are used to deliver light from a light source in the monitor to the patient, and light reflected back from the patient s muscle to the monitor for spectroscopic analysis. The fiber optic cables are large and fragile, and there is no way to get them in and out of the test spacesuit used for ground-based studies. With complimentary funding from the US Army, we undertook a complete redesign of the sensor and control electronics to build a novel system small enough to be used within the spacesuit and portable enough to be used by a combat medic. In the new system the filament lamp used in the fiber optic system was replaced with a novel broadband near infrared

  18. Biosensors for EVA: Improved Instrumentation for Ground-based Studies

    Science.gov (United States)

    Soller, B.; Ellerby, G.; Zou, F.; Scott, P.; Jin, C.; Lee, S. M. C.; Coates, J.

    2010-01-01

    During lunar excursions in the EVA suit, real-time measurement of metabolic rate is required to manage consumables and guide activities to ensure safe return to the base. Metabolic rate, or oxygen consumption (VO2), is normally measured from pulmonary parameters but cannot be determined with standard techniques in the oxygen-rich environment of a spacesuit. Our group has developed novel near infrared spectroscopic (NIRS) methods to calculate muscle oxygen saturation (SmO 2), hematocrit, and pH, and we recently demonstrated that we can use our NIRS sensor to measure VO 2 on the leg during cycling. Our NSBRI project has 4 objectives: (1) increase the accuracy of the metabolic rate calculation through improved prediction of stroke volume; (2) investigate the relative contributions of calf and thigh oxygen consumption to metabolic rate calculation for walking and running; (3) demonstrate that the NIRS-based noninvasive metabolic rate methodology is sensitive enough to detect decrement in VO 2 in a space analog; and (4) improve instrumentation to allow testing within a spacesuit. Over the past year we have made progress on all four objectives, but the most significant progress was made in improving the instrumentation. The NIRS system currently in use at JSC is based on fiber optics technology. Optical fiber bundles are used to deliver light from a light source in the monitor to the patient, and light reflected back from the patient s muscle to the monitor for spectroscopic analysis. The fiber optic cables are large and fragile, and there is no way to get them in and out of the test spacesuit used for ground-based studies. With complimentary funding from the US Army, we undertook a complete redesign of the sensor and control electronics to build a novel system small enough to be used within the spacesuit and portable enough to be used by a combat medic. In the new system the filament lamp used in the fiber optic system was replaced with a novel broadband near infrared

  19. The experience of FURNAS in the development and implementation of generation automatic control trough the microprocessors; Experiencia de FURNAS no desenvolvimento e implantacao de controle automatico de geracao atraves de microprocessadores

    Energy Technology Data Exchange (ETDEWEB)

    Ferreira, Luiz Edmundo S.; Espirito Santo, Sergio do; Freitas, Cesar A.B.; Santos, Jorge Luiz R.; Cotrim, Sergio P.R.; Fernandes, Lucilia M.D.; Praca, Antonio A.S.; Garrofe, Paulo H.S. [FURNAS, Rio de Janeiro, RJ (Brazil)

    1993-12-31

    The objective of this work is to introduce the main points related to development and the implementation of the several phases of a digital system to the Generation Automatic Control (GAC) of FURNAS, using an specific and dedicated microprocessors to control the processes 1 ref., 3 figs.

  20. A program for assisting automatic generation control of the ELETRONORTE using artificial neural network; Um programa para assistencia ao controle automatico de geracao da Eletronorte usando rede neuronal artificial

    Energy Technology Data Exchange (ETDEWEB)

    Brito Filho, Pedro Rodrigues de; Nascimento Garcez, Jurandyr do [Para Univ., Belem, PA (Brazil). Centro Tecnologico; Charone Junior, Wady [Centrais Eletricas do Nordeste do Brasil S.A. (ELETRONORTE), Belem, PA (Brazil)

    1994-12-31

    This work presents an application of artificial neural network as a support to decision making in the automatic generation control (AGC) of the ELETRONORTE. It uses a software to auxiliary in the decisions in real time of the AGC. (author) 2 refs., 6 figs., 1 tab.

  1. Evaluating the Potential of Rtk-Uav for Automatic Point Cloud Generation in 3d Rapid Mapping

    Science.gov (United States)

    Fazeli, H.; Samadzadegan, F.; Dadrasjavan, F.

    2016-06-01

    During disaster and emergency situations, 3D geospatial data can provide essential information for decision support systems. The utilization of geospatial data using digital surface models as a basic reference is mandatory to provide accurate quick emergency response in so called rapid mapping activities. The recipe between accuracy requirements and time restriction is considered critical in this situations. UAVs as alternative platforms for 3D point cloud acquisition offer potentials because of their flexibility and practicability combined with low cost implementations. Moreover, the high resolution data collected from UAV platforms have the capabilities to provide a quick overview of the disaster area. The target of this paper is to experiment and to evaluate a low-cost system for generation of point clouds using imagery collected from a low altitude small autonomous UAV equipped with customized single frequency RTK module. The customized multi-rotor platform is used in this study. Moreover, electronic hardware is used to simplify user interaction with the UAV as RTK-GPS/Camera synchronization, and beside the synchronization, lever arm calibration is done. The platform is equipped with a Sony NEX-5N, 16.1-megapixel camera as imaging sensor. The lens attached to camera is ZEISS optics, prime lens with F1.8 maximum aperture and 24 mm focal length to deliver outstanding images. All necessary calibrations are performed and flight is implemented over the area of interest at flight height of 120 m above the ground level resulted in 2.38 cm GSD. Earlier to image acquisition, 12 signalized GCPs and 20 check points were distributed in the study area and measured with dualfrequency GPS via RTK technique with horizontal accuracy of σ = 1.5 cm and vertical accuracy of σ = 2.3 cm. results of direct georeferencing are compared to these points and experimental results show that decimeter accuracy level for 3D points cloud with proposed system is achievable, that is suitable

  2. 软件测试数据自动生成算法的仿真研究%Simulation Research on Automatically Generate Software Test Data Algorithm

    Institute of Scientific and Technical Information of China (English)

    黄丽芬

    2012-01-01

    Testing data is the most crucial part in software testing software, and it is important for the software test automation degree to improve the automatic software test data generation method. Aiming at the defects of genetic algorithm and ant colony algorithm, a new software test data generation algorithm was proposed in this paper based on genetic and ant colony algorithm. Firstly, genetic algorithm which has the global searching ability was used to find the optimal solution, and then the optimal solution was converted into the initial pheromone of ant colony algorithm. Finally, the best test data were found by ant colony algorithm positive feedback mechanism quickly. The experimental results show that the proposed method improves the efficiency of software test data generation and has very important using value.%研究软件质量优化问题,传统遗传算法存在局部最优、收敛速度慢,使软件测试数据自动生成效率低.为提高软件测试数据生成效率,对传统遗传算法进行改进,提出一种遗传-蚁群算法的软件测试数据生成算法.针对测试数据自动生成的特点,充分发挥遗传算法的全局搜索和蚁群算法的局部搜索优势,提高了测试数据的生成能力.实验结果表明,遗传-蚁群算法提高了软件测试数据生成效率,是一种较为理想的软件测试数据生成算法.

  3. DEVELOPMENT AND TESTING OF GEO-PROCESSING MODELS FOR THE AUTOMATIC GENERATION OF REMEDIATION PLAN AND NAVIGATION DATA TO USE IN INDUSTRIAL DISASTER REMEDIATION

    Directory of Open Access Journals (Sweden)

    G. Lucas

    2015-08-01

    Full Text Available This paper introduces research done on the automatic preparation of remediation plans and navigation data for the precise guidance of heavy machinery in clean-up work after an industrial disaster. The input test data consists of a pollution extent shapefile derived from the processing of hyperspectral aerial survey data from the Kolontár red mud disaster. Three algorithms were developed and the respective scripts were written in Python. The first model aims at drawing a parcel clean-up plan. The model tests four different parcel orientations (0, 90, 45 and 135 degree and keeps the plan where clean-up parcels are less numerous considering it is an optimal spatial configuration. The second model drifts the clean-up parcel of a work plan both vertically and horizontally following a grid pattern with sampling distance of a fifth of a parcel width and keep the most optimal drifted version; here also with the belief to reduce the final number of parcel features. The last model aims at drawing a navigation line in the middle of each clean-up parcel. The models work efficiently and achieve automatic optimized plan generation (parcels and navigation lines. Applying the first model we demonstrated that depending on the size and geometry of the features of the contaminated area layer, the number of clean-up parcels generated by the model varies in a range of 4% to 38% from plan to plan. Such a significant variation with the resulting feature numbers shows that the optimal orientation identification can result in saving work, time and money in remediation. The various tests demonstrated that the model gains efficiency when 1/ the individual features of contaminated area present a significant orientation with their geometry (features are long, 2/ the size of pollution extent features becomes closer to the size of the parcels (scale effect. The second model shows only 1% difference with the variation of feature number; so this last is less interesting for

  4. Gravitational waves from merging intermediate-mass black holes : II Event rates at ground-based detectors

    CERN Document Server

    Shinkai, Hisa-aki; Ebisuzaki, Toshikazu

    2016-01-01

    Based on a dynamical formation model of a super-massive black-hole (SMBH), we estimate expected observational profile of gravitational wave at ground-based detectors, such as KAGRA or advanced LIGO/VIRGO. Focusing that the second generation of detectors have enough sensitivity from 10 Hz and up (especially with KAGRA due to its location at less seismic noise), we are able to detect the ring-down gravitational wave of a BH of the mass $M 1$ per year for $\\rho=10$. Thus if we observe a BH with more than $100 M_\\odot$ in future gravitational wave observations, our model naturally explains its source.

  5. The advances in airglow study and observation by the ground-based airglow observation network over China

    Science.gov (United States)

    Xu, Jiyao; Li, Qinzeng; Yuan, Wei; Liu, Xiao; Liu, Weijun; Sun, Longchang

    2017-04-01

    at 630.0 nm over Xinglong, we studied the evolution (generation, amplification, and dissipation) of mesoscale field-aligned irregularity structures (FAIs) ( 150 km) associated with a medium-scale traveling ionospheric disturbance (MSTID) event. We also investigates the statistical features of equatorial plasma bubbles (EPBs) using airglow images from 2012 to 2014 from a ground-based network of four imagers in the equatorial region of China.

  6. Eclipse计划系统DVH自动生成比较软件开发%Development of a Compared Software for Automatically Generated DVH in Eclipse TPS

    Institute of Scientific and Technical Information of China (English)

    谢朝; 骆科林; 邹炼; 胡金有

    2016-01-01

    目的:自动快速计算治疗计划的剂量体积直方图(DVH),并与医生处方要求相比较。方法使用AutoHotkey热键脚本语言和高级程序设计语言C#,开发了适合于Eclipse11.0计划系统的DVH自动生成比较软件ShowDVH,该软件由处方文档生成、C#中DVH操作函数、软件可视化及DVH比较报告生成模块构成。结果在临床上选取不同病种各10例,调用ShowDVH在计划系统下运行,ShowDVH不仅能生成DVH报告,还能准确判断治疗计划是否满足医生处方要求,为调强优化参数设置给予方向指导。结论该软件界面友好,功能强大,能快速计算生成比较DVH,大大节约计划设计时间,提高放射治疗物理师工作效率。%Objective This study is to automaticaly calculatethe dose volume histogram(DVH) for the treatment plan, then to compare it with requirements of doctor's prescriptions.Methods The scripting language Autohotkey and programming language C# were used to develop a compared software for automatically generated DVH in Eclipse TPS. This software is named Show Dose Volume Histogram (ShowDVH), which is composed of prescription documents generation, operation functions of DVH, software visualization and DVH compared report generation.Results Ten cases in different cancers have been separately selected, in Eclipse TPS 11.0 ShowDVH could not only automaticaly generate DVH reports but also accurately determine whether treatment plans meet the requirements of doctor’s prescriptions, then reports gave direction for setting optimization parameters of intensity modulated radiated therapy. Conclusions The ShowDVH is an user-friendly and powerful software, and can automaticaly generated compared DVH reports fast in Eclipse TPS 11.0. With the help of ShowDVH, it greatly saves plan designing time and improves working effi ciency of radiation therapy physicists.

  7. Dephi中基于ADO技术实现定制报表的生成%ADO technology based automatic generation of customized report forms in Dephi

    Institute of Scientific and Technical Information of China (English)

    丁风海; 夏伟; 邱斌; 王琳娜

    2012-01-01

    ADO (active data objects) is OLE DB technology developed by Microsoft Corporation. It is based on the data access rules and API access, and provides a consistent, high-performance and high compatibility data access interface. The application of ADO technological development application program allows the program developers to control database access easily, thus can produce database access procedures which meet client requirements. The access mechanism and method of the databases which use ADO technology in Dephi is introduced emphatically. Through a simple example, the highly specialized customized report forms automatically generated by combining OLE Automation technology are described in detail.%ADO是Microsoft发展的OLE DB技术,它基于COM数据访问规则和API访问,并提供一个一致、高性能、高兼容性的数据访问接口.使用ADO技术开发应用程序可以使程序开发者更容易地控制对数据库的访问,从而产生符合用户需求的数据库访问程序.着重介绍了在Dephi中应用ADO技术的数据库的访问机制和实现方法,并通过简单实例详述结合OLE自动化技术自动生成专业性较强的定制报表.

  8. Automatic Generation of the Reconfigurable Robot Forward and Inverse Kinematics%可重组机器人运动学正逆解的自动生成

    Institute of Scientific and Technical Information of China (English)

    费燕琼; 冯光涛; 赵锡芳; 徐卫良

    2000-01-01

    When the configuration of reconfigurable modular robot is finalized, the link module and joint module are gained. The PME method is adopted. So the forward kinematics is generated automatically. As for the inverse kinematics, this paper transferred the body coordinate system to the fixed coordinate system and found the rules, then divided them into 4 sub-problems. With the PME method, the inverse kinematics algorithm (PMEI method) was got. Under Windows environment some examples were given.%可重组模块机器人构形确定后,找出依序的连杆、关节模块,采用指数积(PME)法,自动生 成运动学正方程.把运动学正方程转换成相对于固定坐标系的方程,找出规律,分成4个子问题,成 功地运用PME法得到自动求模块机器人逆解的算法—逆运动学指数积(PMEI)法.windows环 境下正逆解的自动生成说明了该方法的可行性.

  9. Automatic control of plants of direct steam generation with cylinder-parabolic solar collectors; Control automatico de plantas de generacion directa de vapor con colectores solares cilindro-parabolicos

    Energy Technology Data Exchange (ETDEWEB)

    Valenzuela Gutierrez, L.

    2008-07-01

    The main objective of this dissertation has been the contributions to the operation in automatic mode of a new generation of direct steam generation solar plants with parabolic-trough collectors. The dissertation starts introducing the parabolic-trough collectors solar thermal technology for the generation of process steam or steam for a Rankine cycle in the case of power generation generation, which is currently the most developed and commercialized technology. Presently, the parabolic-trough collectors technology is based on the configuration known as heat-exchanger system, based in the use of a heat transfer fluid in the solar field which is heated during the recirculation through the absorber tubes of the solar collectors, transferring later on the that thermal energy to a heat-exchanger for steam generation. Direct steam generation in the absorber tubes has always been shown as an ideal pathway to reduce generation cost by 15% and increase conversion efficiency by 20% (DISS, 1999). (Author)

  10. Research on the Method Based on LabVlEW for Automatically Generating Detection Report%基于LabVIEW的检测报告自动生成方法研究

    Institute of Scientific and Technical Information of China (English)

    李磊; 杨峰; 何耀

    2012-01-01

    In order to automatically printout the detection results and judgment conclusion from certain composite detector that is developed based on LabVIEW platform for radar receiver in accordance with designated template format and sequence, the global variable technology and Word document report generation technology have been researched. By adopting Report Generation Toolkit and LabVIEW programming technology, the automatic report generation program for composite detection report is designed. Various detection results and judgment conclusion can be recorded in real time; and the report based on the format of designated Word template can be generated automatically with this program.%为了将检测结果和判别结论按照给定的模版格式和顺序自动地打印出来,研究了LabVIEW中的全局变量技术和Word文档检测报告生成技术.利用Report Generation Toolkit 工具包和LabVIEW编程技术,设计了组合检测报告自动生成程序.该程序能够实时记录各种检测结果和判别结论,并自动生成基于指定Word模板格式的检测报告.

  11. Space-borne detection of volcanic carbon dioxide anomalies: The importance of ground-based validation networks

    Science.gov (United States)

    Schwandner, F. M.; Carn, S. A.; Corradini, S.; Merucci, L.; Salerno, G.; La Spina, A.

    2012-04-01

    disparity of data formats and frequencies. The solution to achieving high quality regional or global coverage may lie in the combination of the two methods, and satellite campaigns do already rely on ground-based validation methods for their data products. To optimize a combination of space-borne and ground-based techniques, two aspects need to be addressed: (a) database and data format compatibility, and (b) questions of instrumentation network design and compatibility. For regional spatio-temporal atmospheric variances, no homogenous data formats and databases exist to date. In volcanology, such an approach already exists in the emerging WOVOdat database and its data format convention (http://www.wovodat.org). For ground-based CO2 network designs in volcanology and in greenhouse gas monitoring in general, no common approach has yet found common use that besides integration of existing networks would also enable nesting, scaling and fractal growth. In summary, promising first results from the GOSAT instrument indicate that a space-based detection of volcanic CO2 pulses may indeed be possible. Once newer generations of satellite sensors become operational (e.g., OCO-2, OCO-3, and GOSAT-2), better ground resolution and scanning capabilities might enable detection and imaging of volcanic CO2 plumes. To derive the best quality product from such measurements, we anticipate an urgent need to design new approaches to ground-based validation networks and their data products. Such networks would ideally become more compatible, with a higher degree of instrumentation compatibility, and a common data exchange format to connect between distributed databases. Community-based efforts will be necessary to progress toward these goals.

  12. Ground-based Infrared Observations of Water Vapor and Hydrogen Peroxide in the Atmosphere of Mars

    Science.gov (United States)

    Encrenaz, T.; Greathouse, T. K.; Bitner, M.; Kruger, A.; Richter, M. J.; Lacy, J. H.; Bézard, B.; Fouchet, T.; Lefevre, F.; Forget, F.; Atreya, S. K.

    2008-11-01

    Ground-based observations of water vapor and hydrogen peroxide have been obtained in the thermal infrared range, using the TEXES instrument at the NASA Infrared Telescope Facility, for different times of the seasonal cycle.

  13. Informing hydrological models with ground-based time-lapse relative gravimetry: potential and limitations

    DEFF Research Database (Denmark)

    Bauer-Gottwein, Peter; Christiansen, Lars; Rosbjerg, Dan

    2011-01-01

    Coupled hydrogeophysical inversion emerges as an attractive option to improve the calibration and predictive capability of hydrological models. Recently, ground-based time-lapse relative gravity (TLRG) measurements have attracted increasing interest because there is a direct relationship between ...

  14. Changes in ground-based solar ultraviolet radiation during fire episodes: a case study

    CSIR Research Space (South Africa)

    Wright, CY

    2013-09-01

    Full Text Available about the relationship between fires and solar UVR without local high-quality column or ground-based ambient air pollution (particulate matter in particular) data; however, the threat to public health from fires was acknowledged....

  15. A decade of dark matter searches with ground-based Cherenkov telescopes

    Energy Technology Data Exchange (ETDEWEB)

    Doro, Michele, E-mail: michele.doro@pd.infn.it [University and INFN Padova, via Marzolo 8, 35131 Padova (Italy); Department of Physics and CERES, Campus Universitat Autonoma Barcelona, 08135 Bellaterra (Spain)

    2014-04-01

    In the general scenario of Weakly Interacting Massive Particles (WIMP), dark matter (DM) can be observed via astrophysical gamma-rays because photons are produced in various DM annihilation or decay processes, either as broad-band or line emission, or because of the secondary processes of charged particles in the final stages of the annihilations or the decays. The energy range of the former processes is accessible by current ground-based Imaging Atmospheric Cherenkov telescopes (IACTs, like H.E.S.S., MAGIC and VERITAS). The strengths of this technique are (a) the expected DM gamma-ray spectra show peculiar features like bumps, spikes and cutoff that make them clearly distinguishable from the smoother astrophysical spectra and (b) the expected DM spectrum is universal and therefore by observing two or more DM targets with the same spectrum, a clear identification (besides detection) of DM would be enabled. The role of IACTs may gain more importance in the future as the results from the LHC may hint to a DM particle with mass at the TeV or above, where the IACTs sensitivity is unsurpassed by other experiments. In this contribution, a review of the search for DM with the current generation of IACT will be presented.

  16. On Advanced Estimation Techniques for Exoplanet Detection and Characterization using Ground-Based Coronagraphs

    Science.gov (United States)

    Lawson, Peter R.; Frazin, Richard; Barrett, Harrison; Caucci, Luca; Devaney, Nicholas; Furenlid, Lars; Gladysz, Szymon; Guyon, Olivier; Krist, John; Maire, Jerome; Marois, Christian; Mawet, Dimitri; Mouillet, David; Mugnier, Laurent; Perrin, Marshall; Poyneer, Lisa; Pueyo, Laurent; Savransky, Dmitry; Soummer, Remi

    2012-01-01

    The direct imaging of planets around nearby stars is exceedingly difficult. Only about 14 exoplanets have been imaged to date that have masses less than 13 times that of Jupiter. The next generation of planet-finding coronagraphs, including VLT-SPHERE, the Gemini Planet Imager, Palomar P1640, and Subaru HiCIAO have predicted contrast performance of roughly a thousand times less than would be needed to detect Earth-like planets. In this paper we review the state of the art in exoplanet imaging, most notably the method of Locally Optimized Combination of Images (LOCI), and we investigate the potential of improving the detectability of faint exoplanets through the use of advanced statistical methods based on the concepts of the ideal observer and the Hotelling observer. We provide a formal comparison of techniques through a blind data challenge and evaluate performance using the Receiver Operating Characteristic (ROC) and Localization ROC (LROC) curves. We place particular emphasis on the understanding and modeling of realistic sources of measurement noise in ground-based AO-corrected coronagraphs. The work reported in this paper is the result of interactions between the co-authors during a week-long workshop on exoplanet imaging that was held in Squaw Valley, California, in March of 2012.

  17. Complementing the ground-based CMB Stage-4 experiment on large scales with the PIXIE satellite

    CERN Document Server

    Calabrese, Erminia; Dunkley, Jo

    2016-01-01

    We present forecasts for cosmological parameters from future Cosmic Microwave Background (CMB) data measured by the Stage-4 (S4) generation of ground-based experiments in combination with large-scale anisotropy data from the PIXIE satellite. We demonstrate the complementarity of the two experiments and focus on science targets that benefit from their combination. We show that a cosmic-variance-limited measurement of the optical depth to reionization provided by PIXIE, with error $\\sigma(\\tau)=0.002$, is vital for enabling a 5$\\sigma$ detection of the sum of the neutrino masses when combined with a CMB-S4 lensing measurement, and with lower-redshift constraints on the growth of structure and the distance-redshift relation. Parameters characterizing the epoch of reionization will also be tightly constrained; PIXIE's $\\tau$ constraint converts into $\\sigma(\\rm{z_{re}})=0.2$ for the mean time of reionization, and a kinematic Sunyaev-Zel'dovich measurement from S4 gives $\\sigma(\\Delta \\rm{z_{re}})=0.03$ for the du...

  18. Ground-Based Tests of Spacecraft Polymeric Materials under OXY-GEN Plasma-Beam

    Science.gov (United States)

    Chernik, Vladimir; Novikov, Lev; Gaidar, Anna

    2016-07-01

    Spacecraft LEO mission is accompanied by destruction of polymeric material surface under influence of atomic oxygen flow. Sources of molecular, plasma and ion beams are used for the accelerated ground-based tests of spacecraft materials. In the work application of oxygen plasma accelerator of a duoplasmatron type is described. Plasma particles have been accelerated up to average speed of 13-16 km/s. Influence of such beam on materials leads to more intensive destruction of polymers than in LEO. This fact allows to execute tests in the accelerated time scale by a method of an effective fluence. Special measures were given to decrease a concentration of both gaseous and electrode material impurities in the oxygen beam. In the work the results of simulative tests of spacecraft materials and experiments on LEO are considered. Comparison of plasma beam simulation with LEO data has shown conformity for structures of a number of polymeric materials. The relative erosion yields (normalized with respect to polyimide) of the tested materials are shown practically equal to those in LEO. The obtained results give grounds for using the plasma-generation mode with ion energies of 20-30 eV to accelerated testing of spacecraft materials for long -term LEO missions.

  19. The Effect of Pulsar Timing Noise and Glitches on Timing Analysis for Ground Based Telescopes Observation

    Science.gov (United States)

    Oña-Wilhelmi, E.; de Jager, O. C.; Contreras, J. L.; de los Reyes, R.; Fonseca, V.; López, M.; Lucarelli, F.; MAGIC Collaboration

    2003-07-01

    Pulsed emission from a number of gamma-ray pulsars is expected to be detectable with next generation ground-based gamma-ray telescopes such as MAGIC and possibly H.E.S.S. within a few hours of observations. The sensitivity is however not sufficient to enable a detection within a few seconds as reached by radio surveys. In some cases we may be fortunate to do a period search given a few hours' data, but if the signal is marginal, the correct period parameters must be known to allow a folding of the gamma-ray arrival times. The residual phases are then sub jected to a test for uniformity from which the significance of a signal can be assessed. If contemporary radio parameters are not available, we have to extrap olate archival radio parameters to the observation time in question. Such an extrap olation must then be accurate enough to avoid significant pulse smearing. The pulsar ephemerides from the archival data of HartRAO and Princeton (b etween 1989 and 1998) provide an excellent opportunity to study the accuracy of extrap olations of such ephemerides to the present moment, if an appropriate time shift is intro duced. The aim of this study is to investigate the smear in the gamma-ray pulse profile during a single night of observations.

  20. Simulated forecasts for primordial B-mode searches in ground-based experiments

    CERN Document Server

    Alonso, David; Naess, Sigurd; Thorne, Ben

    2016-01-01

    Detecting the imprint of inflationary gravitational waves on the $B$-mode polarization of the Cosmic Microwave Background (CMB) is one of the main science cases for current and next-generation CMB experiments. In this work we explore some of the challenges that ground-based facilities will have to face in order to carry out this measurement in the presence of Galactic foregrounds and correlated atmospheric noise. We present forecasts for Stage-3 (S3) and planned Stage-4 (S4) experiments based on the analysis of simulated sky maps using a map-based Bayesian foreground cleaning method. Our results thus consistently propagate the uncertainties on foreground parameters such as spatially-varying spectral indices, as well as the bias on the measured tensor-to-scalar ratio $r$ caused by an incorrect modelling of the foregrounds. We find that S3 and S4-like experiments should be able to put constraints on $r$ of the order $\\sigma(r)=(0.5-1.0)\\times10^{-2}$ and $\\sigma(r)=(0.5-1.0)\\times10^{-3}$ respectively, assuming...

  1. Ground-based Optical Observations of Geophysical Phenomena: Aurora Borealis and Meteors

    Science.gov (United States)

    Samara, Marilia

    2010-10-01

    Advances in low-light level imaging technology have enabled significant improvements in the ground based study of geophysical phenomena. In this talk we focus on two such phenomena that occur in the Earth's ionosphere: aurorae and meteors. Imaging the aurora which is created by the interplay of the Earth's magnetosphere, ionosphere and atmosphere, provides a tool for remote sensing physical processes that are otherwise very difficult to study. By quantifying the intensities, scale sizes and lifetimes of auroral structures, we can gain significant insight into the physics behind the generation of the aurora and the interaction of the magnetosphere with the solar wind. Additionally, the combination of imaging with radars provides complimentary data and therefore more information than either method on its own. Meteor observations are a perfect example of this because the radar can accurately determine only the line-of-sight component of velocity, while imaging provides the direction of motion, the perpendicular velocity and brightness (a proxy for mass), therefore enabling a much more accurate determination of the full velocity vector and mass.

  2. CO2 Total Column Variability From Ground-Based FTIR Measurements Over Central Mexico

    Science.gov (United States)

    Baylon, J. L.; Stremme, W.; Plaza, E.; Bezanilla, A.; Grutter, M.; Hase, F.; Blumenstock, T.

    2014-12-01

    There are now several space missions dedicated to measure greenhouse gases in order to improve the understanding of the carbon cycle. Ground based measurement sites are of great value in the validation process, however there are only a few stations in tropical latitudes. We present measurements of solar-absorption infrared spectra recorded on two locations over Central Mexico: the High-Altitude Station Altzomoni (19.12 N, 98.65 W), located in the Izta-Popo National Park outside of Mexico City; and the UNAM's Atmospheric Observatory (19.32 N, 99.17 W) in Mexico City. These measurements were performed using a high resolution Fourier transform infrared spectrometer FTIR (Bruker, HR 120/5) at Altzomoni and a moderate resolution FTIR (Bruker, Vertex 80) within the city. In this work, we present the first results for total vertical columns of CO2 derived from near-infrared spectra recorded at both locations using the retrieval code PROFFIT. We present the seasonal cycle and variability from the measurements, as well as the full diagnostics of the retrieval in order assess its quality and discuss the differences of both instruments and locations (altitudes, urban vs remote). This work aims to contribute to generate high quality datasets for satellite validation.

  3. Heavy precipitation retrieval from combined satellite observations and ground-based lightning measurements

    Science.gov (United States)

    Mugnai, A.; Dietrich, S.; Casella, D.; di Paola, F.; Formenton, M.; Sanò, P.

    2010-09-01

    We have developed a series of algorithms for the retrieval of precipitation (especially, heavy precipitation) over the Mediterranean area using satellite observations from the available microwave (MW) radiometers onboard low Earth orbit (LEO) satellites and from the visible-infrared (VIS-IR) SEVIRI radiometer onboard the European geosynchronous (GEO) satellite Meteosat Second Generation (MSG), in conjunction with lightning data from ground-based networks - such as ZEUS and LINET. These are: • A new approach for precipitation retrieval from space (which we call the Cloud Dynamics and Radiation Database approach, CDRD) that incorporates lightning and environmental/dynamical information in addition to the upwelling microwave brightness temperatures (TB’s) so as to reduce the retrieval uncertainty and improve the retrieval performance; • A new combined MW-IR technique for producing frequent precipitation retrievals from space (which we call PM-GCD technique), that uses passive-microwave (PM) retrievals in conjunction with lightning information and the Global Convection Detection (GCD) technique to discriminate deep convective clouds within the GEO observations; • A new morphing approach (which we call the Lightning-based Precipitation Evolving Technique, L-PET) that uses the available lightning measurements for propagating the rainfall estimates from satellite-borne MW radiometers to a much higher time resolution than the MW observations. We will present and discuss our combined MW/IR/lightning precipitation algorithms and analyses with special reference to some case studies over the western Mediterranean.

  4. Simulated forecasts for primordial B -mode searches in ground-based experiments

    Science.gov (United States)

    Alonso, David; Dunkley, Joanna; Thorne, Ben; Næss, Sigurd

    2017-02-01

    Detecting the imprint of inflationary gravitational waves on the B -mode polarization of the cosmic microwave background (CMB) is one of the main science cases for current and next-generation CMB experiments. In this work we explore some of the challenges that ground-based facilities will have to face in order to carry out this measurement in the presence of galactic foregrounds and correlated atmospheric noise. We present forecasts for stage-3 (S3) and planned stage-4 (S4) experiments based on the analysis of simulated sky maps using a map-based Bayesian foreground-cleaning method. Our results thus consistently propagate the uncertainties on foreground parameters such as spatially varying spectral indices, as well as the bias on the measured tensor-to-scalar ratio r caused by an incorrect modeling of the foregrounds. We find that S3 and S4-like experiments should be able to put constraints on r of the order σ (r )=(0.5 - 1.0 )×10-2 and σ (r )=(0.5 - 1.0 )×10-3 respectively, assuming instrumental systematic effects are under control. We further study deviations from the fiducial foreground model, finding that, while the effects of a second polarized dust component would be minimal on both S3 and S4, a 2% polarized anomalous dust emission component would be clearly detectable by stage-4 experiments.

  5. Ground-based infrared surveys: imaging the thermal fields at volcanoes and revealing the controlling parameters.

    Science.gov (United States)

    Pantaleo, Michele; Walter, Thomas

    2013-04-01

    Temperature monitoring is a widespread procedure in the frame of volcano hazard monitoring. Indeed temperature changes are expected to reflect changes in volcanic activity. We propose a new approach, within the thermal monitoring, which is meant to shed light on the parameters controlling the fluid pathways and the fumarole sites by using infrared measurements. Ground-based infrared cameras allow one to remotely image the spatial distribution, geometric pattern and amplitude of fumarole fields on volcanoes at metre to centimetre resolution. Infrared mosaics and time series are generated and interpreted, by integrating geological field observations and modeling, to define the setting of the volcanic degassing system at shallow level. We present results for different volcano morphologies and show that lithology, structures and topography control the appearance of fumarole field by the creation of permeability contrasts. We also show that the relative importance of those parameters is site-dependent. Deciphering the setting of the degassing system is essential for hazard assessment studies because it would improve our understanding on how the system responds to endogenous or exogenous modification.

  6. Petascale Computing for Ground-Based Solar Physics with the DKIST Data Center

    Science.gov (United States)

    Berukoff, Steven J.; Hays, Tony; Reardon, Kevin P.; Spiess, DJ; Watson, Fraser; Wiant, Scott

    2016-05-01

    When construction is complete in 2019, the Daniel K. Inouye Solar Telescope will be the most-capable large aperture, high-resolution, multi-instrument solar physics facility in the world. The telescope is designed as a four-meter off-axis Gregorian, with a rotating Coude laboratory designed to simultaneously house and support five first-light imaging and spectropolarimetric instruments. At current design, the facility and its instruments will generate data volumes of 3 PB per year, and produce 107-109 metadata elements.The DKIST Data Center is being designed to store, curate, and process this flood of information, while providing association of science data and metadata to its acquisition and processing provenance. The Data Center will produce quality-controlled calibrated data sets, and make them available freely and openly through modern search interfaces and APIs. Documented software and algorithms will also be made available through community repositories like Github for further collaboration and improvement.We discuss the current design and approach of the DKIST Data Center, describing the development cycle, early technology analysis and prototyping, and the roadmap ahead. We discuss our iterative development approach, the underappreciated challenges of calibrating ground-based solar data, the crucial integration of the Data Center within the larger Operations lifecycle, and how software and hardware support, intelligently deployed, will enable high-caliber solar physics research and community growth for the DKIST's 40-year lifespan.

  7. MetaSensing's FastGBSAR: ground based radar for deformation monitoring

    Science.gov (United States)

    Rödelsperger, Sabine; Meta, Adriano

    2014-10-01

    The continuous monitoring of ground deformation and structural movement has become an important task in engineering. MetaSensing introduces a novel sensor system, the Fast Ground Based Synthetic Aperture Radar (FastGBSAR), based on innovative technologies that have already been successfully applied to airborne SAR applications. The FastGBSAR allows the remote sensing of deformations of a slope or infrastructure from up to a distance of 4 km. The FastGBSAR can be setup in two different configurations: in Real Aperture Radar (RAR) mode it is capable of accurately measuring displacements along a linear range profile, ideal for monitoring vibrations of structures like bridges and towers (displacement accuracy up to 0.01 mm). Modal parameters can be determined within half an hour. Alternatively, in Synthetic Aperture Radar (SAR) configuration it produces two-dimensional displacement images with an acquisition time of less than 5 seconds, ideal for monitoring areal structures like dams, landslides and open pit mines (displacement accuracy up to 0.1 mm). The MetaSensing FastGBSAR is the first ground based SAR instrument on the market able to produce two-dimensional deformation maps with this high acquisition rate. By that, deformation time series with a high temporal and spatial resolution can be generated, giving detailed information useful to determine the deformation mechanisms involved and eventually to predict an incoming failure. The system is fully portable and can be quickly installed on bedrock or a basement. The data acquisition and processing can be fully automated leading to a low effort in instrument operation and maintenance. Due to the short acquisition time of FastGBSAR, the coherence between two acquisitions is very high and the phase unwrapping is simplified enormously. This yields a high density of resolution cells with good quality and high reliability of the acquired deformations. The deformation maps can directly be used as input into an Early

  8. Geospace Science from Ground-based Magnetometer Arrays: Advances in Sensors, Data Collection, and Data Integration

    Science.gov (United States)

    Mann, Ian; Chi, Peter

    2016-07-01

    , acceleration, and loss of electrons in the radiation belts promise high profile science returns. Integrated, global scale data products also have potential importance and application for real-time monitoring of the space weather threats to electrical power grids from geomagnetically induced currents. Such data exploitation increasingly relies on the collaborations between multiple national magnetometer arrays to generate single data products with common file format and data properties. We review advances in geospace science which can be delivered by networks of ground-based magnetometers - in terms of advances in sensors, data collection, and data integration - including through collaborations within the Ultra-Large Terrestrial International Magnetometer Array (ULTIMA) consortium.

  9. Automatic Control System for Neutron Laboratory Safety

    Institute of Scientific and Technical Information of China (English)

    ZHAO; Xiao; ZHANG; Guo-guang; FENG; Shu-qiang; SU; Dan; YANG; Guo-zhao; ZHANG; Shuai

    2015-01-01

    In order to cooperate with the experiment of neutron generator,and realize the automatic control in the experiment,a set of automatic control system for the safety of the neutron laboratory is designed.The system block diagram is shown as Fig.1.Automatic control device is for processing switch signal,so PLC is selected as the core component

  10. Cloud and precipitation properties from ground-based remote sensing instruments in East Antarctica

    Directory of Open Access Journals (Sweden)

    I. V. Gorodetskaya

    2014-07-01

    Full Text Available A new comprehensive cloud-precipitation-meteorological observatory has been established at Princess Elisabeth base, located in the escarpment zone of Dronning Maud Land, East Antarctica. The observatory consists of a set of ground-based remote sensing instruments (ceilometer, infrared pyrometer and vertically profiling precipitation radar combined with automatic weather station measurements of near-surface meteorology, radiative fluxes, and snow accumulation. In this paper, the observatory is presented and the potential for studying the evolution of clouds and precipitating systems is illustrated by case studies. It is shown that the synergetic use of the set of instruments allows for distinguishing ice, mixed-phase and precipitating clouds, including some information on their vertical extent. In addition, wind-driven blowing snow events can be distinguished from deeper precipitating systems. Cloud properties largely affect the surface radiative fluxes, with liquid-containing clouds dominating the radiative impact. A statistical analysis of all measurements (in total 14 months mainly occurring in summer/autumn indicates that these liquid-containing clouds occur during as much as 20% of the cloudy periods. The cloud occurrence shows a strong bimodal distribution with clear sky conditions 51% of the time and complete overcast conditions 35% of the time. Snowfall occurred 17% of the cloudy periods with a predominance of light precipitation and only rare events with snowfall > 1 mm h−1 water equivalent (w.e.. Three of such intensive snowfall events occurred during 2011 contributing to anomalously large annual snow accumulation. This is the first deployment of a precipitation radar in Antarctica allowing to assess the contribution of the snowfall to the local surface mass balance. It is shown that on the one hand large accumulation events (>10 mm w.e. day−1 during the measurement period of 26 months were always associated with snowfall, but that

  11. Calibrating ground-based microwave radiometers: Uncertainty and drifts

    Science.gov (United States)

    Küchler, N.; Turner, D. D.; Löhnert, U.; Crewell, S.

    2016-04-01

    The quality of microwave radiometer (MWR) calibrations, including both the absolute radiometric accuracy and the spectral consistency, determines the accuracy of geophysical retrievals. The Microwave Radiometer Calibration Experiment (MiRaCalE) was conducted to evaluate the performance of MWR calibration techniques, especially of the so-called Tipping Curve Calibrations (TCC) and Liquid Nitrogen Calibrations (LN2cal), by repeatedly calibrating a fourth-generation Humidity and Temperature Profiler (HATPRO-G4) that measures downwelling radiance between 20 GHz and 60 GHz. MiRaCalE revealed two major points to improve MWR calibrations: (i) the necessary repetition frequency for MWR calibration techniques to correct drifts, which ensures stable long-term measurements; and (ii) the spectral consistency of control measurements of a well known reference is useful to estimate calibration accuracy. Besides, we determined the accuracy of the HATPRO's liquid nitrogen-cooled blackbody's temperature. TCCs and LN2cals were found to agree within 0.5 K when observing the liquid nitrogen-cooled blackbody with a physical temperature of 77 K. This agreement of two different calibration techniques suggests that the brightness temperature of the LN2 cooled blackbody is accurate within at least 0.5 K, which is a significant reduction of the uncertainties that have been assumed to vary between 0.6 K and 1.5 K when calibrating the HATPRO-G4. The error propagation of both techniques was found to behave almost linearly, leading to maximum uncertainties of 0.7 K when observing a scene that is associated with a brightness temperature of 15 K.

  12. Evaluation of Real-Time Ground-Based GPS Meteorology

    Science.gov (United States)

    Fang, P.; Bock, Y.; Gutman, S.

    2003-04-01

    We demonstrate and evaluate a system to estimate zenith tropospheric delays in real time (5-10 minute latency) based on the technique of instantaneous GPS positioning as described by Bock et al. [2000] using data from the Orange County Real Time GPS Network. OCRTN is an upgrade of a sub-network of SCIGN sites in southern California to low latency (1-2 sec), high-rate (1 Hz) data streaming. Currently, ten sites are streaming data (Ashtech binary MBEN format) by means of dedicated, point-to-point radio modems to a network hub that translates the asynchronous serial data to TCP/IP and onto a PC workstation residing on a local area network. Software residing on the PC allows multiple clients to access the raw data simultaneously though TCP/IP. One of the clients is a Geodetics RTD server that receives and archives (1) the raw 1 Hz network data, (2) estimates of instantaneous positions and zenith tropospheric delays, and (3) RINEX data to decimated to 30 seconds. The network is composed of ten sites. The distribution of nine of the sites approximates a right triangle with two 60 km legs, and a tenth site on Catalina Island a distance of about 50 km (over water) from the hypotenuse of the triangle. Relative zenith delays are estimated every second with a latency less than a second. Median values are computed at a user-specified interval (e.g., 10 minutes) with outliers greater than 4 times the interquartile range rejected. We describe the results with those generated by our operational system using the GAMIT software, with a latency of 30-60 minutes. Earlier results (from a similar network) comparing 30-minute median RTD values to GAMIT 30-minute estimates indicate that the two solutions differ by about 1 cm. We also describe our approach to determining absolute zenith delays. If an Internet connection is available we will present a real-time demonstration. [Bock, Y., R. Nikolaidis, P. J. de Jonge, and M. Bevis, Instantaneous resolution of crustal motion at medium

  13. Automatic Construction of Finite Algebras

    Institute of Scientific and Technical Information of China (English)

    张健

    1995-01-01

    This paper deals with model generation for equational theories,i.e.,automatically generating (finite)models of a given set of (logical) equations.Our method of finite model generation and a tool for automatic construction of finite algebras is described.Some examples are given to show the applications of our program.We argue that,the combination of model generators and theorem provers enables us to get a better understanding of logical theories.A brief comparison betwween our tool and other similar tools is also presented.

  14. Study of Upper Albian rudist buildups in the Edwards Formation using ground-based hyperspectral imaging and terrestrial laser scanning

    Science.gov (United States)

    Krupnik, Diana; Khan, Shuhab; Okyay, Unal; Hartzell, Preston; Zhou, Hua-Wei

    2016-11-01

    Ground-based hyperspectral imaging is used for development of digital outcrop models which can facilitate detailed qualitative and quantitative sedimentological analysis and augment the study of depositional environment, diagenetic processes, and hydrocarbon reservoir characterization in areas which are physically inaccessible. For this investigation, ground-based hyperspectral imaging is combined with terrestrial laser scanning to produce mineralogical maps of Late Albian rudist buildups of the Edwards formation in the Lake Georgetown Spillway in Williamson County, Texas. The Edwards Formation consists of shallow water deposits of reef and associated interreef facies. It is an aquifer in western Texas and was investigated as a hydrocarbon play in south Texas. Hyperspectral data were registered to a geometrically accurate laser point cloud-generated mesh with sub-pixel accuracy and were used to map compositional variation by distinguishing spectral properties unique to each material. More calcitic flat-topped toucasid-rich bioherm facies were distinguished from overlying porous sucrosic dolostones, and peloid wackestones and packstones of back-reef facies. Ground truth was established by petrographic study of samples from this area. This research integrates high-resolution datasets to analyze geometrical and compositional properties of this carbonate formation at a finer scale than traditional methods have achieved and to model the geometry and composition of rudist buildups.

  15. DEM Development from Ground-Based LiDAR Data: A Method to Remove Non-Surface Objects

    Directory of Open Access Journals (Sweden)

    Maneesh Sharma

    2010-11-01

    Full Text Available Topography and land cover characteristics can have significant effects on infiltration, runoff, and erosion processes on watersheds. The ability to model the timing and routing of surface water and erosion is affected by the resolution of the digital elevation model (DEM. High resolution ground-based Light Detecting and Ranging (LiDAR technology can be used to collect detailed topographic and land cover characteristic data. In this study, a method was developed to remove vegetation from ground-based LiDAR data to create high resolution DEMs. Research was conducted on intensively studied rainfall–runoff plots on the USDA-ARS Walnut Gulch Experimental Watershed in Southeast Arizona. LiDAR data were used to generate 1 cm resolution digital surface models (DSM for 5 plots. DSMs created directly from LiDAR data contain non-surface objects such as vegetation cover. A vegetation removal method was developed which used a slope threshold and a focal mean filter method to remove vegetation and create bare earth DEMs. The method was validated on a synthetic plot, where rocks and vegetation were added incrementally. Results of the validation showed a vertical error of ±7.5 mm in the final DEM.

  16. Traveling magnetopause distortion related to a large-scale magnetosheath plasma jet: THEMIS and ground-based observations

    CERN Document Server

    Dmitriev, A V; 10.1029/2011JA016861

    2013-01-01

    Here, we present a case study of THEMIS and ground-based observations on the dayside magnetopause, and geomagnetic field perturbations related to the interaction of an interplanetary directional discontinuity (DD), as observed by ACE, within the magnetosphere on 16 June 2007. The interaction resulted in a large-scale local magnetopause distortion of an 'expansion-compression-expansion' (ECE) sequence that lasted for 15 min. The compression was caused by a very dense, cold, and fast high-beta magnetosheath plasma flow, a so-called plasma jet, whose kinetic energy was approximately three times higher than the energy of the incident solar wind. The plasma jet resulted in the effective penetration of the magnetosheath plasma inside the magnetosphere. A strong distortion of the Chapman-Ferraro current in the ECE sequence generated a tripolar magnetic pulse 'decrease-peak-decrease' (DPD) that was observed at low and middle latitudes by the INTERMAGNET network of ground-based magnetometers. The characteristics of th...

  17. Automatic Reading

    Institute of Scientific and Technical Information of China (English)

    胡迪

    2007-01-01

    <正>Reading is the key to school success and,like any skill,it takes practice.A child learns to walk by practising until he no longer has to think about how to put one foot in front of the other.The great athlete practises until he can play quickly,accurately and without thinking.Ed- ucators call it automaticity.

  18. Tentative detection of clear-air turbulence using a ground-based Rayleigh lidar.

    Science.gov (United States)

    Hauchecorne, Alain; Cot, Charles; Dalaudier, Francis; Porteneuve, Jacques; Gaudo, Thierry; Wilson, Richard; Cénac, Claire; Laqui, Christian; Keckhut, Philippe; Perrin, Jean-Marie; Dolfi, Agnès; Cézard, Nicolas; Lombard, Laurent; Besson, Claudine

    2016-05-01

    Atmospheric gravity waves and turbulence generate small-scale fluctuations of wind, pressure, density, and temperature in the atmosphere. These fluctuations represent a real hazard for commercial aircraft and are known by the generic name of clear-air turbulence (CAT). Numerical weather prediction models do not resolve CAT and therefore provide only a probability of occurrence. A ground-based Rayleigh lidar was designed and implemented to remotely detect and characterize the atmospheric variability induced by turbulence in vertical scales between 40 m and a few hundred meters. Field measurements were performed at Observatoire de Haute-Provence (OHP, France) on 8 December 2008 and 23 June 2009. The estimate of the mean squared amplitude of bidimensional fluctuations of lidar signal showed excess compared to the estimated contribution of the instrumental noise. This excess can be attributed to atmospheric turbulence with a 95% confidence level. During the first night, data from collocated stratosphere-troposphere (ST) radar were available. Altitudes of the turbulent layers detected by the lidar were roughly consistent with those of layers with enhanced radar echo. The derived values of turbulence parameters Cn2 or CT2 were in the range of those published in the literature using ST radar data. However, the detection was at the limit of the instrumental noise and additional measurement campaigns are highly desirable to confirm these initial results. This is to our knowledge the first successful attempt to detect CAT in the free troposphere using an incoherent Rayleigh lidar system. The built lidar device may serve as a test bed for the definition of embarked CAT detection lidar systems aboard airliners.

  19. Mapping the East African Ionosphere Using Ground-based GPS TEC Measurements

    Science.gov (United States)

    Mengist, Chalachew Kindie; Kim, Yong Ha; Yeshita, Baylie Damtie; Workayehu, Abyiot Bires

    2016-03-01

    The East African ionosphere (3°S-18°N, 32°E-50°E) was mapped using Total Electron Content (TEC) measurements from ground-based GPS receivers situated at Asmara, Mekelle, Bahir Dar, Robe, Arbaminch, and Nairobi. Assuming a thin shell ionosphere at 350 km altitude, we project the Ionospheric Pierce Point (IPP) of a slant TEC measurement with an elevation angle of >10° to its corresponding location on the map. We then infer the estimated values at any point of interest from the vertical TEC values at the projected locations by means of interpolation. The total number of projected IPPs is in the range of 24-66 at any one time. Since the distribution of the projected IPPs is irregularly spaced, we have used an inverse distance weighted interpolation method to obtain a spatial grid resolution of 1°×1° latitude and longitude, respectively. The TEC maps were generated for the year 2008, with a 2 hr temporal resolution. We note that TEC varies diurnally, with a peak in the late afternoon (at 1700 LT), due to the equatorial ionospheric anomaly. We have observed higher TEC values at low latitudes in both hemispheres compared to the magnetic equatorial region, capturing the ionospheric distribution of the equatorial anomaly. We have also confirmed the equatorial seasonal variation in the ionosphere, characterized by minimum TEC values during the solstices and maximum values during the equinoxes. We evaluate the reliability of the map, demonstrating a mean error (difference between the measured and interpolated values) range of 0.04-0.2 TECU (Total Electron Content Unit). As more measured TEC values become available in this region, the TEC map will be more reliable, thereby allowing us to study in detail the equatorial ionosphere of the African sector, where ionospheric measurements are currently very few.

  20. Fine spectral structures in Jovian decametric radio emission observed by ground-based radio telescope.

    Science.gov (United States)

    Panchenko, M.; Brazhenko, A. I.; Shaposhnikov, V. E.; Konovalenko, A. A.; Rucker, H. O.

    2014-04-01

    Jupiter with the largest planetary magnetosphere in the solar system emits intense coherent non-thermal radio emission in a wide frequency range. This emission is a result of a complicated interaction between the dynamic Jovian magnetosphere and energetic particles supplying the free energy from planetary rotation and the interaction between Jupiter and the Galilean moons. Decametric radio emission (DAM) is the strongest component of Jovian radiation observed in a frequency range from few MHz up to 40 MHz. This emission is generated via cyclotron maser mechanism in sources located along Jovian magnetic field lines. Depending on the time scales the Jovian DAMexhibits different complex spectral structures. We present the observations of the Jovian decametric radio emission using the large ground-based radio telescope URAN- 2 (Poltava, Ukraine) operated in the decametric frequency range. This telescope is one of the largest low frequency telescopes in Europe equipped with high performance digital radio spectrometers. The antenna array of URAN-2 consists of 512 crossed dipoles with an effective area of 28 000m2 and beam pattern size of 3.5 x 7 deg. (at 25 MHz). The instrument enables continuous observations of the Jovian radio during long period of times. Jovian DAM was observed continuously since Sep. 2012 (depending on Jupiter visibility) with relatively high time-frequency resolution (4 kHz - 100ms) in the broad frequency range (8-32MHz). We have detected a big amount of the fine spectral structures in the dynamic spectra of DAM such as trains of S-bursts, quasi-continuous narrowband emission, narrow-band splitting events and zebra stripe-like patterns. We analyzed mainly the fine structures associated with non-Io controlled DAM. We discuss how the observed narrowband structures which most probably are related to the propagation of the decametric radiation in the Jupiter's ionosphere can be used to study the plasma parameters in the inner Jovian magnetosphere.

  1. Ground based interferometric radar initial look at Longview, Blue Springs, Tuttle Creek, and Milford Dams

    Science.gov (United States)

    Deng, Huazeng

    Measuring millimeter and smaller deformation has been demonstrated in the literature using RADAR. To address in part the limitations in current commercial satellite-based SAR datasets, a University of Missouri (MU) team worked with GAMMA Remote Sensing to develop a specialized (dual-frequency, polarimetric, and interferometric) ground-based real-aperture RADAR (GBIR) instrument. The GBIR device is portable with its tripod system and control electronics. It can be deployed to obtain data with high spatial resolution (i.e. on the order of 1 meter) and high temporal resolution (i.e. on the order 1 minute). The high temporal resolution is well suited for measurements of rapid deformation. From the same geodetic position, the GBIR may collect dual frequency data set using C-band and Ku-band. The overall goal of this project is to measure the deformation from various scenarios by applying the GBIR system. Initial efforts have been focusing on testing the system performance on different types of targets. This thesis details a number of my efforts on experimental and processing activities at the start of the MU GBIR imaging project. For improved close range capability, a wideband dual polarized antenna option was produced and tested. For GBIR calibration, several trihedral corner reflectors were designed and fabricated. In addition to experimental activities and site selection, I participated in advanced data processing activities. I processed GBIR data in several ways including single-look-complex (SLC) image generation, imagery registration, and interferometric processing. A number of initial-processed GBIR image products are presented from four dams: Longview, Blue Springs, Tuttle Creek, and Milford. Excellent imaging performance of the MU GBIR has been observed for various target types such as riprap, concrete, soil, rock, metal, and vegetation. Strong coherence of the test scene has been observed in the initial interferograms.

  2. Initial Results from the DEEPWAVE Airborne and Ground-Based Measurement Program in New Zealand in 2014

    Science.gov (United States)

    Fritts, Dave; Smith, Ron; Taylor, Mike; Doyle, Jim; Eckermann, Steve; Dörnbrack, Andreas; Rapp, Markus; Williams, Biff; Bossert, Katrina; Pautet, Dominique

    2015-04-01

    The deep-propagating gravity wave experiment (DEEPWAVE) was performed on and over New Zealand, Tasmania, the Tasman Sea, and the Southern Ocean with core airborne measurements extending from 5 June to 21 July 2014 and supporting ground-based measurements beginning in late May and extending beyond the airborne component. DEEPWAVE employed two aircraft, the NSF/NCAR GV and the German DLR Falcon. The GV carried the standard flight-level instruments, dropsondes, and the Microwave Temperature Profiler (MTP). It also hosted new airborne lidar and imaging instruments built specifically to allow quantification of gravity waves (GWs) from sources at lower altitudes (e.g., orography, convection, jet streams, fronts, and secondary GW generation) throughout the stratosphere and into the mesosphere and lower thermosphere (MLT). The new GV lidars included a Rayleigh lidar measuring atmospheric density and temperature from ~20-60 km and a sodium resonance lidar measuring sodium density and temperature at ~75-100 km. An airborne Advanced Mesosphere Temperature Mapper (AMTM) was also developed for the GV, and together with additional IR "wing" cameras, imaged the OH airglow temperature and/or intensity fields extending ~900 km across the GV flight track. The DLR Falcon was equipped with its standard flight-level instruments and an aerosol Doppler lidar able to measure radial winds below the Falcon where aerosol backscatter was sufficient. Additional ground-based instruments included a 449 MHz boundary layer radar, balloons at multiple sites, two ground-based Rayleigh lidars, a second ground-based AMTM, a Fabry Perot interferometer measuring winds and temperatures at ~87 and 95 km, and a meteor radar measuring winds from ~80-100 km. DEEPWAVE performed 26 GV flights, 13 Falcon flights, and an extensive series of ground-based measurements whether or not the aircraft were flying. Together, these observed many diverse cases of GW forcing, propagation, refraction, and dissipation

  3. Ground-based follow-up in relation to Kepler Asteroseismic Investigation

    CERN Document Server

    Uytterhoeven, K; Bruntt, H; De Cat, P; Frandsen, S; Gutierrez-Soto, J; Kiss, L; Kurtz, D W; Marconi, M; Molenda-Zakowicz, J; Ostensen, R; Randall, S; Southworth, J; Szabo, R

    2010-01-01

    The Kepler space mission, successfully launched in March 2009, is providing continuous, high-precision photometry of thousands of stars simultaneously. The uninterrupted time-series of stars of all known pulsation types are a precious source for asteroseismic studies. The Kepler data do not provide information on the physical parameters, such as effective temperature, surface gravity, metallicity, and vsini, which are crucial for successful asteroseismic modelling. Additional ground-based time-series data are needed to characterize mode parameters in several types of pulsating stars. Therefore, ground-based multi-colour photometry and mid/high-resolution spectroscopy are needed to complement the space data. We present ground-based activities within KASC on selected asteroseismic Kepler targets of several pulsation types. (Based on observations made with the Isaac Newton Telescope, William Herschel Telescope, Nordic Optical Telescope, Telescopio Nazionale Galileo, Mercator Telescope (La Palma, Spain), and IAC-...

  4. BigBOSS: The Ground-Based Stage IV BAO Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schlegel, David; Bebek, Chris; Heetderks, Henry; Ho, Shirley; Lampton, Michael; Levi, Michael; Mostek, Nick; Padmanabhan, Nikhil; Perlmutter, Saul; Roe, Natalie; Sholl, Michael; Smoot, George; White, Martin; Dey, Arjun; Abraham, Tony; Jannuzi, Buell; Joyce, Dick; Liang, Ming; Merrill, Mike; Olsen, Knut; Salim, Samir

    2009-04-01

    The BigBOSS experiment is a proposed DOE-NSF Stage IV ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with an all-sky galaxy redshift survey. The project is designed to unlock the mystery of dark energy using existing ground-based facilities operated by NOAO. A new 4000-fiber R=5000 spectrograph covering a 3-degree diameter field will measure BAO and redshift space distortions in the distribution of galaxies and hydrogen gas spanning redshifts from 0.2< z< 3.5. The Dark Energy Task Force figure of merit (DETF FoM) for this experiment is expected to be equal to that of a JDEM mission for BAO with the lower risk and cost typical of a ground-based experiment.

  5. Comparing Dawn, Hubble Space Telescope, and Ground-Based Interpretations of (4) Vesta

    CERN Document Server

    Reddy, Vishnu; Corre, Lucille Le; Scully, Jennifer E C; Gaskell, Robert; Russell, Christopher T; Park, Ryan S; Nathues, Andreas; Raymond, Carol; Gaffey, Michael J; Sierks, Holger; Becker, Kris J; McFadden, Lucy A

    2013-01-01

    Observations of asteroid 4 Vesta by NASA's Dawn spacecraft are interesting because its surface has the largest range of albedo, color and composition of any other asteroid visited by spacecraft to date. These hemispherical and rotational variations in surface brightness and composition have been attributed to impact processes since Vesta's formation. Prior to Dawn's arrival at Vesta, its surface properties were the focus of intense telescopic investigations for nearly a hundred years. Ground-based photometric and spectroscopic observations first revealed these variations followed later by those using Hubble Space Telescope. Here we compare interpretations of Vesta's rotation period, pole, albedo, topographic, color, and compositional properties from ground-based telescopes and HST with those from Dawn. Rotational spectral variations observed from ground-based studies are also consistent with those observed by Dawn. While the interpretation of some of these features was tenuous from past data, the interpretati...

  6. Ka-band bistatic ground-based SAR using noise signals

    Science.gov (United States)

    Lukin, K.; Mogyla, A.; Vyplavin, P.; Palamarchuk, V.; Zemlyaniy, O.; Tarasenko, V.; Zaets, N.; Skretsanov, V.; Shubniy, A.; Glamazdin, V.; Natarov, M.; Nechayev, O.

    2008-01-01

    Currently, one of the actual problems is remote monitoring of technical state of large objects. Different methods can be used for that purpose. The most promising of them relies on application of ground based synthetic aperture radars (SAR) and differential interferometry. We have designed and tested Ground Based Noise Waveform SAR based on noise radar technology [1] and synthetic aperture antennas [2]. It enabled to build an instrument for precise all-weather monitoring of large objects in real-time. We describe main performance of ground-based interferometric SAR which uses continuous Ka-band noise waveform as a probe signal. Besides, results of laboratory trials and evaluation of its main performance are presented as well.

  7. Automated Ground-based Time-lapse Camera Monitoring of West Greenland ice sheet outlet Glaciers: Challenges and Solutions

    Science.gov (United States)

    Ahn, Y.; Box, J. E.; Balog, J.; Lewinter, A.

    2008-12-01

    Monitoring Greenland outlet glaciers using remotely sensed data has drawn a great attention in earth science communities for decades and time series analysis of sensory data has provided important variability information of glacier flow by detecting speed and thickness changes, tracking features and acquiring model input. Thanks to advancements of commercial digital camera technology and increased solid state storage, we activated automatic ground-based time-lapse camera stations with high spatial/temporal resolution in west Greenland outlet and collected one-hour interval data continuous for more than one year at some but not all sites. We believe that important information of ice dynamics are contained in these data and that terrestrial mono-/stereo-photogrammetry can provide theoretical/practical fundamentals in data processing along with digital image processing techniques. Time-lapse images over periods in west Greenland indicate various phenomenon. Problematic is rain, snow, fog, shadows, freezing of water on camera enclosure window, image over-exposure, camera motion, sensor platform drift, and fox chewing of instrument cables, and the pecking of plastic window by ravens. Other problems include: feature identification, camera orientation, image registration, feature matching in image pairs, and feature tracking. Another obstacle is that non-metric digital camera contains large distortion to be compensated for precise photogrammetric use. Further, a massive number of images need to be processed in a way that is sufficiently computationally efficient. We meet these challenges by 1) identifying problems in possible photogrammetric processes, 2) categorizing them based on feasibility, and 3) clarifying limitation and alternatives, while emphasizing displacement computation and analyzing regional/temporal variability. We experiment with mono and stereo photogrammetric techniques in the aide of automatic correlation matching for efficiently handling the enormous

  8. Generating Capacity Prediction of Automatic Tracking Power Generation System on Inflatable Membrane Greenhouse Attached Photovoltaic%光伏充气膜温室自跟踪发电系统发电量预测

    Institute of Scientific and Technical Information of China (English)

    徐小力; 刘秋爽; 见浪護

    2012-01-01

    A method which can forecast generating capacity of automatic tracking power system on inflatable membrane greenhouse attached photovoltaic was proposed based on the self-adaptive variation particle swarm neural network by adding with weather information. Firstly, through combining historical data of electricity production and meteorological data, the main factors of the impact on generating capacity of power generation system on inflatable membrane greenhouse attached photovoltaic was analyzed. Then, the neural network forecasting model was established by combining the weather forecast. The self-adaptive variation particle swarm algorithm was introduced to improve the training effect by tackling the problems of slowly converging, easily falling into local optimum, and difficultly converging existed in traditional neural network forecasting model based on gradient-descent BP algorithm. The neural network was optimized with adaptable mutation particle swarm optimization ( AMPSO) algorithm. The mutation was put into particle swarm optimization(PSO) algorithm to find local optimal value. Experimental results showed that the entire convergence performance was significantly improved by adopting AMPSO and the premature convergence problem can be effectively avoided in PSO.%针对光伏充气膜温室自跟踪发电系统提出了一种加入天气预报信息的自适应变异粒子群神经网络的发电量预测算法.首先结合历史发电量数据和气象数据分析了影响光伏充气膜温室自跟踪发电系统发电量的主要因素,建立了加入天气预报的神经网络预测模型,并针对传统神经网络预测模型中基于梯度下降的BP算法收敛慢、易陷入局部最优、训练难收敛等问题,通过自适应变异粒子群算法改进了神经网络.该算法通过将变异环节引入粒子群优化算法,进行隔代进化找到局部最优解.实验结果表明所采用的自适应变异粒子群的神经网络预测算法的全

  9. Validating MODIS and Sentinel-2 NDVI Products at a Temperate Deciduous Forest Site Using Two Independent Ground-Based Sensors.

    Science.gov (United States)

    Lange, Maximilian; Dechant, Benjamin; Rebmann, Corinna; Vohland, Michael; Cuntz, Matthias; Doktor, Daniel

    2017-08-11

    Quantifying the accuracy of remote sensing products is a timely endeavor given the rapid increase in Earth observation missions. A validation site for Sentinel-2 products was hence established in central Germany. Automatic multispectral and hyperspectral sensor systems were installed in parallel with an existing eddy covariance flux tower, providing spectral information of the vegetation present at high temporal resolution. Normalized Difference Vegetation Index (NDVI) values from ground-based hyperspectral and multispectral sensors were compared with NDVI products derived from Sentinel-2A and Moderate-resolution Imaging Spectroradiometer (MODIS). The influence of different spatial and temporal resolutions was assessed. High correlations and similar phenological patterns between in situ and satellite-based NDVI time series demonstrated the reliability of satellite-based phenological metrics. Sentinel-2-derived metrics showed better agreement with in situ measurements than MODIS-derived metrics. Dynamic filtering with the best index slope extraction algorithm was nevertheless beneficial for Sentinel-2 NDVI time series despite the availability of quality information from the atmospheric correction procedure.

  10. Comparison of Snow Albedo from MISR, MODIS and AVHRR with ground-based observations on the Greenland Ice Sheet

    Science.gov (United States)

    Stroeve, J. C.; Nolin, A.

    2001-12-01

    The surface albedo is an important climate parameter, as it controls the amount of solar radiation absorbed by the surface. For snow-covered surfaces, the albedo may be greater than 0.80, thereby allowing very little solar energy to be absorbed by the snowpack. As the snow ages and/or begins to melt, the albedo is reduced considerably, leading to enhanced absorption of solar radiation. Consequently, snow melt, comprises an unstable, positive feedback component of the climate system, which amplifies small pertubations to that system. Satellite remote sensing offers a means for measuring and monitoring the surface albedo of snow-covered areas. This study evaluates snow surface albedo retrievals from MISR, MODIS and AVHRR through comparisons with surface albedo measurements obtained in Greenland. Data from automatic weather stations, in addition to other in situ data collected during 2000 provide the ground-based measurements with which to compare coincident clear-sky satellite albedo retrievals. In general, agreements are good with the satellite data. However, satellite calibration and difficulties accurately representing the angular signature of the snow surface make it difficult to reach an albedo accuracy within 0.05.

  11. First ground-based FTIR-observations of methane in the tropics

    Directory of Open Access Journals (Sweden)

    A. K. Petersen

    2010-02-01

    Full Text Available Total column concentrations and volume mixing ratio profiles of methane have been retrieved from ground-based solar absorption FTIR spectra in the near-infrared recorded in Paramaribo (Suriname. The methane FTIR observations are compared with TM5 model simulations and satellite observations from SCIAMACHY, and represent the first validation of SCIAMACHY retrievals in the tropics using ground-based remote sensing techniques. Apart from local biomass burning features, our methane FTIR observations agree well with the SCIAMACHY retrievals and TM5 model simulations.

  12. Extended lateral heating of the nighttime ionosphere by ground-based VLF transmitters

    OpenAIRE

    İnan, Umran Savaş; Graf, K. L.; Spasojevic, M.; Marshall, R. A.; Lehtinen, N. G.; Foust, F. R.

    2013-01-01

    JOURNAL OF GEOPHYSICAL RESEARCH: SPACE PHYSICS, VOL. 118, 7783–7797, doi:10.1002/2013JA019337, 2013 Extended lateral heating of the nighttime ionosphere by ground-based VLF transmitters K. L. Graf,1 M. Spasojevic,1 R. A. Marshall,2 N. G. Lehtinen,1 F. R. Foust,1 and U. S. Inan1,3 Received 16 August 2013; revised 9 October 2013; accepted 11 November 2013; published 3 December 2013. [1] The effects of ground-based very low frequency (VLF) transmitters on the lower ionospher...

  13. Status of advanced ground-based laser interferometers for gravitational-wave detection

    CERN Document Server

    Dooley, Katherine L; Dwyer, Sheila; Puppo, Paola

    2014-01-01

    Ground-based laser interferometers for gravitational-wave (GW) detection were first constructed starting 20 years ago and as of 2010 collection of several years' worth of science data at initial design sensitivities was completed. Upgrades to the initial detectors together with construction of brand new detectors are ongoing and feature advanced technologies to improve the sensitivity to GWs. This conference proceeding provides an overview of the common design features of ground-based laser interferometric GW detectors and establishes the context for the status updates of each of the four gravitational-wave detectors around the world: Advanced LIGO, Advanced Virgo, GEO600 and KAGRA.

  14. Asteroseismology of solar-type stars with Kepler: III. Ground-based data

    DEFF Research Database (Denmark)

    Karoff, Christoffer; Molenda-Żakowicz , J.

    2010-01-01

    We report on the ground-based follow-up program of spectroscopic and photometric observations of solar-like asteroseismic targets for the Kepler space mission. These stars constitute a large group of more than a thousand objects which are the subject of an intensive study by the Kepler Asteroseis......We report on the ground-based follow-up program of spectroscopic and photometric observations of solar-like asteroseismic targets for the Kepler space mission. These stars constitute a large group of more than a thousand objects which are the subject of an intensive study by the Kepler...

  15. Estimation of solar irradiance using ground-based whole sky imagers

    CERN Document Server

    Dev, Soumyabrata; Lee, Yee Hui; Winkler, Stefan

    2016-01-01

    Ground-based whole sky imagers (WSIs) can provide localized images of the sky of high temporal and spatial resolution, which permits fine-grained cloud observation. In this paper, we show how images taken by WSIs can be used to estimate solar radiation. Sky cameras are useful here because they provide additional information about cloud movement and coverage, which are otherwise not available from weather station data. Our setup includes ground-based weather stations at the same location as the imagers. We use their measurements to validate our methods.

  16. Development of a automatic positioning system of photovoltaic panels for electric energy generation; Desenvolvimento de um sistema de posicionamento automatico de placas fotovoltaicas para a geracao de energia eletrica

    Energy Technology Data Exchange (ETDEWEB)

    Alves, Alceu F.; Cagnon, Odivaldo Jose [Universidade Estadual Paulista (DEE/FEB/UNESP), Bauru, SP (Brazil). Fac. de Engenharia. Dept. de Engenharia Eletrica; Seraphin, Odivaldo Jose [Universidade Estadual Paulista (DER/FCA/UNESP), Botucatu, SP (Brazil). Fac. de Ciencias Agronomicas. Dept. de Engenharia Rural

    2008-07-01

    This work presents an automatic positioning system for photovoltaic panels, in order to improve the conversion of solar energy to electric energy. A prototype with automatic movement was developed, and its efficiency in generating electric energy was compared to another one with the same characteristics, but fixed in space. Preliminary results point to a significant increase in efficiency, obtained from a simplified process of movement, in which sensors are not used to determine the apparent sun's position, but instead of it, the relative Sun-Earth's position equations are used. An innovative movement mechanical system is also presented, using two stepper motors to move the panel along two-axis, but with independent movement, contributing, this way, to save energy during the positioning times. The use of this proposed system in rural areas is suggested. (author)

  17. Supporting a Diverse Community of Undergraduate Researchers in Satellite and Ground-Based Remote Sensing

    Science.gov (United States)

    Blake, R.; Liou-Mark, J.

    2012-12-01

    The U.S. remains in grave danger of losing its global competitive edge in STEM. To find solutions to this problem, the Obama Administration proposed two new national initiatives: the Educate to Innovate Initiative and the $100 million government/private industry initiative to train 100,000 STEM teachers and graduate 1 million additional STEM students over the next decade. To assist in ameliorating the national STEM plight, the New York City College of Technology has designed its NSF Research Experience for Undergraduate (REU) program in satellite and ground-based remote sensing to target underrepresented minority students. Since the inception of the program in 2008, a total of 45 undergraduate students of which 38 (84%) are considered underrepresented minorities in STEM have finished or are continuing with their research or are pursuing their STEM endeavors. The program is comprised of the three primary components. The first component, Structured Learning Environments: Preparation and Mentorship, provides the REU Scholars with the skill sets necessary for proficiency in satellite and ground-based remote sensing research. The students are offered mini-courses in Geographic Information Systems, MATLAB, and Remote Sensing. They also participate in workshops on the Ethics of Research. Each REU student is a member of a team that consists of faculty mentors, post doctorate/graduate students, and high school students. The second component, Student Support and Safety Nets, provides undergraduates a learning environment that supports them in becoming successful researchers. Special networking and Brown Bag sessions, and an annual picnic with research scientists are organized so that REU Scholars are provided with opportunities to expand their professional community. Graduate school support is provided by offering free Graduate Record Examination preparation courses and workshops on the graduate school application process. Additionally, students are supported by college

  18. Sensitivity Comparison of Searches for Binary Black Hole Coalescences with Ground-based Gravitational-Wave Detectors

    CERN Document Server

    Mohapatra, Satya; Caudill, Sarah; Clark, James; Hanna, Chad; Klimenko, Sergey; Pankow, Chris; Vaulin, Ruslan; Vedovato, Gabriele; Vitale, Salvatore

    2014-01-01

    Searches for gravitational-wave transients from binary black hole coalescences typically rely on one of two approaches: matched filtering with templates and morphology-independent excess power searches. Multiple algorithmic implementations in the analysis of data from the first generation of ground-based gravitational wave interferometers have used different strategies for the suppression of non-Gaussian noise transients, and targeted different regions of the binary black hole parameter space. In this paper we compare the sensitivity of three such algorithms: matched filtering with full coalescence templates, matched filtering with ringdown templates and a morphology-independent excess power search. The comparison is performed at a fixed false alarm rate and relies on Monte-carlo simulations of binary black hole coalescences for spinning, non-precessing systems with total mass 25-350 solar mass, which covers the parameter space of stellar mass and intermediate mass black hole binaries. We find that in the mas...

  19. 基于攻击模式的完备攻击图自动生成方法%Complete Attack Graph Automatic Generation Method Based on Attack Pattern

    Institute of Scientific and Technical Information of China (English)

    刘龙; 陈秀真; 李建华

    2013-01-01

    无圈攻击图结构简单,但在构建过程中会导致部分路径缺失。为此,给出完备攻击图的概念,提出基于攻击模式的完备攻击图自动生成方法。通过分析网络防火墙的配置文件,自动获取网络连通性。完善攻击模式知识库以优化攻击者能力建模,并在此基础上设计广度优先前向搜索的攻击图生成算法,实现自动生成完备攻击图的原型。实验结果表明,该方法的自动化程度高、时间消耗少,可应用于大型网络。%As the generation of attack graph without loops leads to missing of attack paths, this paper puts forward the concept of complete attack graph and builds its automatic generation method. It obtains the network connectivity automatically by analyzing the firewall configuration files, to get rid of tedious manual input. Then the attack patterns are enriched to cover almost all network attack types and based on them, an efficient approach to complete attack graph generation is built. In the end, a model to generate complete attack graph automatically using the algorithm is built. Experimental result shows that this method has less time consumption, high degree of automation, and it can be applied to large networks.

  20. Design concepts for the Cherenkov Telescope Array CTA: an advanced facility for ground-based high-energy gamma-ray astronomy

    Science.gov (United States)

    Actis, M.; Agnetta, G.; Aharonian, F.; Akhperjanian, A.; Aleksić, J.; Aliu, E.; Allan, D.; Allekotte, I.; Antico, F.; Antonelli, L. A.; Antoranz, P.; Aravantinos, A.; Arlen, T.; Arnaldi, H.; Artmann, S.; Asano, K.; Asorey, H.; Bähr, J.; Bais, A.; Baixeras, C.; Bajtlik, S.; Balis, D.; Bamba, A.; Barbier, C.; Barceló, M.; Barnacka, A.; Barnstedt, J.; Barres de Almeida, U.; Barrio, J. A.; Basso, S.; Bastieri, D.; Bauer, C.; Becerra, J.; Becherini, Y.; Bechtol, K.; Becker, J.; Beckmann, V.; Bednarek, W.; Behera, B.; Beilicke, M.; Belluso, M.; Benallou, M.; Benbow, W.; Berdugo, J.; Berger, K.; Bernardino, T.; Bernlöhr, K.; Biland, A.; Billotta, S.; Bird, T.; Birsin, E.; Bissaldi, E.; Blake, S.; Blanch, O.; Bobkov, A. A.; Bogacz, L.; Bogdan, M.; Boisson, C.; Boix, J.; Bolmont, J.; Bonanno, G.; Bonardi, A.; Bonev, T.; Borkowski, J.; Botner, O.; Bottani, A.; Bourgeat, M.; Boutonnet, C.; Bouvier, A.; Brau-Nogué, S.; Braun, I.; Bretz, T.; Briggs, M. S.; Brun, P.; Brunetti, L.; Buckley, J. H.; Bugaev, V.; Bühler, R.; Bulik, T.; Busetto, G.; Buson, S.; Byrum, K.; Cailles, M.; Cameron, R.; Canestrari, R.; Cantu, S.; Carmona, E.; Carosi, A.; Carr, J.; Carton, P. H.; Casiraghi, M.; Castarede, H.; Catalano, O.; Cavazzani, S.; Cazaux, S.; Cerruti, B.; Cerruti, M.; Chadwick, P. M.; Chiang, J.; Chikawa, M.; Cieślar, M.; Ciesielska, M.; Cillis, A.; Clerc, C.; Colin, P.; Colomé, J.; Compin, M.; Conconi, P.; Connaughton, V.; Conrad, J.; Contreras, J. L.; Coppi, P.; Corlier, M.; Corona, P.; Corpace, O.; Corti, D.; Cortina, J.; Costantini, H.; Cotter, G.; Courty, B.; Couturier, S.; Covino, S.; Croston, J.; Cusumano, G.; Daniel, M. K.; Dazzi, F.; Angelis, A. De; de Cea Del Pozo, E.; de Gouveia Dal Pino, E. M.; de Jager, O.; de La Calle Pérez, I.; de La Vega, G.; de Lotto, B.; de Naurois, M.; de Oña Wilhelmi, E.; de Souza, V.; Decerprit, B.; Deil, C.; Delagnes, E.; Deleglise, G.; Delgado, C.; Dettlaff, T.; di Paolo, A.; di Pierro, F.; Díaz, C.; Dick, J.; Dickinson, H.; Digel, S. W.; Dimitrov, D.; Disset, G.; Djannati-Ataï, A.; Doert, M.; Domainko, W.; Dorner, D.; Doro, M.; Dournaux, J.-L.; Dravins, D.; Drury, L.; Dubois, F.; Dubois, R.; Dubus, G.; Dufour, C.; Durand, D.; Dyks, J.; Dyrda, M.; Edy, E.; Egberts, K.; Eleftheriadis, C.; Elles, S.; Emmanoulopoulos, D.; Enomoto, R.; Ernenwein, J.-P.; Errando, M.; Etchegoyen, A.; Falcone, A. D.; Farakos, K.; Farnier, C.; Federici, S.; Feinstein, F.; Ferenc, D.; Fillin-Martino, E.; Fink, D.; Finley, C.; Finley, J. P.; Firpo, R.; Florin, D.; Föhr, C.; Fokitis, E.; Font, Ll.; Fontaine, G.; Fontana, A.; Förster, A.; Fortson, L.; Fouque, N.; Fransson, C.; Fraser, G. W.; Fresnillo, L.; Fruck, C.; Fujita, Y.; Fukazawa, Y.; Funk, S.; Gäbele, W.; Gabici, S.; Gadola, A.; Galante, N.; Gallant, Y.; García, B.; García López, R. J.; Garrido, D.; Garrido, L.; Gascón, D.; Gasq, C.; Gaug, M.; Gaweda, J.; Geffroy, N.; Ghag, C.; Ghedina, A.; Ghigo, M.; Gianakaki, E.; Giarrusso, S.; Giavitto, G.; Giebels, B.; Giro, E.; Giubilato, P.; Glanzman, T.; Glicenstein, J.-F.; Gochna, M.; Golev, V.; Gómez Berisso, M.; González, A.; González, F.; Grañena, F.; Graciani, R.; Granot, J.; Gredig, R.; Green, A.; Greenshaw, T.; Grimm, O.; Grube, J.; Grudzińska, M.; Grygorczuk, J.; Guarino, V.; Guglielmi, L.; Guilloux, F.; Gunji, S.; Gyuk, G.; Hadasch, D.; Haefner, D.; Hagiwara, R.; Hahn, J.; Hallgren, A.; Hara, S.; Hardcastle, M. J.; Hassan, T.; Haubold, T.; Hauser, M.; Hayashida, M.; Heller, R.; Henri, G.; Hermann, G.; Herrero, A.; Hinton, J. A.; Hoffmann, D.; Hofmann, W.; Hofverberg, P.; Horns, D.; Hrupec, D.; Huan, H.; Huber, B.; Huet, J.-M.; Hughes, G.; Hultquist, K.; Humensky, T. B.; Huppert, J.-F.; Ibarra, A.; Illa, J. M.; Ingjald, J.; Inoue, Y.; Inoue, S.; Ioka, K.; Jablonski, C.; Jacholkowska, A.; Janiak, M.; Jean, P.; Jensen, H.; Jogler, T.; Jung, I.; Kaaret, P.; Kabuki, S.; Kakuwa, J.; Kalkuhl, C.; Kankanyan, R.; Kapala, M.; Karastergiou, A.; Karczewski, M.; Karkar, S.; Karlsson, N.; Kasperek, J.; Katagiri, H.; Katarzyński, K.; Kawanaka, N.; Kȩdziora, B.; Kendziorra, E.; Khélifi, B.; Kieda, D.; Kifune, T.; Kihm, T.; Klepser, S.; Kluźniak, W.; Knapp, J.; Knappy, A. R.; Kneiske, T.; Knödlseder, J.; Köck, F.; Kodani, K.; Kohri, K.; Kokkotas, K.; Komin, N.; Konopelko, A.; Kosack, K.; Kossakowski, R.; Kostka, P.; Kotuła, J.; Kowal, G.; Kozioł, J.; Krähenbühl, T.; Krause, J.; Krawczynski, H.; Krennrich, F.; Kretzschmann, A.; Kubo, H.; Kudryavtsev, V. A.; Kushida, J.; La Barbera, N.; La Parola, V.; La Rosa, G.; López, A.; Lamanna, G.; Laporte, P.; Lavalley, C.; Le Flour, T.; Le Padellec, A.; Lenain, J.-P.; Lessio, L.; Lieunard, B.; Lindfors, E.; Liolios, A.; Lohse, T.; Lombardi, S.; Lopatin, A.; Lorenz, E.; Lubiński, P.; Luz, O.; Lyard, E.; Maccarone, M. C.; Maccarone, T.; Maier, G.; Majumdar, P.; Maltezos, S.; Małkiewicz, P.; Mañá, C.; Manalaysay, A.; Maneva, G.; Mangano, A.; Manigot, P.; Marín, J.; Mariotti, M.; Markoff, S.; Martínez, G.; Martínez, M.; Mastichiadis, A.; Matsumoto, H.; Mattiazzo, S.; Mazin, D.; McComb, T. J. L.; McCubbin, N.; McHardy, I.; Medina, C.; Melkumyan, D.; Mendes, A.; Mertsch, P.; Meucci, M.; Michałowski, J.; Micolon, P.; Mineo, T.; Mirabal, N.; Mirabel, F.; Miranda, J. M.; Mirzoyan, R.; Mizuno, T.; Moal, B.; Moderski, R.; Molinari, E.; Monteiro, I.; Moralejo, A.; Morello, C.; Mori, K.; Motta, G.; Mottez, F.; Moulin, E.; Mukherjee, R.; Munar, P.; Muraishi, H.; Murase, K.; Murphy, A. Stj.; Nagataki, S.; Naito, T.; Nakamori, T.; Nakayama, K.; Naumann, C.; Naumann, D.; Nayman, P.; Nedbal, D.; Niedźwiecki, A.; Niemiec, J.; Nikolaidis, A.; Nishijima, K.; Nolan, S. J.; Nowak, N.; O'Brien, P. T.; Ochoa, I.; Ohira, Y.; Ohishi, M.; Ohka, H.; Okumura, A.; Olivetto, C.; Ong, R. A.; Orito, R.; Orr, M.; Osborne, J. P.; Ostrowski, M.; Otero, L.; Otte, A. N.; Ovcharov, E.; Oya, I.; Oziȩbło, A.; Paiano, S.; Pallota, J.; Panazol, J. L.; Paneque, D.; Panter, M.; Paoletti, R.; Papyan, G.; Paredes, J. M.; Pareschi, G.; Parsons, R. D.; Paz Arribas, M.; Pedaletti, G.; Pepato, A.; Persic, M.; Petrucci, P. O.; Peyaud, B.; Piechocki, W.; Pita, S.; Pivato, G.; Płatos, Ł.; Platzer, R.; Pogosyan, L.; Pohl, M.; Pojmański, G.; Ponz, J. D.; Potter, W.; Prandini, E.; Preece, R.; Prokoph, H.; Pühlhofer, G.; Punch, M.; Quel, E.; Quirrenbach, A.; Rajda, P.; Rando, R.; Rataj, M.; Raue, M.; Reimann, C.; Reimann, O.; Reimer, A.; Reimer, O.; Renaud, M.; Renner, S.; Reymond, J.-M.; Rhode, W.; Ribó, M.; Ribordy, M.; Rico, J.; Rieger, F.; Ringegni, P.; Ripken, J.; Ristori, P.; Rivoire, S.; Rob, L.; Rodriguez, S.; Roeser, U.; Romano, P.; Romero, G. E.; Rosier-Lees, S.; Rovero, A. C.; Roy, F.; Royer, S.; Rudak, B.; Rulten, C. B.; Ruppel, J.; Russo, F.; Ryde, F.; Sacco, B.; Saggion, A.; Sahakian, V.; Saito, K.; Saito, T.; Sakaki, N.; Salazar, E.; Salini, A.; Sánchez, F.; Sánchez Conde, M. Á.; Santangelo, A.; Santos, E. M.; Sanuy, A.; Sapozhnikov, L.; Sarkar, S.; Scalzotto, V.; Scapin, V.; Scarcioffolo, M.; Schanz, T.; Schlenstedt, S.; Schlickeiser, R.; Schmidt, T.; Schmoll, J.; Schroedter, M.; Schultz, C.; Schultze, J.; Schulz, A.; Schwanke, U.; Schwarzburg, S.; Schweizer, T.; Seiradakis, J.; Selmane, S.; Seweryn, K.; Shayduk, M.; Shellard, R. C.; Shibata, T.; Sikora, M.; Silk, J.; Sillanpää, A.; Sitarek, J.; Skole, C.; Smith, N.; Sobczyńska, D.; Sofo Haro, M.; Sol, H.; Spanier, F.; Spiga, D.; Spyrou, S.; Stamatescu, V.; Stamerra, A.; Starling, R. L. C.; Stawarz, Ł.; Steenkamp, R.; Stegmann, C.; Steiner, S.; Stergioulas, N.; Sternberger, R.; Stinzing, F.; Stodulski, M.; Straumann, U.; Suárez, A.; Suchenek, M.; Sugawara, R.; Sulanke, K. H.; Sun, S.; Supanitsky, A. D.; Sutcliffe, P.; Szanecki, M.; Szepieniec, T.; Szostek, A.; Szymkowiak, A.; Tagliaferri, G.; Tajima, H.; Takahashi, H.; Takahashi, K.; Takalo, L.; Takami, H.; Talbot, R. G.; Tam, P. H.; Tanaka, M.; Tanimori, T.; Tavani, M.; Tavernet, J.-P.; Tchernin, C.; Tejedor, L. A.; Telezhinsky, I.; Temnikov, P.; Tenzer, C.; Terada, Y.; Terrier, R.; Teshima, M.; Testa, V.; Tibaldo, L.; Tibolla, O.; Tluczykont, M.; Todero Peixoto, C. J.; Tokanai, F.; Tokarz, M.; Toma, K.; Torres, D. F.; Tosti, G.; Totani, T.; Toussenel, F.; Vallania, P.; Vallejo, G.; van der Walt, J.; van Eldik, C.; Vandenbroucke, J.; Vankov, H.; Vasileiadis, G.; Vassiliev, V. V.; Vegas, I.; Venter, L.; Vercellone, S.; Veyssiere, C.; Vialle, J. P.; Videla, M.; Vincent, P.; Vink, J.; Vlahakis, N.; Vlahos, L.; Vogler, P.; Vollhardt, A.; Volpe, F.; von Gunten, H. P.; Vorobiov, S.; Wagner, S.; Wagner, R. M.; Wagner, B.; Wakely, S. P.; Walter, P.; Walter, R.; Warwick, R.; Wawer, P.; Wawrzaszek, R.; Webb, N.; Wegner, P.; Weinstein, A.; Weitzel, Q.; Welsing, R.; Wetteskind, H.; White, R.; Wierzcholska, A.; Wilkinson, M. I.; Williams, D. A.; Winde, M.; Wischnewski, R.; Wiśniewski, Ł.; Wolczko, A.; Wood, M.; Xiong, Q.; Yamamoto, T.; Yamaoka, K.; Yamazaki, R.; Yanagita, S.; Yoffo, B.; Yonetani, M.; Yoshida, A.; Yoshida, T.; Yoshikoshi, T.; Zabalza, V.; Zagdański, A.; Zajczyk, A.; Zdziarski, A.; Zech, A.; Ziȩtara, K.; Ziółkowski, P.; Zitelli, V.; Zychowski, P.

    2011-12-01

    Ground-based gamma-ray astronomy has had a major breakthrough with the impressive results obtained using systems of imaging atmospheric Cherenkov telescopes. Ground-based gamma-ray astronomy has a huge potential in astrophysics, particle physics and cosmology. CTA is an international initiative to build the next generation instrument, with a factor of 5-10 improvement in sensitivity in the 100 GeV-10 TeV range and the extension to energies well below 100 GeV and above 100 TeV. CTA will consist of two arrays (one in the north, one in the south) for full sky coverage and will be operated as open observatory. The design of CTA is based on currently available technology. This document reports on the status and presents the major design concepts of CTA.

  1. Design Concepts for the Cherenkov Telescope Array CTA: An Advanced Facility for Ground-Based High-Energy Gamma-Ray Astronomy

    Energy Technology Data Exchange (ETDEWEB)

    Actis, M

    2012-04-17

    Ground-based gamma-ray astronomy has had a major breakthrough with the impressive results obtained using systems of imaging atmospheric Cherenkov telescopes. Ground-based gamma-ray astronomy has a huge potential in astrophysics, particle physics and cosmology. CTA is an international initiative to build the next generation instrument, with a factor of 5-10 improvement in sensitivity in the 100 GeV-10 TeV range and the extension to energies well below 100 GeV and above 100 TeV. CTA will consist of two arrays (one in the north, one in the south) for full sky coverage and will be operated as open observatory. The design of CTA is based on currently available technology. This document reports on the status and presents the major design concepts of CTA.

  2. Natural language processing techniques for automatic test ...

    African Journals Online (AJOL)

    Journal of Computer Science and Its Application ... The questions were generated by first extracting the text from the materials supplied by the ... Keywords: Discourse Connectives, Machine Learning, Automatic Test Generation E-Learning.

  3. Ground-Based VIS/NIR Reflectance Spectra of 25143 Itokawa: What Hayabusa will See and How Ground-Based Data can Augment Analyses

    Science.gov (United States)

    Vilas, Faith; Abell, P. A.; Jarvis, K. S.

    2004-01-01

    Planning for the arrival of the Hayabusa spacecraft at asteroid 25143 Itokawa includes consideration of the expected spectral information to be obtained using the AMICA and NIRS instruments. The rotationally-resolved spatial coverage the asteroid we have obtained with ground-based telescopic spectrophotometry in the visible and near-infrared can be utilized here to address expected spacecraft data. We use spectrophotometry to simulate the types of data that Hayabusa will receive with the NIRS and AMICA instruments, and will demonstrate them here. The NIRS will cover a wavelength range from 0.85 m, and have a dispersion per element of 250 Angstroms. Thus, we are limited in coverage of the 1.0 micrometer and 2.0 micrometer mafic silicate absorption features. The ground-based reflectance spectra of Itokawa show a large component of olivine in its surface material, and the 2.0 micrometer feature is shallow. Determining the olivine to pyroxene abundance ratio is critically dependent on the attributes of the 1.0- and 2.0 micrometer features. With a cut-off near 2,1 micrometer the longer edge of the 2.0- feature will not be obtained by NIRS. Reflectance spectra obtained using ground-based telescopes can be used to determine the regional composition around space-based spectral observations, and possibly augment the longer wavelength spectral attributes. Similarly, the shorter wavelength end of the 1.0 micrometer absorption feature will be partially lost to the NIRS. The AMICA filters mimic the ECAS filters, and have wavelength coverage overlapping with the NIRS spectral range. We demonstrate how merging photometry from AMICA will extend the spectral coverage of the NIRS. Lessons learned from earlier spacecraft to asteroids should be considered.

  4. Development of binary image masks for TPF-C and ground-based AO coronagraphs

    Science.gov (United States)

    Ge, Jian; Crepp, Justin; Vanden Heuvel, Andrew; Miller, Shane; McDavitt, Dan; Kravchenko, Ivan; Kuchner, Marc

    2006-06-01

    We report progress on the development of precision binary notch-filter focal plane coronagraphic masks for directly imaging Earth-like planets at visible wavelengths with the Terrestrial Planet Finder Coronagraph (TPF-C), and substellar companions at near infrared wavelengths from the ground with coronagraphs coupled to high-order adaptive optics (AO) systems. Our recent theoretical studies show that 8th-order image masks (Kuchner, Crepp & Ge 2005, KCG05) are capable of achieving unlimited dynamic range in an ideal optical system, while simultaneously remaining relatively insensitive to low-spatial-frequency optical aberrations, such as tip/tilt errors, defocus, coma, astigmatism, etc. These features offer a suite of advantages for the TPF-C by relaxing many control and stability requirements, and can also provide resistance to common practical problems associated with ground-based observations; for example, telescope flexure and low-order errors left uncorrected by the AO system due to wavefront sensor-deformable mirror lag time can leak light at significant levels. Our recent lab experiments show that prototype image masks can generate contrast levels on the order of 2x10 -6 at 3 λ/D and 6x10 -7 at 10 λ/D without deformable mirror correction using monochromatic light (Crepp et al. 2006), and that this contrast is limited primarily by light scattered by imperfections in the optics and extra diffraction created by mask construction errors. These experiments also indicate that the tilt and defocus sensitivities of high-order masks follow the theoretical predictions of Shaklan and Green 2005. In this paper, we discuss these topics as well as review our progress on developing techniques for fabricating a new series of image masks that are "free-standing", as such construction designs may alleviate some of the (mostly chromatic) problems associated with masks that rely on glass substrates for mechanical support. Finally, results obtained from our AO coronagraph

  5. On reconciling ground-based with spaceborne normalized radar cross section measurements

    DEFF Research Database (Denmark)

    Baumgartner, Francois; Munk, Jens; Jezek, K C

    2002-01-01

    This study examines differences in the normalized radar cross section, derived from ground-based versus spaceborne radar data. A simple homogeneous half-space model, indicates that agreement between the two improves as 1) the distance from the scatterer is increased; and/or 2) the extinction...

  6. Precision simulation of ground-based lensing data using observations from space

    CERN Document Server

    Mandelbaum, Rachel; Leauthaud, Alexie; Massey, Richard J; Rhodes, Jason

    2011-01-01

    Current and upcoming wide-field, ground-based, broad-band imaging surveys promise to address a wide range of outstanding problems in galaxy formation and cosmology. Several such uses of ground-based data, especially weak gravitational lensing, require highly precise measurements of galaxy image statistics with careful correction for the effects of the point-spread function (PSF). In this paper, we introduce the SHERA (SHEar Reconvolution Analysis) software to simulate ground-based imaging data with realistic galaxy morphologies and observing conditions, starting from space-based data (from COSMOS, the Cosmological Evolution Survey) and accounting for the effects of the space-based PSF. This code simulates ground-based data, optionally with a weak lensing shear applied, in a model-independent way using a general Fourier space formalism. The utility of this pipeline is that it allows for a precise, realistic assessment of systematic errors due to the method of data processing, for example in extracting weak len...

  7. Analysis of the substorm trigger phase using multiple ground-based instrumentation

    Energy Technology Data Exchange (ETDEWEB)

    Kauristie, K.; Pulkkinen, T.I.; Pellinen, R.J. [Finnish Meteorological Institute, Helsinki (Finland)] [and others

    1995-08-01

    The authors discuss in detail the observation of an event of auroral activity fading during the trigger, or growth phase of a magnetic storm. This event was observed by all-sky cameras, EISCAT radar and magnetometers, riometers, and pulsation magnetometers, from ground based stations in Finland and Scandanavia. Based on their detailed analysis, they present a possible cause for the observed fading.

  8. Simulation of the imaging quality of ground-based telescopes affected by atmospheric disturbances

    Science.gov (United States)

    Ren, Yubin; Kou, Songfeng; Gu, Bozhong

    2014-08-01

    Ground-based telescope imaging model is developed in this paper, the relationship between the atmospheric disturbances and the ground-based telescope image quality is studied. Simulation of the wave-front distortions caused by atmospheric turbulences has long been an important method in the study of the propagation of light through the atmosphere. The phase of the starlight wave-front is changed over time, but in an appropriate short exposure time, the atmospheric disturbances can be considered as "frozen". In accordance with Kolmogorov turbulence theory, simulating atmospheric disturbances of image model based on the phase screen distorted by atmospheric turbulences is achieved by the fast Fourier transform (FFT). Geiger mode avalanche photodiode array (APD arrays) model is used for atmospheric wave-front detection, the image is achieved by inversion method of photon counting after the target starlight goes through phase screens and ground-based telescopes. Ground-based telescope imaging model is established in this paper can accurately achieve the relationship between the quality of telescope imaging and monolayer or multilayer atmosphere disturbances, and it is great significance for the wave-front detection and optical correction in a Multi-conjugate Adaptive Optics system (MCAO).

  9. Ground-based LIDAR: a novel approach to quantify fine-scale fuelbed characteristics

    Science.gov (United States)

    E.L. Loudermilk; J.K. Hiers; J.J. O’Brien; R.J. Mitchell; A. Singhania; J.C. Fernandez; W.P. Cropper; K.C. Slatton

    2009-01-01

    Ground-based LIDAR (also known as laser ranging) is a novel technique that may precisely quantify fuelbed characteristics important in determining fire behavior. We measured fuel properties within a south-eastern US longleaf pine woodland at the individual plant and fuelbed scale. Data were collected using a mobile terrestrial LIDAR unit at sub-cm scale for individual...

  10. Use of neural networks in ground-based aerosol retrievals from multi-angle spectropolarimetric observations

    NARCIS (Netherlands)

    Di Noia, A.; Hasekamp, O.P.; Harten, G. van; Rietjens, J.H.H.; Smit, J.M.; Snik, F.; Henzing, J.S.; Boer, J. de; Keller, C.U.; Volten, H.

    2015-01-01

    In this paper, the use of a neural network algorithm for the retrieval of the aerosol properties from ground-based spectropolarimetric measurements is discussed. The neural network is able to retrieve the aerosol properties with an accuracy that is almost comparable to that of an iterative retrieval

  11. Retrieval of liquid water cloud properties from ground-based remote sensing observations

    NARCIS (Netherlands)

    Knist, C.L.

    2014-01-01

    Accurate ground-based remotely sensed microphysical and optical properties of liquid water clouds are essential references to validate satellite-observed cloud properties and to improve cloud parameterizations in weather and climate models. This requires the evaluation of algorithms for retrieval of

  12. Ground-based remote sensing scheme for monitoring aerosol–cloud interactions (discussion)

    NARCIS (Netherlands)

    Sarna, K.; Russchenberg, H.W.J.

    2015-01-01

    A method for continuous observation of aerosol–cloud interactions with ground-based remote sensing instruments is presented. The main goal of this method is to enable the monitoring of cloud microphysical changes due to the changing aerosol concentration. We use high resolution measurements from lid

  13. Ground-based remote sensing scheme for monitoring aerosol-cloud interactions

    NARCIS (Netherlands)

    Sarna, K.; Russchenberg, H.W.J.

    2016-01-01

    A new method for continuous observation of aerosol–cloud interactions with ground-based remote sensing instruments is presented. The main goal of this method is to enable the monitoring of the change of the cloud droplet size due to the change in the aerosol concentration. We use high-resolution mea

  14. Asteroseismology of solar-type stars with Kepler: III. Ground-based data

    DEFF Research Database (Denmark)

    Karoff, Christoffer; Molenda-Żakowicz , J.

    2010-01-01

    We report on the ground-based follow-up program of spectroscopic and photometric observations of solar-like asteroseismic targets for the Kepler space mission. These stars constitute a large group of more than a thousand objects which are the subject of an intensive study by the Kepler Asteroseis...

  15. A Fast Method for Embattling Optimization of Ground-Based Radar Surveillance Network

    Science.gov (United States)

    Jiang, H.; Cheng, H.; Zhang, Y.; Liu, J.

    A growing number of space activities have created an orbital debris environment that poses increasing impact risks to existing space systems and human space flight. For the safety of in-orbit spacecraft, a lot of observation facilities are needed to catalog space objects, especially in low earth orbit. Surveillance of Low earth orbit objects are mainly rely on ground-based radar, due to the ability limitation of exist radar facilities, a large number of ground-based radar need to build in the next few years in order to meet the current space surveillance demands. How to optimize the embattling of ground-based radar surveillance network is a problem to need to be solved. The traditional method for embattling optimization of ground-based radar surveillance network is mainly through to the detection simulation of all possible stations with cataloged data, and makes a comprehensive comparative analysis of various simulation results with the combinational method, and then selects an optimal result as station layout scheme. This method is time consuming for single simulation and high computational complexity for the combinational analysis, when the number of stations increases, the complexity of optimization problem will be increased exponentially, and cannot be solved with traditional method. There is no better way to solve this problem till now. In this paper, target detection procedure was simplified. Firstly, the space coverage of ground-based radar was simplified, a space coverage projection model of radar facilities in different orbit altitudes was built; then a simplified objects cross the radar coverage model was established according to the characteristics of space objects orbit motion; after two steps simplification, the computational complexity of the target detection was greatly simplified, and simulation results shown the correctness of the simplified results. In addition, the detection areas of ground-based radar network can be easily computed with the

  16. Ground-Based Lidar Measurements During the CALIPSO and Twilight Zone (CATZ) Campaign

    Science.gov (United States)

    Berkoff, Timothy; Qian, Li; Kleidman, Richard; Stewart, Sebastian; Welton, Ellsworth; Li, Zhu; Holbem, Brent

    2008-01-01

    The CALIPSO and Twilight Zone (CATZ) field campaign was carried out between June 26th and August 29th of 2007 in the multi-state Maryland-Virginia-Pennsylvania region of the U.S. to study aerosol properties and cloud-aerosol interactions during overpasses of the CALIPSO satellite. Field work was conducted on selected days when CALIPSO ground tracks occurred in the region. Ground-based measurements included data from multiple Cimel sunphotometers that were placed at intervals along a segment of the CALIPSO ground-track. These measurements provided sky radiance and AOD measurements to enable joints inversions and comparisons with CALIPSO retrievals. As part of this activity, four ground-based lidars provided backscatter measurements (at 523 nm) in the region. Lidars at University of Maryland Baltimore County (Catonsville, MD) and Goddard Space Flight Center (Greenbelt, MD) provided continuous data during the campaign, while two micro-pulse lidar (MPL) systems were temporarily stationed at various field locations directly on CALIPSO ground-tracks. As a result, thirteen on-track ground-based lidar observations were obtained from eight different locations in the region. In some cases, nighttime CALIPSO coincident measurements were also obtained. In most studies reported to date, ground-based lidar validation efforts for CALIPSO rely on systems that are at fixed locations some distance away from the satellite ground-track. The CATZ ground-based lidar data provide an opportunity to examine vertical structure properties of aerosols and clouds both on and off-track simultaneously during a CALIPSO overpass. A table of available ground-based lidar measurements during this campaign will be presented, along with example backscatter imagery for a number of coincident cases with CALIPSO. Results indicate that even for a ground-based measurements directly on-track, comparisons can still pose a challenge due to the differing spatio-temporal properties of the ground and satellite

  17. Automatic trend estimation

    CERN Document Server

    Vamos¸, C˘alin

    2013-01-01

    Our book introduces a method to evaluate the accuracy of trend estimation algorithms under conditions similar to those encountered in real time series processing. This method is based on Monte Carlo experiments with artificial time series numerically generated by an original algorithm. The second part of the book contains several automatic algorithms for trend estimation and time series partitioning. The source codes of the computer programs implementing these original automatic algorithms are given in the appendix and will be freely available on the web. The book contains clear statement of the conditions and the approximations under which the algorithms work, as well as the proper interpretation of their results. We illustrate the functioning of the analyzed algorithms by processing time series from astrophysics, finance, biophysics, and paleoclimatology. The numerical experiment method extensively used in our book is already in common use in computational and statistical physics.

  18. Zero-Gravity Locomotion Simulators: New Ground-Based Analogs for Microgravity Exercise Simulation

    Science.gov (United States)

    Perusek, Gail P.; DeWitt, John K.; Cavanagh, Peter R.; Grodsinsky, Carlos M.; Gilkey, Kelly M.

    2007-01-01

    Maintaining health and fitness in crewmembers during space missions is essential for preserving performance for mission-critical tasks. NASA's Exercise Countermeasures Project (ECP) provides space exploration exercise hardware and monitoring requirements that lead to devices that are reliable, meet medical, vehicle, and habitat constraints, and use minimal vehicle and crew resources. ECP will also develop and validate efficient exercise prescriptions that minimize daily time needed for completion of exercise yet maximize performance for mission activities. In meeting these mission goals, NASA Glenn Research Center (Cleveland, OH, USA), in collaboration with the Cleveland Clinic (Cleveland, Ohio, USA), has developed a suite of zero-gravity locomotion simulators and associated technologies to address the need for ground-based test analog capability for simulating in-flight (microgravity) and surface (partial-gravity) exercise to advance the health and safety of astronaut crews and the next generation of space explorers. Various research areas can be explored. These include improving crew comfort during exercise, and understanding joint kinematics and muscle activation pattern differences relative to external loading mechanisms. In addition, exercise protocol and hardware optimization can be investigated, along with characterizing system dynamic response and the physiological demand associated with advanced exercise device concepts and performance of critical mission tasks for Exploration class missions. Three zero-gravity locomotion simulators are currently in use and the research focus for each will be presented. All of the devices are based on a supine subject suspension system, which simulates a reduced gravity environment by completely or partially offloading the weight of the exercising test subject s body. A platform for mounting treadmill is positioned perpendicularly to the test subject. The Cleveland Clinic Zero-g Locomotion Simulator (ZLS) utilizes a

  19. Application of Technical Measures and Software in Constructing Photorealistic 3D Models of Historical Building Using Ground-Based and Aerial (UAV) Digital Images

    Science.gov (United States)

    Zarnowski, Aleksander; Banaszek, Anna; Banaszek, Sebastian

    2015-12-01

    Preparing digital documentation of historical buildings is a form of protecting cultural heritage. Recently there have been several intensive studies using non-metric digital images to construct realistic 3D models of historical buildings. Increasingly often, non-metric digital images are obtained with unmanned aerial vehicles (UAV). Technologies and methods of UAV flights are quite different from traditional photogrammetric approaches. The lack of technical guidelines for using drones inhibits the process of implementing new methods of data acquisition. This paper presents the results of experiments in the use of digital images in the construction of photo-realistic 3D model of a historical building (Raphaelsohns' Sawmill in Olsztyn). The aim of the study at the first stage was to determine the meteorological and technical conditions for the acquisition of aerial and ground-based photographs. At the next stage, the technology of 3D modelling was developed using only ground-based or only aerial non-metric digital images. At the last stage of the study, an experiment was conducted to assess the possibility of 3D modelling with the comprehensive use of aerial (UAV) and ground-based digital photographs in terms of their labour intensity and precision of development. Data integration and automatic photo-realistic 3D construction of the models was done with Pix4Dmapper and Agisoft PhotoScan software Analyses have shown that when certain parameters established in an experiment are kept, the process of developing the stock-taking documentation for a historical building moves from the standards of analogue to digital technology with considerably reduced cost.

  20. Monitoring of atmospheric ozone and nitrogen dioxide over the south of Portugal by ground-based and satellite observations.

    Science.gov (United States)

    Bortoli, Daniele; Silva, Ana Maria; Costa, Maria João; Domingues, Ana Filipa; Giovanelli, Giorgio

    2009-07-20

    The SPATRAM (Spectrometer for Atmospheric TRAcers Monitoring) instrument has been developed as a result of the collaboration between CGE-UE, ISAC-CNR and Italian National Agency for New Technologies, Energy and the Environment (ENEA). SPATRAM is a multi-purpose UV-Vis-scanning spectrometer (250 - 950 nm) and it is installed at the Observatory of the CGE, in Evora, since April 2004. A brief description of the instrument is given, highlighting the technological innovations with respect to the previous version of similar equipment. The need for such measurements automatically taken on a routine basis in south-western European regions, specifically in Portugal, has encouraged the development and installation of the equipment and constitutes a major driving force for the present work. The main features and some improvements introduced in the DOAS (Differential Optical Absorption Spectroscopy) algorithms are discussed. The results obtained applying DOAS methodology to the SPATRAM spectrometer measurements of diffused spectral sky radiation are presented in terms of diurnal and seasonal variations of nitrogen dioxide (NO(2)) and ozone (O(3)). NO(2) confirms the typical seasonal cycle reaching the maximum of (6.5 +/- 0.3) x 10(+15) molecules cm(-2) for the sunset values (PM), during the summer season, and the minimum of (1.55 +/- 0.07) x 10(+15) molecules cm(-2) for the sunrise values (AM) in winter. O(3) presents the maximum total column of (433 +/- 5) Dobson Unit (DU) in the spring season and the minimum of (284 +/- 3) DU during the fall period. The huge daily variations of the O(3) total column during the spring season are analyzed and discussed. The ground-based results obtained for NO(2) and O(3) column contents are compared with data from satellite-borne equipment (GOME - Global Ozone Monitoring Experiment; SCIAMACHY - Scanning Imaging Absorption Spectrometer for Atmospheric CHartographY; TOMS - Total Ozone Monitoring Spectrometer) and it is shown that the two data