WorldWideScience

Sample records for 2nd generation software

  1. STARS 2.0: 2nd-generation open-source archiving and query software

    Winegar, Tom

    2008-07-01

    The Subaru Telescope is in process of developing an open-source alternative to the 1st-generation software and databases (STARS 1) used for archiving and query. For STARS 2, we have chosen PHP and Python for scripting and MySQL as the database software. We have collected feedback from staff and observers, and used this feedback to significantly improve the design and functionality of our future archiving and query software. Archiving - We identified two weaknesses in 1st-generation STARS archiving software: a complex and inflexible table structure and uncoordinated system administration for our business model: taking pictures from the summit and archiving them in both Hawaii and Japan. We adopted a simplified and normalized table structure with passive keyword collection, and we are designing an archive-to-archive file transfer system that automatically reports real-time status and error conditions and permits error recovery. Query - We identified several weaknesses in 1st-generation STARS query software: inflexible query tools, poor sharing of calibration data, and no automatic file transfer mechanisms to observers. We are developing improved query tools and sharing of calibration data, and multi-protocol unassisted file transfer mechanisms for observers. In the process, we have redefined a 'query': from an invisible search result that can only transfer once in-house right now, with little status and error reporting and no error recovery - to a stored search result that can be monitored, transferred to different locations with multiple protocols, reporting status and error conditions and permitting recovery from errors.

  2. 2nd Generation Alkaline Electrolysis

    Yde, Lars; Kjartansdóttir, Cecilia Kristin; Allebrod, Frank; Mogensen, Mogens Bjerg; Møller, Per; Hilbert, Lisbeth R.; Nielsen, Peter Tommy; Mathiesen, Troels; Jensen, Jørgen; Andersen, Lars; Dierking, Alexander

    This report provides the results of the 2nd Generation Alkaline Electrolysis project which was initiated in 2008. The project has been conducted from 2009-2012 by a consortium comprising Århus University Business and Social Science – Centre for Energy Technologies (CET (former HIRC)), Technical...

  3. 2nd generation biogas. BioSNG

    The substitution of natural gas by a renewable equivalent is an interesting option to reduce the use of fossil fuels and the accompanying greenhouse gas emissions, as well as from the point of view of security of supply. The renewable alternative for natural gas is green natural gas, i.e. gaseous energy carriers produced from biomass comprising both biogas and Synthetic Natural Gas (SNG). Via this route can be benefited from all the advantages of natural gas, like the existing dense infrastructure, trade and supply network, and natural gas applications. In this presentation attention is paid to the differences between first generation biogas and second generation bioSNG; the market for bioSNG: grid injection vs. transportation fuel; latest update on the lab- and pilot-scale bioSNG development at ECN; and an overview is given of ongoing bioSNG activities worldwide

  4. 2nd Generation alkaline electrolysis. Final report

    Yde, L. [Aarhus Univ. Business and Social Science - Centre for Energy Technologies (CET), Aarhus (Denmark); Kjartansdottir, C.K. [Technical Univ. of Denmark. DTU Mechanical Engineering, Kgs. Lyngby (Denmark); Allebrod, F. [Technical Univ. of Denmark. DTU Energy Conversion, DTU Risoe Campus, Roskilde (Denmark)] [and others

    2013-03-15

    The overall purpose of this project has been to contribute to this load management by developing a 2{sup nd} generation of alkaline electrolysis system characterized by being compact, reliable, inexpensive and energy efficient. The specific targets for the project have been to: 1) Increase cell efficiency to more than 88% (according to the higher heating value (HHV)) at a current density of 200 mA /cm{sup 2}; 2) Increase operation temperature to more than 100 degree Celsius to make the cooling energy more valuable; 3) Obtain an operation pressure more than 30 bar hereby minimizing the need for further compression of hydrogen for storage; 4) Improve stack architecture decreasing the price of the stack with at least 50%; 5) Develop a modular design making it easy to customize plants in the size from 20 to 200 kW; 6) Demonstrating a 20 kW 2{sup nd} generation stack in H2College at the campus of Arhus University in Herning. The project has included research and development on three different technology tracks of electrodes; an electrochemical plating, an atmospheric plasma spray (APS) and finally a high temperature and pressure (HTP) track with operating temperature around 250 deg. C and pressure around 40 bar. The results show that all three electrode tracks have reached high energy efficiencies. In the electrochemical plating track a stack efficiency of 86.5% at a current density of 177mA/cm{sup 2} and a temperature of 74.4 deg. C has been shown. The APS track showed cell efficiencies of 97%, however, coatings for the anode side still need to be developed. The HTP cell has reached 100 % electric efficiency operating at 1.5 V (the thermoneutral voltage) with a current density of 1. 1 A/cm{sup 2}. This track only tested small cells in an externally heated laboratory set-up, and thus the thermal loss to surroundings cannot be given. The goal set for the 2{sup nd} generation electrolyser system, has been to generate 30 bar pressure in the cell stack. An obstacle to be

  5. Performance and validation of COMPUCEA 2nd generation for uranium measurements in physical inventory verifications

    Full text: In order to somewhat alleviate the kind of logistical problems encountered in the in-field measurements with the current COMPUCEA equipment (COMbined Product for Uranium Content and Enrichment Assay), and with the expected benefits of saving some time and costs for the missions in mind, ITU is presently developing a 2nd generation of the COMPUCEA device. This new development also forms a task in the support programme of the Joint Research Centre of the European Commission to the IAEA. To validate the in-field performance of the newly developed 2nd generation COMPUCEA, a prototype has been tested together with the 1st generation equipment during physical inventory verification (PIV) measurements in different uranium fuel fabrication plants in Europe. In this paper we will present the prototype of COMPUCEA 2nd generation, its hardware as well as the software developed for the evaluation of the U content and 235U enrichment. We will show a comparison of the performance of the 2nd generation with the 1st generation on a larger number of uranium samples measured during the in-field PIVs. The observed excellent performance of the new COMPUCEA represents an important step in the validation of this new instrument. (author)

  6. The 2nd Generation Real Time Mission Monitor (RTMM) Development

    Blakeslee, R. J.; Goodman, M.; Hardin, D. M.; Hall, J.; Yubin He, M.; Regner, K.; Conover, H.; Smith, T.; Meyer, P.; Lu, J.; Garrett, M.

    2009-12-01

    The NASA Real Time Mission Monitor (RTMM) is a visualization and information system that fuses multiple Earth science data sources, to enable real time decision-making for airborne and ground validation experiments. Developed at the National Aeronautics and Space Administration (NASA) Marshall Space Flight Center, RTMM is a situational awareness, decision-support system that integrates satellite imagery and orbit data, radar and other surface observations (e.g., lightning location network data), airborne navigation and instrument data sets, model output parameters, and other applicable Earth science data sets. The integration and delivery of this information is made possible using data acquisition systems, network communication links, network server resources, and visualizations through the Google Earth virtual globe application. In order to improve the usefulness and efficiency of the RTMM system, capabilities are being developed to allow the end-user to easily configure RTMM applications based on their mission-specific requirements and objectives. This second generation RTMM is being redesigned to take advantage of the Google plug-in capabilities to run multiple applications in a web browser rather than the original single application Google Earth approach. Currently RTMM employs a limited Service Oriented Architecture approach to enable discovery of mission specific resources. We are expanding the RTMM architecture such that it will more effectively utilize the Open Geospatial Consortium Sensor Web Enablement services and other new technology software tools and components. These modifications and extensions will result in a robust, versatile RTMM system that will greatly increase flexibility of the user to choose which science data sets and support applications to view and/or use. The improvements brought about by RTMM 2nd generation system will provide mission planners and airborne scientists with enhanced decision-making tools and capabilities to more

  7. A ZeroDimensional Model of a 2nd Generation Planar SOFC Using Calibrated Parameters

    Brian Elmegaard; Niels Houbak; Thomas Frank Petersen

    2006-01-01

    This paper presents a zero-dimensional mathematical model of a planar 2nd generation coflow SOFC developed for simulation of power systems. The model accounts for the electrochemical oxidation of hydrogen as well as the methane reforming reaction and the water-gas shift reaction. An important part of the paper is the electrochemical sub-model, where experimental data was used to calibrate specific parameters. The SOFC model was implemented in the DNA simulation software which is designed for ...

  8. The 2nd Generation VLTI path to performance

    Woillez, Julien; Berger, Jean-Philippe; Bonnet, Henri; de Wit, Willem-Jan; Egner, Sebastian; Eisenhauer, Frank; Gonté, Frédéric; Guieu, Sylvain; Haguenauer, Pierre; Mérand, Antoine; Pettazzi, Lorenzo; Poupar, Sébastien; Schöller, Markus; Schuhler, Nicolas

    2016-01-01

    The upgrade of the VLTI infrastructure for the 2nd generation instruments is now complete with the transformation of the laboratory, and installation of star separators on both the 1.8-m Auxiliary Telescopes (ATs) and the 8-m Unit Telescopes (UTs). The Gravity fringe tracker has had a full semester of commissioning on the ATs, and a first look at the UTs. The CIAO infrared wavefront sensor is about to demonstrate its performance relative to the visible wavefront sensor MACAO. First astrometric measurements on the ATs and astrometric qualification of the UTs are on-going. Now is a good time to revisit the performance roadmap for VLTI that was initiated in 2014, which aimed at coherently driving the developments of the interferometer, and especially its performance, in support to the new generation of instruments: Gravity and MATISSE.

  9. Super Boiler 2nd Generation Technology for Watertube Boilers

    Mr. David Cygan; Dr. Joseph Rabovitser

    2012-03-31

    This report describes Phase I of a proposed two phase project to develop and demonstrate an advanced industrial watertube boiler system with the capability of reaching 94% (HHV) fuel-to-steam efficiency and emissions below 2 ppmv NOx, 2 ppmv CO, and 1 ppmv VOC on natural gas fuel. The boiler design would have the capability to produce >1500 F, >1500 psig superheated steam, burn multiple fuels, and will be 50% smaller/lighter than currently available watertube boilers of similar capacity. This project is built upon the successful Super Boiler project at GTI. In that project that employed a unique two-staged intercooled combustion system and an innovative heat recovery system to reduce NOx to below 5 ppmv and demonstrated fuel-to-steam efficiency of 94% (HHV). This project was carried out under the leadership of GTI with project partners Cleaver-Brooks, Inc., Nebraska Boiler, a Division of Cleaver-Brooks, and Media and Process Technology Inc., and project advisors Georgia Institute of Technology, Alstom Power Inc., Pacific Northwest National Laboratory and Oak Ridge National Laboratory. Phase I of efforts focused on developing 2nd generation boiler concepts and performance modeling; incorporating multi-fuel (natural gas and oil) capabilities; assessing heat recovery, heat transfer and steam superheating approaches; and developing the overall conceptual engineering boiler design. Based on our analysis, the 2nd generation Industrial Watertube Boiler when developed and commercialized, could potentially save 265 trillion Btu and $1.6 billion in fuel costs across U.S. industry through increased efficiency. Its ultra-clean combustion could eliminate 57,000 tons of NOx, 460,000 tons of CO, and 8.8 million tons of CO2 annually from the atmosphere. Reduction in boiler size will bring cost-effective package boilers into a size range previously dominated by more expensive field-erected boilers, benefiting manufacturers and end users through lower capital costs.

  10. A ZeroDimensional Model of a 2nd Generation Planar SOFC Using Calibrated Parameters

    Brian Elmegaard

    2006-12-01

    Full Text Available This paper presents a zero-dimensional mathematical model of a planar 2nd generation coflow SOFC developed for simulation of power systems. The model accounts for the electrochemical oxidation of hydrogen as well as the methane reforming reaction and the water-gas shift reaction. An important part of the paper is the electrochemical sub-model, where experimental data was used to calibrate specific parameters. The SOFC model was implemented in the DNA simulation software which is designed for energy system simulation. The result is an accurate and flexible tool suitable for simulation of many different SOFC-based power systems.

  11. Proceedings 2nd Workshop on Formal Methods in the Development of Software

    Andrés, César; Llana, Luis

    2012-01-01

    This volume contains the proceedings of the 2nd WorkShop on Formal Methods in the Development of Software (WS-FMDS 2012). The workshop was held in Paris, France on August 30th, 2012 as a satellite event to the 18th International Symposium on Formal Methods (FM-2012). The aim of WS-FMDS 2012 is to provide a forum for researchers who are interested in the application of formal methods on systems which are being developing with a software methodology. In particular, this workshop is intended to ...

  12. From 1st- to 2nd-Generation Biofuel Technologies: Extended Executive Summary

    NONE

    2008-07-01

    This report looks at the technical challenges facing 2nd-generation biofuels, evaluates their costs and examines related current policies to support their development and deployment. The potential for production of more advanced biofuels is also discussed. Although significant progress continues to be made to overcome the technical and economic challenges, 2nd-generation biofuels still face major constraints to their commercial deployment.

  13. Intergenerational Transmission and the School-to-work Transition for 2nd Generation Immigrants

    Nielsen, Helena Skyt; Rosholm, Michael; Smith, Nina;

    2001-01-01

    We analyse the extent of intergenerational transmission through parental capital, ethnic capital and neighbourhood effects on several aspects of the school-to-work transition of 2nd generation immigrants and young ethnic Danes. The main findings are that parental capital has strong positive effects...... on the probability of completing a qualifying education and on the entry into the labour market, but it has a much smaller impact on the duration of the first employment spell and on the wage level. Growing up in neighbourhoods with a high concentration of immigrants is associated with negative...... labour market prospects both for young natives and 2nd generation immigrants....

  14. Intergenerational Transmission and the School-to-work Transition for 2nd Generation Immigrants

    Nielsen, Helena Skyt; Rosholm, Michael; Smith, Nina;

    2001-01-01

    We analyse the extent of intergenerational transmission through parental capital, ethnic capital and neighbourhood effects on several aspects of the school-to-work transition of 2nd generation immigrants and young ethnic Danes. The main findings are that parental capital has strong positive effects...

  15. The 1997 Protocol and the European Union (European Union and '2nd generation' responsibility conventions)

    The issue of accession of the Eastern European Member States to the 1997 Protocol is discussed with focus on the European Union's authority and enforcement powers. Following up the article published in the preceding issue of this journal, the present contribution analyses the relations of the '2nd generation' responsibility conventions to the law of the European Union. (orig.)

  16. Performance and validation of COMPUCEA 2nd generation for uranium measurements in physical inventory verification

    A new instrumental version of COMPUCEA has been developed with the aim to provide a simplified and more practical instrumentation for in-field use. The main design goals were to eliminate the radioactive sources and the liquid nitrogen-cooled Ge detectors used in the 1st generation of COMPUCEA. This paper describes the major technical features of the 2nd generation of equipment together with typical performance data. The performance tests carried out during first in-field measurements in the course of physical inventory verification campaigns represent an important step in the validation of this new instrument. (author)

  17. Systems Engineering Approach to Technology Integration for NASA's 2nd Generation Reusable Launch Vehicle

    Thomas, Dale; Smith, Charles; Thomas, Leann; Kittredge, Sheryl

    2002-01-01

    The overall goal of the 2nd Generation RLV Program is to substantially reduce technical and business risks associated with developing a new class of reusable launch vehicles. NASA's specific goals are to improve the safety of a 2nd-generation system by 2 orders of magnitude - equivalent to a crew risk of 1-in-10,000 missions - and decrease the cost tenfold, to approximately $1,000 per pound of payload launched. Architecture definition is being conducted in parallel with the maturating of key technologies specifically identified to improve safety and reliability, while reducing operational costs. An architecture broadly includes an Earth-to-orbit reusable launch vehicle, on-orbit transfer vehicles and upper stages, mission planning, ground and flight operations, and support infrastructure, both on the ground and in orbit. The systems engineering approach ensures that the technologies developed - such as lightweight structures, long-life rocket engines, reliable crew escape, and robust thermal protection systems - will synergistically integrate into the optimum vehicle. To best direct technology development decisions, analytical models are employed to accurately predict the benefits of each technology toward potential space transportation architectures as well as the risks associated with each technology. Rigorous systems analysis provides the foundation for assessing progress toward safety and cost goals. The systems engineering review process factors in comprehensive budget estimates, detailed project schedules, and business and performance plans, against the goals of safety, reliability, and cost, in addition to overall technical feasibility. This approach forms the basis for investment decisions in the 2nd Generation RLV Program's risk-reduction activities. Through this process, NASA will continually refine its specialized needs and identify where Defense and commercial requirements overlap those of civil missions.

  18. Boosting biogas yield of anaerobic digesters by utilizing concentrated molasses from 2nd generation bioethanol plant

    Shiplu Sarker, Henrik Bjarne Møller

    2013-01-01

    Concentrated molasses (C5 molasses) from 2nd generation bioethanol plant has been investigated for enhancing productivity of manure based digesters. A batch study at mesophilic condition (35±1C) showed the maximum methane yield from molasses as 286 LCH4/kgVS which was approximately 63% of the calculated theoretical yield. In addition to the batch study, co-digestion of molasses with cattle manure in a semi-continuously stirred reactor at thermophilic temperature (50±1°C) was also performed wi...

  19. Effects of Thermal Cycling on Control and Irradiated EPC 2nd Generation GaN FETs

    Patterson, Richard L.; Scheick, Leif; Lauenstein, Jean-Marie; Casey, Megan; Hammoud, Ahmad

    2013-01-01

    The power systems for use in NASA space missions must work reliably under harsh conditions including radiation, thermal cycling, and exposure to extreme temperatures. Gallium nitride semiconductors show great promise, but information pertaining to their performance is scarce. Gallium nitride N-channel enhancement-mode field effect transistors made by EPC Corporation in a 2nd generation of manufacturing were exposed to radiation followed by long-term thermal cycling in order to address their reliability for use in space missions. Results of the experimental work are presented and discussed.

  20. Advanced Electron Beam Ion Sources (EBIS) for 2-nd generation carbon radiotherapy facilities

    In this work we analyze how advanced Electron Beam Ion Sources (EBIS) can facilitate the progress of carbon therapy facilities. We will demonstrate that advanced ion sources enable operation of 2-nd generation ion beam therapy (IBT) accelerators. These new accelerator concepts with designs dedicated to IBT provide beams better suited for therapy and, are more cost efficient than contemporary IBT facilities. We will give a sort overview of the existing new IBT concepts and focus on those where ion source technology is the limiting factor. We will analyse whether this limitation can be overcome in the near future thanks to ongoing EBIS development

  1. Advanced Electron Beam Ion Sources (EBIS) for 2-nd generation carbon radiotherapy facilities

    Shornikov, A.; Wenander, F.

    2016-04-01

    In this work we analyze how advanced Electron Beam Ion Sources (EBIS) can facilitate the progress of carbon therapy facilities. We will demonstrate that advanced ion sources enable operation of 2-nd generation ion beam therapy (IBT) accelerators. These new accelerator concepts with designs dedicated to IBT provide beams better suited for therapy and, are more cost efficient than contemporary IBT facilities. We will give a sort overview of the existing new IBT concepts and focus on those where ion source technology is the limiting factor. We will analyse whether this limitation can be overcome in the near future thanks to ongoing EBIS development.

  2. Emotional and Behavioral Disorders in 1.5th Generation, 2nd Generation Immigrant Children, and Foreign Adoptees.

    Tan, Tony Xing

    2016-10-01

    Existing theories (e.g., acculturative stress theory) cannot adequately explain why mental disorders in immigrants are less prevalent than in non-immigrants. In this paper, the culture-gene co-evolutionary theory of mental disorders was utilized to generate a novel hypothesis that connection to heritage culture reduces the risk for mental disorders in immigrant children. Four groups of children aged 2-17 years were identified from the 2007 United States National Survey of Children's Health: 1.5th generation immigrant children (n = 1378), 2nd generation immigrant children (n = 4194), foreign adoptees (n = 270), and non-immigrant children (n = 54,877). The 1.5th generation immigrant children's connection to their heritage culture is stronger than or similar to the 2nd generation immigrants, while the foreign adoptees have little connection to their birth culture. Controlling for age, sex, family type and SES, the odds for having ADD/ADHD, Conduct Disorder, Anxiety Disorder, and Depression diagnosis were the lowest for the 1.5th generation immigrant children, followed by the 2nd generation immigrant children and the foreign adoptees. The foreign adoptees and non-adopted children were similar in the odds of having these disorders. Connection to heritage culture might be the underlying mechanism that explained recent immigrants' lower rates of mental disorders. PMID:26972324

  3. The Planar Optics Phase Sensor: a study for the VLTI 2nd Generation Fringe Tracker

    Blind, Nicolas; Absil, Olivier; Alamir, Mazen; Berger, Jean-Philippe; Defrère, Denis; Feautrier, Philippe; Hénault, Franois; Jocou, Laurent; Kern, Pierre; Laurent, Thomas; Malbet, Fabien; Mourard, Denis; Rousselet-Perrault, Karine; Sarlette, Alain; Surdej, Jean; Tarmoul, Nassima; Tatulli, Eric; Vincent, Lionel; 10.1117/12.857114

    2010-01-01

    In a few years, the second generation instruments of the Very Large Telescope Interferometer (VLTI) will routinely provide observations with 4 to 6 telescopes simultaneously. To reach their ultimate performance, they will need a fringe sensor capable to measure in real time the randomly varying optical paths differences. A collaboration between LAOG (PI institute), IAGL, OCA and GIPSA-Lab has proposed the Planar Optics Phase Sensor concept to ESO for the 2nd Generation Fringe Tracker. This concept is based on the integrated optics technologies, enabling the conception of extremely compact interferometric instruments naturally providing single-mode spatial filtering. It allows operations with 4 and 6 telescopes by measuring the fringes position thanks to a spectrally dispersed ABCD method. We present here the main analysis which led to the current concept as well as the expected on-sky performance and the proposed design.

  4. Enabling the 2nd Generation in Space: Building Blocks for Large Scale Space Endeavours

    Barnhardt, D.; Garretson, P.; Will, P.

    Today the world operates within a "first generation" space industrial enterprise, i.e. all industry is on Earth, all value from space is from bits (data essentially), and the focus is Earth-centric, with very limited parts of our population and industry participating in space. We are limited in access, manoeuvring, on-orbit servicing, in-space power, in-space manufacturing and assembly. The transition to a "Starship culture" requires the Earth to progress to a "second generation" space industrial base, which implies the need to expand the economic sphere of activity of mankind outside of an Earth-centric zone and into CIS-lunar space and beyond, with an equal ability to tap the indigenous resources in space (energy, location, materials) that will contribute to an expanding space economy. Right now, there is no comfortable place for space applications that are not discovery science, exploration, military, or established earth bound services. For the most part, space applications leave out -- or at least leave nebulous, unconsolidated, and without a critical mass -- programs and development efforts for infrastructure, industrialization, space resources (survey and process maturation), non-traditional and persistent security situational awareness, and global utilities -- all of which, to a far greater extent than a discovery and exploration program, may help determine the elements of a 2nd generation space capability. We propose a focus to seed the pre-competitive research that will enable global industry to develop the necessary competencies that we currently lack to build large scale space structures on-orbit, that in turn would lay the foundation for long duration spacecraft travel (i.e. key technologies in access, manoeuvrability, etc.). This paper will posit a vision-to-reality for a step wise approach to the types of activities the US and global space providers could embark upon to lay the foundation for the 2nd generation of Earth in space.

  5. The New 2nd-Generation SRF R&D Facility at Jefferson Lab: TEDF

    Reece, Charles E.; Reilly, Anthony V.

    2012-09-01

    The US Department of Energy has funded a near-complete renovation of the SRF-based accelerator research and development facilities at Jefferson Lab. The project to accomplish this, the Technical and Engineering Development Facility (TEDF) Project has completed the first of two phases. An entirely new 3,100 m{sup 2} purpose-built SRF technical work facility has been constructed and was occupied in summer of 2012. All SRF work processes with the exception of cryogenic testing have been relocated into the new building. All cavity fabrication, processing, thermal treatment, chemistry, cleaning, and assembly work is collected conveniently into a new LEED-certified building. An innovatively designed 800 m2 cleanroom/chemroom suite provides long-term flexibility for support of multiple R&D and construction projects as well as continued process evolution. The characteristics of this first 2nd-generation SRF facility are described.

  6. Boosting biogas yield of anaerobic digesters by utilizing concentrated molasses from 2nd generation bioethanol plant

    Sarker, Shiplu [Department of Renewable Energy, Faculty of Engineering and Science, University of Agder, Grimstad-4879 (Norway); Moeller, Henrik Bjarne [Department of Biosystems Engineering, Faculty of Science and Technology, Aarhus University, Research center Foulum, Blichers Alle, Post Box 50, Tjele-8830 (Denmark)

    2013-07-01

    Concentrated molasses (C5 molasses) from 2nd generation bioethanol plant has been investigated for enhancing productivity of manure based digesters. A batch study at mesophilic condition (35+- 1 deg C) showed the maximum methane yield from molasses as 286 LCH4/kgVS which was approximately 63% of the calculated theoretical yield. In addition to the batch study, co-digestion of molasses with cattle manure in a semi-continuously stirred reactor at thermophilic temperature (50+- 1 deg C) was also performed with a stepwise increase in molasses concentration. The results from this experiment revealed the maximum average biogas yield of 1.89 L/L/day when 23% VSmolasses was co-digested with cattle manure. However, digesters fed with more than 32% VSmolasses and with short adaptation period resulted in VFA accumulation and reduced methane productivity indicating that when using molasses as biogas booster this level should not be exceeded.

  7. 2nd International Workshop on Crowd Sourcing in Software Engineering (CSI-SE 2015)

    Fraser, G; Latoza, T.D.; Mariani, L

    2015-01-01

    Crowdsourcing is increasingly revolutionizing the ways in which software is engineered. Programmers increasingly crowdsource answering their questions through Q&A sites. Non-programmers may contribute human-intelligence to development projects, by, for example, usability testing software or even play games with a purpose to implicitly construct formal specifications. Crowdfunding helps to democratize decisions about what software to build. Software engineering researchers may even benefit...

  8. Geodesign from Theory to Practice: From Metaplanning to 2nd Generation of Planning Support Systems

    Michele Campagna

    2014-05-01

    Full Text Available This paper deals with the concept of Geodesign, a new approach to spatial planning and design which is grounded on extensive use of Geographic Information Science methods and tools. As a method Geodesign is intended to inform projects since their conceptualization, to analysis and diagnosis, to design of alternatives and impact simulation, and eventually the final choice. This approach appears particularly urgent and actual to many scholars from academia and practitioners from the industry and the planning practice for advances in GIScience nowadays offer unprecedented data and tools to manage territorial knowledge for decision-making support. The author argues research in Geodesign may contribute to solve major actual pitfalls in sustainable spatial planning: namely it may offer methods to help planners to inform sustainable design alternatives with environmental considerations and contextually assess their impacts; secondly, it may help to ensure more transparent, responsible, and accountable democratic decision-making processes. The argumentation is supported by the author recent research results with regards to the evolution from 1st generation Planning Support Systems (PSS, to metaplanning and 2nd generation PSS.

  9. Improved beam spot measurements in the 2nd generation proton beam writing system

    Nanosized ion beams (especially proton and helium) play a pivotal role in the field of ion beam lithography and ion beam analysis. Proton beam writing has shown lithographic details down to the 20 nm level, limited by the proton beam spot size. Introducing a smaller spot size will allow smaller lithographic features. Smaller probe sizes, will also drastically improve the spatial resolution for ion beam analysis techniques. Among many other requirements, having an ideal resolution standard, used for beam focusing and a reliable focusing method, is an important pre-requisite for sub-10 nm beam spot focusing. In this paper we present the fabrication processes of a free-standing resolution standard with reduced side-wall projection and high side-wall verticality. The resulting grid is orthogonal (90.0° ± 0.1), has smooth edges with better than 6 nm side-wall projection. The new resolution standard has been used in focusing a 2 MeV H2+ beam in the 2nd generation PBW system at Center for Ion Beam Applications, NUS. The beam size has been characterized using on- and off-axis scanning transmission ion microscopy (STIM) and ion induced secondary electron detection, carried out with a newly installed micro channel plate electron detector. The latter has been shown to be a realistic alternative to STIM measurements, as the drawback of PIN diode detector damage is alleviated. With these improvements we show reproducible beam focusing down to 14 nm

  10. Conceptual design study of $Nb_{3} Sn$ low-beta quadrupoles for 2nd generation LHC IRs

    Zlobin, A V; Andreev, N; Barzi, E; Bauer, P; Chichili, D R; Huang, Y; Imbasciati, L; Kashikhin, V V; Lamm, M J; Limon, P; Novitski, I; Peterson, T; Strait, J B; Yadav, S; Yamada, R

    2003-01-01

    Conceptual designs of 90-mm aperture high-gradient quadrupoles based on the Nb/sub 3/Sn superconductor, are being developed at Fermilab for possible 2nd generation IRs with the similar optics as in the current low-beta insertions. Magnet designs and results of magnetic, mechanical, thermal and quench protection analysis for these magnets are presented and discussed. (10 refs).

  11. Multi-objective Optimization of a Solar Assisted 1st and 2nd Generation Sugarcane Ethanol Production Plant

    Zevenhoven, Ron; Wallerand, Anna Sophia; Queiroz Albarelli, Juliana; Viana Ensinas, Adriano; Ambrosetti, Gianluca; Mian, Alberto; Maréchal, François

    2014-01-01

    Ethanol production sites utilizing sugarcane as feedstock are usually located in regions with high land availability and decent solar radiation. This offers the opportunity to cover parts of the process energy demand with concentrated solar power (CSP) and thereby increase the fuel production and carbon conversion efficiency. A plant is examined that produces 1st and 2nd generation ethanol by fermentation of sugars (from sugarcane) and enzymatic hydrolysis of the lignocellulosic residues (bag...

  12. Generative Software Development

    Rumpe, Bernhard; Schindler, Martin; Völkel, Steven; Weisemöller, Ingo

    2014-01-01

    Generation of software from modeling languages such as UML and domain specific languages (DSLs) has become an important paradigm in software engineering. In this contribution, we present some positions on software development in a model based, generative manner based on home grown DSLs as well as the UML. This includes development of DSLs as well as development of models in these languages in order to generate executable code, test cases or models in different languages. Development of formal...

  13. BMI differences in 1st and 2nd generation immigrants of Asian and European origin to Australia.

    Hauck, Katharina; Hollingsworth, Bruce; Morgan, Lawrie

    2011-01-01

    We estimate assimilation of immigrants' body mass index (BMI) to the host population of Australia over one generation, conducting separate analyses for immigrants from 7 regions of Europe and Asia. We use quantile regressions to allow for differing impact of generational status across 19 quantiles of BMI from under-weight to morbidly obese individuals. We find that 1st generation South European immigrants have higher, and South and East Asian immigrants have lower BMI than Australians, but have assimilated to the BMI of their hosts in the 2nd generation. There are no or only small BMI differences between Australians and 1st and 2nd generation immigrants from East Europe, North-West Europe, Middle East and Pacific regions. We conclude that both upward and downward assimilation in some immigrant groups is most likely caused by factors which can change over one generation (such as acculturation), and not factors which would take longer to change (such as genetics). Our results suggest that public health policies targeting the lifestyles of well educated Asian immigrants may be effective in preventing BMI increase in this subgroup. PMID:20869292

  14. White paper on perspectives of biofuels in Denmark - with focus on 2nd generation bioethanol; Hvidbog om perspektiver for biobraendstoffer i Danmark - med fokus paa 2. generations bioethanol

    Larsen, Gy.; Foghmar, J.

    2009-11-15

    The white paper presents the perspectives - both options and barriers - for a Danish focus on production and use of biomass, including sustainable 2nd generation bioethanol, for transport. The white paper presents the current knowledge of biofuels and bioethanol and recommendations for a Danish strategy. (ln)

  15. Time resolved 2nd harmonic generation at LaAlO3/SrTiO3 Interfaces

    Adhikari, Sanjay; Eom, Chang-Beom; Ryu, Sangwoo; Cen, Cheng

    2014-03-01

    Ultrafast spectroscopy can produce information of carrier/lattice dynamics, which is especially valuable for understanding phase transitions at LaAlO3/SrTiO3 interfaces. LaAlO3 (LAO) and SrTiO3 (STO) are both associated with wide band gap, which allows deep penetration of commonly used laser wavelengths and therefore usually leads to overwhelming bulk signal background. Here we report a time resolved study of a 2nd harmonic generation (SHG) signal resulting from impulsive below-the-band-gap optical pumping. The nonlinear nature of the signal enables us to probe the interface directly. Output of a home built Ti:Sapphire laser and BBO crystal were used to generate 30fs pulses of two colors (405nm and 810nm). The 405nm pulse was used to pump the LAO/STO interfaces, while 2nd harmonics of the 810nm pulse generated at the interfaces was probed as a function of the time delay. Signals from samples with varying LAO thicknesses clearly correlates to the metal-insulator transition. Distinct time dependent signals were observed at LAO/STO interfaces grown on different substrates. Experiments performed at different optical polarization geometries, interface electric fields and temperatures allow us to paint a clearer picture of the novel oxide heterostructures under investigation.

  16. Clinical evaluation of the 2nd generation radio-receptor assay for anti-thyrotropin receptor antibodies (TRAb) in Graves' disease

    Full text: Detection of autoantibodies to the TSH receptor by radioreceptorial assays (RRA) is largely requested in clinical practice for the diagnosis of Graves' disease and its differentiation from diffuse thyroid autonomy. Additionally, TRAb measurement during antithyroid drug treatment can be useful to evaluate the risk of disease's relapse alter therapy discontinuation. Nevertheless, some patients affected by Graves' disease are TRAb negative when 1st generation assay is used. Recently a new RRA method for TRAb assay was developed by using human recombinant TSH-receptor and solid-phase technique. Aim of our work was the comparison between 1st and 2nd generation TRAb assays in Graves' disease patients and, particularly, the evaluation of 2nd generation test in a sub-group of patients affected by Graves' disease but with negative 1st generation TRAb assay. We evaluated the diagnostic performance of a newly developed 2nd generation TRAb assay (DYNOtest(r) TRAK human, BRAHMS Diagnostica GmbH, Germany) in 46 patients affected by Graves' disease with negative 1st generation TRAb assay (TRAK Assay(r), BRAHMS Diagnostica GmbH, Germany) . A control groups of 50 Graves' disease patients with positive 1st generation TRAb assay, 50 patients affected by Hashimoto's thyroiditis and 50 patients affected by nodular goiter were also examined. 41 out of 46 patients affected by Graves' disease with negative 1st generation TRAb assay showed a positive 2nd generation test. The overall sensitivity of the 2nd generation test was significantly improved respect the 1st generation assay in Graves' disease patients (χ2 = 22.5, p<0.0001). 1 and 3 out of 50 patients affected by Hashimoto's thyroiditis were positive by 1st and 2nd generation TRAB assay, respectively. All these patients showed primary hypothyroidism. No differences resulted in euthyroid Hashimoto's thyroiditis sub-group and in nodular goiter control group. The 2nd generation TRAB assay is clearly more sensitive than the 1

  17. Large-aperture $Nb_{3}Sn$ quadrupoles for $2^{nd}$ generation LHC IRs

    Zlobin, A V; Chichili, D R; Huang Yu; Kashikhin, V V; Lamm, M J; Limon, P J; Mokhov, N V; Novitski, I; Peterson, T; Strait, J B; Yadav, S

    2002-01-01

    The 1/sup st/ generation of low-beta quadrupoles for the LHC interaction region (IR) was designed to achieve the nominal LHC luminosity of 10/sup 34/ cm/sup -2/s/sup -1/. Given that the lifetime of the 1/sup st/ generation IR quadrupoles is limited by ionizing radiation to 6-7 years, the 2/sup nd/ generation of IR quadrupoles has to be developed with the goal to achieve the ultimate luminosity up to 10/sup 35/ cm/sup -2/s/sup -1/. The IR quadrupole parameters such as nominal gradient, dynamic aperture and physical aperture, operation margins are the main factors limiting the machine performance. Conceptual designs of 90-mm aperture high-gradient quadrupoles, suitable for use in 2/sup nd/ generation high-luminosity LHC IRs with the similar optics, are presented. The issues related to the field gradient, field quality and operation margins are discussed. (5 refs).

  18. Mobile Radio Communications: Second and Third Generation Cellular and WATM Systems: 2nd

    Steele, R; Hanzo, L

    1999-01-01

    This comprehensive all-in-one reference work covers the fundamental physical aspects of mobile communications and explains the latest techniques employed in second and third generation digital cellular mobile radio systems. Mobile radio communications technology has progressed rapidly and it is now capable of the transmission of voice, data and image signals. This new edition reflects the current state-of-the-art by featuring: * Expanded and updated sections on voice compression techniques, i...

  19. Bellman's GAP : a 2nd generation language and system for algebraic dynamic programming

    Sauthoff, Georg

    2010-01-01

    The dissertation describes the new Bellman’s GAP which is a programming system for writing dynamic programming algorithms over sequential data. It is the second generation implementation of the algebraic dynamic programming framework (ADP). The system includes the multi-paradigm language (GAP-L), its compiler (GAP-C), functional modules (GAP-M) and a web site (GAP Pages) to experiment with GAP-L programs. GAP-L includes declarative constructs, e.g. tree grammars to model the search space, and...

  20. Next generation LP system for maintenance in nuclear power reactors (2nd report)

    Laser peening is a surface enhancement process that introduces compressive residual stress on materials by irradiating laser pulses under aqueous environment. The process utilizes the impulsive effect of high-pressure plasma generated by ablative interaction of each laser pulse. Around a decade ago, the authors invented a new process of laser peening (LP) without any surface preparation, while the conventional types required coating that prevented the surface from melting. Taking advantage of the new process without surface preparation, we have applied laser peening without coating to nuclear power plants as a preventive maintenance against stress corrosion cracking (SCC). Toshiba released the first LP system in 1999, which delivered laser pulses through waterproof pipes with mirrors. In 2002, fiber-delivery was attained and significantly extended the applicability. Now, the development of a new system has been just accomplished, which is extremely simple, reliable and easy-handled. (author)

  1. Self-assembling software generator

    Bouchard, Ann M. (Albuquerque, NM); Osbourn, Gordon C. (Albuquerque, NM)

    2011-11-25

    A technique to generate an executable task includes inspecting a task specification data structure to determine what software entities are to be generated to create the executable task, inspecting the task specification data structure to determine how the software entities will be linked after generating the software entities, inspecting the task specification data structure to determine logic to be executed by the software entities, and generating the software entities to create the executable task.

  2. Cogeneration and production of 2nd generation bio fuels using biomass gasification; Cogeneracion y produccion de biocombustibles de 2 generacion mediante gasificacion de biomasa

    Uruena Leal, A.; Diez Rodriguez, D.; Antolin Giraldo, G.

    2011-07-01

    Thermochemical decomposition process of gasification, in which a carbonaceous fuel, under certain conditions of temperature and oxygen deficiency, results in a series of reactions that will produce a series of gaseous products is now widely used for high performance energetic and versatility of these gaseous products for energy and 2nd generation bio fuels and reduce the emission of greenhouse gases. (Author)

  3. Open pit mine planning and design. Vol 1. Fundamentals; Vol. 2. CSMine software package and orebodey case examples. 2nd.

    Hustrulid, W.; Kuchta, M. [University of Utah, Salt Lake City, UT (United States)

    2006-04-15

    This book is designed to be both a textbook and a reference book describing the principles involved in the planning and design of open pit mines. Volume 1 deals with the fundamental concepts involved in the planning and design of an open pit mine. The eight chapters cover mine planning, mining revenues and costs, orebody description, geometrical considerations, pit limits, and production planning, mineral resources and ore reserves, and responsible mining. There is an extensive coverage of environmental considerations and basic economic principles. A large number of examples have been included to illustrate the applications. A second volume is devoted to a mine design and software package, CSMine. CSMine is user-friendly mine planning and design software developed specifically to illustrate the practical application of the involved principles. It also comprises the CSMine tutorial, the CSMine user's manual and eight orebody case examples, including drillhole data sets for performing a complete open pit mine evaluation. 545 ills., 211 tabs.

  4. Strategies for 2nd generation biofuels in EU - Co-firing to stimulate feedstock supply development and process integration to improve energy efficiency and economic competitiveness

    The present biofuel policies in the European Union primarily stimulate 1st generation biofuels that are produced based on conventional food crops. They may be a distraction from lignocellulose based 2nd generation biofuels - and also from biomass use for heat and electricity - by keeping farmers' attention and significant investments focusing on first generation biofuels and the cultivation of conventional food crops as feedstocks. This article presents two strategies that can contribute to the development of 2nd generation biofuels based on lignocellulosic feedstocks. The integration of gasification-based biofuel plants in district heating systems is one option for increasing the energy efficiency and improving the economic competitiveness of such biofuels. Another option, biomass co-firing with coal, generates high-efficiency biomass electricity and reduces CO2 emissions by replacing coal. It also offers a near-term market for lignocellulosic biomass, which can stimulate development of supply systems for biomass also suitable as feedstock for 2nd generation biofuels. Regardless of the long-term priorities of biomass use for energy, the stimulation of lignocellulosic biomass production by development of near term and cost-effective markets is judged to be a no-regrets strategy for Europe. Strategies that induce a relevant development and exploit existing energy infrastructures in order to reduce risk and reach lower costs, are proposed an attractive complement the present and prospective biofuel policies. (author)

  5. Strategies for 2nd generation biofuels in EU - Co-firing to stimulate feedstock supply development and process integration to improve energy efficiency and economic competitiveness

    The present biofuel policies in the European Union primarily stimulate 1st generation biofuels that are produced based on conventional food crops. They may be a distraction from lignocellulose based 2nd generation biofuels - and also from biomass use for heat and electricity - by keeping farmers' attention and significant investments focusing on first generation biofuels and the cultivation of conventional food crops as feedstocks. This article presents two strategies that can contribute to the development of 2nd generation biofuels based on lignocellulosic feedstocks. The integration of gasification-based biofuel plants in district heating systems is one option for increasing the energy efficiency and improving the economic competitiveness of such biofuels. Another option, biomass co-firing with coal, generates high-efficiency biomass electricity and reduces CO2 emissions by replacing coal. It also offers a near-term market for lignocellulosic biomass, which can stimulate development of supply systems for biomass also suitable as feedstock for 2nd generation biofuels. Regardless of the long-term priorities of biomass use for energy, the stimulation of lignocellulosic biomass production by development of near term and cost-effective markets is judged to be a no-regrets strategy for Europe. Strategies that induce a relevant development and exploit existing energy infrastructures in order to reduce risk and reach lower costs, are proposed an attractive complement the present and prospective biofuel policies.

  6. Immobilized High Level Waste (HLW) Interim Storage Alternative Generation and analysis and Decision Report 2nd Generation Implementing Architecture

    CALMUS, R.B.

    2000-09-14

    Two alternative approaches were previously identified to provide second-generation interim storage of Immobilized High-Level Waste (IHLW). One approach was retrofit modification of the Fuel and Materials Examination Facility (FMEF) to accommodate IHLW. The results of the evaluation of the FMEF as the second-generation IHLW interim storage facility and subsequent decision process are provided in this document.

  7. Generative Software Engineering

    Jézéquel, Jean-Marc

    2007-01-01

    Researching evermore abstract and powerful ways of composing programs is the meat of software engineering for half a century. Important early steps were subroutines (to encapsulate actions) and records (to encapsulate data). A large step forward came with the introduction of the object-oriented concepts (classes, subclasses and virtual methods) where classes can encapsulate both data and behaviors in a very powerful, but still flexible, way. For a long time, these concepts dominated the scene...

  8. Advances with the new AIMS fab 193 2nd generation: a system for the 65 nm node including immersion

    Zibold, Axel M.; Poortinga, E.; Doornmalen, H. v.; Schmid, R.; Scherubl, T.; Harnisch, W.

    2005-06-01

    The Aerial Image Measurement System, AIMS, for 193nm lithography emulation is established as a standard for the rapid prediction of wafer printability for critical structures including dense patterns and defects or repairs on masks. The main benefit of AIMS is to save expensive image qualification consisting of test wafer exposures followed by wafer CD-SEM resist or wafer analysis. By adjustment of numerical aperture (NA), illumination type and partial coherence (σ) to match any given stepper/ scanner, AIMS predicts the printability of 193nm reticles such as binary with, or without OPC and phase shifting. A new AIMS fab 193 second generation system with a maximum NA of 0.93 is now available. Improvements in field uniformity, stability over time, measurement automation and higher throughput meet the challenging requirements of the 65nm node. A new function, "Global CD Map" can be applied to automatically measure and analyse the global CD uniformity of repeating structures across a reticle. With the options of extended depth-of-focus (EDOF) software and the upcoming linear polarisation capability in the illumination the new AIMS fab 193 second generation system is able to cover both dry and immersion requirements for NA performed to study the effects of polarisation for imaging by comparing the aerial image of the AIMS to the resist image of the scanner.

  9. Advances with the new AIMS fab 193 2nd generation: a system for the 65 nm node including immersion

    Zibold, Axel M.; Poortinga, E.; Doornmalen, H. v.; Schmid, R.; Scherubl, T.; Harnisch, W.

    2005-06-01

    The Aerial Image Measurement System, AIMS, for 193nm lithography emulation is established as a standard for the rapid prediction of wafer printability for critical structures including dense patterns and defects or repairs on masks. The main benefit of AIMS is to save expensive image qualification consisting of test wafer exposures followed by wafer CD-SEM resist or wafer analysis. By adjustment of numerical aperture (NA), illumination type and partial coherence (σ) to match any given stepper/ scanner, AIMS predicts the printability of 193nm reticles such as binary with, or without OPC and phase shifting. A new AIMS fab 193 second generation system with a maximum NA of 0.93 is now available. Improvements in field uniformity, stability over time, measurement automation and higher throughput meet the challenging requirements of the 65nm node. A new function, "Global CD Map" can be applied to automatically measure and analyse the global CD uniformity of repeating structures across a reticle. With the options of extended depth-of-focus (EDOF) software and the upcoming linear polarisation capability in the illumination the new AIMS fab 193 second generation system is able to cover both dry and immersion requirements for NA < 1. Rigorous simulations have been performed to study the effects of polarisation for imaging by comparing the aerial image of the AIMS to the resist image of the scanner.

  10. Experimental Investigation of 2nd Generation Bioethanol Derived from Empty-fruit-bunch (EFB) of Oil-palm on Performance and Exhaust Emission of SI Engine

    Yanuandri Putrasari; Haznan Abimanyu; Achmad Praptijanto; Arifin Nur; Yan Irawan; Sabar Pangihutan Simanungkalit

    2014-01-01

    The experimental investigation of 2nd generation bioethanol derived from EFB of oil-palm blended with gasoline for 10, 20, 25% by volume and pure gasoline were conducted on performance and exhaust emission tests of SI engine. A four stroke, four cylinders, programmed fuel injection (PGMFI), 16 valves variable valve timing and electronic lift control (VTEC), single overhead camshaft (SOHC), and 1,497 cm3 SI engine (Honda/L15A) was used in this investigation. Engine performance test was carried...

  11. Anaerobic digestion in combination with 2nd generation ethanol production for maximizing biofuels yield from lignocellulosic biomass – testing in an integrated pilot-scale biorefinery plant

    Uellendahl, Hinrich; Ahring, Birgitte Kiær

    was higher for mesophilic than for thermophilic operation. The effluent from the ethanol fermentation showed no signs of toxicity to the anaerobic microorganisms. Implementation of the biogas production from the fermentation effluent accounted for about 30% higher biofuels yield in the biorefinery......An integrated biorefinery concept for 2nd generation bioethanol production together with biogas production from the fermentation effluent was tested in pilot-scale. The pilot plant comprised pretreatment, enzymatic hydrolysis, hexose and pentose fermentation into ethanol and anaerobic digestion of...... the fermentation effluent in a UASB (upflow anaerobic sludge blanket) reactor. Operation of the 770 liter UASB reactor was tested under both mesophilic (38ºC) and thermophilic (53ºC) conditions with increasing loading rates of the liquid fraction of the effluent from ethanol fermentation. At an OLR of...

  12. Methodology for measuring the impact of mobile technology change from 2nd to 3th generation percerved by users of smes in Barranquilla

    Jairo Polo

    2011-06-01

    Full Text Available This article presents the results of a research project undertaken to obtain a Masters inBusiness Administration from the Business School at the Universidad del Norte, whosepurpose was to identify and test a methodology to measure the impact exerted by thechange from 2nd to 3rd generation mobile tech, based on the perception of users belongingto Barranquilla SME, motivated by the influence of technological changes in behavior andthe knowledge creation among society members, and the importance it has taken to thesurvival of organizations the adoption of applications for process automation, web-basedapplications, voice, data and video that allow the development of competitive advantages,based on information and creativity for new and better products or services.

  13. Efficient 2(nd) and 4(th) harmonic generation of a single-frequency, continuous-wave fiber amplifier.

    Sudmeyer, Thomas; Imai, Yutaka; Masuda, Hisashi; Eguchi, Naoya; Saito, Masaki; Kubota, Shigeo

    2008-02-01

    We demonstrate efficient cavity-enhanced second and fourth harmonic generation of an air-cooled, continuous-wave (cw), single-frequency 1064 nm fiber-amplifier system. The second harmonic generator achieves up to 88% total external conversion efficiency, generating more than 20-W power at 532 nm wavelength in a diffraction-limited beam (M(2) crystal operated at 25 degrees C. The fourth harmonic generator is based on an AR-coated, Czochralski-grown beta-BaB(2)O(4) (BBO) crystal optimized for low loss and high damage threshold. Up to 12.2 W of 266-nm deep-UV (DUV) output is obtained using a 6-mm long critically phase-matched BBO operated at 40 degrees C. This power level is more than two times higher than previously reported for cw 266-nm generation. The total external conversion efficiency from the fundamental at 1064 nm to the fourth harmonic at 266 nm is >50%. PMID:18542230

  14. Lignocellulosic ethanol in Brazil : technical assessment of 1st and 2nd generation sugarcane ethanol in a Brazilian setting

    Stojanovic, M.; Bakker, R.R.C.

    2009-01-01

    Brazil is currently the largest ethanol-biofuel producer worldwide. Ethanol is produced by fermenting the sucrose part of the sugarcane that contains only one third of the sugarcane energy. The rest of the plant is burned to produce energy to run the process and to generate electricity that is sold

  15. Lignocellulosic ethanol in Brazil : technical assessment of 1st and 2nd generation sugarcane ethanol in a Brazilian setting

    Stojanovic, M.; Bakker, R.R.C.

    2009-01-01

    Brazil is currently the largest ethanol-biofuel producer worldwide. Ethanol is produced by fermenting the sucrose part of the sugarcane that contains only one third of the sugarcane energy. The rest of the plant is burned to produce energy to run the process and to generate electricity that is sold to the public grid, making the process a net energy producer. This paper evaluates current technology from an energy efficiency point of view and quantifies additional benefits from extra energy ge...

  16. FT-IR Investigation of Hoveyda-Grubbs'2nd Generation Catalyst in Self-Healing Epoxy Mixtures

    The development of smart composites capable of self-repair on aeronautical structures is still at the planning stage owing to complex issues to overcome. A very important issue to solve concerns the components' stability of the proposed composites which are compromised at the cure temperatures necessary for good performance of the composite. In this work we analyzed the possibility to apply Hoveyda Grubbs' second generation catalyst (HG2) to develop self-healing systems. Our experimental results have shown critical issues in the use of epoxy precursors in conjunction with Hoveyda-Grubbs II metathesis catalyst. However, an appropriate curing cycle of the self-healing mixture permits to overcome the critical issues making possible high temperatures for the curing process without deactivating self-repair activity.

  17. Contribution of ion beam analysis methods to the development of 2nd generation high temperature superconducting (HTS) wires

    Usov, Igor O [Los Alamos National Laboratory; Arendt, Paul N [Los Alamos National Laboratory; Stan, Liliana [Los Alamos National Laboratory; Holesinger, Terry G [Los Alamos National Laboratory; Foltyn, Steven R [Los Alamos National Laboratory; Depaula, Raymond F [Los Alamos National Laboratory

    2009-01-01

    One of the crucial steps in the second generation high temperature superconducting wire program was development of the buffer layer architecture. The architecture designed at the Superconductivity Technology Center at Los Alamos National Laboratory consists of several oxide layers wherein each layer plays a specific role, namely: nucleation layer, diffusion barrier, biaxially textured template, and an intermediate layer with a good match to the lattice parameter of superconducting Y{sub 1}Ba{sub 2}Cu{sub 3}O{sub 7} (YBCO) compound. This report demonstrates how a wide range of ion beam analysis techniques (SIMS, RBS, channeling, PIXE, PIGE, NRA, ERD) was employed for analysis of each buffer layer and the YBCO films. These results assisted in understanding of a variety of physical processes occurring during the buffet layer fabrication and helped to optimize the buffer layer architecture as a whole.

  18. Control system for the 2nd generation Berkeley automounters (BAM2) at GM/CA-CAT macromolecular crystallography beamlines

    GM/CA-CAT at Sector 23 of the Advanced Photon Source (APS) is an NIH funded facility for crystallographic structure determination of biological macromolecules by X-ray diffraction. A second-generation Berkeley automounter is being integrated into the beamline control system at the 23BM experimental station. This new device replaces the previous all-pneumatic gripper motions with a combination of pneumatics and XYZ motorized linear stages. The latter adds a higher degree of flexibility to the robot including auto-alignment capability, accommodation of a larger capacity sample Dewar of arbitrary shape, and support for advanced operations such as crystal washing, while preserving the overall simplicity and efficiency of the Berkeley automounter design.

  19. New approaches for improving the production of the 1st and 2nd generation ethanol by yeast.

    Kurylenko, Olena; Semkiv, Marta; Ruchala, Justyna; Hryniv, Orest; Kshanovska, Barbara; Abbas, Charles; Dmytruk, Kostyantyn; Sibirny, Andriy

    2016-01-01

    Increase in the production of 1st generation ethanol from glucose is possible by the reduction in the production of ethanol co-products, especially biomass. We have developed a method to reduce biomass accumulation of Saccharomyces cerevisiae by the manipulation of the intracellular ATP level due to overexpression of genes of alkaline phosphatase, apyrase or enzymes involved in futile cycles. The strains constructed accumulated up to 10% more ethanol on a cornmeal hydrolysate medium. Similar increase in ethanol accumulation was observed in the mutants resistant to the toxic inhibitors of glycolysis like 3-bromopyruvate and others. Substantial increase in fuel ethanol production will be obtained by the development of new strains of yeasts that ferment sugars of the abundant lignocellulosic feedstocks, especially xylose, a pentose sugar. We have found that xylose can be fermented under elevated temperatures by the thermotolerant yeast, Hansenula polymorpha. We combined protein engineering of the gene coding for xylose reductase (XYL1) along with overexpression of the other two genes responsible for xylose metabolism in yeast (XYL2, XYL3) and the deletion of the global transcriptional activator CAT8, with the selection of mutants defective in utilizing ethanol as a carbon source using the anticancer drug, 3-bromopyruvate. Resulted strains accumulated 20-25 times more ethanol from xylose at the elevated temperature of 45°C with up to 12.5 g L(-1) produced. Increase in ethanol yield and productivity from xylose was also achieved by overexpression of genes coding for the peroxisomal enzymes: transketolase (DAS1) and transaldolase (TAL2), and deletion of the ATG13 gene. PMID:26619255

  20. Experimental Investigation of 2nd Generation Bioethanol Derived from Empty-fruit-bunch (EFB of Oil-palm on Performance and Exhaust Emission of SI Engine

    Yanuandri Putrasari

    2014-07-01

    Full Text Available The experimental investigation of 2nd generation bioethanol derived from EFB of oil-palm blended with gasoline for 10, 20, 25% by volume and pure gasoline were conducted on performance and exhaust emission tests of SI engine. A four stroke, four cylinders, programmed fuel injection (PGMFI, 16 valves variable valve timing and electronic lift control (VTEC, single overhead camshaft (SOHC, and 1,497 cm3 SI engine (Honda/L15A was used in this investigation. Engine performance test was carried out for brake torque, power, and fuel consumption. The exhaust emission was analyzed for carbon monoxide (CO and hydrocarbon (HC. The engine was operated on speed range from1,500 until 4,500 rev/min with 85% throttle opening position. The results showed that the highest brake torque of bioethanol blends achieved by 10% bioethanol content at 3,000 to 4,500 rpm, the brake power was greater than pure gasoline at 3,500 to 4,500 rpm for 10% bioethanol, and bioethanol-gasoline blends of 10 and 20% resulted greater bsfc than pure gasoline at low speed from 1,500 to 3,500 rpm. The trend of CO and HC emissions tended to decrease when the engine speed increased.

  1. Next generation software process improvement

    Turnas, Daniel

    2003-01-01

    Approved for public release; distribution is unlimited Software is often developed under a process that can at best be described as ad hoc. While it is possible to develop quality software under an ad hoc process, formal processes can be developed to help increase the overall quality of the software under development. The application of these processes allows for an organization to mature. The software maturity level, and process improvement, of an organization can be measured with the Cap...

  2. Online Rule Generation Software Process Model

    Sudeep Marwaha

    2013-07-01

    Full Text Available For production systems like expert systems, a rule generation software can facilitate the faster deployment. The software process model for rule generation using decision tree classifier refers to the various steps required to be executed for the development of a web based software model for decision rule generation. The Royce’s final waterfall model has been used in this paper to explain the software development process. The paper presents the specific output of various steps of modified waterfall model for decision rules generation.

  3. Quantification of left and right ventricular function and myocardial mass: Comparison of low-radiation dose 2nd generation dual-source CT and cardiac MRI

    Objective: To prospectively evaluate the accuracy of left and right ventricular function and myocardial mass measurements based on a dual-step, low radiation dose protocol with prospectively ECG-triggered 2nd generation dual-source CT (DSCT), using cardiac MRI (cMRI) as the reference standard. Materials and methods: Twenty patients underwent 1.5 T cMRI and prospectively ECG-triggered dual-step pulsing cardiac DSCT. This image acquisition mode performs low-radiation (20% tube current) imaging over the majority of the cardiac cycle and applies full radiation only during a single adjustable phase. Full-radiation-phase images were used to assess cardiac morphology, while low-radiation-phase images were used to measure left and right ventricular function and mass. Quantitative CT measurements based on contiguous multiphase short-axis reconstructions from the axial CT data were compared with short-axis SSFP cardiac cine MRI. Contours were manually traced around the ventricular borders for calculation of left and right ventricular end-diastolic volume, end-systolic volume, stroke volume, ejection fraction and myocardial mass for both modalities. Statistical methods included independent t-tests, the Mann–Whitney U test, Pearson correlation statistics, and Bland–Altman analysis. Results: All CT measurements of left and right ventricular function and mass correlated well with those from cMRI: for left/right end-diastolic volume r = 0.885/0.801, left/right end-systolic volume r = 0.947/0.879, left/right stroke volume r = 0.620/0.697, left/right ejection fraction r = 0.869/0.751, and left/right myocardial mass r = 0.959/0.702. Mean radiation dose was 6.2 ± 1.8 mSv. Conclusions: Prospectively ECG-triggered, dual-step pulsing cardiac DSCT accurately quantifies left and right ventricular function and myocardial mass in comparison with cMRI with substantially lower radiation exposure than reported for traditional retrospective ECG-gating.

  4. Stroke Symbol Generation Software for Fighter Aircraft

    G.K. Tripathi

    2013-03-01

    Full Text Available This paper gives an overview of the stroke symbol generation software developed by Hindustan Aeronautics Limited for fighter aircraft. This paper covers the working principle of head-up-display, overview of target hardware on which the developed software has been integrated and tested, software architecture, hardware software interfaces and design details of stroke symbol generation software. The paper also covers the issues related to stroke symbol quality which were encountered by the design team and the details about how the issues were resolved during integration and test phase.Defence Science Journal, 2013, 63(2, pp.153-156, DOI:http://dx.doi.org/10.14429/dsj.63.4257

  5. TOWARDS TEST CASES GENERATION FROM SOFTWARE SPECIFICATIONS

    R. Jeevarathinam,

    2010-11-01

    Full Text Available Verification and Validation of software systems often consumes up to 70% of the development resources. Testing is one of the most frequently used Verification and Validation techniques for verifyingsystems. Many agencies that certify software systems for use require that the software be tested to certain specified levels of coverage. Currently, developing test cases to meet these requirements takes a major portion of the resources. Automating this task would result in significant time and cost savings. This testing research is aimed at the generation of such test cases. In the proposed approach a formal model of the required software behavior (a formal specification is used for test-case generation and as an oracle to determine if theimplementation produced the correct output during testing. This is referred to as Specification Based Testing. Specification based testing offers several advantages to traditional code based testing. The formal specification can be used as the source artifact to generate functional tests for the final product and since the test cases are produced at an earlier stage in the software development, they are available before the implementation is completed. Central to this approach is the use of model checkers as test case generation engines. Model checking is a technique for exploring the reachable state-space of a system model to verify properties of interest.There are several research challenges that must be addressed to realize this test generation approach.

  6. An Impact Motion Generation Support Software

    Tsujita, Teppei; Konno, Atsushi; Nomura, Yuki; Komizunai, Shunsuke; Ayaz, Yasar; Uchiyama, Masaru

    2010-01-01

    The detail of impact motion generation support software is presented in this paper. The developed software supports impact motion design with OpenHRP or OpenHRP3. A preliminary impact motion experiment is performed by a humanoid robot and the analyses of its result are presented. The analysis reveals that the designed motion is not robust against error in the position of the nail since the timing of pulling up the hammer is defined in the designed motion in advance. Therefore, ...

  7. 2nd Tourism Postdisciplinarity Conference

    2016-01-01

    Following the noted success of the 1st international conference on postdisciplinary approaches to tourism studies (held in Neuchatel, Switzerland, 19-22 June, 2013), we are happy to welcome you to the 2nd Tourism Postdisciplinarity Conference. Postdisciplinarity surpasses the boundaries of disciplinary thinking and opens up the possibility to question the established phenomena – touristic or otherwise – we take for granted. It does not claim that disciplinarity is essentially wrong, but it...

  8. 2nd Tourism Postdisciplinarity Conference

    Following the noted success of the 1st international conference on postdisciplinary approaches to tourism studies (held in Neuchatel, Switzerland, 19-22 June, 2013), we are happy to welcome you to the 2nd Tourism Postdisciplinarity Conference. Postdisciplinarity surpasses the boundaries of...... study less embedded in that system of thought. Postdisciplinarity is an epistemological endeavour that speaks of knowledge production and the ways in which the world of physical and social phenomena can be known. It is also an ontological discourse as it concerns what we call ‘tourism...

  9. Monte Carlo generators in ATLAS software

    This document describes how Monte Carlo (MC) generators can be used in the ATLAS software framework (Athena). The framework is written in C++ using Python scripts for job configuration. Monte Carlo generators that provide the four-vectors describing the results of LHC collisions are written in general by third parties and are not part of Athena. These libraries are linked from the LCG Generator Services (GENSER) distribution. Generators are run from within Athena and the generated event output is put into a transient store, in HepMC format, using StoreGate. A common interface, implemented via inheritance of a GeneratorModule class, guarantees common functionality for the basic generation steps. The generator information can be accessed and manipulated by helper packages like TruthHelper. The ATLAS detector simulation as well access the truth information from StoreGate1. Steering is done through specific interfaces to allow for flexible configuration using ATLAS Python scripts. Interfaces to most general purpose generators, including: Pythia6, Pythia8, Herwig, Herwig++ and Sherpa are provided, as well as to more specialized packages, for example Phojet and Cascade. A second type of interface exist for the so called Matrix Element generators that only generate the particles produced in the hard scattering process and write events in the Les Houches event format. A generic interface to pass these events to Pythia6 and Herwig for parton showering and hadronisation has been written.

  10. 家族企业代际传承及二代推动战略转型的绩效研究%Performance Study of Intergenerational Succession and Strategic Transformation Driven by the 2nd Generation of Family Business

    汪祥耀; 金一禾

    2015-01-01

    本文将家族企业代际传承分为参与管理、共同管理和接收管理三个阶段,探究了处于不同阶段家族企业的绩效情况以及二代推动的家族企业战略转型对企业绩效的影响。利用2010-2012年我国A股主板上市公司中家族企业的样本展开实证研究,得出如下结论:二代进入家族企业高管,参与家族企业的日常经营和战略决策,或者与一代共同管理家族企业,对企业业绩产生正面影响;由于样本数量较少的客观原因,接收管理后企业绩效的经济后果关系未能得到证实;二代推动的战略转型对共同管理和企业绩效的关系起反向调节作用,在代际传承的共同管理阶段实施战略转型会降低原有真实绩效。%This paper divides the intergeneration succession of family business into three phases, including Involvement Management, Co-management, and Takeover Management, then researches the performance of intergenerational succession and strategic transformation driven by the 2nd generation.The empirical study of the listed family firms in China's A-share main market in the years of 2010-2012 finds that the family firms with the 2nd generation involvement management and co-management have a better performance, that there is no proof of relationship between takeover management and firm perform-ance due to the lack of enough samples.Besides, the strategic transformation driven by the 2nd generation has a reverse effect on the relationship between the co-management and firm performance, and the strategic transformation will reduce the o-riginal real performance in the co-management phase.

  11. "Me-A Different 2nd Generation of the Wealthy"%"我——就是不一样的富二代"

    Guo Yan

    2010-01-01

    @@ In recent years,phrases like "G2 of the Wealthy"are frequently mentioned by people,and discussions on the second generation of wealthy Chinese are a hot topic.However,people still have the impression that"G2 of the" Wealthy"is a generation which lacds nothing but significance and pursuit,a generation without responsibility and lnly meaningless individuality.

  12. Enhanced animal productivity and health with improved manure management in 2nd Generation Environmentally Superior Technology in North Carolina: II. Air quality

    The aim of this study was to evaluate the effects of improved manure management on air quality and the beneficial effect of a cleaner environment on animal productivity and health using a second generation of Environmentally Superior Technology. The second generation system combines solid-liquid sep...

  13. 2nd Historic Mortars Conference

    Hughes, John; Groot, Caspar; Historic Mortars : Characterisation, Assessment and Repair

    2012-01-01

    This volume focuses on research and practical issues connected with mortars on historic structures. The book is divided into four sections: Characterisation of Historic Mortars, Repair Mortars and Design Issues, Experimental Research into Properties of Repair Mortars, and Assessment and Testing. The papers present the latest work of researchers in their field. The individual contributions were selected from the contributions to the 2nd Historic Mortars Conference, which took place in Prague, September, 22-24, 2010. All papers were reviewed and improved as necessary before publication. This peer review process by the editors resulted in the 34 individual contributions included in here. One extra paper reviewing and summarising State-of-the-Art knowledge covered by this publication was added as a starting and navigational point for the reader. The editors believe that having these papers in print is important and they hope that it will stimulate further research into historic mortars and related subjects. 

  14. Experimental Stochatics (2nd edition)

    Otto Moeschlin and his co-authors have written a book about simulation of stochastic systems. The book comes with a CD-ROM that contains the experiments discussed in the book, and the text from the book is repeated on the CD-ROM. According to the authors, the aim of the book is to give a quick introduction to stochastic simulation for 'all persons interested in experimental stochastics'. To please this diverse audience, the authors offer a book that has four parts. Part 1, called 'Artificial Randomness', is the longest of the four parts. It gives an overview of the generation, testing and basic usage of pseudo random numbers in simulation. Although algorithms for generating sequences of random numbers are fundamental to simulation, it is a slightly unusual choice to give it such weight in comparison to other algorithmic topics. The remaining three parts consist of simulation case studies. Part 2, 'Stochastic Models', treats four problems - Buffon's needle, a queuing system, and two problems related to the kinetic theory of gases. Part 3 is called 'Stochastic Processes' and discusses the simulation of discrete time Markov chains, birth-death processes, Brownian motion and diffusions. The last section of Part 3 is about simulation as a tool to understand the traffic flow in a system controlled by stoplights, an area of research for the authors. Part 4 is called 'Evaluation of Statistical Procedures'. This section contains examples where simulation is used to test the performance of statistical methods. It covers four examples: the Neymann-Pearson lemma, the Wald sequential test, Bayesian point estimation and Hartigan procedures. The CD-ROM contains an easy-to-install software package that runs under Microsoft Windows. The software contains the text and simulations from the book. What I found most enjoyable about this book is the number of topics covered in the case studies. The highly individual selection of applications, which may serve as a source of inspiration

  15. Enhanced animal productivity and health with improved manure management in 2nd Generation Environmentally Superior Technology in North Carolina: I. Water quality

    New legislation in North Carolina promotes the replacement of old lagoon technology with new Environmentally Superior Technology. Scientists at ARS Florence Center and industry cooperators completed design and demonstration of a second generation treatment system for swine waste that can achieve hig...

  16. Next-Generation Lightweight Mirror Modeling Software

    Arnold, William R., Sr.; Fitzgerald, Mathew; Rosa, Rubin Jaca; Stahl, Phil

    2013-01-01

    The advances in manufacturing techniques for lightweight mirrors, such as EXELSIS deep core low temperature fusion, Corning's continued improvements in the Frit bonding process and the ability to cast large complex designs, combined with water-jet and conventional diamond machining of glasses and ceramics has created the need for more efficient means of generating finite element models of these structures. Traditional methods of assembling 400,000 + element models can take weeks of effort, severely limiting the range of possible optimization variables. This paper will introduce model generation software developed under NASA sponsorship for the design of both terrestrial and space based mirrors. The software deals with any current mirror manufacturing technique, single substrates, multiple arrays of substrates, as well as the ability to merge submodels into a single large model. The modeler generates both mirror and suspension system elements, suspensions can be created either for each individual petal or the whole mirror. A typical model generation of 250,000 nodes and 450,000 elements only takes 5-10 minutes, much of that time being variable input time. The program can create input decks for ANSYS, ABAQUS and NASTRAN. An archive/retrieval system permits creation of complete trade studies, varying cell size, depth, and petal size, suspension geometry with the ability to recall a particular set of parameters and make small or large changes with ease. The input decks created by the modeler are text files which can be modified by any editor, all the key shell thickness parameters are accessible and comments in deck identify which groups of elements are associated with these parameters. This again makes optimization easier. With ANSYS decks, the nodes representing support attachments are grouped into components; in ABAQUS these are SETS and in NASTRAN as GRIDPOINT SETS, this make integration of these models into large telescope or satellite models possible

  17. Next Generation Lightweight Mirror Modeling Software

    Arnold, William R., Sr.; Fitzgerald, Mathew; Rosa, Rubin Jaca; Stahl, H. Philip

    2013-01-01

    The advances in manufacturing techniques for lightweight mirrors, such as EXELSIS deep core low temperature fusion, Corning's continued improvements in the Frit bonding process and the ability to cast large complex designs, combined with water-jet and conventional diamond machining of glasses and ceramics has created the need for more efficient means of generating finite element models of these structures. Traditional methods of assembling 400,000 + element models can take weeks of effort, severely limiting the range of possible optimization variables. This paper will introduce model generation software developed under NASA sponsorship for the design of both terrestrial and space based mirrors. The software deals with any current mirror manufacturing technique, single substrates, multiple arrays of substrates, as well as the ability to merge submodels into a single large model. The modeler generates both mirror and suspension system elements, suspensions can be created either for each individual petal or the whole mirror. A typical model generation of 250,000 nodes and 450,000 elements only takes 5-10 minutes, much of that time being variable input time. The program can create input decks for ANSYS, ABAQUS and NASTRAN. An archive/retrieval system permits creation of complete trade studies, varying cell size, depth, and petal size, suspension geometry with the ability to recall a particular set of parameters and make small or large changes with ease. The input decks created by the modeler are text files which can be modified by any editor, all the key shell thickness parameters are accessible and comments in deck identify which groups of elements are associated with these parameters. This again makes optimization easier. With ANSYS decks, the nodes representing support attachments are grouped into components; in ABAQUS these are SETS and in NASTRAN as GRIDPOINT SETS, this make integration of these models into large telescope or satellite models easier.

  18. Next generation lightweight mirror modeling software

    Arnold, William R.; Fitzgerald, Matthew; Rosa, Rubin Jaca; Stahl, H. Philip

    2013-09-01

    The advances in manufacturing techniques for lightweight mirrors, such as EXELSIS deep core low temperature fusion, Corning's continued improvements in the Frit bonding process and the ability to cast large complex designs, combined with water-jet and conventional diamond machining of glasses and ceramics has created the need for more efficient means of generating finite element models of these structures. Traditional methods of assembling 400,000 + element models can take weeks of effort, severely limiting the range of possible optimization variables. This paper will introduce model generation software developed under NASA sponsorship for the design of both terrestrial and space based mirrors. The software deals with any current mirror manufacturing technique, single substrates, multiple arrays of substrates, as well as the ability to merge submodels into a single large model. The modeler generates both mirror and suspension system elements, suspensions can be created either for each individual petal or the whole mirror. A typical model generation of 250,000 nodes and 450,000 elements only takes 3-5 minutes, much of that time being variable input time. The program can create input decks for ANSYS, ABAQUS and NASTRAN. An archive/retrieval system permits creation of complete trade studies, varying cell size, depth, and petal size, suspension geometry with the ability to recall a particular set of parameters and make small or large changes with ease. The input decks created by the modeler are text files which can be modified by any text editor, all the shell thickness parameters and suspension spring rates are accessible and comments in deck identify which groups of elements are associated with these parameters. This again makes optimization easier. With ANSYS decks, the nodes representing support attachments are grouped into components; in ABAQUS these are SETS and in NASTRAN as GRIDPOINT SETS, this make integration of these models into large telescope or satellite

  19. Using Test Generating Software for Assessment

    Singh Aurora, Tarlok

    2007-04-01

    Assessment is an important part of teaching and learning. Designing suitable tests and quizzes for assessment is a time consuming task. With faculty's much commitment at work, sometimes it is not easy to find enough time to design a good test before the test day. Searching for and modifying older tests can take considerable amount of time. There is a need to develop a customized test bank that one could use to generate a quiz or test quickly before class time or before a test. A number of commercial software is available for e-learning that has this capability. Some of these are - Test Generator, Examview, Test Pro Developer etc. Application of Examview software in developing a test bank for physics will be presented. A physics test bank, with applications in other disciplines, can be gradually built over time and used to create a test or quiz quickly. Multiple scrambled versions of a single test (and answer sheets) can be created to discourage cheating in a large class setting. The presentation will show how to build a test bank.

  20. Techno-economic evaluation of 2nd generation bioethanol production from sugar cane bagasse and leaves integrated with the sugar-based ethanol process

    Macrelli Stefano

    2012-04-01

    Full Text Available Abstract Background Bioethanol produced from the lignocellulosic fractions of sugar cane (bagasse and leaves, i.e. second generation (2G bioethanol, has a promising market potential as an automotive fuel; however, the process is still under investigation on pilot/demonstration scale. From a process perspective, improvements in plant design can lower the production cost, providing better profitability and competitiveness if the conversion of the whole sugar cane is considered. Simulations have been performed with AspenPlus to investigate how process integration can affect the minimum ethanol selling price of this 2G process (MESP-2G, as well as improve the plant energy efficiency. This is achieved by integrating the well-established sucrose-to-bioethanol process with the enzymatic process for lignocellulosic materials. Bagasse and leaves were steam pretreated using H3PO4 as catalyst and separately hydrolysed and fermented. Results The addition of a steam dryer, doubling of the enzyme dosage in enzymatic hydrolysis, including leaves as raw material in the 2G process, heat integration and the use of more energy-efficient equipment led to a 37 % reduction in MESP-2G compared to the Base case. Modelling showed that the MESP for 2G ethanol was 0.97 US$/L, while in the future it could be reduced to 0.78 US$/L. In this case the overall production cost of 1G + 2G ethanol would be about 0.40 US$/L with an output of 102 L/ton dry sugar cane including 50 % leaves. Sensitivity analysis of the future scenario showed that a 50 % decrease in the cost of enzymes, electricity or leaves would lower the MESP-2G by about 20%, 10% and 4.5%, respectively. Conclusions According to the simulations, the production of 2G bioethanol from sugar cane bagasse and leaves in Brazil is already competitive (without subsidies with 1G starch-based bioethanol production in Europe. Moreover 2G bioethanol could be produced at a lower cost if subsidies were used to compensate for the

  1. [Implications of TCGA Network Data on 2nd Generation Immunotherapy Concepts Based on PD-L1 and PD-1 Target Structures].

    Peters, I; Tezval, H; Kramer, M W; Wolters, M; Grünwald, V; Kuczyk, M A; Serth, J

    2015-11-01

    The era of cytokines, given to patients with metastatic renal cell carcinoma (mRCC) as part of an unspecific immunomodulatory treatment concept, seems to have ended with the introduction of targeted therapies. However, preliminary data from studies on treatment with checkpoint inhibitors (e. g. anti-PD-1 and anti-PD-L1) may point the way to second-generation immunotherapy. The rationale of such immunomodulatory treatment is to stop or interrupt the tumour from "escaping" the body's immune defence. Thompson et al. report that increased protein expression of PD-L1 (CD274/ B7-H1) in tumour cells and tumour-infiltrating immune cells (TILs; lymphocytes and histiocytes) is associated with unfavourable clinical pathological parameters as well as poor survival. In small pilot groups of mRCC patients it was found that increased PD-L1 protein expression in tumours and TILs may be correlated with the objective response to anti-PD-1 treatment. Sometimes, however, a very wide variety of response rates was observed, which raises the question if this can be explained by individual expression levels of PD-L1 (CD 274) or PD-1 (PDCD1).Recently published data from the Cancer Genome Atlas (TCGA) Kidney Renal Clear Cell Carcinoma (KIRC) Network now provide a genome-wide data base that allows us to review or validate the molecular results obtained in clear cell renal cell carcinomas (ccRCC) to date.In this study, we analysed the TCGA KIRC mRNA expression data for PD-L1 and PD-1 for a possible association with clinical pathological parameters and the survival of 417 ccRCC patients.The mRNA expression of PD-L1 in primary nephrectomy specimens revealed no significant association with unfavourable clinical parameters. Interestingly, though, a positive correlation with patient survival was found (HR=0,59, p=0,006).These results, which partly contradict the concept applied to date, point out the necessity to ascertain the characteristics of PD-L1 and PD-1 expression at mRNA and protein

  2. A Practical GLR Parser Generator for Software Reverse Engineering

    Teng Geng; Fu Xu; Han Mei; Wei Meng; Zhibo Chen; Changqing Lai

    2014-01-01

    Traditional parser generators use deterministic parsing methods. These methods can not meet the parsing requirements of software reverse engineering effectively. A new parser generator is presented which can generate GLR parser with automatic error recovery. The generated GLR parser has comparable parsing speed with the traditional LALR(1) parser and can be used in the parsing of software reverse engineering.

  3. A Practical GLR Parser Generator for Software Reverse Engineering

    Teng Geng

    2014-03-01

    Full Text Available Traditional parser generators use deterministic parsing methods. These methods can not meet the parsing requirements of software reverse engineering effectively. A new parser generator is presented which can generate GLR parser with automatic error recovery. The generated GLR parser has comparable parsing speed with the traditional LALR(1 parser and can be used in the parsing of software reverse engineering.

  4. A 2nd generation cosmic axion experiment

    Hagmann, C; Stoeffl, W.; Van Bibber, K.; Daw, E.; Kinion, D.; Rosenberg, L; Sikivie, P.; Sullivan, N.; D. Tanner; Moltz, D.; Nezrick, F.; Turner, M; Golubev, N.; Kravchuk, L.

    1995-01-01

    An experiment is described to detect dark matter axions trapped in the halo of our galaxy. Galactic axions are converted into microwave photons via the Primakoff effect in a static background field provided by a superconducting magnet. The photons are collected in a high Q microwave cavity and detected by a low noise receiver. The axion mass range accessible by this experiment is 1.3-13 micro-eV. The expected sensitivity will be roughly 50 times greater than achieved by previous experiments i...

  5. A 2nd generation cosmic axion experiment

    Hagmann, C A; Van Bibber, K; Daw, E J; Kinion, D S; Rosenberg, L J; Sikivie, P; Sullivan, N; Tanner, D B; Moltz, D M; Nezrick, F A; Turner, M; Golubev, N A; Kravchuk, L V

    1995-01-01

    An experiment is described to detect dark matter axions trapped in the halo of our galaxy. Galactic axions are converted into microwave photons via the Primakoff effect in a static background field provided by a superconducting magnet. The photons are collected in a high Q microwave cavity and detected by a low noise receiver. The axion mass range accessible by this experiment is 1.3-13 micro-eV. The expected sensitivity will be roughly 50 times greater than achieved by previous experiments in this mass range. The assembly of the detector is well under way at LLNL and data taking will start in mid-1995.

  6. Automatic Testcase Generation for Flight Software

    Bushnell, David Henry; Pasareanu, Corina; Mackey, Ryan M.

    2008-01-01

    The TacSat3 project is applying Integrated Systems Health Management (ISHM) technologies to an Air Force spacecraft for operational evaluation in space. The experiment will demonstrate the effectiveness and cost of ISHM and vehicle systems management (VSM) technologies through onboard operation for extended periods. We present two approaches to automatic testcase generation for ISHM: 1) A blackbox approach that views the system as a blackbox, and uses a grammar-based specification of the system's inputs to automatically generate *all* inputs that satisfy the specifications (up to prespecified limits); these inputs are then used to exercise the system. 2) A whitebox approach that performs analysis and testcase generation directly on a representation of the internal behaviour of the system under test. The enabling technologies for both these approaches are model checking and symbolic execution, as implemented in the Ames' Java PathFinder (JPF) tool suite. Model checking is an automated technique for software verification. Unlike simulation and testing which check only some of the system executions and therefore may miss errors, model checking exhaustively explores all possible executions. Symbolic execution evaluates programs with symbolic rather than concrete values and represents variable values as symbolic expressions. We are applying the blackbox approach to generating input scripts for the Spacecraft Command Language (SCL) from Interface and Control Systems. SCL is an embedded interpreter for controlling spacecraft systems. TacSat3 will be using SCL as the controller for its ISHM systems. We translated the SCL grammar into a program that outputs scripts conforming to the grammars. Running JPF on this program generates all legal input scripts up to a prespecified size. Script generation can also be targeted to specific parts of the grammar of interest to the developers. These scripts are then fed to the SCL Executive. ICS's in-house coverage tools will be run to

  7. Generating Protocol Software from CPN Models Annotated with Pragmatics

    Simonsen, Kent Inge; Kristensen, Lars M.; Kindler, Ekkart

    Model-driven software engineering (MDSE) provides a foundation for automatically generating software based on models that focus on the problem domain while abstracting from the details of underlying implementation platforms. Coloured Petri Nets (CPNs) have been widely used to formally model and...... verify protocol software, but limited work exists on using CPN models of protocols as a basis for automated code generation. The contribution of this paper is a method for generating protocol software from a class of CPN models annotated with code generation pragmatics. Our code generation method...

  8. Automatic program generation: future of software engineering

    Robinson, J.H.

    1979-01-01

    At this moment software development is still more of an art than an engineering discipline. Each piece of software is lovingly engineered, nurtured, and presented to the world as a tribute to the writer's skill. When will this change. When will the craftsmanship be removed and the programs be turned out like so many automobiles from an assembly line. Sooner or later it will happen: economic necessities will demand it. With the advent of cheap microcomputers and ever more powerful supercomputers doubling capacity, much more software must be produced. The choices are to double the number of programers, double the efficiency of each programer, or find a way to produce the needed software automatically. Producing software automatically is the only logical choice. How will automatic programing come about. Some of the preliminary actions which need to be done and are being done are to encourage programer plagiarism of existing software through public library mechanisms, produce well understood packages such as compiler automatically, develop languages capable of producing software as output, and learn enough about the whole process of programing to be able to automate it. Clearly, the emphasis must not be on efficiency or size, since ever larger and faster hardware is coming.

  9. Sistema especialista de 2ª geração para diagnose técnica: modelo e procedimento 2nd generation expert system for technical diagnosis: a model and a procedure

    Néocles Alves Pereira

    1994-04-01

    Full Text Available Este trabalho trata da diagnose em equipamentos industriais através do uso de Sistemas Especialistas. Com o objetivo de desenvolver procedimentos que contribuam na construção de Sistemas Especialistas para diagnose em Manutenção Industrial, consideramos os chamados Sistemas Especialistas de 2ª Geração. Propomos um modelo modificado e um procedimento de diagnose. Na estratégia de diagnose utilizamos uma busca "top-down best-first", que combina dois tipos de tratamento de incerteza: (i entropia, para decidir pelo melhor caminho nas estruturas de conhecimento, e (ii crença nos sintomas, para validar os diagnósticos obtidos. Esta proposta traz as seguintes vantagens: base de conhecimento mais completa, melhores explicação e apresentação de diagnósticos finais. Desenvolvemos um protótipo com base em informações reais sobre bombas centrífugas.This paper deals with the diagnosis of industrial equipments through the use of Expert Systems. Intending to develop procedures that result in diagnosis knowledge bases for Industrial Maintenance, we have considered 2nd Generation Expert Systems. We have proposed a modified model and a diagnosis procedure. We used for the diagnosis strategy a "top-down best-first search", that combines two types of uncertainty treatment: (i entropy, to find the best way in the search throughout knowledge structures, (ii belief in the symptoms, to validate the resultant diagnostics. This proposal has the following advantages: a more complete knowledge base, a better explanation and presentation of the resultant diagnostics. We have developed a prototype considering real informations about centrifugal pumps.

  10. A Practical Evaluation of Next Generation Sequencing & Molecular Cloning Software

    Meintjes, Peter; Qaadri, Kashef; Olsen, Christian

    2013-01-01

    Laboratories using Next Generation Sequencing (NGS) technologies and/ or high-throughput molecular cloning experiments can spend a significant amount of their research budget on data analysis and data management. The decision to develop in-house software, to rely on combinations of free software packages, or to purchase commercial software can significantly affect productivity and ROI. In this talk, we will describe a practical software evaluation process that was developed to assist core fac...

  11. Automatic generation of hardware/software interfaces

    King, Myron Decker; Dave, Nirav H.; Mithal, Arvind

    2012-01-01

    Enabling new applications for mobile devices often requires the use of specialized hardware to reduce power consumption. Because of time-to-market pressure, current design methodologies for embedded applications require an early partitioning of the design, allowing the hardware and software to be developed simultaneously, each adhering to a rigid interface contract. This approach is problematic for two reasons: (1) a detailed hardware-software interface is difficult to specify until one is de...

  12. Improved Ant Algorithms for Software Testing Cases Generation

    Shunkun Yang; Tianlong Man; Jiaqi Xu

    2014-01-01

    Existing ant colony optimization (ACO) for software testing cases generation is a very popular domain in software testing engineering. However, the traditional ACO has flaws, as early search pheromone is relatively scarce, search efficiency is low, search model is too simple, positive feedback mechanism is easy to porduce the phenomenon of stagnation and precocity. This paper introduces improved ACO for software testing cases generation: improved local pheromone update strategy for ant colony...

  13. Search-based software test data generation using evolutionary computation

    Maragathavalli, P.

    2011-01-01

    Search-based Software Engineering has been utilized for a number of software engineering activities.One area where Search-Based Software Engineering has seen much application is test data generation. Evolutionary testing designates the use of metaheuristic search methods for test case generation. The search space is the input domain of the test object, with each individual or potential solution, being an encoded set of inputs to that test object. The fitness function is tailored to find...

  14. Next-generation business intelligence software with Silverlight 3

    Czernicki, Bart

    2010-01-01

    Business Intelligence (BI) software is the code and tools that allow you to view different components of a business using a single visual platform, making comprehending mountains of data easier. Applications that include reports, analytics, statistics, and historical and predictive modeling are all examples of BI applications. Currently, we are in the second generation of BI software, called BI 2.0. This generation is focused on writing BI software that is predictive, adaptive, simple, and interactive. As computers and software have evolved, more data can be presented to end users with increas

  15. Creating the next generation control system software

    A new 1980's style support package for future accelerator control systems is proposed. It provides a way to create accelerator applications software without traditional programming. Visual Interactive Applications (VIA) is designed to meet the needs of expanded accelerator complexes in a more cost effective way than past experience with procedural languages by using technology from the personal computer and artificial intelligence communities. 4 refs

  16. Using DSL for Automatic Generation of Software Connectors

    Bureš, Tomáš; Malohlava, M.; Hnětynka, P.

    Los Alamitos: IEEE Computer Society, 2008, s. 138-147. ISBN 978-0-7695-3091-8. [ICCBSS 2008. International Conference on Composition-Based Software Systems /7./. Madrid (ES), 25.02.2008-29.02.2008,] R&D Projects: GA AV ČR 1ET400300504 Institutional research plan: CEZ:AV0Z10300504 Keywords : component based systems * software connectors * code generation * domain-specific languages Subject RIV: JC - Computer Hardware ; Software

  17. Abstracts: 2nd interventional MRI symposium

    Main topics of the 2nd interventional MRI symposium were: MR compatibility and pulse sequences; MR thermometry, biopsy, musculoskeletal system; laser-induced interstitial thermotherapy, radiofrequency ablations; intraoperative MR; vascular applications, breast, endoscopy; focused ultrasound, cryotherapy, perspectives; poster session with 34 posters described. (AJ)

  18. Abstracts: 2nd interventional MRI symposium

    Anon.

    1997-09-01

    Main topics of the 2nd interventional MRI symposium were: MR compatibility and pulse sequences; MR thermometry, biopsy, musculoskeletal system; laser-induced interstitial thermotherapy, radiofrequency ablations; intraoperative MR; vascular applications, breast, endoscopy; focused ultrasound, cryotherapy, perspectives; poster session with 34 posters described. (AJ)

  19. Generating Protocol Software from CPN Models Annotated with Pragmatics

    Simonsen, Kent Inge; Kristensen, Lars M.; Kindler, Ekkart

    2013-01-01

    verify protocol software, but limited work exists on using CPN models of protocols as a basis for automated code generation. The contribution of this paper is a method for generating protocol software from a class of CPN models annotated with code generation pragmatics. Our code generation method...... consists of three main steps: automatically adding so-called derived pragmatics to the CPN model, computing an abstract template tree, which associates pragmatics with code templates, and applying the templates to generate code which can then be compiled. We illustrate our method using a unidirectional...

  20. Automating Traceability for Generated Software Artifacts

    Richardson, Julian; Green, Jeffrey

    2004-01-01

    Program synthesis automatically derives programs from specifications of their behavior. One advantage of program synthesis, as opposed to manual coding, is that there is a direct link between the specification and the derived program. This link is, however, not very fine-grained: it can be best characterized as Program is-derived- from Specification. When the generated program needs to be understood or modified, more $ne-grained linking is useful. In this paper, we present a novel technique for automatically deriving traceability relations between parts of a specification and parts of the synthesized program. The technique is very lightweight and works -- with varying degrees of success - for any process in which one artifact is automatically derived from another. We illustrate the generality of the technique by applying it to two kinds of automatic generation: synthesis of Kalman Filter programs from speci3cations using the Aut- oFilter program synthesis system, and generation of assembly language programs from C source code using the GCC C compilel: We evaluate the effectiveness of the technique in the latter application.

  1. Generating DEM from LIDAR data - comparison of available software tools

    Korzeniowska, K.; Lacka, M.

    2011-12-01

    In recent years many software tools and applications have appeared that offer procedures, scripts and algorithms to process and visualize ALS data. This variety of software tools and of "point cloud" processing methods contributed to the aim of this study: to assess algorithms available in various software tools that are used to classify LIDAR "point cloud" data, through a careful examination of Digital Elevation Models (DEMs) generated from LIDAR data on a base of these algorithms. The works focused on the most important available software tools: both commercial and open source ones. Two sites in a mountain area were selected for the study. The area of each site is 0.645 sq km. DEMs generated with analysed software tools ware compared with a reference dataset, generated using manual methods to eliminate non ground points. Surfaces were analysed using raster analysis. Minimum, maximum and mean differences between reference DEM and DEMs generated with analysed software tools were calculated, together with Root Mean Square Error. Differences between DEMs were also examined visually using transects along the grid axes in the test sites.

  2. Microcomputers: Instrument Generation Software. Evaluation Guides. Guide Number 11.

    Gray, Peter J.

    Designed to assist evaluators in selecting the appropriate software for the generation of various data collection instruments, this guide discusses such key program characteristics as text entry, item storage and manipulation, item retrieval, and printing. Some characteristics of a good instrument generation program are discussed; these include…

  3. Radioisotope thermoelectric generator transportation system subsystem 143 software development plan

    This plan describes the activities to be performed and the controls to be applied to the process of specifying, developing, and qualifying the data acquisition software for the Radioisotope Thermoelectric Generator (RTG) Transportation System Subsystem 143 Instrumentation and Data Acquisition System (IDAS). This plan will serve as a software quality assurance plan, a verification and validation (V and V) plan, and a configuration management plan

  4. Software Surface Modeling and Grid Generation Steering Committee

    Smith, Robert E. (Editor)

    1992-01-01

    It is a NASA objective to promote improvements in the capability and efficiency of computational fluid dynamics. Grid generation, the creation of a discrete representation of the solution domain, is an essential part of computational fluid dynamics. However, grid generation about complex boundaries requires sophisticated surface-model descriptions of the boundaries. The surface modeling and the associated computation of surface grids consume an extremely large percentage of the total time required for volume grid generation. Efficient and user friendly software systems for surface modeling and grid generation are critical for computational fluid dynamics to reach its potential. The papers presented here represent the state-of-the-art in software systems for surface modeling and grid generation. Several papers describe improved techniques for grid generation.

  5. Development of the software generation method using model driven software engineering tool

    Jang, H. S.; Jeong, J. C.; Kim, J. H.; Han, H. W.; Kim, D. Y.; Jang, Y. W. [KOPEC, Taejon (Korea, Republic of); Moon, W. S. [NEXTech Inc., Seoul (Korea, Republic of)

    2003-10-01

    The methodologies to generate the automated software design specification and source code for the nuclear I and C systems software using model driven language is developed in this work. For qualitative analysis of the algorithm, the activity diagram is modeled and generated using Unified Modeling Language (UML), and then the sequence diagram is designed for automated source code generation. For validation of the generated code, the code audits and module test is performed using Test and QA tool. The code coverage and complexities of example code are examined in this stage. The low pressure pressurizer reactor trip module of the Plant Protection System was programmed as subject for this task. The test result using the test tool shows that errors were easily detected from the generated source codes that have been generated using test tool. The accuracy of input/output processing by the execution modules was clearly identified.

  6. Computer Software Configuration Item-Specific Flight Software Image Transfer Script Generator

    Bolen, Kenny; Greenlaw, Ronald

    2010-01-01

    A K-shell UNIX script enables the International Space Station (ISS) Flight Control Team (FCT) operators in NASA s Mission Control Center (MCC) in Houston to transfer an entire or partial computer software configuration item (CSCI) from a flight software compact disk (CD) to the onboard Portable Computer System (PCS). The tool is designed to read the content stored on a flight software CD and generate individual CSCI transfer scripts that are capable of transferring the flight software content in a given subdirectory on the CD to the scratch directory on the PCS. The flight control team can then transfer the flight software from the PCS scratch directory to the Electronically Erasable Programmable Read Only Memory (EEPROM) of an ISS Multiplexer/ Demultiplexer (MDM) via the Indirect File Transfer capability. The individual CSCI scripts and the CSCI Specific Flight Software Image Transfer Script Generator (CFITSG), when executed a second time, will remove all components from their original execution. The tool will identify errors in the transfer process and create logs of the transferred software for the purposes of configuration management.

  7. Search-Based Software Test Data Generation Using Evolutionary Computation

    P. Maragathavalli

    2011-02-01

    Full Text Available Search-based Software Engineering has been utilized for a number of software engineering activities.One area where Search-Based Software Engineering has seen much application is test data generation. Evolutionary testing designates the use of metaheuristic search methods for test case generation. The search space is the input domain of the test object, with each individual or potential solution, being an encoded set of inputs to that test object. The fitness function is tailored to find test data for the type of testthat is being undertaken. Evolutionary Testing (ET uses optimizing search techniques such as evolutionary algorithms to generate test data. The effectiveness of GA-based testing system is compared with a Random testing system. For simple programs both testing systems work fine, but as the complexity of the program or the complexity of input domain grows, GA-based testing system significantly outperforms Random testing.

  8. A rule-based software test data generator

    Deason, William H.; Brown, David B.; Chang, Kai-Hsiung; Cross, James H., II

    1991-01-01

    Rule-based software test data generation is proposed as an alternative to either path/predicate analysis or random data generation. A prototype rule-based test data generator for Ada programs is constructed and compared to a random test data generator. Four Ada procedures are used in the comparison. Approximately 2000 rule-based test cases and 100,000 randomly generated test cases are automatically generated and executed. The success of the two methods is compared using standard coverage metrics. Simple statistical tests showing that even the primitive rule-based test data generation prototype is significantly better than random data generation are performed. This result demonstrates that rule-based test data generation is feasible and shows great promise in assisting test engineers, especially when the rule base is developed further.

  9. Evaluation of the efficiency and fault density of software generated by code generators

    Schreur, Barbara

    1993-01-01

    Flight computers and flight software are used for GN&C (guidance, navigation, and control), engine controllers, and avionics during missions. The software development requires the generation of a considerable amount of code. The engineers who generate the code make mistakes and the generation of a large body of code with high reliability requires considerable time. Computer-aided software engineering (CASE) tools are available which generates code automatically with inputs through graphical interfaces. These tools are referred to as code generators. In theory, code generators could write highly reliable code quickly and inexpensively. The various code generators offer different levels of reliability checking. Some check only the finished product while some allow checking of individual modules and combined sets of modules as well. Considering NASA's requirement for reliability, an in house manually generated code is needed. Furthermore, automatically generated code is reputed to be as efficient as the best manually generated code when executed. In house verification is warranted.

  10. A code generation framework for the ALMA common software

    Troncoso, Nicolás; von Brand, Horst H.; Ibsen, Jorge; Mora, Matias; Gonzalez, Victor; Chiozzi, Gianluca; Jeram, Bogdan; Sommer, Heiko; Zamora, Gabriel; Tejeda, Alexis

    2010-07-01

    Code generation helps in smoothing the learning curve of a complex application framework and in reducing the number of Lines Of Code (LOC) that a developer needs to craft. The ALMA Common Software (ACS) has adopted code generation in specific areas, but we are now exploiting the more comprehensive approach of Model Driven code generation to transform directly an UML Model into a full implementation in the ACS framework. This approach makes it easier for newcomers to grasp the principles of the framework. Moreover, a lower handcrafted LOC reduces the error rate. Additional benefits achieved by model driven code generation are: software reuse, implicit application of design patterns and automatic tests generation. A model driven approach to design makes it also possible using the same model with different frameworks, by generating for different targets. The generation framework presented in this paper uses openArchitectureWare1 as the model to text translator. OpenArchitectureWare provides a powerful functional language that makes this easier to implement the correct mapping of data types, the main difficulty encountered in the translation process. The output is an ACS application readily usable by the developer, including the necessary deployment configuration, thus minimizing any configuration burden during testing. The specific application code is implemented by extending generated classes. Therefore, generated and manually crafted code are kept apart, simplifying the code generation process and aiding the developers by keeping a clean logical separation between the two. Our first results show that code generation improves dramatically the code productivity.

  11. Improved ant algorithms for software testing cases generation.

    Yang, Shunkun; Man, Tianlong; Xu, Jiaqi

    2014-01-01

    Existing ant colony optimization (ACO) for software testing cases generation is a very popular domain in software testing engineering. However, the traditional ACO has flaws, as early search pheromone is relatively scarce, search efficiency is low, search model is too simple, positive feedback mechanism is easy to produce the phenomenon of stagnation and precocity. This paper introduces improved ACO for software testing cases generation: improved local pheromone update strategy for ant colony optimization, improved pheromone volatilization coefficient for ant colony optimization (IPVACO), and improved the global path pheromone update strategy for ant colony optimization (IGPACO). At last, we put forward a comprehensive improved ant colony optimization (ACIACO), which is based on all the above three methods. The proposed technique will be compared with random algorithm (RND) and genetic algorithm (GA) in terms of both efficiency and coverage. The results indicate that the improved method can effectively improve the search efficiency, restrain precocity, promote case coverage, and reduce the number of iterations. PMID:24883391

  12. 2nd International Conference on Mobile and Wireless Technology

    Wattanapongsakorn, Naruemon

    2015-01-01

    This book provides a snapshot of the current state-of-the-art in the fields of mobile and wireless technology, security and applications.  The proceedings of the 2nd International Conference on Mobile and Wireless Technology (ICMWT2015), it represents the outcome of a unique platform for researchers and practitioners from academia and industry to share cutting-edge developments in the field of mobile and wireless science technology, including those working on data management and mobile security.   The contributions presented here describe the latest academic and industrial research from the international mobile and wireless community.  The scope covers four major topical areas: mobile and wireless networks and applications; security in mobile and wireless technology; mobile data management and applications; and mobile software.  The book will be a valuable reference for current researchers in academia and industry, and a useful resource for graduate-level students working on mobile and wireless technology...

  13. Computer aided power flow software engineering and code generation

    Bacher, R. [Swiss Federal Inst. of Tech., Zuerich (Switzerland)

    1995-12-31

    In this paper a software engineering concept is described which permits the automatic solution of a non-linear set of network equations. The power flow equation set can be seen as a defined subset of a network equation set. The automated solution process is the numerical Newton-Raphson solution process of the power flow equations where the key code parts are the numeric mismatch and the numeric Jacobian term computation. It is shown that both the Jacobian and the mismatch term source code can be automatically generated in a conventional language such as Fortran or C. Thereby one starts from a high level, symbolic language with automatic differentiation and code generation facilities. As a result of this software engineering process an efficient, very high quality Newton-Raphson solution code is generated which allows easier implementation of network equation model enhancements and easier code maintenance as compared to hand-coded Fortran or C code.

  14. Computer aided power flow software engineering and code generation

    Bacher, R. [Swiss Federal Inst. of Tech., Zuerich (Switzerland)

    1996-02-01

    In this paper a software engineering concept is described which permits the automatic solution of a non-linear set of network equations. The power flow equation set can be seen as a defined subset of a network equation set. The automated solution process is the numerical Newton-Raphson solution process of the power flow equations where the key code parts are the numeric mismatch and the numeric Jacobian term computation. It is shown that both the Jacobian and the mismatch term source code can be automatically generated in a conventional language such as Fortran or C. Thereby one starts from a high level, symbolic language with automatic differentiation and code generation facilities. As a result of this software engineering process an efficient, very high quality newton-Raphson solution code is generated which allows easier implementation of network equation model enhancements and easier code maintenance as compared to hand-coded Fortran or C code.

  15. Open Source Next Generation Visualization Software for Interplanetary Missions

    Trimble, Jay; Rinker, George

    2016-01-01

    Mission control is evolving quickly, driven by the requirements of new missions, and enabled by modern computing capabilities. Distributed operations, access to data anywhere, data visualization for spacecraft analysis that spans multiple data sources, flexible reconfiguration to support multiple missions, and operator use cases, are driving the need for new capabilities. NASA's Advanced Multi-Mission Operations System (AMMOS), Ames Research Center (ARC) and the Jet Propulsion Laboratory (JPL) are collaborating to build a new generation of mission operations software for visualization, to enable mission control anywhere, on the desktop, tablet and phone. The software is built on an open source platform that is open for contributions (http://nasa.github.io/openmct).

  16. Software Test Case Automated Generation Algorithm with Extended EDPN Model

    Jinlong Tao

    2013-08-01

    Full Text Available To improve the sufficiency for software testing and the performance of testing algorithms, an improved event-driven Petri network model using combination method is proposed, abbreviated as OEDPN model. Then it is applied to OATS method to extend the implementation of OATS. On the basis of OEDPN model, the marked associate recursive method of state combination on category is presented to solve problems of combined conflict. It is also for test case explosion generated by redundant test cases and hard extension of OATS method. Meanwhile, the generation methods on interactive test cases of extended OATS are also presented by research on generation test cases.

  17. Overview of the next generation of Fermilab collider software

    Fermilab is entering an era of operating a more complex collider facility. In addition, new operator workstations are available that have increased capabilities. The task of providing updated software in this new environment precipitated a project called Colliding Beam Software (CBS). It was soon evident that a new approach was needed for developing console software. Hence CBS, although a common acronym, is too narrow a description. A new generation of the application program subroutine library has been created to enhance the existing programming environment with a set of value added tools. Several key Collider applications were written that exploit CBS tools. This paper will discuss the new tools and the underlying change in methodology in application program development for accelerator control at Fermilab. (author)

  18. 2nd International Conference on Natural Fibers

    Rana, Sohel

    2016-01-01

    This book collects selected high quality articles submitted to the 2nd International Conference on Natural Fibers (ICNF2015). A wide range of topics is covered related to various aspects of natural fibres such as agriculture, extraction and processing, surface modification and functionalization, advanced structures, nano fibres, composites and nanocomposites, design and product development, applications, market potential, and environmental impact. Divided into separate sections on these various topics, the book presents the latest high quality research work addressing different approaches and techniques to improve processing, performance, functionalities and cost-effectiveness of natural fibre and natural based products, in order to promote their applications in various advanced technical sectors. This book is a useful source of information for materials scientists, teachers and students from various disciplines as well as for R& D staff in industries using natural fibre based materials. .

  19. PUS Services Software Building Block Automatic Generation for Space Missions

    Candia, S.; Sgaramella, F.; Mele, G.

    2008-08-01

    The Packet Utilization Standard (PUS) has been specified by the European Committee for Space Standardization (ECSS) and issued as ECSS-E-70-41A to define the application-level interface between Ground Segments and Space Segments. The ECSS-E- 70-41A complements the ECSS-E-50 and the Consultative Committee for Space Data Systems (CCSDS) recommendations for packet telemetry and telecommand. The ECSS-E-70-41A characterizes the identified PUS Services from a functional point of view and the ECSS-E-70-31 standard specifies the rules for their mission-specific tailoring. The current on-board software design for a space mission implies the production of several PUS terminals, each providing a specific tailoring of the PUS services. The associated on-board software building blocks are developed independently, leading to very different design choices and implementations even when the mission tailoring requires very similar services (from the Ground operative perspective). In this scenario, the automatic production of the PUS services building blocks for a mission would be a way to optimize the overall mission economy and improve the robusteness and reliability of the on-board software and of the Ground-Space interactions. This paper presents the Space Software Italia (SSI) activities for the development of an integrated environment to support: the PUS services tailoring activity for a specific mission. the mission-specific PUS services configuration. the generation the UML model of the software building block implementing the mission-specific PUS services and the related source code, support documentation (software requirements, software architecture, test plans/procedures, operational manuals), and the TM/TC database. The paper deals with: (a) the project objectives, (b) the tailoring, configuration, and generation process, (c) the description of the environments supporting the process phases, (d) the characterization of the meta-model used for the generation, (e) the

  20. AUTOMATED TEST CASES GENERATION FOR OBJECT ORIENTED SOFTWARE

    A.V.K.SHANTHI

    2011-09-01

    Full Text Available Software testing is an important activity in software development life cycle. Testing includes executing a program on a set of test cases and comparing the actual results with the expected results. To test a system, the implementation must be understood first, which can be done by creating a suitable model of the system. UML is widely accepted and used by industry for modeling and design of softwaresystems. A novel method to automatically generate test cases based on UML Class diagrams guaranteed in many ways. Test case generation from design specifications has the added advantage of allowing test cases to be available early in the software development cycle, thereby making test planning more effective. Here is a technique in which a new approach using data mining concepts is designed and that algorithm is to be used to generate test cases. The Tool generates a novel automated test case that is much superior, less complex and easier to implement in any Testing system. Where in this Tool, information from UML Class diagram extracted and mapped, tree structure is formed with help of those information’s, Genetic Algorithm implemented as data mining technique, where Genetic crossover operator applied to discover all patterns and Depth First Search algorithm implement to Binary tree’s formed to represent the knowledge i.e., test cases. Path coverage criterion is an importantconcept to be considered in test case generation is concern. This paper presents valid test cases generation scheme which is fully automated, and the generated test cases to satisfy transition pathcoverage criteria.

  1. Improved Ant Algorithms for Software Testing Cases Generation

    Shunkun Yang

    2014-01-01

    Full Text Available Existing ant colony optimization (ACO for software testing cases generation is a very popular domain in software testing engineering. However, the traditional ACO has flaws, as early search pheromone is relatively scarce, search efficiency is low, search model is too simple, positive feedback mechanism is easy to porduce the phenomenon of stagnation and precocity. This paper introduces improved ACO for software testing cases generation: improved local pheromone update strategy for ant colony optimization, improved pheromone volatilization coefficient for ant colony optimization (IPVACO, and improved the global path pheromone update strategy for ant colony optimization (IGPACO. At last, we put forward a comprehensive improved ant colony optimization (ACIACO, which is based on all the above three methods. The proposed technique will be compared with random algorithm (RND and genetic algorithm (GA in terms of both efficiency and coverage. The results indicate that the improved method can effectively improve the search efficiency, restrain precocity, promote case coverage, and reduce the number of iterations.

  2. Optimized generation of high resolution breast anthropomorphic software phantoms

    Purpose: The authors present an efficient method for generating anthropomorphic software breast phantoms with high spatial resolution. Employing the same region growing principles as in their previous algorithm for breast anatomy simulation, the present method has been optimized for computational complexity to allow for fast generation of the large number of phantoms required in virtual clinical trials of breast imaging. Methods: The new breast anatomy simulation method performs a direct calculation of the Cooper's ligaments (i.e., the borders between simulated adipose compartments). The calculation corresponds to quadratic decision boundaries of a maximum a posteriori classifier. The method is multiscale due to the use of octree-based recursive partitioning of the phantom volume. The method also provides user-control of the thickness of the simulated Cooper's ligaments and skin. Results: Using the proposed method, the authors have generated phantoms with voxel size in the range of (25-1000 μm)3/voxel. The power regression of the simulation time as a function of the reciprocal voxel size yielded a log-log slope of 1.95 (compared to a slope of 4.53 of our previous region growing algorithm). Conclusions: A new algorithm for computer simulation of breast anatomy has been proposed that allows for fast generation of high resolution anthropomorphic software phantoms.

  3. 2nd International Arctic Ungulate Conference

    A. Anonymous

    1996-01-01

    Full Text Available The 2nd International Arctic Ungulate Conference was held 13-17 August 1995 on the University of Alaska Fairbanks campus. The Institute of Arctic Biology and the Alaska Cooperative Fish and Wildlife Research Unit were responsible for organizing the conference with assistance from biologists with state and federal agencies and commercial organizations. David R. Klein was chair of the conference organizing committee. Over 200 people attended the conference, coming from 10 different countries. The United States, Canada, and Norway had the largest representation. The conference included invited lectures; panel discussions, and about 125 contributed papers. There were five technical sessions on Physiology and Body Condition; Habitat Relationships; Population Dynamics and Management; Behavior, Genetics and Evolution; and Reindeer and Muskox Husbandry. Three panel sessions discussed Comparative caribou management strategies; Management of introduced, reestablished, and expanding muskox populations; and Health risks in translocation of arctic ungulates. Invited lectures focused on the physiology and population dynamics of arctic ungulates; contaminants in food chains of arctic ungulates and lessons learned from the Chernobyl accident; and ecosystem level relationships of the Porcupine Caribou Herd.

  4. Exogenous attention enhances 2nd-order contrast sensitivity.

    Barbot, Antoine; Landy, Michael S; Carrasco, Marisa

    2011-05-11

    Natural scenes contain a rich variety of contours that the visual system extracts to segregate the retinal image into perceptually coherent regions. Covert spatial attention helps extract contours by enhancing contrast sensitivity for 1st-order, luminance-defined patterns at attended locations, while reducing sensitivity at unattended locations, relative to neutral attention allocation. However, humans are also sensitive to 2nd-order patterns such as spatial variations of texture, which are predominant in natural scenes and cannot be detected by linear mechanisms. We assess whether and how exogenous attention--the involuntary and transient capture of spatial attention--affects the contrast sensitivity of channels sensitive to 2nd-order, texture-defined patterns. Using 2nd-order, texture-defined stimuli, we demonstrate that exogenous attention increases 2nd-order contrast sensitivity at the attended location, while decreasing it at unattended locations, relative to a neutral condition. By manipulating both 1st- and 2nd-order spatial frequency, we find that the effects of attention depend both on 2nd-order spatial frequency of the stimulus and the observer's 2nd-order spatial resolution at the target location. At parafoveal locations, attention enhances 2nd-order contrast sensitivity to high, but not to low 2nd-order spatial frequencies; at peripheral locations attention also enhances sensitivity to low 2nd-order spatial frequencies. Control experiments rule out the possibility that these effects might be due to an increase in contrast sensitivity at the 1st-order stage of visual processing. Thus, exogenous attention affects 2nd-order contrast sensitivity at both attended and unattended locations. PMID:21356228

  5. Next Generation of ECT Software for Data Analysis of Steam Generator Tubes

    Improvements to existing EddyOne eddy current analysis software are being presented. Those improvements are geared towards improved interaction between the software and ECT analyst by having a better and more featured user interface, while keeping some industry standard signal display norms intact to keep the familiar factor and ease the transition to the next generation of EddyOne. Improvements presented in this paper thus ease the transition to the new software by reducing training requirements for the existing analysts and for new analysts coming to the industry. Further, by utilizing modern technologies next generation of software is able to further reduce maintenance and deployment costs of the whole system for future to come.(author).

  6. High-Quality Random Number Generation Software for High-Performance Computing Project

    National Aeronautics and Space Administration — Random number (RN) generation is the key software component that permits random sampling. Software for parallel RN generation (RNG) should be based on RNGs that are...

  7. Evaluation of the efficiency and reliability of software generated by code generators

    Schreur, Barbara

    1994-01-01

    There are numerous studies which show that CASE Tools greatly facilitate software development. As a result of these advantages, an increasing amount of software development is done with CASE Tools. As more software engineers become proficient with these tools, their experience and feedback lead to further development with the tools themselves. What has not been widely studied, however, is the reliability and efficiency of the actual code produced by the CASE Tools. This investigation considered these matters. Three segments of code generated by MATRIXx, one of many commercially available CASE Tools, were chosen for analysis: ETOFLIGHT, a portion of the Earth to Orbit Flight software, and ECLSS and PFMC, modules for Environmental Control and Life Support System and Pump Fan Motor Control, respectively.

  8. 2nd International technical meeting on small reactors

    The 2nd International Technical Meeting on Small Reactors was held on November 7-9, 2012 in Ottawa, Ontario. The meeting was hosted by Atomic Energy of Canada Limited (AECL) and Canadian Nuclear Society (CNS). There is growing international interest and activity in the development of small nuclear reactor technology. This meeting provided participants with an opportunity to share ideas and exchange information on new developments. This Technical Meeting covered topics of interest to designers, operators, researchers and analysts involved in the design, development and deployment of small reactors for power generation and research. A special session focussed on small modular reactors (SMR) for generating electricity and process heat, particularly in small grids and remote locations. Following the success of the first Technical Meeting in November 2010, which captured numerous accomplishments of low-power critical facilities and small reactors, the second Technical Meeting was dedicated to the achievements, capabilities, and future prospects of small reactors. This meeting also celebrated the 50th Anniversary of the Nuclear Power Demonstration (NPD) reactor which was the first small reactor (20 MWe) to generate electricity in Canada.

  9. Assessment of nursing care using indicators generated by software

    Ana Paula Souza Lima

    2015-04-01

    Full Text Available OBJECTIVE: to analyze the efficacy of the Nursing Process in an Intensive Care Unit using indicators generated by software. METHOD: cross-sectional study using data collected for four months. RNs and students daily registered patients, took history (at admission, performed physical assessments, and established nursing diagnoses, nursing plans/prescriptions, and assessed care delivered to 17 patients using software. Indicators concerning the incidence and prevalence of nursing diagnoses, rate of effectiveness, risk diagnoses, and rate of effective prevention of complications were computed. RESULTS: the Risk for imbalanced body temperature was the most frequent diagnosis (23.53%, while the least frequent was Risk for constipation (0%. The Risk for Impaired skin integrity was prevalent in 100% of the patients, while Risk for acute confusion was the least prevalent (11.76%. Risk for constipation and Risk for impaired skin integrity obtained a rate of risk diagnostic effectiveness of 100%. The rate of effective prevention of acute confusion and falls was 100%. CONCLUSION: the efficacy of the Nursing Process using indicators was analyzed because these indicators reveal how nurses have identified patients' risks and conditions, and planned care in a systematized manner.

  10. Effective Test Case Generation Using Antirandom Software Testing

    Kulvinder Singh,

    2010-11-01

    Full Text Available Random Testing is a primary technique for the software testing. Antirandom Testing improves the fault-detection capability of Random Testing by employing the location information of previously executed test cases. Antirandom testing selects test case such that it is as different as possible from all the previous executed test cases. The implementation is illustrated using basic examples. Moreover, compared with Random Testing, test cases generated in Antirandom Testing are more evenly spread across the input domain. AntirandomTesting has conventionally been applied to programs that have only numerical input types, because the distance between numerical inputs is readily measurable. The vast majority of research involves distance techniques for generating the antirandom test cases. Different from these techniques, we focus on the concrete values ofprogram inputs by proposing a new method to generate the antirandom test cases. The proposed method enables Antirandom Testing to be applied to all kinds of programs regardless of their input types. Empirical studies are further conducted for comparison and evaluation of the effectiveness of these methods is also presented. Experimental results show that, compared with random and hamming distance techniques, the proposed method significantly reduces the number of test cases required to detect the first failure. Overall, proposed antirandom testing gives higher fault coverage than antirandom testing with hamming distance method, which gives higher fault coverage than pure random testing.

  11. A NEO population generation and observation simulation software tool

    Müller, Sven; Gelhaus, Johannes; Hahn, Gerhard; Franco, Raffaella

    One of the main targets of ESA's Space Situational Awareness (SSA) program is to build a wide knowledge base about objects that can potentially harm Earth (Near-Earth Objects, NEOs). An important part of this effort is to create the Small Bodies Data Centre (SBDC) which is going to aggregate measurement data from a fully-integrated NEO observation sensor network. Until this network is developed, artificial NEO measurement data is needed in order to validate SBDC algorithms. Moreover, to establish a functioning NEO observation sensor network, it has to be determined where to place sensors, what technical requirements have to be met in order to be able to detect NEOs and which observation strategies work the best. Because of this, a sensor simulation software was needed. This paper presents a software tool which allows users to create and analyse NEO populations and to simulate and analyse population observations. It is a console program written in Fortran and comes with a Graphical User Interface (GUI) written in Java and C. The tool can be distinguished into the components ``Population Generator'' and ``Observation Simulator''. The Population Generator component is responsible for generating and analysing a NEO population. Users can choose between creating fictitious (random) and synthetic populations. The latter are based on one of two models describing the orbital and size distribution of observed NEOs: The existing socalled ``Bottke Model'' (Bottke et al. 2000, 2002) and the new ``Granvik Model'' (Granvik et al. 2014, in preparation) which has been developed in parallel to the tool. Generated populations can be analysed by defining 2D, 3D and scatter plots using various NEO attributes. As a result, the tool creates the appropiate files for the plotting tool ``gnuplot''. The tool's Observation Simulator component yields the Observation Simulation and Observation Analysis functions. Users can define sensor systems using ground- or space-based locations as well as

  12. Florida Investigates 2nd Possible Local Transmission of Zika Virus

    ... html Florida Investigates 2nd Possible Local Transmission of Zika Virus If confirmed, cases would be first instances of ... Broward County, north of Miami. Infection with the Zika virus, which in most cases is transmitted by mosquitoes, ...

  13. Generation and Optimization of Test cases for Object-Oriented Software Using State Chart Diagram

    Ranjita Kumari Swain

    2012-05-01

    Full Text Available The process of testing any software system is an enormous task which is time consuming and costly. The time and required effort to do sufficient testing grow, as the size and complexity of the software grows, which may cause overrun of the project budget, delay in the development of software system or some test cases may not be covered. During SDLC (software development life cycle, generally the software testing phase takes around 40-70% of the time and cost. Statebased testing is frequently used in software testing. Test data generation is one of the key issues in software testing. A properly generated test suite may not only locate the errors in a software system, but also help in reducing the high cost associated with software testing. It is often desired that test data in the form of test sequences within a test suite can be automatically generated to achieve required test coverage.

  14. Test case generation based on orthogonal table for software black-box testing

    LIU Jiu-fu; YANG Zhong; YANG Zhen-xing; SUN Lin

    2008-01-01

    Software testing is an important means to assure the software quality. This paper presents a practicable method to generate test cases of software testing, which is operational and high efficient. We discuss the identification of software specification categories and choices and make a classification tree. Based on the orthogonal array, it is easy to generate test cases. The number of this method is less than that of all combination of the choices.

  15. GENESIS: Agile Generation of Information Management Oriented Software

    Juan Erasmo Gómez

    2010-06-01

    Full Text Available The specification for an information system can be clear from the beginning: it must acquire, display, query and modify data, using a database. The main issue is to decide which information to manage. In the case originating this work, information was always evolving, even up to the end of the project. This implies the construction of a new system each time the information is redefined. This article presents Genesis, an agile development infrastructure, and proposes an approach for the immediate construction of required information systems. Experts describe their information needs and queries, and Genesis generates the corresponding application, with the appropriate graphical interfaces and database.La especificación de un sistema de información puede estar clara desde el principio: debe adquirir, desplegar, consultar y modificar datos, usando una base de datos. El asunto es decidir cuál información manejar. En el caso que origina este trabajo, la información evoluciona permanentemente, incluso hasta el final del proyecto. Esto implica la construcción de un nuevo sistema cada vez que se redefine la información. Este artículo presenta Genesis, una infraestructura ágil para la construcción inmediata del sistema de información que sea requerido. Los expertos describen su información y consultas. Genesis produce el software correspondiente, generando las interfaces gráficas y la base de datos apropiados.

  16. Software Defined Radio Architecture Contributions to Next Generation Space Communications

    Kacpura, Thomas J.; Eddy, Wesley M.; Smith, Carl R.; Liebetreu, John

    2015-01-01

    systems, as well as those communications and navigation systems operated by international space agencies and civilian and government agencies. In this paper, we review the philosophies, technologies, architectural attributes, mission services, and communications capabilities that form the structure of candidate next-generation integrated communication architectures for space communications and navigation. A key area that this paper explores is from the development and operation of the software defined radio for the NASA Space Communications and Navigation (SCaN) Testbed currently on the International Space Station (ISS). Evaluating the lessons learned from development and operation feed back into the communications architecture. Leveraging the reconfigurability provides a change in the way that operations are done and must be considered. Quantifying the impact on the NASA Space Telecommunications Radio System (STRS) software defined radio architecture provides feedback to keep the standard useful and up to date. NASA is not the only customer of these radios. Software defined radios are developed for other applications, and taking advantage of these developments promotes an architecture that is cost effective and sustainable. Developments in the following areas such as an updated operating environment, higher data rates, networking and security can be leveraged. The ability to sustain an architecture that uses radios for multiple markets can lower costs and keep new technology infused.

  17. 2nd European Advanced Accelerator Concepts Workshop

    Assmann, Ralph; Grebenyuk, Julia

    2015-01-01

    The European Advanced Accelerator Concepts Workshop has the mission to discuss and foster methods of beam acceleration with gradients beyond state of the art in operational facilities. The most cost effective and compact methods for generating high energy particle beams shall be reviewed and assessed. This includes diagnostics methods, timing technology, special need for injectors, beam matching, beam dynamics with advanced accelerators and development of adequate simulations. This workshop is organized in the context of the EU-funded European Network for Novel Accelerators (EuroNNAc2), that includes 52 Research Institutes and universities.

  18. Model of Next Generation Energy-Efficient Design Software for Buildings

    MA Zhiliang; ZHAO Yili

    2008-01-01

    Energy-efficient design for buildings (EEDB) is a vital step towards building energy-saving. In or-der to greatly improve the EEDB, the next generation EEDB software that makes use of latest technologies needs to be developed. This paper mainly focuses on establishing the model of the next generation EEDB software. Based on the investigation of literatures and the interviews to the designers, the requirements on the next generation EEDB software were identified, where the lifecycle assessment on both energy con-sumption and environmental impacts, 3D graphics support, and building information modeling (BIM) support were stressed. Then the workflow for using the next generation EEDB software was established. Finally,based on the workflow, the framework model for the software was proposed, and the partial models and the corresponding functions were systematically analyzed. The model lays a solid foundation for developing the next generation EEDB software.

  19. Thermoluminescent characteristics of ZrO2:Nd films

    In this work it is exposed the obtained results after analysing the photo luminescent and thermoluminescent characteristics of activated zirconium oxide with neodymium (ZrO2 :Nd) and its possible application in the UV radiation dosimetry. The realized experiments had as objective to study the characteristics such as the optimum thermal erased treatment, the influence of light on the response, the response depending on the wavelength, the fadeout of the information, the temperature effect, the response depending on the time and the recurring of the response. The results show that the ZrO2 :Nd is a promising material to be used as Tl dosemeter for the UV radiation. (Author)

  20. 2nd Quarter Transportation Report FY 2014

    Gregory, L.

    2014-07-30

    This report satisfies the U.S. Department of Energy (DOE), National Nuclear Security Administration Nevada Field Office (NNSA/NFO) commitment to prepare a quarterly summary report of radioactive waste shipments to the Nevada National Security Site (NNSS) Radioactive Waste Management Complex (RWMC) at Area 5. There were no shipments sent for offsite treatment and returned to the NNSS this quarter. This report summarizes the second quarter of fiscal year (FY) 2014 low-level radioactive waste (LLW) and mixed low-level radioactive waste (MLLW) shipments. This report also includes annual summaries for FY 2014 in Tables 4 and 5. Tabular summaries are provided which include the following: Sources of and carriers for LLW and MLLW shipments to and from the NNSS; Number and external volume of LLW and MLLW shipments; Highway routes used by carriers; and Incident/accident data applicable to LLW and MLLW shipments. In this report shipments are accounted for upon arrival at the NNSS, while disposal volumes are accounted for upon waste burial. The disposal volumes presented in this report do not include minor volumes of non-radioactive materials that were approved for disposal. Volume reports showing cubic feet (ft3) generated using the Low-Level Waste Information System may vary slightly due to differing rounding conventions.

  1. S-Cube: Enabling the Next Generation of Software Services

    Metzger, Andreas; Pohl, Klaus

    The Service Oriented Architecture (SOA) paradigm is increasingly adopted by industry for building distributed software systems. However, when designing, developing and operating innovative software services and servicebased systems, several challenges exist. Those challenges include how to manage the complexity of those systems, how to establish, monitor and enforce Quality of Service (QoS) and Service Level Agreements (SLAs), as well as how to build those systems such that they can proactively adapt to dynamically changing requirements and context conditions. Developing foundational solutions for those challenges requires joint efforts of different research communities such as Business Process Management, Grid Computing, Service Oriented Computing and Software Engineering. This paper provides an overview of S-Cube, the European Network of Excellence on Software Services and Systems. S-Cube brings together researchers from leading research institutions across Europe, who join their competences to develop foundations, theories as well as methods and tools for future service-based systems.

  2. 2nd International Conference on Multiscale Computational Methods for Solids and Fluids

    2016-01-01

    This volume contains the best papers presented at the 2nd ECCOMAS International Conference on Multiscale Computations for Solids and Fluids, held June 10-12, 2015. Topics dealt with include multiscale strategy for efficient development of scientific software for large-scale computations, coupled probability-nonlinear-mechanics problems and solution methods, and modern mathematical and computational setting for multi-phase flows and fluid-structure interaction. The papers consist of contributions by six experts who taught short courses prior to the conference, along with several selected articles from other participants dealing with complementary issues, covering both solid mechanics and applied mathematics. .

  3. 2nd ISPRA nuclear electronics symposium, Stresa, Italy May 20-23, 1975

    Two round tables were annexed to the 2nd Ispra Nuclear Electronics Symposium. The first one was concerned with software support for the implementation of microprocessors, MOS and bipolar microporcessors, environmental data systems, and the use of microprocessors and minicomputers in nuclear, biomedical and environmental fields. Nuclear electronics future, and its diversification, gravitational waves and electronics, the environmental measurements of air and water quality were discussed during the second round table, and relevant feelings brought out during the discussion on the extension of nuclear electronics techniques to other fields

  4. A Handbook for Classroom Instruction That Works, 2nd Edition

    Association for Supervision and Curriculum Development, 2012

    2012-01-01

    Perfect for self-help and professional learning communities, this handbook makes it much easier to apply the teaching practices from the ASCD-McREL best-seller "Classroom Instruction That Works: Research-Based Strategies for Increasing Student Achievement, 2nd Edition." The authors take you through the refined Instructional Planning Guide, so you…

  5. 2nd International Conference on Nuclear Physics in Astrophysics

    Fülöp, Zsolt; Somorjai, Endre; The European Physical Journal A : Volume 27, Supplement 1, 2006

    2006-01-01

    Launched in 2004, "Nuclear Physics in Astrophysics" has established itself in a successful topical conference series addressing the forefront of research in the field. This volume contains the selected and refereed papers of the 2nd conference, held in Debrecen in 2005 and reprinted from "The European Physical Journal A - Hadrons and Nuclei".

  6. 2nd International Conference on Data Management Technologies and Applications

    2013-01-01

    The 2nd International Conference on Data Management Technologies and Applications (DATA) aims to bring together researchers, engineers and practitioners interested on databases, data warehousing, data mining, data management, data security and other aspects of information systems and technology involving advanced applications of data.

  7. The 2nd Seminar on Standardization Cooperation in Northeast Asia

    2004-01-01

    @@ The 2nd Seminar on Standardization Cooperation in Northeast Asia(2003) was held in Beijing from Oct 30th - Oct 31st, which was the succession of the first one in Korea, 2002, with the participants coming from the standardization circles in China, Japan and Korea.

  8. Open-source software for generating electrocardiogram signals

    McSharry, P E; Sharry, Patrick E. Mc; Cifford, Gari D.

    2004-01-01

    ECGSYN, a dynamical model that faithfully reproduces the main features of the human electrocardiogram (ECG), including heart rate variability, RR intervals and QT intervals is presented. Details of the underlying algorithm and an open-source software implementation in Matlab, C and Java are described. An example of how this model will facilitate comparisons of signal processing techniques is provided.

  9. Automated Software Test Data Generation: Direction of Research

    Hitesh Tahbildar

    2011-02-01

    Full Text Available In this paper we are giving an overview of automatic test data generation. The basic objective of thispaper is to acquire the basic concepts related to automated test data generation research. The differentimplementation techniques are described with their relative merits and demerits. The future challengesand problems of test data generation are explained. Finally we describe the area where more focus isrequired for making automatic test data generation more effective in industry.

  10. Minimal TestCase Generation for Object-Oriented Software with State Charts

    Ranjita Kumari Swain; Prafulla Kumar Behera; Durga Prasad Mohapatra

    2012-01-01

    Today statecharts are a de facto standard in industry for modeling system behavior. Test data generation is one of the key issues in software testing. This paper proposes an reduction approach to test data generation for the state-based software testing. In this paper, first state transition graph is derived from state chart diagram. Then, all the required information are extracted from the state chart diagram. Then, test cases are generated. Lastly, a set of test cases are minimized by calcu...

  11. Development of a Prototype Automation Simulation Scenario Generator for Air Traffic Management Software Simulations

    Khambatta, Cyrus F.

    2007-01-01

    A technique for automated development of scenarios for use in the Multi-Center Traffic Management Advisor (McTMA) software simulations is described. The resulting software is designed and implemented to automate the generation of simulation scenarios with the intent of reducing the time it currently takes using an observational approach. The software program is effective in achieving this goal. The scenarios created for use in the McTMA simulations are based on data taken from data files from the McTMA system, and were manually edited before incorporation into the simulations to ensure accuracy. Despite the software s overall favorable performance, several key software issues are identified. Proposed solutions to these issues are discussed. Future enhancements to the scenario generator software may address the limitations identified in this paper.

  12. 2nd International Conference on Green Communications and Networks 2012

    Ma, Maode; GCN 2012

    2013-01-01

    The objective of the 2nd International Conference on Green Communications and Networks 2012 (GCN 2012) is to facilitate an exchange of information on best practices for the latest research advances in the area of communications, networks and intelligence applications. These mainly involve computer science and engineering, informatics, communications and control, electrical engineering, information computing, and business intelligence and management. Proceedings of the 2nd International Conference on Green Communications and Networks 2012 (GCN 2012) will focus on green information technology and applications, which will provide in-depth insights for engineers and scientists in academia, industry, and government. The book addresses the most innovative research developments including technical challenges, social and economic issues, and presents and discusses the authors’ ideas, experiences, findings, and current projects on all aspects of advanced green information technology and applications. Yuhang Yang is ...

  13. 2nd Interdiciplinary Conference on Production, Logistics and Traffic 2015

    Friedrich, Hanno; Thaller, Carina; Geiger, Christiane

    2016-01-01

    This contributed volume contains the selected and reviewed papers of the 2nd Interdisciplinary Conference on Production, Logistics and Traffic (ICPLT) 2015, Dortmund, Germany. The topical focus lies on economic, ecological and societal issues related to commercial transport. The authors are international experts and the paper collection presents the state-of-the-art in the field, thus making this book a valuable read for both practitioners and researchers.

  14. 2nd International Open and Distance Learning (IODL) Symposium

    Reviewed by Murat BARKAN

    2006-01-01

    This closing remarks prepared and presented by Prof. Dr. Murat BARKAN Anadolu University, Eskisehir, TURKEY DEAR GUESTS, As the 2nd International Open and Distance Learning (IODL) Symposium is now drawing to end, I would like to thank you all for your outstanding speeches, distinguished presentations, constructive roundtable and session discussions, and active participation during the last five days. I hope you all share my view that the whole symposium has been...

  15. Exogenous attention enhances 2nd-order contrast sensitivity

    Barbot, Antoine; Landy, Michael S.; Carrasco, Marisa

    2011-01-01

    Natural scenes contain a rich variety of contours that the visual system extracts to segregrate the retinal image into perceptually coherent regions. Covert spatial attention helps extract contours by enhancing contrast sensitivity for 1st-order, luminance-defined patterns at attended locations, while reducing sensitivity at unattended locations, relative to neutral attention allocation. However, humans are also sensitive to 2nd-order patterns such as spatial variations of texture, which are ...

  16. 2nd International Conference on Electric and Electronics (EEIC 2012)

    Advances in Electric and Electronics

    2012-01-01

    This volume contains 108 full length papers presented at the 2nd International Conference on Electric and Electronics (EEIC 2012), held on April 21-22 in Sanya, China, which brings together researchers working in many different areas of education and learning to foster international collaborations and exchange of new ideas. This volume can be divided into two sections on the basis of the classification of manuscripts considered: the first section deals with Electric and the second section with Electronics.

  17. 2nd International Conference on Intelligent Technologies and Engineering Systems

    Chen, Cheng-Yi; Yang, Cheng-Fu

    2014-01-01

    This book includes the original, peer reviewed research papers from the 2nd International Conference on Intelligent Technologies and Engineering Systems (ICITES2013), which took place on December 12-14, 2013 at Cheng Shiu University in Kaohsiung, Taiwan. Topics covered include: laser technology, wireless and mobile networking, lean and agile manufacturing, speech processing, microwave dielectrics, intelligent circuits and systems, 3D graphics, communications, and structure dynamics and control.

  18. Introduction on the 2nd annual general meeting of ARCCNM

    This paper outlines general information on the 2nd annual general meeting of ARCCNM (Asian Regional Cooperative Council for Nuclear Medicine). The international symposium exchanged new development recently on basic and clinical nuclear medicine. Asian school of nuclear medicine is an educational enterprise of ARCCNM, and the objective is to organize and coordinate academic and training programs in nuclear medicine. It will promote nuclear medicine in Asia region through enhancing regional scientific activities and research collaboration

  19. Specification and Generation of Environment for Model Checking of Software Components

    Pařízek, P.; Plášil, František

    2007-01-01

    Roč. 176, - (2007), s. 143-154. ISSN 1571-0661 R&D Projects: GA AV ČR 1ET400300504 Institutional research plan: CEZ:AV0Z10300504 Keywords : software components * behavior protocols * model checking * automated generation of environment Subject RIV: JC - Computer Hardware ; Software

  20. An application generator for rapid prototyping of Ada real-time control software

    Johnson, Jim; Biglari, Haik; Lehman, Larry

    1990-01-01

    The need to increase engineering productivity and decrease software life cycle costs in real-time system development establishes a motivation for a method of rapid prototyping. The design by iterative rapid prototyping technique is described. A tool which facilitates such a design methodology for the generation of embedded control software is described.

  1. Using Automatic Code Generation in the Attitude Control Flight Software Engineering Process

    McComas, David; O'Donnell, James R., Jr.; Andrews, Stephen F.

    1999-01-01

    This paper presents an overview of the attitude control subsystem flight software development process, identifies how the process has changed due to automatic code generation, analyzes each software development phase in detail, and concludes with a summary of our lessons learned.

  2. Rapid Protoyping Software for Developing Next-Generation Air Traffic Management Algorithms Project

    National Aeronautics and Space Administration — Research on next-generation air traffic management systems is being conducted at several laboratories using custom software. In order to provide a more uniform...

  3. Rapid Protoyping Software for Developing Next-Generation Air Traffic Management Algorithms Project

    National Aeronautics and Space Administration — Research on next-generation air traffic control systems are being conducted at several laboratories. Most of this work is being carried out using custom software....

  4. Automatically generated acceptance test: A software reliability experiment

    Protzel, Peter W.

    1988-01-01

    This study presents results of a software reliability experiment investigating the feasibility of a new error detection method. The method can be used as an acceptance test and is solely based on empirical data about the behavior of internal states of a program. The experimental design uses the existing environment of a multi-version experiment previously conducted at the NASA Langley Research Center, in which the launch interceptor problem is used as a model. This allows the controlled experimental investigation of versions with well-known single and multiple faults, and the availability of an oracle permits the determination of the error detection performance of the test. Fault interaction phenomena are observed that have an amplifying effect on the number of error occurrences. Preliminary results indicate that all faults examined so far are detected by the acceptance test. This shows promise for further investigations, and for the employment of this test method on other applications.

  5. Analysis of Software Test Item Generation- Comparison Between High Skilled and Low Skilled Engineers

    Masayuki Hirayama; Osamu Mizuno; Tohru Kikuno

    2005-01-01

    Recent software system contain many functions to provide various services. According to this tendency, it is difficult to ensure software quality and to eliminate crucial faults by conventional software testing methods. So taking the effect of test engineer's skill on test item generation into consideration, we propose a new test item generation method,which supports the generation of test items for illegal behavior of the system. The proposed method can generate test items based on use-case analysis, deviation analysis for legal behavior, and faults tree analysis for system fault situations. From the results of the experimental applications of our method, we confirmed that test items for illegal behavior of a system were effectively generated, and also the proposed method could effectively assist test item generation by an engineer with low-level skill.

  6. Psychosocial Risks Generated By Assets Specific Design Software

    Remus, Furtună; Angela, Domnariu; Petru, Lazăr

    2015-07-01

    The human activity concerning an occupation is resultant from the interaction between the psycho-biological, socio-cultural and organizational-occupational factors. Tehnological development, automation and computerization that are to be found in all the branches of activity, the level of speed in which things develop, as well as reaching their complexity, require less and less physical aptitudes and more cognitive qualifications. The person included in the work process is bound in most of the cases to come in line with the organizational-occupational situations that are specific to the demands of the job. The role of the programmer is essencial in the process of execution of ordered softwares, thus the truly brilliant ideas can only come from well-rested minds, concentrated on their tasks. The actual requirements of the jobs, besides the high number of benefits and opportunities, also create a series of psycho-social risks, which can increase the level of stress during work activity, especially for those who work under pressure.

  7. Learning from examples - Generation and evaluation of decision trees for software resource analysis

    Selby, Richard W.; Porter, Adam A.

    1988-01-01

    A general solution method for the automatic generation of decision (or classification) trees is investigated. The approach is to provide insights through in-depth empirical characterization and evaluation of decision trees for software resource data analysis. The trees identify classes of objects (software modules) that had high development effort. Sixteen software systems ranging from 3,000 to 112,000 source lines were selected for analysis from a NASA production environment. The collection and analysis of 74 attributes (or metrics), for over 4,700 objects, captured information about the development effort, faults, changes, design style, and implementation style. A total of 9,600 decision trees were automatically generated and evaluated. The trees correctly identified 79.3 percent of the software modules that had high development effort or faults, and the trees generated from the best parameter combinations correctly identified 88.4 percent of the modules on the average.

  8. Generating Safety-Critical PLC Code From a High-Level Application Software Specification

    2008-01-01

    The benefits of automatic-application code generation are widely accepted within the software engineering community. These benefits include raised abstraction level of application programming, shorter product development time, lower maintenance costs, and increased code quality and consistency. Surprisingly, code generation concepts have not yet found wide acceptance and use in the field of programmable logic controller (PLC) software development. Software engineers at Kennedy Space Center recognized the need for PLC code generation while developing the new ground checkout and launch processing system, called the Launch Control System (LCS). Engineers developed a process and a prototype software tool that automatically translates a high-level representation or specification of application software into ladder logic that executes on a PLC. All the computer hardware in the LCS is planned to be commercial off the shelf (COTS), including industrial controllers or PLCs that are connected to the sensors and end items out in the field. Most of the software in LCS is also planned to be COTS, with only small adapter software modules that must be developed in order to interface between the various COTS software products. A domain-specific language (DSL) is a programming language designed to perform tasks and to solve problems in a particular domain, such as ground processing of launch vehicles. The LCS engineers created a DSL for developing test sequences of ground checkout and launch operations of future launch vehicle and spacecraft elements, and they are developing a tabular specification format that uses the DSL keywords and functions familiar to the ground and flight system users. The tabular specification format, or tabular spec, allows most ground and flight system users to document how the application software is intended to function and requires little or no software programming knowledge or experience. A small sample from a prototype tabular spec application is

  9. Two 2nd Circuit decisions represent mixed bag on insurance.

    2000-01-21

    The 2nd U.S. Circuit Court of Appeals in New York issued two important rulings within a week on the extent to which the Americans with Disabilities Act (ADA) regulates insurance practices. [Name removed] v. Allstate Life Insurance Co. was a plaintiff-friendly decision, finding that the insurance company illegally refused to sell life insurance to a married couple because of their mental disability, major depression. [Name removed]. v. Israel Discount Bank of New York was more defendant friendly and tackled the issue of whether the ADA permits different benefit caps for mental and physical disabilities. PMID:11367226

  10. BIPHASIC TREATMENT OF 2ND CLASS ANGLE ANOMALIES

    C. Romanec

    2011-09-01

    Full Text Available Our approach aims at presenting, based on clinical observations and complementary examinations, the effects of a treatment’s setting up during the mixed dentition period. The objectives include the identification of the optimal time of treatment of II/1, II/2 Angle malocclusions, as well as the therapeutic possibilities for the treatment of 2nd class Angle malocclusion during the period of mixed and permanent dentition. The study is based on data collected from 114 clinical cases (69 girls and 45 boys with an age span between 7 and 18 years.

  11. 2nd conference on Continuous Media with Microstructure

    Kuczma, Mieczysław

    2016-01-01

    This book presents research advances in the field of Continuous Media with Microstructure and considers the three complementary pillars of mechanical sciences: theory, research and computational simulation. It focuses on the following problems: thermodynamic and mathematical modeling of materials with extensions of classical constitutive laws, single and multicomponent media including modern multifunctional materials, wave propagation, multiscale and multiphysics processes, phase transformations, and porous, granular and composite materials. The book presents the proceedings of the 2nd Conference on Continuous Media with Microstructure, which was held in 2015 in Łagów, Poland, in memory of Prof. Krzysztof Wilmański. .

  12. TagGD: fast and accurate software for DNA Tag generation and demultiplexing.

    Paul Igor Costea

    Full Text Available Multiplexing is of vital importance for utilizing the full potential of next generation sequencing technologies. We here report TagGD (DNA-based Tag Generator and Demultiplexor, a fully-customisable, fast and accurate software package that can generate thousands of barcodes satisfying user-defined constraints and can guarantee full demultiplexing accuracy. The barcodes are designed to minimise their interference with the experiment. Insertion, deletion and substitution events are considered when designing and demultiplexing barcodes. 20,000 barcodes of length 18 were designed in 5 minutes and 2 million barcoded Illumina HiSeq-like reads generated with an error rate of 2% were demultiplexed with full accuracy in 5 minutes. We believe that our software meets a central demand in the current high-throughput biology and can be utilised in any field with ample sample abundance. The software is available on GitHub (https://github.com/pelinakan/UBD.git.

  13. FACTORS GENERATING RISKS DURING REQUIREMENT ENGINEERING PROCESS IN GLOBAL SOFTWARE DEVELOPMENT ENVIRONMENT

    Huma Hayat Khan

    2014-03-01

    Full Text Available Challenges of Requirements Engineering become adequate when it is performed in global software development paradigm. There can be many reasons behind this challenging nature. “Risks” can be one of them, as there is more risk exposure in global development paradigm. So it is may be one of the main reasons of making Requirement Engineering more challenging. For this first there is a need to identify the factors which actually generate these risks. This paper therefore not only identifies the factors, but also the risks which these factors may generate. A systematic literature review is done for the identification of these factors and the risks which may occur during requirement engineering process in global software development paradigm. The list leads to progressive enhancement for assisting in requirement engineering activities in global software development paradigm. This work is especially useful for the, less experience people working in global software development.

  14. 2nd International Conference on Computer Science, Applied Mathematics and Applications

    Thi, Hoai; Nguyen, Ngoc

    2014-01-01

    The proceedings consists of 30 papers which have been selected and invited from the submissions to the 2nd International Conference on Computer Science, Applied Mathematics and Applications (ICCSAMA 2014) held on 8-9 May, 2014 in Budapest, Hungary. The conference is organized into 7 sessions: Advanced Optimization Methods and Their Applications, Queueing Models and Performance Evaluation, Software Development and Testing, Computational Methods for Mobile and Wireless Networks, Computational Methods for Knowledge Engineering, Logic Based Methods for Decision Making and Data Mining, and Nonlinear Systems and Applications, respectively. All chapters in the book discuss theoretical and practical issues connected with computational methods and optimization methods for knowledge engineering. The editors hope that this volume can be useful for graduate and Ph.D. students and researchers in Computer Science and Applied Mathematics. It is the hope of the editors that readers of this volume can find many inspiring idea...

  15. Afs password expiration starts Feb 2nd 2004

    2004-01-01

    Due to security reasons, and in agreement with CERN management, afs/lxplus passwords will fall into line with Nice/Mail passwords on February 2nd and expire annually. As of the above date afs account holders who have not changed their passwords for over a year will have a 60 day grace period to make a change. Following this date their passwords will become invalid. What does this mean to you? If you have changed your afs password in the past 10 months the only difference is that 60 days before expiration you will receive a warning message. Similar warnings will also appear nearer the time of expiration. If you have not changed your password for more than 10 months, then, as of February 2nd you will have 60 days to change it using the command ‘kpasswd'. Help to choose a good password can be found at: http://security.web.cern.ch/security/passwords/ If you have been given a temporary password at any time by the Helpdesk or registration team this will automatically fall into the expiration category ...

  16. A Practical Comparison of De Novo Genome Assembly Software Tools for Next-Generation Sequencing Technologies

    Zhang, Wenyu; Chen, Jiajia; Yang, Yang; Tang, Yifei; Shang, Jing; Shen, Bairong

    2011-01-01

    The advent of next-generation sequencing technologies is accompanied with the development of many whole-genome sequence assembly methods and software, especially for de novo fragment assembly. Due to the poor knowledge about the applicability and performance of these software tools, choosing a befitting assembler becomes a tough task. Here, we provide the information of adaptivity for each program, then above all, compare the performance of eight distinct tools against eight groups of simulat...

  17. Scoping analysis of the Advanced Test Reactor using SN2ND

    Wolters, E.; Smith, M. (NE NEAMS PROGRAM); ( SC)

    2012-07-26

    A detailed set of calculations was carried out for the Advanced Test Reactor (ATR) using the SN2ND solver of the UNIC code which is part of the SHARP multi-physics code being developed under the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program in DOE-NE. The primary motivation of this work is to assess whether high fidelity deterministic transport codes can tackle coupled dynamics simulations of the ATR. The successful use of such codes in a coupled dynamics simulation can impact what experiments are performed and what power levels are permitted during those experiments at the ATR. The advantages of the SN2ND solver over comparable neutronics tools are its superior parallel performance and demonstrated accuracy on large scale homogeneous and heterogeneous reactor geometries. However, it should be noted that virtually no effort from this project was spent constructing a proper cross section generation methodology for the ATR usable in the SN2ND solver. While attempts were made to use cross section data derived from SCALE, the minimal number of compositional cross section sets were generated to be consistent with the reference Monte Carlo input specification. The accuracy of any deterministic transport solver is impacted by such an approach and clearly it causes substantial errors in this work. The reasoning behind this decision is justified given the overall funding dedicated to the task (two months) and the real focus of the work: can modern deterministic tools actually treat complex facilities like the ATR with heterogeneous geometry modeling. SN2ND has been demonstrated to solve problems with upwards of one trillion degrees of freedom which translates to tens of millions of finite elements, hundreds of angles, and hundreds of energy groups, resulting in a very high-fidelity model of the system unachievable by most deterministic transport codes today. A space-angle convergence study was conducted to determine the meshing and angular cubature

  18. Scoping analysis of the Advanced Test Reactor using SN2ND

    A detailed set of calculations was carried out for the Advanced Test Reactor (ATR) using the SN2ND solver of the UNIC code which is part of the SHARP multi-physics code being developed under the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program in DOE-NE. The primary motivation of this work is to assess whether high fidelity deterministic transport codes can tackle coupled dynamics simulations of the ATR. The successful use of such codes in a coupled dynamics simulation can impact what experiments are performed and what power levels are permitted during those experiments at the ATR. The advantages of the SN2ND solver over comparable neutronics tools are its superior parallel performance and demonstrated accuracy on large scale homogeneous and heterogeneous reactor geometries. However, it should be noted that virtually no effort from this project was spent constructing a proper cross section generation methodology for the ATR usable in the SN2ND solver. While attempts were made to use cross section data derived from SCALE, the minimal number of compositional cross section sets were generated to be consistent with the reference Monte Carlo input specification. The accuracy of any deterministic transport solver is impacted by such an approach and clearly it causes substantial errors in this work. The reasoning behind this decision is justified given the overall funding dedicated to the task (two months) and the real focus of the work: can modern deterministic tools actually treat complex facilities like the ATR with heterogeneous geometry modeling. SN2ND has been demonstrated to solve problems with upwards of one trillion degrees of freedom which translates to tens of millions of finite elements, hundreds of angles, and hundreds of energy groups, resulting in a very high-fidelity model of the system unachievable by most deterministic transport codes today. A space-angle convergence study was conducted to determine the meshing and angular cubature

  19. Software module for geometric product modeling and NC tool path generation

    The intelligent CAD/CAM system named VIRTUAL MANUFACTURE is created. It is consisted of four intelligent software modules: the module for virtual NC machine creation, the module for geometric product modeling and automatic NC path generation, the module for virtual NC machining and the module for virtual product evaluation. In this paper the second intelligent software module is presented. This module enables feature-based product modeling carried out via automatic saving of the designed product geometric features as knowledge data. The knowledge data are afterwards applied for automatic NC program generation for the designed product NC machining. (Author)

  20. 2nd International Open and Distance Learning (IODL Symposium

    Reviewed by Murat BARKAN

    2006-10-01

    Full Text Available This closing remarks prepared and presented by Prof. Dr. Murat BARKAN Anadolu University, Eskisehir, TURKEY DEAR GUESTS, As the 2nd International Open and Distance Learning (IODL Symposium is now drawing to end, I would like to thank you all for your outstanding speeches, distinguished presentations, constructive roundtable and session discussions, and active participation during the last five days. I hope you all share my view that the whole symposium has been a very stimulating and successful experience. Also, on behalf of all the participants, I would like to take this opportunity to thank and congratulate the symposium organization committee for their excellent job in organizing and hosting our 2nd meeting. Throughout the symposium, five workshops, six keynote speeches and 66 papers, which were prepared by more than 150 academicians and practitioners from 23 different countries, reflected remarkable and various views and approaches about open and flexible learning. Besides, all these academic endeavors, 13 educational films were displayed during the symposium. The technology exhibition, hosted by seven companies, was very effective to showcase the current level of the technology and the success of applications of theory into practice. Now I would like to go over what our scholar workshop and keynote presenters shared with us: Prof. Marina McIsaac form Arizona State University dwelled on how to determine research topics worthwhile to be examined and how to choose appropriate research design and methods. She gave us clues on how to get articles published in professional journals. Prof. Colin Latchem from Australia and Prof. Dr. Ali Ekrem Ozkul from Anadolu University pointed to the importance of strategic planning for distance and flexible learning. They highlighted the advantages of strategic planning for policy-makers, planners, managers and staff. Dr. Wolfram Laaser from Fern University of Hagen, presented different multimedia clips and

  1. 2nd international conference on advanced nanomaterials and nanotechnology

    Goswami, D; Perumal, A

    2013-01-01

    Nanoscale science and technology have occupied centre stage globally in modern scientific research and discourses in the early twenty first century. The enabling nature of the technology makes it important in modern electronics, computing, materials, healthcare, energy and the environment. This volume contains selected articles presented (as Invited/Oral/Poster presentations) at the 2nd international conference on advanced materials and nanotechnology (ICANN-2011) held recently at the Indian Institute of Technology Guwahati, during Dec 8-10, 2011. The list of topics covered in this proceedings include: Synthesis and self assembly of nanomaterials Nanoscale characterisation Nanophotonics & Nanoelectronics Nanobiotechnology Nanocomposites  F   Nanomagnetism Nanomaterials for Enery Computational Nanotechnology Commercialization of Nanotechnology The conference was represented by around 400 participants from several countries including delegates invited from USA, Germany, Japan, UK, Taiwan, Italy, Singapor...

  2. Isotope effects on vapour phase 2nd viral coefficients

    Vapor phase 2nd virial coefficient isotope effects (VCIE's) are interpreted. A useful correlation ids developed between -Δ(B-b0)/(B-b0) = -VCIE and the reference condensed phase reduced isotopic partition function ratio [ln(fc/fg)]*. B is the second virial coefficient , b0 = 2πσ3/3, σ is the Lennard-Jones size parameter, and Δ is an isotopic difference, light-heavy. [ln(fc/fg)]* can be obtained from vapor pressure isotope effects for T/TCRITICAL p/f2g), where ln(fp/f2g) is the reduced isotopic partition function ratio describing the equilibrium between monomers and interacting pairs. At temperatures well removed from crossovers in ln(fp/f2g) or [ln(fc/fg)]*, ln(fp/f2g) = (0.4±0.2)[ln(fc/fg)]*. (author)

  3. 2nd International Conference on NeuroRehabilitation

    Andersen, Ole; Akay, Metin

    2014-01-01

    The book is the proceedings of the 2nd International Conference on NeuroRehabilitation (ICNR 2014), held 24th-26th June 2014 in Aalborg, Denmark. The conference featured the latest highlights in the emerging and interdisciplinary field of neural rehabilitation engineering and identified important healthcare challenges the scientific community will be faced with in the coming years. Edited and written by leading experts in the field, the book includes keynote papers, regular conference papers, and contributions to special and innovation sessions, covering the following main topics: neuro-rehabilitation applications and solutions for restoring impaired neurological functions; cutting-edge technologies and methods in neuro-rehabilitation; and translational challenges in neuro-rehabilitation. Thanks to its highly interdisciplinary approach, the book will not only be a  highly relevant reference guide for academic researchers, engineers, neurophysiologists, neuroscientists, physicians and physiotherapists workin...

  4. 2nd Colombian Congress on Computational Biology and Bioinformatics

    Cristancho, Marco; Isaza, Gustavo; Pinzón, Andrés; Rodríguez, Juan

    2014-01-01

    This volume compiles accepted contributions for the 2nd Edition of the Colombian Computational Biology and Bioinformatics Congress CCBCOL, after a rigorous review process in which 54 papers were accepted for publication from 119 submitted contributions. Bioinformatics and Computational Biology are areas of knowledge that have emerged due to advances that have taken place in the Biological Sciences and its integration with Information Sciences. The expansion of projects involving the study of genomes has led the way in the production of vast amounts of sequence data which needs to be organized, analyzed and stored to understand phenomena associated with living organisms related to their evolution, behavior in different ecosystems, and the development of applications that can be derived from this analysis.  .

  5. 2nd International Afro-European Conference for Industrial Advancement

    Wegrzyn-Wolska, Katarzyna; Hassanien, Aboul; Snasel, Vaclav; Alimi, Adel

    2016-01-01

    This volume contains papers presented at the 2nd International Afro-European Conference for Industrial Advancement -- AECIA 2015. The conference aimed at bringing together the foremost experts and excellent young researchers from Africa, Europe and the rest of the world to disseminate the latest results from various fields of engineering, information, and communication technologies. The topics, discussed at the conference, covered a broad range of domains spanning from ICT and engineering to prediction, modeling, and analysis of complex systems. The 2015 edition of AECIA featured a distinguished special track on prediction, modeling and analysis of complex systems -- Nostradamus, and special sessions on Advances in Image Processing and Colorization and Data Processing, Protocols, and Applications in Wireless Sensor Networks.

  6. 2nd CEAS Specialist Conference on Guidance, Navigation and Control

    Mulder, Bob; Choukroun, Daniel; Kampen, Erik-Jan; Visser, Coen; Looye, Gertjan

    2013-01-01

    Following the successful 1st CEAS (Council of European Aerospace Societies) Specialist Conference on Guidance, Navigation and Control (CEAS EuroGNC) held in Munich, Germany in 2011, Delft University of Technology happily accepted the invitation of organizing the 2nd  CEAS EuroGNC in Delft, The Netherlands in 2013. The goal of the conference is to promote new advances in aerospace GNC theory and technologies for enhancing safety, survivability, efficiency, performance, autonomy and intelligence of aerospace systems using on-board sensing, computing and systems. A great push for new developments in GNC are the ever higher safety and sustainability requirements in aviation. Impressive progress was made in new research fields such as sensor and actuator fault detection and diagnosis, reconfigurable and fault tolerant flight control, online safe flight envelop prediction and protection, online global aerodynamic model identification, online global optimization and flight upset recovery. All of these challenges de...

  7. 2nd International Multidisciplinary Microscopy and Microanalysis Congress

    Oral, Ahmet; Ozer, Mehmet

    2015-01-01

    The 2nd International Multidisciplinary Microscopy and Microanalysis Congress & Exhibition (InterM 2014) was held on 16–19 October 2014 in Oludeniz, Fethiye/ Mugla, Turkey. The aim of the congress was to gather scientists from various branches and discuss the latest improvements in the field of microscopy. The focus of the congress has been widened in an "interdisciplinary" manner, so as to allow all scientists working on several related subjects to participate and present their work. These proceedings include 33 peer-reviewed technical papers, submitted by leading academic and research institutions from over 17 countries and representing some of the most cutting-edge research available. The papers were presented at the congress in the following sessions: ·         Applications of Microscopy in the Physical Sciences ·         Applications of Microscopy in the Biological Sciences.

  8. 2nd International Conference on Communication and Computer Engineering

    Othman, Mohd; Othman, Mohd; Rahim, Yahaya; Pee, Naim

    2016-01-01

    This book covers diverse aspects of advanced computer and communication engineering, focusing specifically on industrial and manufacturing theory and applications of electronics, communications, computing and information technology. Experts in research, industry, and academia present the latest developments in technology, describe applications involving cutting-edge communication and computer systems, and explore likely future trends. In addition, a wealth of new algorithms that assist in solving computer and communication engineering problems are presented. The book is based on presentations given at ICOCOE 2015, the 2nd International Conference on Communication and Computer Engineering. It will appeal to a wide range of professionals in the field, including telecommunication engineers, computer engineers and scientists, researchers, academics and students.

  9. 2nd International Conference on Harmony Search Algorithm

    Geem, Zong

    2016-01-01

    The Harmony Search Algorithm (HSA) is one of the most well-known techniques in the field of soft computing, an important paradigm in the science and engineering community.  This volume, the proceedings of the 2nd International Conference on Harmony Search Algorithm 2015 (ICHSA 2015), brings together contributions describing the latest developments in the field of soft computing with a special focus on HSA techniques. It includes coverage of new methods that have potentially immense application in various fields. Contributed articles cover aspects of the following topics related to the Harmony Search Algorithm: analytical studies; improved, hybrid and multi-objective variants; parameter tuning; and large-scale applications.  The book also contains papers discussing recent advances on the following topics: genetic algorithms; evolutionary strategies; the firefly algorithm and cuckoo search; particle swarm optimization and ant colony optimization; simulated annealing; and local search techniques.   This book ...

  10. 2nd International Conference on Construction and Building Research

    Fernández-Plazaola, Igor; Hidalgo-Delgado, Francisco; Martínez-Valenzuela, María; Medina-Ramón, Francisco; Oliver-Faubel, Inmaculada; Rodríguez-Abad, Isabel; Salandin, Andrea; Sánchez-Grandia, Rafael; Tort-Ausina, Isabel; Construction and Building Research

    2014-01-01

    Many areas of knowledge converge in the building industry and therefore research in this field necessarily involves an interdisciplinary approach. Effective research requires strong relations between a broad variety of scientific and technological domains and more conventional construction or craft processes, while also considering advanced management processes, where all the main actors permanently interact. This publication takes an interdisciplinary approach grouping various studies on the building industry chosen from among the works presented for the 2nd International Conference on Construction and Building Research. The papers examine aspects of materials and building systems; construction technology; energy and sustainability; construction management; heritage, refurbishment and conservation. The information contained within these pages may be of interest to researchers and practitioners in construction and building activities from the academic sphere, as well as public and private sectors.

  11. 2nd International Congress on Neurotechnology, Electronics and Informatics

    Encarnação, Pedro

    2016-01-01

    This book is a timely report on current neurotechnology research. It presents a snapshot of the state of the art in the field, discusses current challenges and identifies new directions. The book includes a selection of extended and revised contributions presented at the 2nd International Congress on Neurotechnology, Electronics and Informatics (NEUROTECHNIX 2014), held October 25-26 in Rome, Italy. The chapters are varied: some report on novel theoretical methods for studying neuronal connectivity or neural system behaviour; others report on advanced technologies developed for similar purposes; while further contributions concern new engineering methods and technological tools supporting medical diagnosis and neurorehabilitation. All in all, this book provides graduate students, researchers and practitioners dealing with different aspects of neurotechnologies with a unified view of the field, thus fostering new ideas and research collaborations among groups from different disciplines.

  12. Advanced User Interface Generation in the Software Framework for Magnetic Measurements at CERN

    Arpaia, P; La Commara, Giuseppe; Arpaia, Pasquale

    2010-01-01

    A model-based approach, the Model-View-Interactor Paradigm, for automatic generation of user interfaces in software frameworks for measurement systems is proposed. The Model-View-Interactor Paradigm is focused on the ``interaction{''} typical in a software framework for measurement applications: the final user interacts with the automatic measurement system executing a suitable high-level script previously written by a test engineer. According to the main design goal of frameworks, the proposed approach allows the user interfaces to be separated easily from the application logic for enhancing the flexibility and reusability of the software. As a practical case study, this approach has been applied to the flexible software framework for magnetic measurements at the European Organization for Nuclear research (CERN). In particular, experimental results about the scenario of permeability measurements are reported.

  13. New Software Developments for Quality Mesh Generation and Optimization from Biomedical Imaging Data

    Yu, Zeyun; Wang, Jun; Gao, Zhanheng; Xu, Ming; Hoshijima, Masahiko

    2013-01-01

    In this paper we present a new software toolkit for generating and optimizing surface and volumetric meshes from three-dimensional (3D) biomedical imaging data, targeted at image-based finite element analysis of some biomedical activities in a single material domain. Our toolkit includes a series of geometric processing algorithms including surface re-meshing and quality-guaranteed tetrahedral mesh generation and optimization. All methods described have been encapsulated into a user-friendly ...

  14. Using Genetic Algorithm for Automated Efficient Software Test Case Generation for Path Testing

    Premal B. Nirpal

    2011-05-01

    Full Text Available This paper discusses genetic algorithms that can automatically generate test cases to test selected path. This algorithm takes a selected path as a target and executes sequences of operators iteratively for test cases to evolve. The evolved test case can lead the program execution to achieve the target path. An automatic path-oriented test data generation is not only a crucial problem but also a hot issue in the research area of software testing today.

  15. Minimal Testcase Generation for Object-Oriented Software with State Charts

    Ranjita Kumari Swain

    2012-08-01

    Full Text Available Today statecharts are a de facto standard in industry for modeling system behavior. Test data generation isone of the key issues in software testing. This paper proposes an reduction approach to test data generationfor the state-based software testing. In this paper, first state transition graph is derived from state chartdiagram. Then, all the required information are extracted from the state chart diagram. Then, test casesare generated. Lastly, a set of test cases are minimized by calculating the node coverage for each test case.It is also determined that which test cases are covered by other test cases. The advantage of our testgeneration technique is that it optimizes test coverage by minimizing time and cost. The present test datageneration scheme generates test cases which satisfy transition path coverage criteria, path coveragecriteria and action coverage criteria. A case study on Railway Ticket Vending Machine (RTVM has beenpresented to illustrate our approach.

  16. Software R&D for Next Generation of HEP Experiments, Inspired by Theano

    CERN. Geneva

    2015-01-01

    In the next decade, the frontiers of High Energy Physics (HEP) will be explored by three machines: the High Luminosity Large Hadron Collider (HL-LHC) in Europe, the Long Base Neutrino Facility (LBNF) in the US, and the International Linear Collider (ILC) in Japan. These next generation experiments must address two fundamental problems in the current generation of HEP experimental software: the inability to take advantage and adapt to the rapidly evolving processor landscape, and the difficulty in developing and maintaining increasingly complex software systems by physicists. I will propose a strategy, inspired by the automatic optimization and code generation in Theano, to simultaneously address both problems. I will describe three R&D projects with short-term physics deliverables aimed at developing this strategy. The first project is to develop maximally sensitive General Search for New Physics at the LHC by applying the Matrix Element Method running GPUs of HPCs. The second is to classify and reconstru...

  17. Reliable Mining of Automatically Generated Test Cases from Software Requirements Specification (SRS)

    Raamesh, Lilly

    2010-01-01

    Writing requirements is a two-way process. In this paper we use to classify Functional Requirements (FR) and Non Functional Requirements (NFR) statements from Software Requirements Specification (SRS) documents. This is systematically transformed into state charts considering all relevant information. The current paper outlines how test cases can be automatically generated from these state charts. The application of the states yields the different test cases as solutions to a planning problem. The test cases can be used for automated or manual software testing on system level. And also the paper presents a method for reduction of test suite by using mining methods thereby facilitating the mining and knowledge extraction from test cases.

  18. Photonic generation and independent steering of multiple RF signals for software defined radars.

    Ghelfi, Paolo; Laghezza, Francesco; Scotti, Filippo; Serafino, Giovanni; Pinna, Sergio; Bogoni, Antonella

    2013-09-23

    As the improvement of radar systems claims for digital approaches, photonics is becoming a solution for software defined high frequency and high stability signal generation. We report on our recent activities on the photonic generation of flexible wideband RF signals, extending the proposed architecture to the independent optical beamforming of multiple signals. The scheme has been tested generating two wideband signals at 10 GHz and 40 GHz, and controlling their independent delays at two antenna elements. Thanks to the multiple functionalities, the proposed scheme allows to improve the effectiveness of the photonic approach, reducing its cost and allowing flexibility, extremely wide bandwidth, and high stability. PMID:24104176

  19. The design of real time infrared image generation software based on Creator and Vega

    Wang, Rui-feng; Wu, Wei-dong; Huo, Jun-xiu

    2013-09-01

    Considering the requirement of high reality and real-time quality dynamic infrared image of an infrared image simulation, a method to design real-time infrared image simulation application on the platform of VC++ is proposed. This is based on visual simulation software Creator and Vega. The functions of Creator are introduced simply, and the main features of Vega developing environment are analyzed. The methods of infrared modeling and background are offered, the designing flow chart of the developing process of IR image real-time generation software and the functions of TMM Tool and MAT Tool and sensor module are explained, at the same time, the real-time of software is designed.

  20. Book review: Psychology in a work context (2nd Ed.

    Nanette Tredoux

    2003-10-01

    Full Text Available Bergh, Z. & Theron, A.L. (Eds (2003 Psychology in a work context (2nd Ed.. Cape Town: Oxford University Press. This book is an overview and introduction to Industrial and Organisational Psychology. It is a work of ambitious scope, and it is clear that the contributors have invested a great deal of thought and effort in the planning and execution of the book. The current version is the second edition, and it looks set to become one of those standard textbooks that are revised every few years to keep up with changing times. It is a handsome volume, produced to a high standard of editorial care, pleasingly laid out and organised well enough to be useful as an occasional reference source. An English-Afrikaans glossary, tables of contents for every chapter as well as for the entire book, a comprehensive index and extensive bibliography make it easy to retrieve the information relating to a particular topic. Every chapter ends with a conclusion summarising the gist of the material covered. Quality illustrations lighten the tone and help to bring some of the concepts to life. Learning outcomes and self-assessment exercises and questions for every chapter will be useful to the lecturer using the book as a source for a tutored course, and for the student studying by distance learning. If sold at the suggested retail price, the book represents good value compared to imported textbooks that cover similar ground.

  1. PREFACE: 2nd International Symposium "Optics and its Applications"

    Calvo, Maria L.; Dolganova, Irina N.; Gevorgyan, Narine; Guzman, Angela; Papoyan, Aram; Sarkisyan, Hayk; Yurchenko, Stanislav

    2016-01-01

    The ICTP smr2633: 2nd International Symposium "Optics and its Applications" (OPTICS-2014) http://indico.ictp.it/event/a13253/ was held in Yerevan and Ashtarak, Armenia, on 1-5 September 2014. The Symposium was organized by the Abdus Salam International Center for Theoretical Physics (ICTP) with the collaboration of the SPIE Armenian Student Chapter, the Armenian TC of ICO, the Russian-Armenian University (RAU), the Institute for Physical Research of the National Academy of Sciences of Armenia (IPR of NAS), the Greek-Armenian industrial company LT-Pyrkal, and the Yerevan State University (YSU). The Symposium was co-organized by the BMSTU SPIE & OSA student chapters. The International Symposium OPTICS-2014 was dedicated to the 50th anniversary of the Abdus Salam International Center for Theoretical Physics. This symposium "Optics and its Applications" was the First Official ICTP Scientific Event in Armenia. The presentations at OPTICS-2014 were centered on these topics: optical properties of nanostructures; quantum optics & information; singular optics and its applications; laser spectroscopy; strong field optics; nonlinear & ultrafast optics; photonics & fiber optics; optics of liquid crystals; and mathematical methods in optics.

  2. APTWG: 2nd Asia-Pacific Transport Working Group Meeting

    This conference report summarizes the contributions to and discussions at the 2nd Asia-Pacific Transport Working Group Meeting held in Chengdu, China, from 15 to 18 May 2012. The topics of the meeting were organized under five main headings: momentum transport, non-locality in transport, edge turbulence and L–H transition, three-dimensional effects on transport physics, and particle, momentum and heat pinches. It is found that lower hybrid wave and ion cyclotron wave induce co-current rotation while electron cyclotron wave induces counter-current rotation. A four-stage imaging for low (L) to high (H) confinement transition gradually emerges and a more detailed verification is urgently expected. The new edge-localized modes mitigation technique with supersonic molecular beam injection was approved to be effective to some extent on HL-2A and KSTAR. It is also found that low collisionality, trapped electron mode to ion temperature gradient transition (or transition of higher to lower density and temperature gradients), fuelling and lithium coating are in favour of inward pinch of particles in tokamak plasmas. (paper)

  3. Next generation of decision making software for nanopatterns characterization: application to semiconductor industry

    Dervilllé, A.; Labrosse, A.; Zimmermann, Y.; Foucher, J.; Gronheid, R.; Boeckx, C.; Singh, A.; Leray, P.; Halder, S.

    2016-03-01

    The dimensional scaling in IC manufacturing strongly drives the demands on CD and defect metrology techniques and their measurement uncertainties. Defect review has become as important as CD metrology and both of them create a new metrology paradigm because it creates a completely new need for flexible, robust and scalable metrology software. Current, software architectures and metrology algorithms are performant but it must be pushed to another higher level in order to follow roadmap speed and requirements. For example: manage defect and CD in one step algorithm, customize algorithms and outputs features for each R&D team environment, provide software update every day or every week for R&D teams in order to explore easily various development strategies. The final goal is to avoid spending hours and days to manually tune algorithm to analyze metrology data and to allow R&D teams to stay focus on their expertise. The benefits are drastic costs reduction, more efficient R&D team and better process quality. In this paper, we propose a new generation of software platform and development infrastructure which can integrate specific metrology business modules. For example, we will show the integration of a chemistry module dedicated to electronics materials like Direct Self Assembly features. We will show a new generation of image analysis algorithms which are able to manage at the same time defect rates, images classifications, CD and roughness measurements with high throughput performances in order to be compatible with HVM. In a second part, we will assess the reliability, the customization of algorithm and the software platform capabilities to follow new specific semiconductor metrology software requirements: flexibility, robustness, high throughput and scalability. Finally, we will demonstrate how such environment has allowed a drastic reduction of data analysis cycle time.

  4. Software for Generating Troposphere Corrections for InSAR Using GPS and Weather Model Data

    Moore, Angelyn W.; Webb, Frank H.; Fishbein, Evan F.; Fielding, Eric J.; Owen, Susan E.; Granger, Stephanie L.; Bjoerndahl, Fredrik; Loefgren, Johan; Fang, Peng; Means, James D.; Bock, Yehuda; Tong, Xiaopeng

    2013-01-01

    Atmospheric errors due to the troposphere are a limiting error source for spaceborne interferometric synthetic aperture radar (InSAR) imaging. This software generates tropospheric delay maps that can be used to correct atmospheric artifacts in InSAR data. The software automatically acquires all needed GPS (Global Positioning System), weather, and Digital Elevation Map data, and generates a tropospheric correction map using a novel algorithm for combining GPS and weather information while accounting for terrain. Existing JPL software was prototypical in nature, required a MATLAB license, required additional steps to acquire and ingest needed GPS and weather data, and did not account for topography in interpolation. Previous software did not achieve a level of automation suitable for integration in a Web portal. This software overcomes these issues. GPS estimates of tropospheric delay are a source of corrections that can be used to form correction maps to be applied to InSAR data, but the spacing of GPS stations is insufficient to remove short-wavelength tropospheric artifacts. This software combines interpolated GPS delay with weather model precipitable water vapor (PWV) and a digital elevation model to account for terrain, increasing the spatial resolution of the tropospheric correction maps and thus removing short wavelength tropospheric artifacts to a greater extent. It will be integrated into a Web portal request system, allowing use in a future L-band SAR Earth radar mission data system. This will be a significant contribution to its technology readiness, building on existing investments in in situ space geodetic networks, and improving timeliness, quality, and science value of the collected data

  5. Software architecture for control and data acquisition of linear plasma generator Magnum-PSI

    Highlights: ► An architecture based on a modular design. ► The design offers flexibility and extendability. ► The design covers the overall software architecture. ► It also covers its (sub)systems’ internal structure. -- Abstract: The FOM Institute DIFFER – Dutch Institute for Fundamental Energy Research has completed the construction phase of Magnum-PSI, a magnetized, steady-state, large area, high-flux linear plasma beam generator to study plasma surface interactions under ITER divertor conditions. Magnum-PSI consists of several hardware subsystems, and a variety of diagnostic systems. The COntrol, Data Acquisition and Communication (CODAC) system integrates these subsystems and provides a complete interface for the Magnum-PSI users. Integrating it all, from the lowest hardware level of sensors and actuators, via the level of networked PLCs and computer systems, up to functions and classes in programming languages, demands a sound and modular software architecture, which is extendable and scalable for future changes. This paper describes this architecture, and the modular design of the software subsystems. The design is implemented in the CODAC system at the level of services and subsystems (the overall software architecture), as well as internally in the software subsystems

  6. Resolution 519/012. It is allowed to R DEL SUR S.A company to generate a wind electricity source by a generating power plant placed in Maldonado province 2nd and 4th Catastral section, as well as the connection to the Interconnected National System

    The Resolution 519 is according to the Electric Wholesale Market regulation and it authorizes the power generation using the wind as the primary source. The company who presented this project was R DEL SUR S.A with the aim to instal a wind power plant in Maldonado province.

  7. NgsRelate: a software tool for estimating pairwise relatedness from next-generation sequencing data

    Korneliussen, Thorfinn Sand; Moltke, Ida

    2015-01-01

    MOTIVATION: Pairwise relatedness estimation is important in many contexts such as disease mapping and population genetics. However, all existing estimation methods are based on called genotypes, which is not ideal for next-generation sequencing (NGS) data of low depth from which genotypes cannot be called with high certainty.RESULTS: We present a software tool, NgsRelate, for estimating pairwise relatedness from NGS data. It provides maximum likelihood estimates that are based on genotype lik...

  8. An automation of design and modelling tasks in NX Siemens environment with original software - generator module

    Zbiciak, M.; Grabowik, C.; Janik, W.

    2015-11-01

    Nowadays the design constructional process is almost exclusively aided with CAD/CAE/CAM systems. It is evaluated that nearly 80% of design activities have a routine nature. These design routine tasks are highly susceptible to automation. Design automation is usually made with API tools which allow building original software responsible for adding different engineering activities. In this paper the original software worked out in order to automate engineering tasks at the stage of a product geometrical shape design is presented. The elaborated software works exclusively in NX Siemens CAD/CAM/CAE environment and was prepared in Microsoft Visual Studio with application of the .NET technology and NX SNAP library. The software functionality allows designing and modelling of spur and helicoidal involute gears. Moreover, it is possible to estimate relative manufacturing costs. With the Generator module it is possible to design and model both standard and non-standard gear wheels. The main advantage of the model generated in such a way is its better representation of an involute curve in comparison to those which are drawn in specialized standard CAD systems tools. It comes from fact that usually in CAD systems an involute curve is drawn by 3 points that respond to points located on the addendum circle, the reference diameter of a gear and the base circle respectively. In the Generator module the involute curve is drawn by 11 involute points which are located on and upper the base and the addendum circles therefore 3D gear wheels models are highly accurate. Application of the Generator module makes the modelling process very rapid so that the gear wheel modelling time is reduced to several seconds. During the conducted research the analysis of differences between standard 3 points and 11 points involutes was made. The results and conclusions drawn upon analysis are shown in details.

  9. Aptaligner: Automated Software for Aligning Pseudorandom DNA X-Aptamers from Next-Generation Sequencing Data

    Lu, Emily; Elizondo-Riojas, Miguel-Angel; Chang, Jeffrey T.; Volk, David E.

    2014-01-01

    Next-generation sequencing results from bead-based aptamer libraries have demonstrated that traditional DNA/RNA alignment software is insufficient. This is particularly true for X-aptamers containing specialty bases (W, X, Y, Z, ...) that are identified by special encoding. Thus, we sought an automated program that uses the inherent design scheme of bead-based X-aptamers to create a hypothetical reference library and Markov modeling techniques to provide improved alignments. Aptaligner provid...

  10. Simulation of photovoltaic systems electricity generation using homer software in specific locations in Serbia

    Pavlović Tomislav M.; Milosavljević Dragana D.; Pirsl Danica S.

    2013-01-01

    In this paper basic information of Homer software for PV system electricity generation, NASA - Surface meteorology and solar energy database, RETScreen, PVGIS and HMIRS (Hydrometeorological Institute of Republic of Serbia) solar databases are given. The comparison of the monthly average values for daily solar radiation per square meter received by the horizontal surface taken from NASA, RETScreen, PVGIS and HMIRS solar databases for three locations in Serbia (Belgrade, Negotin and Zlati...

  11. Towards a Pattern-based Automatic Generation of Logical Specifications for Software Models

    Klimek, Radoslaw

    2014-01-01

    The work relates to the automatic generation of logical specifications, considered as sets of temporal logic formulas, extracted directly from developed software models. The extraction process is based on the assumption that the whole developed model is structured using only predefined workflow patterns. A method of automatic transformation of workflow patterns to logical specifications is proposed. Applying the presented concepts enables bridging the gap between the benefits of deductive rea...

  12. Examples to Accompany "Descriptive Cataloging of Rare Books, 2nd Edition."

    Association of Coll. and Research Libraries, Chicago, IL.

    This book is intended to be used with "Descriptive Cataloging of Rare Books," 2nd edition (DCRB) as an illustrative aid to catalogers and others interested in or needing to interpret rare book cataloging. As such, it is to be used in conjunction with the rules it illustrates, both in DCRB and in "Anglo-American Cataloging Rules," 2nd edition…

  13. Development of a Hydrologic Characterization Technology for Fault Zones Phase II 2nd Report

    Karasaki, Kenzi [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Doughty, Christine [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Gasperikova, Erika [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Peterson, John [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Conrad, Mark [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Cook, Paul [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Tiemi, Onishi [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2011-03-31

    This is the 2nd report on the three-year program of the 2nd phase of the NUMO-LBNL collaborative project: Development of Hydrologic Characterization Technology for Fault Zones under NUMO-DOE/LBNL collaboration agreement. As such, this report is a compendium of the results by Kiho et al. (2011) and those by LBNL.

  14. SEMANTIC WEB-BASED SOFTWARE ENGINEERING BY AUTOMATED REQUIREMENTS ONTOLOGY GENERATION IN SOA

    Vahid Rastgoo

    2014-04-01

    Full Text Available This paper presents an approach for automated generation of requirements ontology using UML diagrams in service-oriented architecture (SOA. The goal of this paper is to convenience progress of software engineering processes like software design, software reuse, service discovering and etc. The proposed method is based on a four conceptual layers. The first layer includes requirements achieved by stakeholders, the second one designs service-oriented diagrams from the data in first layer and extracts XMI codes of them. The third layer includes requirement ontology and protocol ontology to describe behavior of services and relationships between them semantically. Finally the forth layer makes standard the concepts exists in ontologies of previous layer. The generated ontology exceeds absolute domain ontology because it considers the behavior of services moreover the hierarchical relationship of them. Experimental results conducted on a set of UML4Soa diagrams in different scopes demonstrate the improvement of the proposed approach from different points of view such as: completeness of requirements ontology, automatic generation and considering SOA.

  15. Analysis of quality raw data of second generation sequencers with Quality Assessment Software

    2011-01-01

    Background Second generation technologies have advantages over Sanger; however, they have resulted in new challenges for the genome construction process, especially because of the small size of the reads, despite the high degree of coverage. Independent of the program chosen for the construction process, DNA sequences are superimposed, based on identity, to extend the reads, generating contigs; mismatches indicate a lack of homology and are not included. This process improves our confidence in the sequences that are generated. Findings We developed Quality Assessment Software, with which one can review graphs showing the distribution of quality values from the sequencing reads. This software allow us to adopt more stringent quality standards for sequence data, based on quality-graph analysis and estimated coverage after applying the quality filter, providing acceptable sequence coverage for genome construction from short reads. Conclusions Quality filtering is a fundamental step in the process of constructing genomes, as it reduces the frequency of incorrect alignments that are caused by measuring errors, which can occur during the construction process due to the size of the reads, provoking misassemblies. Application of quality filters to sequence data, using the software Quality Assessment, along with graphing analyses, provided greater precision in the definition of cutoff parameters, which increased the accuracy of genome construction. PMID:21501521

  16. PREFACE: 2nd National Conference on Nanotechnology 'NANO 2008'

    Czuba, P.; Kolodziej, J. J.; Konior, J.; Szymonski, M.

    2009-03-01

    This issue of Journal of Physics: Conference Series contains selected papers presented at the 2nd National Conference on Nanotechnology 'NANO2008', that was held in Kraków, Poland, 25-28 June 2008. It was organized jointly by the Polish Chemical Society, Polish Physical Society, Polish Vacuum Society, and the Centre for Nanometer-scale Science and Advanced Materials (NANOSAM) of the Jagiellonian University. The meeting presentations were categorized into the following topics: 1. Nanomechanics and nanotribology 2. Characterization and manipulation in nanoscale 3. Quantum effects in nanostructures 4. Nanostructures on surfaces 5. Applications of nanotechnology in biology and medicine 6. Nanotechnology in education 7. Industrial applications of nanotechnology, presentations of the companies 8. Nanoengineering and nanomaterials (international sessions shared with the fellows of Maria-Curie Host Fellowships within the 6th FP of the European Community Project 'Nano-Engineering for Expertise and Development, NEED') 9. Nanopowders 10. Carbon nanostructures and nanosystems 11. Nanoelectronics and nanophotonics 12. Nanomaterials in catalysis 13. Nanospintronics 14. Ethical, social, and environmental aspects of nanotechnology The Conference was attended by 334 participants. The presentations were delivered as 7 invited plenary lectures, 25 invited topical lectures, 78 oral and 108 poster contributions. Only 1/6 of the contributions presented during the Conference were submitted for publication in this Proceedings volume. From the submitted material, this volume of Journal of Physics: Conference Series contains 37 articles that were positively evaluated by independent referees. The Organizing Committee gratefully acknowledges all these contributions. We also thank all the referees of the papers submitted for the Proceedings for their timely and thorough work. We would like to thank all members of the National Program Committee for their work in the selection process of

  17. Pragmatics Annotated Coloured Petri Nets for Protocol Software Generation and Verification

    Fagerland Simonsen, Kent Inge; Kristensen, Lars Michael; Kindler, Ekkart

    PetriCode is a tool that supports automated generation of protocol software from a restricted class of Coloured Petri Nets (CPNs) called Pragmatics Annotated Coloured Petri Nets (PA-CPNs). Petri-Code and PA-CPNs have been designed with five main requirements in mind, which include the same model...... being used for verification and code generation. The PetriCode approach has been discussed and evaluated in earlier papers already. In this paper, we give a formal definition of PA-CPNs and demonstrate how the specific structure of PA-CPNs can be exploited for verification purposes....

  18. Library perceptions of using social software as blogs in the idea generation phase of service innovations

    Scupola, Ada; Nicolajsen, Hanne Westh

    2013-01-01

    This article investigates the use of social software such as blogs to communicate with and to involve users in the idea generation process of service innovations. After a theoretical discussion of user involvement and more specifically user involvement using web-tools with specific focus on blogs......, the article reports findings and lessons from a field experiment at a university library. In the experiment, a blog was established to collect service innovation ideas from the library users. The experiment shows that a blog may engage a limited number of users in the idea generation process and...

  19. 2nd interface between ecology and land development in California

    Keeley, Jon E.; Baer-Keeley, Melanie; Fortheringham, C.J.

    2000-01-01

    The 2nd Interface Between Ecology and Land Development Conference was held in association with Earth Day 1997, five years after the first Interface Conference. Rapid population growth in California has intensified the inevitable conflict between land development and preservation of natural ecosystems. Sustainable development requires wise use of diminishing natural resources and, where possible, restoration of damaged landscapes. These Earth Week Celebrations brought together resource managers, scientists, politicians, environmental consultants, and concerned citizens in an effort to improve the communication necessary to maintain our natural biodiversity, ecosystem processes and general quality of life. As discussed by our keynote speaker, Michael Soule, the best predictor of habitat loss is population growth and nowhere is this better illustrated than in California. As urban perimeters expand, the interface between wildlands and urban areas increases. Few problems are more vexing than how to manage the fire prone ecosystems indigenous to California at this urban interface. Today resource managers face increasing challenges of dealing with this problem and the lead-off section of the proceedings considers both the theoretical basis for making decisions related to prescribed burning and the practical application. Habitat fragmentation is an inevitable consequence of development patterns with significant impacts on animal and plant populations. Managers must be increasingly resourceful in dealing with problems of fragmentation and the often inevitable consequences, including susceptibility to invasive oganisms. One approach to dealing with fragmentation problems is through careful landplanning. California is the national leader in the integration of conservation and economics. On Earth Day 1991, Governor Pete Wilson presented an environmental agenda that promised to create between land owners and environmentalists, agreements that would guarantee the protection of

  20. Pragmatics Annotated Coloured Petri Nets for Protocol Software Generation and Verification

    Simonsen, Kent Inge; Kristensen, Lars Michael; Kindler, Ekkart

    This paper presents the formal definition of Pragmatics Annotated Coloured Petri Nets (PA-CPNs). PA-CPNs represent a class of Coloured Petri Nets (CPNs) that are designed to support automated code genera-tion of protocol software. PA-CPNs restrict the structure of CPN models and allow Petri net...... elements to be annotated with so-called pragmatics, which are exploited for code generation. The approach and tool for gen-erating code is called PetriCode and has been discussed and evaluated in earlier work already. The contribution of this paper is to give a formal def-inition for PA-CPNs; in addition......, we show how the structural restrictions of PA-CPNs can be exploited for making the verification of the modelled protocols more efficient. This is done by automatically deriving progress measures for the sweep-line method, and by introducing so-called service testers, that can be used to control the...

  1. Easy Steps to STAIRS. 2nd Revised Edition.

    National Library of Australia, Canberra.

    This manual for computer searchers describes the software package--IBM's STAIRS (Storage And Information Retrieval System)--used for searching databases in AUSINET (AUStralian Information NETwork). Whereas the first edition explained STAIRS in the context of the National Library's Online ERIC Project and the ERIC data base, this second edition…

  2. EVENT GENERATION OF STANDARD MODEL HIGGS DECAY TO DIMUON PAIRS USING PYTHIA SOFTWARE

    Yusof, Adib

    2015-01-01

    My project for CERN Summer Student Programme 2015 is on Event Generation of Standard Model Higgs Decay to Dimuon Pairs using Pythia Software. Briefly, Pythia or specifically, Pythia 8.1 is a program for the generation of high-energy Physics events that is able to describe the collisions at any given energies between elementary particles such as Electron, Positron, Proton as well as anti-Proton. It contains theory and models for a number of Physics aspects, including hard and soft interactions, parton distributions, initial-state and final-state parton showers, multiparton interactions, fragmentation and decay. All programming code is to be written in C++ language for this version (the previous version uses FORTRAN) and can be linked to ROOT software for displaying output in form of histogram. For my project, I need to generate events for standard model Higgs Boson into Muon and anti-Muon pairs (H→μ+ μ) to study the expected significance value for this particular process at centre-of-mass energy of 13 TeV...

  3. PREFACE: 2nd Workshop on Germanium Detectors and Technologies

    Abt, I.; Majorovits, B.; Keller, C.; Mei, D.; Wang, G.; Wei, W.

    2015-05-01

    The 2nd workshop on Germanium (Ge) detectors and technology was held at the University of South Dakota on September 14-17th 2014, with more than 113 participants from 8 countries, 22 institutions, 15 national laboratories, and 8 companies. The participants represented the following big projects: (1) GERDA and Majorana for the search of neutrinoless double-beta decay (0νββ) (2) SuperCDMS, EDELWEISS, CDEX, and CoGeNT for search of dark matter; (3) TEXONO for sub-keV neutrino physics; (4) AGATA and GRETINA for gamma tracking; (5) AARM and others for low background radiation counting; (5) as well as PNNL and LBNL for applications of Ge detectors in homeland security. All participants have expressed a strong desire on having better understanding of Ge detector performance and advancing Ge technology for large-scale applications. The purpose of this workshop was to leverage the unique aspects of the underground laboratories in the world and the germanium (Ge) crystal growing infrastructure at the University of South Dakota (USD) by brining researchers from several institutions taking part in the Experimental Program to Stimulate Competitive Research (EPSCoR) together with key leaders from international laboratories and prestigious universities, working on the forefront of the intensity to advance underground physics focusing on the searches for dark matter, neutrinoless double-beta decay (0νββ), and neutrino properties. The goal of the workshop was to develop opportunities for EPSCoR institutions to play key roles in the planned world-class research experiments. The workshop was to integrate individual talents and existing research capabilities, from multiple disciplines and multiple institutions, to develop research collaborations, which includes EPSCor institutions from South Dakota, North Dakota, Alabama, Iowa, and South Carolina to support multi-ton scale experiments for future. The topic areas covered in the workshop were: 1) science related to Ge

  4. Evolution of a Reconfigurable Processing Platform for a Next Generation Space Software Defined Radio

    Kacpura, Thomas J.; Downey, Joseph A.; Anderson, Keffery R.; Baldwin, Keith

    2014-01-01

    The National Aeronautics and Space Administration (NASA)Harris Ka-Band Software Defined Radio (SDR) is the first, fully reprogrammable space-qualified SDR operating in the Ka-Band frequency range. Providing exceptionally higher data communication rates than previously possible, this SDR offers in-orbit reconfiguration, multi-waveform operation, and fast deployment due to its highly modular hardware and software architecture. Currently in operation on the International Space Station (ISS), this new paradigm of reconfigurable technology is enabling experimenters to investigate navigation and networking in the space environment.The modular SDR and the NASA developed Space Telecommunications Radio System (STRS) architecture standard are the basis for Harris reusable, digital signal processing space platform trademarked as AppSTAR. As a result, two new space radio products are a synthetic aperture radar payload and an Automatic Detection Surveillance Broadcast (ADS-B) receiver. In addition, Harris is currently developing many new products similar to the Ka-Band software defined radio for other applications. For NASAs next generation flight Ka-Band radio development, leveraging these advancements could lead to a more robust and more capable software defined radio.The space environment has special considerations different from terrestrial applications that must be considered for any system operated in space. Each space mission has unique requirements that can make these systems unique. These unique requirements can make products that are expensive and limited in reuse. Space systems put a premium on size, weight and power. A key trade is the amount of reconfigurability in a space system. The more reconfigurable the hardware platform, the easier it is to adapt to the platform to the next mission, and this reduces the amount of non-recurring engineering costs. However, the more reconfigurable platforms often use more spacecraft resources. Software has similar considerations

  5. NEXT GENERATION ANALYSIS SOFTWARE FOR COMPONENT EVALUATION - Results of Rotational Seismometer Evaluation

    Hart, D. M.; Merchant, B. J.; Abbott, R. E.

    2012-12-01

    The Component Evaluation project at Sandia National Laboratories supports the Ground-based Nuclear Explosion Monitoring program by performing testing and evaluation of the components that are used in seismic and infrasound monitoring systems. In order to perform this work, Component Evaluation maintains a testing facility called the FACT (Facility for Acceptance, Calibration, and Testing) site, a variety of test bed equipment, and a suite of software tools for analyzing test data. Recently, Component Evaluation has successfully integrated several improvements to its software analysis tools and test bed equipment that have substantially improved our ability to test and evaluate components. The software tool that is used to analyze test data is called TALENT: Test and AnaLysis EvaluatioN Tool. TALENT is designed to be a single, standard interface to all test configuration, metadata, parameters, waveforms, and results that are generated in the course of testing monitoring systems. It provides traceability by capturing everything about a test in a relational database that is required to reproduce the results of that test. TALENT provides a simple, yet powerful, user interface to quickly acquire, process, and analyze waveform test data. The software tool has also been expanded recently to handle sensors whose output is proportional to rotation angle, or rotation rate. As an example of this new processing capability, we show results from testing the new ATA ARS-16 rotational seismometer. The test data was collected at the USGS ASL. Four datasets were processed: 1) 1 Hz with increasing amplitude, 2) 4 Hz with increasing amplitude, 3) 16 Hz with increasing amplitude and 4) twenty-six discrete frequencies between 0.353 Hz to 64 Hz. The results are compared to manufacture-supplied data sheets.

  6. A proposed metamodel for the implementation of object oriented software through the automatic generation of source code

    CARVALHO, J. S. C.

    2008-12-01

    Full Text Available During the development of software one of the most visible risks and perhaps the biggest implementation obstacle relates to the time management. All delivery deadlines software versions must be followed, but it is not always possible, sometimes due to delay in coding. This paper presents a metamodel for software implementation, which will rise to a development tool for automatic generation of source code, in order to make any development pattern transparent to the programmer, significantly reducing the time spent in coding artifacts that make up the software.

  7. ANT: Software for Generating and Evaluating Degenerate Codons for Natural and Expanded Genetic Codes.

    Engqvist, Martin K M; Nielsen, Jens

    2015-08-21

    The Ambiguous Nucleotide Tool (ANT) is a desktop application that generates and evaluates degenerate codons. Degenerate codons are used to represent DNA positions that have multiple possible nucleotide alternatives. This is useful for protein engineering and directed evolution, where primers specified with degenerate codons are used as a basis for generating libraries of protein sequences. ANT is intuitive and can be used in a graphical user interface or by interacting with the code through a defined application programming interface. ANT comes with full support for nonstandard, user-defined, or expanded genetic codes (translation tables), which is important because synthetic biology is being applied to an ever widening range of natural and engineered organisms. The Python source code for ANT is freely distributed so that it may be used without restriction, modified, and incorporated in other software or custom data pipelines. PMID:25901796

  8. 76 FR 29750 - Filing Dates for the Nevada Special Election in the 2nd Congressional District

    2011-05-23

    ... General Election on September 13, 2011, to fill the U.S. House seat in ] the 2nd Congressional District... report, the first report must cover all activity that occurred before the committee registered as...

  9. 77 FR 75161 - Filing Dates for the Illinois Special Election in the 2nd Congressional District

    2012-12-19

    ... February 26, 2013, and April 9, 2013, to fill the U.S. House seat in the 2nd Congressional District vacated... not previously filed a report, the first report must cover all activity that occurred before...

  10. 2nd U.S. Case of Bacteria Resistant to Last-Resort Antibiotic

    ... news/fullstory_159807.html 2nd U.S. Case of Bacteria Resistant to Last-Resort Antibiotic Scientists concerned it ... the United States who was infected with a bacteria that is resistant to an antibiotic of last ...

  11. Optimized Pump Power Ratio on 2nd Order Pumping Discrete Raman Amplifier

    Renxiang Huang; Youichi Akasaka; David L. Harris; James Pan

    2003-01-01

    By optimizing pump power ratio between 1st order backward pump and 2nd order forward pump on discrete Raman amplifier, we demonstrated over 2dB noise figure improvement without excessive non-linearity degradation.

  12. Combustion synthesis and characterization of Ba2NdSbO6 nanocrystals

    V T Kavitha; R Jose; S Ramakrishna; P R S Wariar; J Koshy

    2011-07-01

    Nanocrystalline Ba2NdSbO6, a complex cubic perovskite metal oxide, powders were synthesized by a self-sustained combustion method employing citric acid. The product was characterized by X-ray diffraction, differential thermal analysis, thermogravimetric analysis, Fourier transform infrared spectroscopy, transmission electron microscopy and scanning electron microscopy. The as-prepared powders were single phase Ba2NdSbO6 and a mixture of polycrystalline spheroidal particles and single crystalline nanorods. The Ba2NdSbO6 sample sintered at 1500°C for 4 h has high density (∼ 95% of theoretical density). Sintered nanocrystalline Ba2NdSbO6 had a dielectric constant of ∼ 21; and dielectric loss = 8 × 10-3 at 5 MHz.

  13. File list: His.Lar.50.AllAg.2nd_instar [Chip-atlas[Archive

    Full Text Available His.Lar.50.AllAg.2nd_instar dm3 Histone Larvae 2nd instar SRX013015,SRX013042,SRX01...3112,SRX013043,SRX013087,SRX013096 http://dbarchive.biosciencedbc.jp/kyushu-u/dm3/assembled/His.Lar.50.AllAg.2nd_instar.bed ...

  14. File list: ALL.Lar.10.AllAg.2nd_instar [Chip-atlas[Archive

    Full Text Available ALL.Lar.10.AllAg.2nd_instar dm3 All antigens Larvae 2nd instar SRX013087,SRX013015,...SRX013112,SRX013042,SRX013043,SRX013096,SRX013113,SRX013016,SRX013114 http://dbarchive.biosciencedbc.jp/kyushu-u/dm3/assembled/ALL.Lar.10.AllAg.2nd_instar.bed ...

  15. File list: His.Lar.05.AllAg.2nd_instar [Chip-atlas[Archive

    Full Text Available His.Lar.05.AllAg.2nd_instar dm3 Histone Larvae 2nd instar SRX013087,SRX013096,SRX01...3043,SRX013015,SRX013112,SRX013042 http://dbarchive.biosciencedbc.jp/kyushu-u/dm3/assembled/His.Lar.05.AllAg.2nd_instar.bed ...

  16. File list: His.Lar.20.AllAg.2nd_instar [Chip-atlas[Archive

    Full Text Available His.Lar.20.AllAg.2nd_instar dm3 Histone Larvae 2nd instar SRX013015,SRX013042,SRX01...3112,SRX013043,SRX013096,SRX013087 http://dbarchive.biosciencedbc.jp/kyushu-u/dm3/assembled/His.Lar.20.AllAg.2nd_instar.bed ...

  17. File list: ALL.Lar.20.AllAg.2nd_instar [Chip-atlas[Archive

    Full Text Available ALL.Lar.20.AllAg.2nd_instar dm3 All antigens Larvae 2nd instar SRX013015,SRX013042,...SRX013112,SRX013043,SRX013016,SRX013114,SRX013096,SRX013087,SRX013113 http://dbarchive.biosciencedbc.jp/kyushu-u/dm3/assembled/ALL.Lar.20.AllAg.2nd_instar.bed ...

  18. File list: ALL.Lar.50.AllAg.2nd_instar [Chip-atlas[Archive

    Full Text Available ALL.Lar.50.AllAg.2nd_instar dm3 All antigens Larvae 2nd instar SRX013015,SRX013042,...SRX013112,SRX013016,SRX013114,SRX013043,SRX013087,SRX013096,SRX013113 http://dbarchive.biosciencedbc.jp/kyushu-u/dm3/assembled/ALL.Lar.50.AllAg.2nd_instar.bed ...

  19. Meteosat Second Generation station: processing software and computing arquitecture; Estacion de Recepcion de Imagenes del Satelite Meteosat Segunda generacion: Arquitectura Informatica y Software de Proceso

    Martin, L.; Cony, M.; Navarro, A. A.; Zarzalejo, L. F.; Polo, J.

    2010-05-01

    The Renewable Energy Division of CIEMAT houses a specific station for receiving the Meteosat Second Generation images, which is of interest in the works being carried out concerning the solar radiation derived from satellite images. The complexity, the huge amount of information being received and the particular characteristics of the MSG images encouraged the design and development of a specific computer structure and the associated software as well, for a better and more suitable use of the images. This document describes the mentioned structure and software. (Author) 8 refs.

  20. Severe weather phenomena: SQUALL LINES The case of July 2nd 2009

    Paraschivescu, Mihnea; Tanase, Adrian

    2010-05-01

    The wind intensity plays an important role, among the dangerous meteorological phenomena, to produce negative effects on the economy and the social activities, particularly when the wind is about to turn into a storm. During the past years one can notice an increase of wind frequency and intensity due to climate changes and, consequently, as a result of the extreme meteorological phenomena not only on a planetary level but also on a regional one. Although dangerous meteorological phenomena cannot be avoided, since they are natural, nevertheless they can be anticipated and decision making institutions and mass media can be informed. This is the reason why, in this paper, we set out to identify the synoptic conditions that led to the occurrence of the severe storm case in Bucharest on July 2nd, 2009, as well as the matrices that generate such cases. At the same time we sought to identify some indications evidence especially from radar data so as to lead to the improvement of the time interval between the nowcasting warning and the actual occurrence of the phenomenon.

  1. 2nd International Conference on Computer and Communication Technologies

    Raju, K; Mandal, Jyotsna; Bhateja, Vikrant

    2016-01-01

    The book is about all aspects of computing, communication, general sciences and educational research covered at the Second International Conference on Computer & Communication Technologies held during 24-26 July 2015 at Hyderabad. It hosted by CMR Technical Campus in association with Division – V (Education & Research) CSI, India. After a rigorous review only quality papers are selected and included in this book. The entire book is divided into three volumes. Three volumes cover a variety of topics which include medical imaging, networks, data mining, intelligent computing, software design, image processing, mobile computing, digital signals and speech processing, video surveillance and processing, web mining, wireless sensor networks, circuit analysis, fuzzy systems, antenna and communication systems, biomedical signal processing and applications, cloud computing, embedded systems applications and cyber security and digital forensic. The readers of these volumes will be highly benefited from the te...

  2. Next generation hyper-scale software and hardware systems for big data analytics

    CERN. Geneva

    2013-01-01

    Building on foundational technologies such as many-core systems, non-volatile memories and photonic interconnects, we describe some current technologies and future research to create real-time, big data analytics, IT infrastructure. We will also briefly describe some of our biologically-inspired software and hardware architecture for creating radically new hyper-scale cognitive computing systems. About the speaker Rich Friedrich is the director of Strategic Innovation and Research Services (SIRS) at HP Labs. In this strategic role, he is responsible for research investments in nano-technology, exascale computing, cyber security, information management, cloud computing, immersive interaction, sustainability, social computing and commercial digital printing. Rich's philosophy is to fuse strategy and inspiration to create compelling capabilities for next generation information devices, systems and services. Using essential insights gained from the metaphysics of innnovation, he effectively leads ...

  3. Next Generation Astronomical Data Processing using Big Data Technologies from the Apache Software Foundation

    Mattmann, Chris

    2014-04-01

    In this era of exascale instruments for astronomy we must naturally develop next generation capabilities for the unprecedented data volume and velocity that will arrive due to the veracity of these ground-based sensor and observatories. Integrating scientific algorithms stewarded by scientific groups unobtrusively and rapidly; intelligently selecting data movement technologies; making use of cloud computing for storage and processing; and automatically extracting text and metadata and science from any type of file are all needed capabilities in this exciting time. Our group at NASA JPL has promoted the use of open source data management technologies available from the Apache Software Foundation (ASF) in pursuit of constructing next generation data management and processing systems for astronomical instruments including the Expanded Very Large Array (EVLA) in Socorro, NM and the Atacama Large Milimetre/Sub Milimetre Array (ALMA); as well as for the KAT-7 project led by SKA South Africa as a precursor to the full MeerKAT telescope. In addition we are funded currently by the National Science Foundation in the US to work with MIT Haystack Observatory and the University of Cambridge in the UK to construct a Radio Array of Portable Interferometric Devices (RAPID) that will undoubtedly draw from the rich technology advances underway. NASA JPL is investing in a strategic initiative for Big Data that is pulling in these capabilities and technologies for astronomical instruments and also for Earth science remote sensing. In this talk I will describe the above collaborative efforts underway and point to solutions in open source from the Apache Software Foundation that can be deployed and used today and that are already bringing our teams and projects benefits. I will describe how others can take advantage of our experience and point towards future application and contribution of these tools.

  4. A Soliton Hierarchy Associated with a Spectral Problem of 2nd Degree in a Spectral Parameter and Its Bi-Hamiltonian Structure

    Yuqin Yao

    2016-01-01

    Full Text Available Associated with so~(3,R, a new matrix spectral problem of 2nd degree in a spectral parameter is proposed and its corresponding soliton hierarchy is generated within the zero curvature formulation. Bi-Hamiltonian structures of the presented soliton hierarchy are furnished by using the trace identity, and thus, all presented equations possess infinitely commuting many symmetries and conservation laws, which implies their Liouville integrability.

  5. Computer-generated holograms (CGH) realization: the integration of dedicated software tool with digital slides printer

    Guarnieri, Vittorio; Francini, Franco

    1997-12-01

    Last generation of digital printer is usually characterized by a spatial resolution enough high to allow the designer to realize a binary CGH directly on a transparent film avoiding photographic reduction techniques. These devices are able to produce slides or offset prints. Furthermore, services supplied by commercial printing company provide an inexpensive method to rapidly verify the validity of the design by means of a test-and-trial process. Notably, this low-cost approach appears to be suitable for a didactical environment. On the basis of these considerations, a set of software tools able to design CGH's has been developed. The guidelines inspiring the work have been the following ones: (1) ray-tracing approach, considering the object to be reproduced as source of spherical waves; (2) Optimization and speed-up of the algorithms used, in order to produce a portable code, runnable on several hardware platforms. In this paper calculation methods to obtain some fundamental geometric functions (points, lines, curves) are described. Furthermore, by the juxtaposition of these primitives functions it is possible to produce the holograms of more complex objects. Many examples of generated CGHs are presented.

  6. Automatic Generation of Just-in-Time Online Assessments from Software Design Models

    Zualkernan, Imran A.; El-Naaj, Salim Abou; Papadopoulos, Maria; Al-Amoudi, Budoor K.; Matthews, Charles E.

    2009-01-01

    Computer software is pervasive in today's society. The rate at which new versions of computer software products are released is phenomenal when compared to the release rate of new products in traditional industries such as aircraft building. This rapid rate of change can partially explain why most certifications in the software industry are…

  7. Software for evaluating magnetic induction field generated by power lines: implementation of a new algorithm

    The Regional Environment Protection Agency of Friuli Venezia Giulia (A.R.P.A. F.V.G., Italy) has performed an analysis on existing software designed to calculate magnetic induction field generated by power lines. As far as the agency requirements are concerned the tested programs display some difficulties in the immediate processing of electrical and geometrical data supplied by plant owners, and in certain cases turn out to be inadequate in representing complex configurations of power lines. Furthermore, none of them is preset for cyclic calculus to determine the time evolution of induction in a certain exposure area. Finally, the output data are not immediately importable by ArcView, the G.I.S. used by A.R.P.A. F.V.G., and it is not always possible to implement the territory orography to determine the field at specified heights above the ground. P.h.i.d.e.l., an innovative software, tackles and works out al l the above mentioned problems. The power line wires interested in its implementation are represented by poly lines, and the field is analytically calculated, with no further approximation, not even when more power lines are concerned. Therefore, the obtained results, when compared with those of other programs, are the closest to experimental measurements. The output data can be employed both in G.I.S. and Excel environments, allowing the immediate overlaying of digital cartography and the determining of the 3 and 10 μT bands, in compliance with the Italian Decree of the President of the Council of Ministers of 8 July 2003. (authors)

  8. Software for evaluating magnetic induction field generated by power lines: implementation of a new algorithm

    Comelli, M.; Benes, M.; Bampo, A.; Villalta, R. [Regional Environment Protection Agency of Friuli Venezia Giulia (ARPA FVG), Environmental Physics, Udine (Italy)

    2006-07-01

    The Regional Environment Protection Agency of Friuli Venezia Giulia (A.R.P.A. F.V.G., Italy) has performed an analysis on existing software designed to calculate magnetic induction field generated by power lines. As far as the agency requirements are concerned the tested programs display some difficulties in the immediate processing of electrical and geometrical data supplied by plant owners, and in certain cases turn out to be inadequate in representing complex configurations of power lines. Furthermore, none of them is preset for cyclic calculus to determine the time evolution of induction in a certain exposure area. Finally, the output data are not immediately importable by ArcView, the G.I.S. used by A.R.P.A. F.V.G., and it is not always possible to implement the territory orography to determine the field at specified heights above the ground. P.h.i.d.e.l., an innovative software, tackles and works out al l the above mentioned problems. The power line wires interested in its implementation are represented by poly lines, and the field is analytically calculated, with no further approximation, not even when more power lines are concerned. Therefore, the obtained results, when compared with those of other programs, are the closest to experimental measurements. The output data can be employed both in G.I.S. and Excel environments, allowing the immediate overlaying of digital cartography and the determining of the 3 and 10 {mu}T bands, in compliance with the Italian Decree of the President of the Council of Ministers of 8 July 2003. (authors)

  9. Power system economics : the Nordic electricity market. 2nd ed.

    This book written as a textbook for students of engineering is designed for the Norwegian Power Markets course which is part of the Energy and Environment Master's Program and the recently established international MSc program in Electric Power Engineering. As the title indicates, the book deals with both power system economics in general and the practical implementation and experience from the Nordic market. Areas of coverage include: -- Restructuring/deregulation of the power supply system -- Grid access including tariffs and congestion management -- Generation planning -- Market modeling -- Ancillary services -- Regulation of grid monopolies. Although Power Systems Economics is written primarily as a textbook for students, other readers will also find the book interesting. It deals with problems that have been subject of considerable attention in the power sector for some years and it addresses issues that are still relevant and important. (au)

  10. 2nd International Conference on Robot Intelligence Technology and Applications

    Matson, Eric; Myung, Hyun; Xu, Peter; Karray, Fakhri

    2014-01-01

    We are facing a new technological challenge on how to store and retrieve knowledge and manipulate intelligence for autonomous services by intelligent systems which should be capable of carrying out real world tasks autonomously. To address this issue, robot researchers have been developing intelligence technology (InT) for “robots that think” which is in the focus of this book. The book covers all aspects of intelligence from perception at sensor level and reasoning at cognitive level to behavior planning at execution level for each low level segment of the machine. It also presents the technologies for cognitive reasoning, social interaction with humans, behavior generation, ability to cooperate with other robots, ambience awareness, and an artificial genome that can be passed on to other robots. These technologies are to materialize cognitive intelligence, social intelligence, behavioral intelligence, collective intelligence, ambient intelligence and genetic intelligence. The book aims at serving resear...