WorldWideScience

Sample records for 2nd generation software

  1. BASE - 2nd generation software for microarray data management and analysis

    Directory of Open Access Journals (Sweden)

    Nordborg Nicklas

    2009-10-01

    Full Text Available Abstract Background Microarray experiments are increasing in size and samples are collected asynchronously over long time. Available data are re-analysed as more samples are hybridized. Systematic use of collected data requires tracking of biomaterials, array information, raw data, and assembly of annotations. To meet the information tracking and data analysis challenges in microarray experiments we reimplemented and improved BASE version 1.2. Results The new BASE presented in this report is a comprehensive annotable local microarray data repository and analysis application providing researchers with an efficient information management and analysis tool. The information management system tracks all material from biosource, via sample and through extraction and labelling to raw data and analysis. All items in BASE can be annotated and the annotations can be used as experimental factors in downstream analysis. BASE stores all microarray experiment related data regardless if analysis tools for specific techniques or data formats are readily available. The BASE team is committed to continue improving and extending BASE to make it usable for even more experimental setups and techniques, and we encourage other groups to target their specific needs leveraging on the infrastructure provided by BASE. Conclusion BASE is a comprehensive management application for information, data, and analysis of microarray experiments, available as free open source software at http://base.thep.lu.se under the terms of the GPLv3 license.

  2. 2nd Generation RLV Risk Definition Program

    Science.gov (United States)

    Davis, Robert M.; Stucker, Mark (Technical Monitor)

    2000-01-01

    The 2nd Generation RLV Risk Reduction Mid-Term Report summarizes the status of Kelly Space & Technology's activities during the first two and one half months of the program. This report was presented to the cognoscente Contracting Officer's Technical Representative (COTR) and selected Marshall Space Flight Center staff members on 26 September 2000. The report has been approved and is distributed on CD-ROM (as a PowerPoint file) in accordance with the terms of the subject contract, and contains information and data addressing the following: (1) Launch services demand and requirements; (2) Architecture, alternatives, and requirements; (3) Costs, pricing, and business cases analysis; (4) Commercial financing requirements, plans, and strategy; (5) System engineering processes and derived requirements; and (6) RLV system trade studies and design analysis.

  3. 2nd Generation QUATARA Flight Computer Project

    Science.gov (United States)

    Falker, Jay; Keys, Andrew; Fraticelli, Jose Molina; Capo-Iugo, Pedro; Peeples, Steven

    2015-01-01

    Single core flight computer boards have been designed, developed, and tested (DD&T) to be flown in small satellites for the last few years. In this project, a prototype flight computer will be designed as a distributed multi-core system containing four microprocessors running code in parallel. This flight computer will be capable of performing multiple computationally intensive tasks such as processing digital and/or analog data, controlling actuator systems, managing cameras, operating robotic manipulators and transmitting/receiving from/to a ground station. In addition, this flight computer will be designed to be fault tolerant by creating both a robust physical hardware connection and by using a software voting scheme to determine the processor's performance. This voting scheme will leverage on the work done for the Space Launch System (SLS) flight software. The prototype flight computer will be constructed with Commercial Off-The-Shelf (COTS) components which are estimated to survive for two years in a low-Earth orbit.

  4. 2nd generation biogas. BioSNG

    International Nuclear Information System (INIS)

    The substitution of natural gas by a renewable equivalent is an interesting option to reduce the use of fossil fuels and the accompanying greenhouse gas emissions, as well as from the point of view of security of supply. The renewable alternative for natural gas is green natural gas, i.e. gaseous energy carriers produced from biomass comprising both biogas and Synthetic Natural Gas (SNG). Via this route can be benefited from all the advantages of natural gas, like the existing dense infrastructure, trade and supply network, and natural gas applications. In this presentation attention is paid to the differences between first generation biogas and second generation bioSNG; the market for bioSNG: grid injection vs. transportation fuel; latest update on the lab- and pilot-scale bioSNG development at ECN; and an overview is given of ongoing bioSNG activities worldwide

  5. 2nd Generation alkaline electrolysis. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Yde, L. [Aarhus Univ. Business and Social Science - Centre for Energy Technologies (CET), Aarhus (Denmark); Kjartansdottir, C.K. [Technical Univ. of Denmark. DTU Mechanical Engineering, Kgs. Lyngby (Denmark); Allebrod, F. [Technical Univ. of Denmark. DTU Energy Conversion, DTU Risoe Campus, Roskilde (Denmark)] [and others

    2013-03-15

    The overall purpose of this project has been to contribute to this load management by developing a 2{sup nd} generation of alkaline electrolysis system characterized by being compact, reliable, inexpensive and energy efficient. The specific targets for the project have been to: 1) Increase cell efficiency to more than 88% (according to the higher heating value (HHV)) at a current density of 200 mA /cm{sup 2}; 2) Increase operation temperature to more than 100 degree Celsius to make the cooling energy more valuable; 3) Obtain an operation pressure more than 30 bar hereby minimizing the need for further compression of hydrogen for storage; 4) Improve stack architecture decreasing the price of the stack with at least 50%; 5) Develop a modular design making it easy to customize plants in the size from 20 to 200 kW; 6) Demonstrating a 20 kW 2{sup nd} generation stack in H2College at the campus of Arhus University in Herning. The project has included research and development on three different technology tracks of electrodes; an electrochemical plating, an atmospheric plasma spray (APS) and finally a high temperature and pressure (HTP) track with operating temperature around 250 deg. C and pressure around 40 bar. The results show that all three electrode tracks have reached high energy efficiencies. In the electrochemical plating track a stack efficiency of 86.5% at a current density of 177mA/cm{sup 2} and a temperature of 74.4 deg. C has been shown. The APS track showed cell efficiencies of 97%, however, coatings for the anode side still need to be developed. The HTP cell has reached 100 % electric efficiency operating at 1.5 V (the thermoneutral voltage) with a current density of 1. 1 A/cm{sup 2}. This track only tested small cells in an externally heated laboratory set-up, and thus the thermal loss to surroundings cannot be given. The goal set for the 2{sup nd} generation electrolyser system, has been to generate 30 bar pressure in the cell stack. An obstacle to be overcomed has been to ensure equalisation of the H{sub 2} and O{sub 2} pressure to avoid that mixing of gasses can occur. To solve this problem, a special equilibrium valve has been developed to mechanically control that the pressure of the H{sub 2} at all times equals the O{sub 2} side. The developments have resulted in a stack design, which is a cylindrical pressure vessel, with each cell having a cell ''wall'' sufficiently thick, to resist the high pressure and sealed with O-rings for perfect sealing at high pressures. The stack has in test proved to resist a pressure on 45 bar, though some adjustment is still needed to optimize the pressure resistance and efficiency. When deciding on the new stack design both a 'zero gap' and 'non-zero gap' was considered. The zero gap design is more efficient than non-zero gap, however the design is more complex and very costly, primarily because the additional materials and production costs for zero gap electrodes. From these considerations, the concept of a ''low gap'', low diameter, high pressure and high cell number electrolyser stack was born, which could offer an improved efficiency of the electrolyser without causing the same high material and production cost as a zero gap zero gap solution. As a result the low gap design and pressurized stack has reduced the price by 60% of the total system, as well as a reduced system footprint. The progress of the project required a special focus on corrosion testing and examination of polymers in order to find alternative durable membrane and gasket materials. The initial literature survey and the first tests indicated that the chemical resistance of polymers presented a greater challenge than anticipated, and that test data from commercial suppliers were insufficient to model the conditions in the electrolyser. The alkali resistant polymers (e.g. Teflon) are costly and the search for cheaper alternatives turned into a major aim. A number of different tests were run under accelerated conditions and the degradation mechani

  6. Super Boiler 2nd Generation Technology for Watertube Boilers

    Energy Technology Data Exchange (ETDEWEB)

    Mr. David Cygan; Dr. Joseph Rabovitser

    2012-03-31

    This report describes Phase I of a proposed two phase project to develop and demonstrate an advanced industrial watertube boiler system with the capability of reaching 94% (HHV) fuel-to-steam efficiency and emissions below 2 ppmv NOx, 2 ppmv CO, and 1 ppmv VOC on natural gas fuel. The boiler design would have the capability to produce >1500 F, >1500 psig superheated steam, burn multiple fuels, and will be 50% smaller/lighter than currently available watertube boilers of similar capacity. This project is built upon the successful Super Boiler project at GTI. In that project that employed a unique two-staged intercooled combustion system and an innovative heat recovery system to reduce NOx to below 5 ppmv and demonstrated fuel-to-steam efficiency of 94% (HHV). This project was carried out under the leadership of GTI with project partners Cleaver-Brooks, Inc., Nebraska Boiler, a Division of Cleaver-Brooks, and Media and Process Technology Inc., and project advisors Georgia Institute of Technology, Alstom Power Inc., Pacific Northwest National Laboratory and Oak Ridge National Laboratory. Phase I of efforts focused on developing 2nd generation boiler concepts and performance modeling; incorporating multi-fuel (natural gas and oil) capabilities; assessing heat recovery, heat transfer and steam superheating approaches; and developing the overall conceptual engineering boiler design. Based on our analysis, the 2nd generation Industrial Watertube Boiler when developed and commercialized, could potentially save 265 trillion Btu and $1.6 billion in fuel costs across U.S. industry through increased efficiency. Its ultra-clean combustion could eliminate 57,000 tons of NOx, 460,000 tons of CO, and 8.8 million tons of CO2 annually from the atmosphere. Reduction in boiler size will bring cost-effective package boilers into a size range previously dominated by more expensive field-erected boilers, benefiting manufacturers and end users through lower capital costs.

  7. A ZeroDimensional Model of a 2nd Generation Planar SOFC Using Calibrated Parameters

    Directory of Open Access Journals (Sweden)

    Brian Elmegaard

    2006-12-01

    Full Text Available This paper presents a zero-dimensional mathematical model of a planar 2nd generation coflow SOFC developed for simulation of power systems. The model accounts for the electrochemical oxidation of hydrogen as well as the methane reforming reaction and the water-gas shift reaction. An important part of the paper is the electrochemical sub-model, where experimental data was used to calibrate specific parameters. The SOFC model was implemented in the DNA simulation software which is designed for energy system simulation. The result is an accurate and flexible tool suitable for simulation of many different SOFC-based power systems.

  8. From 1st- to 2nd-Generation Biofuel Technologies: Extended Executive Summary

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2008-07-01

    This report looks at the technical challenges facing 2nd-generation biofuels, evaluates their costs and examines related current policies to support their development and deployment. The potential for production of more advanced biofuels is also discussed. Although significant progress continues to be made to overcome the technical and economic challenges, 2nd-generation biofuels still face major constraints to their commercial deployment.

  9. Colchicine treatment of jute seedlings in the 1st and 2nd generation after irradiation

    International Nuclear Information System (INIS)

    Colchicine treatment (0.05% for 12 h) to 15 day old seedlings in the 1st generation after X-ray or gamma-ray exposure was lethal. In contrast the same colchicine treatment to 15 day old seedlings in the 2nd generation was effective in inducing polyploids. (author)

  10. Intergenerational Transmission and the School-to-work Transition for 2nd Generation Immigrants

    DEFF Research Database (Denmark)

    Nielsen, Helena Skyt; Rosholm, Michael

    2001-01-01

    We analyse the extent of intergenerational transmission through parental capital, ethnic capital and neighbourhood effects on several aspects of the school-to-work transition of 2nd generation immigrants and young ethnic Danes. The main findings are that parental capital has strong positive effects on the probability of completing a qualifying education and on the entry into the labour market, but it has a much smaller impact on the duration of the first employment spell and on the wage level. Growing up in neighbourhoods with a high concentration of immigrants is associated with negative labour market prospects both for young natives and 2nd generation immigrants.

  11. Rapeseed residues utilization for energy and 2nd generation biofuels

    Energy Technology Data Exchange (ETDEWEB)

    A. Zabaniotou; O. Ioannidou; V. Skoulou [Aristotle University of Thessaloniki, Thessaloniki (Greece). Chemical Engineering Department

    2008-07-15

    Lignocellulosic biomass is an interesting and necessary enlargement of the biomass used for the production of renewable biofuels. It is expected that second generation biofuels are more energy efficient than the ones of first generation, as a substrate that is able to completely transformed into energy. The present study is part of a research program aiming at the integrated utilization of rapeseed suitable to Greek conditions for biodiesel production and parallel use of its solid residues for energy and second generation biofuels production. In that context, fast pyrolysis at high temperature and fixed bed air gasification of the rapeseed residues were studied. Thermogravimetric analysis and kinetic study were also carried out. The obtained results indicated that high temperature pyrolysis could produces higher yields of syngas and hydrogen production comparing to air fixed bed gasification. 27 refs., 17 figs., 4 tabs.

  12. The 1997 Protocol and the European Union (European Union and '2nd generation' responsibility conventions)

    International Nuclear Information System (INIS)

    The issue of accession of the Eastern European Member States to the 1997 Protocol is discussed with focus on the European Union's authority and enforcement powers. Following up the article published in the preceding issue of this journal, the present contribution analyses the relations of the '2nd generation' responsibility conventions to the law of the European Union. (orig.)

  13. Operations Analysis of the 2nd Generation Reusable Launch Vehicle

    Science.gov (United States)

    Noneman, Steven R.; Smith, C. A. (Technical Monitor)

    2002-01-01

    The Space Launch Initiative (SLI) program is developing a second-generation reusable launch vehicle. The program goals include lowering the risk of loss of crew to 1 in 10,000 and reducing annual operations cost to one third of the cost of the Space Shuttle. The SLI missions include NASA, military and commercial satellite launches and crew and cargo launches to the space station. The SLI operations analyses provide an assessment of the operational support and infrastructure needed to operate candidate system architectures. Measures of the operability are estimated (i.e. system dependability, responsiveness, and efficiency). Operations analysis is used to determine the impact of specific technologies on operations. A conceptual path to reducing annual operations costs by two thirds is based on key design characteristics, such as reusability, and improved processes lowering labor costs. New operations risks can be expected to emerge. They can be mitigated with effective risk management with careful identification, assignment, tracking, and closure. SLI design characteristics such as nearly full reusability, high reliability, advanced automation, and lowered maintenance and servicing coupled with improved processes are contributors to operability and large operating cost reductions.

  14. Sustainable Production of Fuel : A Study for Customer Adoption of 2nd Generation of Biofuel

    OpenAIRE

    Jin, Ying

    2010-01-01

    Abstract Finding a new fuel to substitute gasoline which reducing rapidly every year, is an urgent problem in the world. In this situation, biofuel is considered to be one kind of new fuel which make no pollution. Nowadays, 1st generation biofuel is familiar with people and adopted by customers, which make it have a stable market share. Since it also brings new problems, 2nd generation biofuel appear and solve all the problems.In the thesis, I compared the pros and cons between the 1st and 2n...

  15. Systems Engineering Approach to Technology Integration for NASA's 2nd Generation Reusable Launch Vehicle

    Science.gov (United States)

    Thomas, Dale; Smith, Charles; Thomas, Leann; Kittredge, Sheryl

    2002-01-01

    The overall goal of the 2nd Generation RLV Program is to substantially reduce technical and business risks associated with developing a new class of reusable launch vehicles. NASA's specific goals are to improve the safety of a 2nd-generation system by 2 orders of magnitude - equivalent to a crew risk of 1-in-10,000 missions - and decrease the cost tenfold, to approximately $1,000 per pound of payload launched. Architecture definition is being conducted in parallel with the maturating of key technologies specifically identified to improve safety and reliability, while reducing operational costs. An architecture broadly includes an Earth-to-orbit reusable launch vehicle, on-orbit transfer vehicles and upper stages, mission planning, ground and flight operations, and support infrastructure, both on the ground and in orbit. The systems engineering approach ensures that the technologies developed - such as lightweight structures, long-life rocket engines, reliable crew escape, and robust thermal protection systems - will synergistically integrate into the optimum vehicle. To best direct technology development decisions, analytical models are employed to accurately predict the benefits of each technology toward potential space transportation architectures as well as the risks associated with each technology. Rigorous systems analysis provides the foundation for assessing progress toward safety and cost goals. The systems engineering review process factors in comprehensive budget estimates, detailed project schedules, and business and performance plans, against the goals of safety, reliability, and cost, in addition to overall technical feasibility. This approach forms the basis for investment decisions in the 2nd Generation RLV Program's risk-reduction activities. Through this process, NASA will continually refine its specialized needs and identify where Defense and commercial requirements overlap those of civil missions.

  16. Life Cycle Systems Engineering Approach to NASA's 2nd Generation Reusable Launch Vehicle

    Science.gov (United States)

    Thomas, Dale; Smith, Charles; Safie, Fayssal; Kittredge, Sheryl

    2002-01-01

    The overall goal of the 2nd Generation RLV Program is to substantially reduce technical and business risks associated with developing a new class of reusable launch vehicles. NASA's specific goals are to improve the safety of a 2nd- generation system by 2 orders of magnitude - equivalent to a crew risk of 1 -in- 10,000 missions - and decrease the cost tenfold, to approximately $1,000 per pound of payload launched. Architecture definition is being conducted in parallel with the maturating of key technologies specifically identified to improve safety and reliability, while reducing operational costs. An architecture broadly includes an Earth-to-orbit reusable launch vehicle, on-orbit transfer vehicles and upper stages, mission planning, ground and flight operations, and support infrastructure, both on the ground and in orbit. The systems engineering approach ensures that the technologies developed - such as lightweight structures, long-life rocket engines, reliable crew escape, and robust thermal protection systems - will synergistically integrate into the optimum vehicle. Given a candidate architecture that possesses credible physical processes and realistic technology assumptions, the next set of analyses address the system's functionality across the spread of operational scenarios characterized by the design reference missions. The safety/reliability and cost/economics associated with operating the system will also be modeled and analyzed to answer the questions "How safe is it?" and "How much will it cost to acquire and operate?" The systems engineering review process factors in comprehensive budget estimates, detailed project schedules, and business and performance plans, against the goals of safety, reliability, and cost, in addition to overall technical feasibility. This approach forms the basis for investment decisions in the 2nd Generation RLV Program's risk-reduction activities. Through this process, NASA will continually refine its specialized needs and identify where Defense and commercial requirements overlap those of civil missions.

  17. Boosting biogas yield of anaerobic digesters by utilizing concentrated molasses from 2nd generation bioethanol plant

    OpenAIRE

    Shiplu Sarker, Henrik Bjarne Møller

    2013-01-01

    Concentrated molasses (C5 molasses) from 2nd generation bioethanol plant has been investigated for enhancing productivity of manure based digesters. A batch study at mesophilic condition (35±1C) showed the maximum methane yield from molasses as 286 LCH4/kgVS which was approximately 63% of the calculated theoretical yield. In addition to the batch study, co-digestion of molasses with cattle manure in a semi-continuously stirred reactor at thermophilic temperature (50±1°C) was also performed wi...

  18. Effects of Thermal Cycling on Control and Irradiated EPC 2nd Generation GaN FETs

    Science.gov (United States)

    Patterson, Richard L.; Scheick, Leif; Lauenstein, Jean-Marie; Casey, Megan; Hammoud, Ahmad

    2013-01-01

    The power systems for use in NASA space missions must work reliably under harsh conditions including radiation, thermal cycling, and exposure to extreme temperatures. Gallium nitride semiconductors show great promise, but information pertaining to their performance is scarce. Gallium nitride N-channel enhancement-mode field effect transistors made by EPC Corporation in a 2nd generation of manufacturing were exposed to radiation followed by long-term thermal cycling in order to address their reliability for use in space missions. Results of the experimental work are presented and discussed.

  19. The Planar Optics Phase Sensor: a study for the VLTI 2nd Generation Fringe Tracker

    CERN Document Server

    Blind, Nicolas; Absil, Olivier; Alamir, Mazen; Berger, Jean-Philippe; Defrère, Denis; Feautrier, Philippe; Hénault, Franois; Jocou, Laurent; Kern, Pierre; Laurent, Thomas; Malbet, Fabien; Mourard, Denis; Rousselet-Perrault, Karine; Sarlette, Alain; Surdej, Jean; Tarmoul, Nassima; Tatulli, Eric; Vincent, Lionel; 10.1117/12.857114

    2010-01-01

    In a few years, the second generation instruments of the Very Large Telescope Interferometer (VLTI) will routinely provide observations with 4 to 6 telescopes simultaneously. To reach their ultimate performance, they will need a fringe sensor capable to measure in real time the randomly varying optical paths differences. A collaboration between LAOG (PI institute), IAGL, OCA and GIPSA-Lab has proposed the Planar Optics Phase Sensor concept to ESO for the 2nd Generation Fringe Tracker. This concept is based on the integrated optics technologies, enabling the conception of extremely compact interferometric instruments naturally providing single-mode spatial filtering. It allows operations with 4 and 6 telescopes by measuring the fringes position thanks to a spectrally dispersed ABCD method. We present here the main analysis which led to the current concept as well as the expected on-sky performance and the proposed design.

  20. Boosting biogas yield of anaerobic digesters by utilizing concentrated molasses from 2nd generation bioethanol plant

    Energy Technology Data Exchange (ETDEWEB)

    Sarker, Shiplu [Department of Renewable Energy, Faculty of Engineering and Science, University of Agder, Grimstad-4879 (Norway); Moeller, Henrik Bjarne [Department of Biosystems Engineering, Faculty of Science and Technology, Aarhus University, Research center Foulum, Blichers Alle, Post Box 50, Tjele-8830 (Denmark)

    2013-07-01

    Concentrated molasses (C5 molasses) from 2nd generation bioethanol plant has been investigated for enhancing productivity of manure based digesters. A batch study at mesophilic condition (35+- 1 deg C) showed the maximum methane yield from molasses as 286 LCH4/kgVS which was approximately 63% of the calculated theoretical yield. In addition to the batch study, co-digestion of molasses with cattle manure in a semi-continuously stirred reactor at thermophilic temperature (50+- 1 deg C) was also performed with a stepwise increase in molasses concentration. The results from this experiment revealed the maximum average biogas yield of 1.89 L/L/day when 23% VSmolasses was co-digested with cattle manure. However, digesters fed with more than 32% VSmolasses and with short adaptation period resulted in VFA accumulation and reduced methane productivity indicating that when using molasses as biogas booster this level should not be exceeded.

  1. Boosting biogas yield of anaerobic digesters by utilizing concentrated molasses from 2nd generation bioethanol plant

    Directory of Open Access Journals (Sweden)

    Shiplu Sarker, Henrik Bjarne Møller

    2013-01-01

    Full Text Available Concentrated molasses (C5 molasses from 2nd generation bioethanol plant has been investigated for enhancing productivity of manure based digesters. A batch study at mesophilic condition (35±1C showed the maximum methane yield from molasses as 286 LCH4/kgVS which was approximately 63% of the calculated theoretical yield. In addition to the batch study, co-digestion of molasses with cattle manure in a semi-continuously stirred reactor at thermophilic temperature (50±1°C was also performed with a stepwise increase in molasses concentration. The results from this experiment revealed the maximum average biogas yield of 1.89 L/L/day when 23% VSmolasses was co-digested with cattle manure. However, digesters fed with more than 32% VSmolasses and with short adaptation period resulted in VFA accumulation and reduced methane productivity indicating that when using molasses as biogas booster this level should not be exceeded.

  2. The New 2nd-Generation SRF RandD Facility at Jefferson Lab: TEDF

    International Nuclear Information System (INIS)

    The US Department of Energy has funded a near-complete renovation of the SRF-based accelerator research and development facilities at Jefferson Lab. The project to accomplish this, the Technical and Engineering Development Facility (TEDF) Project has completed the first of two phases. An entirely new 3,100 m2 purpose-built SRF technical work facility has been constructed and was occupied in summer of 2012. All SRF work processes with the exception of cryogenic testing have been relocated into the new building. All cavity fabrication, processing, thermal treatment, chemistry, cleaning, and assembly work is collected conveniently into a new LEED-certified building. An innovatively designed 800 m2 cleanroom/chemroom suite provides long-term flexibility for support of multiple RandD and construction projects as well as continued process evolution. The characteristics of this first 2nd-generation SRF facility are described

  3. Conceptual design study of Nb3Sn low-beta quadrupoles for 2nd generation LHC IRs

    Energy Technology Data Exchange (ETDEWEB)

    Alexander V Zlobin et al.

    2002-10-22

    Conceptual designs of 90-mm aperture high-gradient quadrupoles based on the Nb{sub 3}Sn superconductor, are being developed at Fermilab for possible 2nd generation IRs with the similar optics as in the current low-beta insertions. Magnet designs and results of magnetic, mechanical, thermal and quench protection analysis for these magnets are presented and discussed.

  4. Improved beam spot measurements in the 2nd generation proton beam writing system

    International Nuclear Information System (INIS)

    Nanosized ion beams (especially proton and helium) play a pivotal role in the field of ion beam lithography and ion beam analysis. Proton beam writing has shown lithographic details down to the 20 nm level, limited by the proton beam spot size. Introducing a smaller spot size will allow smaller lithographic features. Smaller probe sizes, will also drastically improve the spatial resolution for ion beam analysis techniques. Among many other requirements, having an ideal resolution standard, used for beam focusing and a reliable focusing method, is an important pre-requisite for sub-10 nm beam spot focusing. In this paper we present the fabrication processes of a free-standing resolution standard with reduced side-wall projection and high side-wall verticality. The resulting grid is orthogonal (90.0° ± 0.1), has smooth edges with better than 6 nm side-wall projection. The new resolution standard has been used in focusing a 2 MeV H2+ beam in the 2nd generation PBW system at Center for Ion Beam Applications, NUS. The beam size has been characterized using on- and off-axis scanning transmission ion microscopy (STIM) and ion induced secondary electron detection, carried out with a newly installed micro channel plate electron detector. The latter has been shown to be a realistic alternative to STIM measurements, as the drawback of PIN diode detector damage is alleviated. With these improvements we show reproducible beam focusing down to 14 nm

  5. Next Generation Millimeter/Submillimeter Array to Search for 2nd Earth

    CERN Document Server

    Saito, Masao

    2011-01-01

    ALMA is a revolutionary radio telescope at present and its full operation will start from 2012. It is expected that ALMA will resolve several cosmic questions and will show a new cosmic view to us. Our passion for astronomy naturally goes beyond ALMA because we believe that the 21st-century Astronomy should pursue the new scientific frontier. In this conference, we propose a project of the future radio telescope to search for Habitable planets and finally detect 2nd Earth as a Migratable planet. The detection of 2nd Earth is one of ultimate dreams for not only astronomers but also people.

  6. The 2nd Generation Street Children (SGSC) in Accra: Developing Teaching Strategies To Enhance Positive Learning Outcomes in Schools

    OpenAIRE

    Alhassan Abdul-Razak Kuyini; Okechuwu Abosi

    2011-01-01

    Ghana is witnessing an increasing number of 2nd generation street children (SGSC) living in the street of Accra, the capital city as a result of many factors including teenage pregnancy among street girls, ethnic conflicts and rural-urban migration. Street presents enormous risks to street children; they are excluded from safe-family environment, basic services like health and education, and protection against exploitation. This article explored the inclusion of 27 SGSC in regular schools in ...

  7. Generation of higher order Gauss-Laguerre modes in single-pass 2nd harmonic generation

    DEFF Research Database (Denmark)

    Buchhave, Preben; Tidemand-Lichtenberg, Peter

    2008-01-01

    We present a realistic method for dynamic simulation of the development of higher order modes in second harmonic generation. The deformation of the wave fronts due to the nonlinear interaction is expressed by expansion in higher order Gauss-Laguerre modes.

  8. Generative Software Development

    OpenAIRE

    Rumpe, Bernhard; Schindler, Martin; Völkel, Steven; Weisemöller, Ingo

    2014-01-01

    Generation of software from modeling languages such as UML and domain specific languages (DSLs) has become an important paradigm in software engineering. In this contribution, we present some positions on software development in a model based, generative manner based on home grown DSLs as well as the UML. This includes development of DSLs as well as development of models in these languages in order to generate executable code, test cases or models in different languages. Dev...

  9. 1st or 2nd generation bioethanol-impacts of technology integration & on feed production and land use

    DEFF Research Database (Denmark)

    Bentsen, Niclas Scott; Felby, Claus

    2009-01-01

    "1st or 2nd generation bioethanol-impacts of technology integration & on feed production and land use" Liquid bio fuels are perceived as a means of mitigating CO2 emissions from transport and thus climate change, but much concern has been raised to the energy consumption from refining biomass to liquid fuels. Integrating technologies such that waste stream can be used will reduce energy consumption in the production of bioethanol from wheat. We show that the integration of bio refining and combined heat an power generation reduces process energy requirements with 30-40 % and makes bioethanol production comparable to gasoline production in terms of energy loss. Utilisation of biomass in the energy sector is inevitably linked to the utilisation of land. This is a key difference between fossil and bio based energy systems. Thus evaluations of bioethanol production based on energy balances alone are inadequate. 1st and 2nd generation bioethanol production exhibits major differences when evaluated on characteristics as feed energy and feed protein production and subsequently on land use changes. 1st generation bioethanol production based on wheat grain in Denmark may in fact reduce the pressure on agricultural land on a global scale, but increase the pressure on local/national scale. In contrast to that 2nd generation bioethanol based on wheat straw exhibits a poorer energy balance than 1st generation, but the induced imbalances on feed energy are smaller. Proteins are some of the plant components with the poorest bio synthesis efficiency and as such the area demand for their production is relatively high. Preservation of the proteins in the biomass such as in feed by-products from bioethanol production is of paramount importance in developing sustainable utilisation of biomass in the energy sector.

  10. White paper on perspectives of biofuels in Denmark - with focus on 2nd generation bioethanol; Hvidbog om perspektiver for biobraendstoffer i Danmark - med fokus paa 2. generations bioethanol

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, Gy.; Foghmar, J.

    2009-11-15

    The white paper presents the perspectives - both options and barriers - for a Danish focus on production and use of biomass, including sustainable 2nd generation bioethanol, for transport. The white paper presents the current knowledge of biofuels and bioethanol and recommendations for a Danish strategy. (ln)

  11. Time resolved 2nd harmonic generation at LaAlO3/SrTiO3 Interfaces

    Science.gov (United States)

    Adhikari, Sanjay; Eom, Chang-Beom; Ryu, Sangwoo; Cen, Cheng

    2014-03-01

    Ultrafast spectroscopy can produce information of carrier/lattice dynamics, which is especially valuable for understanding phase transitions at LaAlO3/SrTiO3 interfaces. LaAlO3 (LAO) and SrTiO3 (STO) are both associated with wide band gap, which allows deep penetration of commonly used laser wavelengths and therefore usually leads to overwhelming bulk signal background. Here we report a time resolved study of a 2nd harmonic generation (SHG) signal resulting from impulsive below-the-band-gap optical pumping. The nonlinear nature of the signal enables us to probe the interface directly. Output of a home built Ti:Sapphire laser and BBO crystal were used to generate 30fs pulses of two colors (405nm and 810nm). The 405nm pulse was used to pump the LAO/STO interfaces, while 2nd harmonics of the 810nm pulse generated at the interfaces was probed as a function of the time delay. Signals from samples with varying LAO thicknesses clearly correlates to the metal-insulator transition. Distinct time dependent signals were observed at LAO/STO interfaces grown on different substrates. Experiments performed at different optical polarization geometries, interface electric fields and temperatures allow us to paint a clearer picture of the novel oxide heterostructures under investigation.

  12. Clinical evaluation of the 2nd generation radio-receptor assay for anti-thyrotropin receptor antibodies (TRAb) in Graves' disease

    International Nuclear Information System (INIS)

    Full text: Detection of autoantibodies to the TSH receptor by radioreceptorial assays (RRA) is largely requested in clinical practice for the diagnosis of Graves' disease and its differentiation from diffuse thyroid autonomy. Additionally, TRAb measurement during antithyroid drug treatment can be useful to evaluate the risk of disease's relapse alter therapy discontinuation. Nevertheless, some patients affected by Graves' disease are TRAb negative when 1st generation assay is used. Recently a new RRA method for TRAb assay was developed by using human recombinant TSH-receptor and solid-phase technique. Aim of our work was the comparison between 1st and 2nd generation TRAb assays in Graves' disease patients and, particularly, the evaluation of 2nd generation test in a sub-group of patients affected by Graves' disease but with negative 1st generation TRAb assay. We evaluated the diagnostic performance of a newly developed 2nd generation TRAb assay (DYNOtest(r) TRAK human, BRAHMS Diagnostica GmbH, Germany) in 46 patients affected by Graves' disease with negative 1st generation TRAb assay (TRAK Assay(r), BRAHMS Diagnostica GmbH, Germany) . A control groups of 50 Graves' disease patients with positive 1st generation TRAb assay, 50 patients affected by Hashimoto's thyroiditis and 50 patients affected by nodular goiter were also examined. 41 out of 46 patients affected by Graves' disease with negative 1st generation TRAb assay showed a positive 2nd generation test. The overall sensitivity of the 2nd generation test was significantly improved respect the 1st generation assay in Graves' disease patients (?2 = 22.5, p<0.0001). 1 and 3 out of 50 patients affected by Hashimoto's thyroiditis were positive by 1st and 2nd generation TRAB assay, respectively. All these patients showed primary hypothyroidism. No differences resulted in euthyroid Hashimoto's thyroiditis sub-group and in nodular goiter control group. The 2nd generation TRAB assay is clearly more sensitive than the 1st generation test and should be used in clinical practice to minimize the incidence of TRAb negative Graves' disease. Long-term prospective studies are needed to evaluate the prognostic role of 2nd generation TRAb assay in Graves' disease treated by antithyroid drugs. (author)

  13. REFUEL. Potential and realizable cost reduction of 2nd generation biofuels

    International Nuclear Information System (INIS)

    In the REFUEL project steering possibilities for and impacts of a greater market penetration of biofuels are assessed. Several benefits are attributed to second generation biofuels, fuels made from lignocellulosic feedstock, such as higher productivity, less impacts on land use and food markets and improved greenhouse gas emission reductions. The chances of second generation biofuels entering the market autonomously are assessed and several policy measures enhancing those changes are evaluated. It shows that most second generation biofuels might become competitive in the biofuel market, if the production of biodiesel from oil crops becomes limited by land availability. Setting high biofuel targets, setting greenhouse gas emissions caps on biofuel and setting subtargets for second generation biofuels, all have a similar impact of stimulating second generation's entrance into the biofuel market. Contrary, low biofuel targets and high imports can have a discouraging impact on second generation biofuel development, and thereby on overall greenhouse gas performance. Since this paper shows preliminary results from the REFUEL study, one is advised to contact the authors before quantitatively referring to this paper

  14. Self-assembling software generator

    Science.gov (United States)

    Bouchard, Ann M. (Albuquerque, NM); Osbourn, Gordon C. (Albuquerque, NM)

    2011-11-25

    A technique to generate an executable task includes inspecting a task specification data structure to determine what software entities are to be generated to create the executable task, inspecting the task specification data structure to determine how the software entities will be linked after generating the software entities, inspecting the task specification data structure to determine logic to be executed by the software entities, and generating the software entities to create the executable task.

  15. Bellman's GAP : a 2nd generation language and system for algebraic dynamic programming

    OpenAIRE

    Sauthoff, Georg

    2010-01-01

    The dissertation describes the new Bellman?s GAP which is a programming system for writing dynamic programming algorithms over sequential data. It is the second generation implementation of the algebraic dynamic programming framework (ADP). The system includes the multi-paradigm language (GAP-L), its compiler (GAP-C), functional modules (GAP-M) and a web site (GAP Pages) to experiment with GAP-L programs. GAP-L includes declarative constructs, e.g. tree grammars to model the search space, an...

  16. Cogeneration and production of 2nd generation bio fuels using biomass gasification; Cogeneracion y produccion de biocombustibles de 2 generacion mediante gasificacion de biomasa

    Energy Technology Data Exchange (ETDEWEB)

    Uruena Leal, A.; Diez Rodriguez, D.; Antolin Giraldo, G.

    2011-07-01

    Thermochemical decomposition process of gasification, in which a carbonaceous fuel, under certain conditions of temperature and oxygen deficiency, results in a series of reactions that will produce a series of gaseous products is now widely used for high performance energetic and versatility of these gaseous products for energy and 2nd generation bio fuels and reduce the emission of greenhouse gases. (Author)

  17. Next generation LP system for maintenance in nuclear power reactors (2nd report)

    International Nuclear Information System (INIS)

    Laser peening is a surface enhancement process that introduces compressive residual stress on materials by irradiating laser pulses under aqueous environment. The process utilizes the impulsive effect of high-pressure plasma generated by ablative interaction of each laser pulse. Around a decade ago, the authors invented a new process of laser peening (LP) without any surface preparation, while the conventional types required coating that prevented the surface from melting. Taking advantage of the new process without surface preparation, we have applied laser peening without coating to nuclear power plants as a preventive maintenance against stress corrosion cracking (SCC). Toshiba released the first LP system in 1999, which delivered laser pulses through waterproof pipes with mirrors. In 2002, fiber-delivery was attained and significantly extended the applicability. Now, the development of a new system has been just accomplished, which is extremely simple, reliable and easy-handled. (author)

  18. Using 2nd generation tyrosine kinase inhibitors in frontline management of chronic phase chronic myeloid leukemia

    Science.gov (United States)

    Jayakar, Vishal

    2014-01-01

    Choices in medicine come with responsibility. With several TKI's (Tyrosine kinase inhibitors) available for front-line management of CML (Chronic Myeloid Leukemia), an astute clinician has to personalise, rationalise and take a pragmatic approach towards selection of the best drug for the ‘patient in question’. Though it is hotly debated as to which TKI will triumph, the truth of this debate lies in individualising treatment rather than a general ‘all size fits all’ approach with imatinib. I personally believe that the second generation TKI's will suit most patient clinical profiles rather than prescribing imatinib to all and I have strived to make a strong case for them in front line treatment of CML. Though Imatinib may remain the first line choice for some patients, my efforts in this debate are mainly geared towards breaking the myth that imatinib is the sole ‘block buster’ on the CML landscape PMID:24665456

  19. Using 2(nd) generation tyrosine kinase inhibitors in frontline management of chronic phase chronic myeloid leukemia.

    Science.gov (United States)

    Jayakar, Vishal

    2014-01-01

    Choices in medicine come with responsibility. With several TKI's (Tyrosine kinase inhibitors) available for front-line management of CML (Chronic Myeloid Leukemia), an astute clinician has to personalise, rationalise and take a pragmatic approach towards selection of the best drug for the 'patient in question'. Though it is hotly debated as to which TKI will triumph, the truth of this debate lies in individualising treatment rather than a general 'all size fits all' approach with imatinib. I personally believe that the second generation TKI's will suit most patient clinical profiles rather than prescribing imatinib to all and I have strived to make a strong case for them in front line treatment of CML. Though Imatinib may remain the first line choice for some patients, my efforts in this debate are mainly geared towards breaking the myth that imatinib is the sole 'block buster' on the CML landscape. PMID:24665456

  20. The 2nd Generation Street Children (SGSC in Accra: Developing Teaching Strategies To Enhance Positive Learning Outcomes in Schools

    Directory of Open Access Journals (Sweden)

    Alhassan Abdul-Razak Kuyini

    2011-10-01

    Full Text Available Ghana is witnessing an increasing number of 2nd generation street children (SGSC living in the street of Accra, the capital city as a result of many factors including teenage pregnancy among street girls, ethnic conflicts and rural-urban migration. Street presents enormous risks to street children; they are excluded from safe-family environment, basic services like health and education, and protection against exploitation. This article explored the inclusion of 27 SGSC in regular schools in Accra. Using qualitative methods, we obtained data from 15 teachers and social workers directly involved in the inclusion project. Finding revealed that SGSC were provided with pedagogical materials, daily feeding and school-related needs to encourage school attendance. To enhance positive learning outcomes of SGSC, teachers employed explicit instruction, cooperative learning, and social skills instruction. The study concludes that inclusion of SGSC in regular schools requires a willing and responsible school leadership; a comprehensive needs assessment including street mapping and social investigation on SGSC; financial support; and training of school personnel on streetism.

  1. Strategies for 2nd generation biofuels in EU - Co-firing to stimulate feedstock supply development and process integration to improve energy efficiency and economic competitiveness

    International Nuclear Information System (INIS)

    The present biofuel policies in the European Union primarily stimulate 1st generation biofuels that are produced based on conventional food crops. They may be a distraction from lignocellulose based 2nd generation biofuels - and also from biomass use for heat and electricity - by keeping farmers' attention and significant investments focusing on first generation biofuels and the cultivation of conventional food crops as feedstocks. This article presents two strategies that can contribute to the development of 2nd generation biofuels based on lignocellulosic feedstocks. The integration of gasification-based biofuel plants in district heating systems is one option for increasing the energy efficiency and improving the economic competitiveness of such biofuels. Another option, biomass co-firing with coal, generates high-efficiency biomass electricity and reduces CO2 emissions by replacing coal. It also offers a near-term market for lignocellulosic biomass, which can stimulate development of supply systems for biomass also suitable as feedstock for 2nd generation biofuels. Regardless of the long-term priorities of biomass use for energy, the stimulation of lignocellulosic biomass production by development of near term and cost-effective markets is judged to be a no-regrets strategy for Europe. Strategies that induce a relevant development and exploit existing energy infrastructures in order to reduce risk and reach lower costs, are proposed an attractive complement the present and prospective biofuel policies.

  2. Experimental Investigation of 2nd Generation Bioethanol Derived from Empty-Fruit-Bunch (EFB) of Oil-Palmon Performance and Exhaust Emission of SI Engine

    OpenAIRE

    Yanuandri Putrasari; Haznan Abimanyu; Achmad Praptijanto; Arifin Nur; Yan Irawan; Sabar Pangihutan

    2014-01-01

    The experimental investigation of 2nd generation bioethanol derived from EFB of oil-palm blended with gasoline for 10, 20, 25% by volume and pure gasoline were conducted on performance and exhaust emission tests of SI engine. A four stroke, four cylinders, programmed fuel injection (PGMFI), 16 valves variable valve timing and electronic lift control (VTEC), single overhead camshaft (SOHC), and 1497 cm3 SI engine (Honda/L15A) was used in this investigation. Engine performance test was carried ...

  3. Development of WWER-440 fuel. Use of fuel assemblies of 2-nd and 3-rd generations with increased enrichment

    International Nuclear Information System (INIS)

    The problem of increasing the power of units at NPPs with WWER-440 is of current importance. There are all the necessary prerequisites for the above-stated problem as a result of updating the design of fuel assemblies and codes. The decrease of power peaking factor in the core is achieved by using profiled fuel assemblies, fuel-integrated burning absorber, FAs with modernized docking unit, modern codes, which allows decreasing conservatism of RP safety substantiation. A wide range of experimental studies of fuel behaviour has been performed which has reached burn-up of (50-60) MW·day/kgU in transition and emergency conditions, post-reactor studies of fuel assemblies, fuel rods and fuel pellets with a 5-year operating period have been performed, which prove high reliability of fuel, presence of a large margin in the fuel pillar, which helps reactor operation at increased power. The results of the work performed on introduction of 5-6 fuel cycles show that the ultimate fuel state on operability in WWER-440 reactors is far from being achieved. Neutron-physical and thermal-hydraulic characteristics of the cores of working power units with RP V-213 are such that actual (design and measured) power peaking factors on fuel assemblies and fuel rods, as a rule, are smaller than the maximum design values. This factor is a real reserve for power forcing. There is experience of operating Units 1, 2, 4 of the Kola NPP and Unit 2 of the Rovno NPP at increased power. Units of the Loviisa NPP are operated at 109 % power. During transfer to work at increased power it is reasonable to use fuel assemblies with increased height of the fuel pillar, which allows decreasing medium linear power distribution. Further development of the 2-nd generation fuel assembly design and consequent transition to working fuel assemblies of the 3-rd generation provides significant improvement of fuel consumption under the conditions of WWER-440 reactors operation with more continuous fuel cycles and increased power

  4. Open pit mine planning and design. Vol 1. Fundamentals; Vol. 2. CSMine software package and orebodey case examples. 2nd.

    Energy Technology Data Exchange (ETDEWEB)

    Hustrulid, W.; Kuchta, M. [University of Utah, Salt Lake City, UT (United States)

    2006-04-15

    This book is designed to be both a textbook and a reference book describing the principles involved in the planning and design of open pit mines. Volume 1 deals with the fundamental concepts involved in the planning and design of an open pit mine. The eight chapters cover mine planning, mining revenues and costs, orebody description, geometrical considerations, pit limits, and production planning, mineral resources and ore reserves, and responsible mining. There is an extensive coverage of environmental considerations and basic economic principles. A large number of examples have been included to illustrate the applications. A second volume is devoted to a mine design and software package, CSMine. CSMine is user-friendly mine planning and design software developed specifically to illustrate the practical application of the involved principles. It also comprises the CSMine tutorial, the CSMine user's manual and eight orebody case examples, including drillhole data sets for performing a complete open pit mine evaluation. 545 ills., 211 tabs.

  5. Generative Software Engineering

    OpenAIRE

    Jézéquel, Jean-Marc

    2007-01-01

    Researching evermore abstract and powerful ways of composing programs is the meat of software engineering for half a century. Important early steps were subroutines (to encapsulate actions) and records (to encapsulate data). A large step forward came with the introduction of the object-oriented concepts (classes, subclasses and virtual methods) where classes can encapsulate both data and behaviors in a very powerful, but still flexible, way. For a long time, these concepts dominated the scene...

  6. 2nd Avenue Online

    Science.gov (United States)

    Over a century ago, Yiddish theater was all the rage in New York and other major American cities with a sizable Jewish population. A wide range of well known performers (such as Paul Muni and Leonard Nimoy) cut their teeth on these stages. Of course, the 2nd Avenue corridor in New York City held many of these Yiddish theaters and this site from the New York University Libraries seeks "to capture the memory and to convey the feel of 2nd Avenue as a living part of the history and culture of New York and America." Visitors to the site can browse around the Multimedia area to listen to oral histories or check out some video clips. The Photos area includes a history of Yiddish theater in New York along with several family photo albums. The site is rounded out by a collection of related radio programs and stations.

  7. Space Ops 2002: Bringing Space Operations into the 21st Century. Track 3: Operations, Mission Planning and Control. 2nd Generation Reusable Launch Vehicle-Concepts for Flight Operations

    Science.gov (United States)

    Hagopian, Jeff

    2002-01-01

    With the successful implementation of the International Space Station (ISS), the National Aeronautics and Space Administration (NASA) enters a new era of opportunity for scientific research. The ISS provides a working laboratory in space, with tremendous capabilities for scientific research. Utilization of these capabilities requires a launch system capable of routinely transporting crew and logistics to/from the ISS, as well as supporting ISS assembly and maintenance tasks. The Space Shuttle serves as NASA's launch system for performing these functions. The Space Shuttle also serves as NASA's launch system for supporting other science and servicing missions that require a human presence in space. The Space Shuttle provides proof that reusable launch vehicles are technically and physically implementable. However, a couple of problems faced by NASA are the prohibitive cost of operating and maintaining the Space Shuttle and its relative inability to support high launch rates. The 2nd Generation Reusable Launch Vehicle (2nd Gen RLV) is NASA's solution to this problem. The 2nd Gen RLV will provide a robust launch system with increased safety, improved reliability and performance, and less cost. The improved performance and reduced costs of the 2nd Gen RLV will free up resources currently spent on launch services. These resource savings can then be applied to scientific research, which in turn can be supported by the higher launch rate capability of the 2nd Gen RLV. The result is a win - win situation for science and NASA. While meeting NASA's needs, the 2nd Gen RLV also provides the United States aerospace industry with a commercially viable launch capability. One of the keys to achieving the goals of the 2nd Gen RLV is to develop and implement new technologies and processes in the area of flight operations. NASA's experience in operating the Space Shuttle and the ISS has brought to light several areas where automation can be used to augment or eliminate functions performed by crew and ground controllers. This experience has also identified the need for new approaches to staffing and training for both crew and ground controllers. This paper provides a brief overview of the mission capabilities provided by the 2nd Gen RLV, a description of NASA's approach to developing the 2nd Gen RLV, a discussion of operations concepts, and a list of challenges to implementing those concepts.

  8. Anaerobic digestion in combination with 2nd generation ethanol production for maximizing biofuels yield from lignocellulosic biomass – testing in an integrated pilot-scale biorefinery plant.

    DEFF Research Database (Denmark)

    Uellendahl, Hinrich; Ahring, Birgitte Kiær

    An integrated biorefinery concept for 2nd generation bioethanol production together with biogas production from the fermentation effluent was tested in pilot-scale. The pilot plant comprised pretreatment, enzymatic hydrolysis, hexose and pentose fermentation into ethanol and anaerobic digestion of the fermentation effluent in a UASB (upflow anaerobic sludge blanket) reactor. Operation of the 770 liter UASB reactor was tested under both mesophilic (38ºC) and thermophilic (53ºC) conditions with increasing loading rates of the liquid fraction of the effluent from ethanol fermentation. At an OLR of 3.5 kg-VS/(m3•d) a methane yield of 340 L/kg-VS was achieved for thermophilic operation while 270 L/kg-VS was obtained under mesophilic conditions. Thermophilic operation was, however, less robust towards further increase of the loading rate and for loading rates higher than 5 kg-VS/(m3•d) the yield was higher for mesophilic than for thermophilic operation. The effluent from the ethanol fermentation showed no signs of toxicity to the anaerobic microorganisms. Implementation of the biogas production from the fermentation effluent accounted for about 30% higher biofuels yield in the biorefinery compared to a system with only bioethanol production.

  9. Experimental Investigation of 2nd Generation Bioethanol Derived from Empty-fruit-bunch (EFB of Oil-palm on Performance and Exhaust Emission of SI Engine

    Directory of Open Access Journals (Sweden)

    Yanuandri Putrasari

    2014-07-01

    Full Text Available The experimental investigation of 2nd generation bioethanol derived from EFB of oil-palm blended with gasoline for 10, 20, 25% by volume and pure gasoline were conducted on performance and exhaust emission tests of SI engine. A four stroke, four cylinders, programmed fuel injection (PGMFI, 16 valves variable valve timing and electronic lift control (VTEC, single overhead camshaft (SOHC, and 1,497 cm3 SI engine (Honda/L15A was used in this investigation. Engine performance test was carried out for brake torque, power, and fuel consumption. The exhaust emission was analyzed for carbon monoxide (CO and hydrocarbon (HC. The engine was operated on speed range from1,500 until 4,500 rev/min with 85% throttle opening position. The results showed that the highest brake torque of bioethanol blends achieved by 10% bioethanol content at 3,000 to 4,500 rpm, the brake power was greater than pure gasoline at 3,500 to 4,500 rpm for 10% bioethanol, and bioethanol-gasoline blends of 10 and 20% resulted greater bsfc than pure gasoline at low speed from 1,500 to 3,500 rpm. The trend of CO and HC emissions tended to decrease when the engine speed increased.

  10. Power plant intake quantification of wheat straw composition for 2nd generation bioethanol optimization : A Near Infrared Spectroscopy (NIRS) feasibility study

    DEFF Research Database (Denmark)

    Lomborg, Carina J.; Thomsen, Mette Hedegaard

    2010-01-01

    Optimization of 2nd generation bioethanol production from wheat straw requires comprehensive knowledge of plant intake feedstock composition. Near Infrared Spectroscopy is evaluated as a potential method for instantaneous quantification of the salient fermentation wheat straw components: cellulose (glucan), hemicelluloses (xylan, arabinan), and lignin. Aiming at chemometric multivariate calibration, 44 pre-selected samples were subjected to spectroscopy and reference analysis. For glucan and xylan prediction accuracies (slope: 0.89, 0.94) and precisions (r2: 0.87) were obtained, corresponding to error of prediction levels at 8–9%. Models for arabinan and lignin were marginally less good, and especially for lignin a further expansion of the feasibility dataset was deemed necessary. The results are related to significant influences from sub-sampling/mass reduction errors in the laboratory regimen. A relative high proportion of outliers excluded from the present models (10–20%) may indicate that comminution sample preparation is most likely always needed. Different solutions to these issues are suggested.

  11. Direct and non-destructive proof of authenticity for the 2nd generation of Brazilian real banknotes via easy ambient sonic spray ionization mass spectrometry.

    Science.gov (United States)

    Schmidt, Eduardo Morgado; Franco, Marcos Fernando; Regino, Karen Gomes; Lehmann, Eraldo Luiz; Arruda, Marco Aurélio Zezzi; de Carvalho Rocha, Werickson Fortunato; Borges, Rodrigo; de Souza, Wanderley; Eberlin, Marcos Nogueira; Correa, Deleon Nascimento

    2014-12-01

    Using a desorption/ionization technique, easy ambient sonic-spray ionization coupled to mass spectrometry (EASI-MS), documents related to the 2nd generation of Brazilian Real currency (R$) were screened in the positive ion mode for authenticity based on chemical profiles obtained directly from the banknote surface. Characteristic profiles were observed for authentic, seized suspect counterfeit and counterfeited homemade banknotes from inkjet and laserjet printers. The chemicals in the authentic banknotes' surface were detected via a few minor sets of ions, namely from the plasticizers bis(2-ethylhexyl)phthalate (DEHP) and dibutyl phthalate (DBP), most likely related to the official offset printing process, and other common quaternary ammonium cations, presenting a similar chemical profile to 1st-generation R$. The seized suspect counterfeit banknotes, however, displayed abundant diagnostic ions in the m/z 400-800 range due to the presence of oligomers. High-accuracy FT-ICR MS analysis enabled molecular formula assignment for each ion. The ions were separated by 44 m/z, which enabled their characterization as Surfynol® 4XX (S4XX, XX=40, 65, and 85), wherein increasing XX values indicate increasing amounts of ethoxylation on a backbone of 2,4,7,9-tetramethyl-5-decyne-4,7-diol (Surfynol® 104). Sodiated triethylene glycol monobutyl ether (TBG) of m/z 229 (C10H22O4Na) was also identified in the seized counterfeit banknotes via EASI(+) FT-ICR MS. Surfynol® and TBG are constituents of inks used for inkjet printing. PMID:25498934

  12. Overview on 1st and 2nd generation coal-fired membrane power plants (with and without turbo machinery in the membrane environment)

    Energy Technology Data Exchange (ETDEWEB)

    L. Blum; E. Riensche; J. Nazarko; R. Menzer; D. Stolten [Forschungszentrum Juelich GmbH Institute of Energy Research - Fuel Cells (IEF-3), Juelich (Germany)

    2009-07-01

    A systematic classification of the capture concepts with conventional separation as well as membrane separation is discussed in a 2-dimensional matrix: The 4 capture principles (post-combustion, oxyfuel, pre-combustion-capture of CO{sub 2} and pre-combustion-capture of H{sub 2}), characterized by the 4 separation tasks CO{sub 2}/N{sub 2}, O{sub 2}/N{sub 2}, CO{sub 2}/H{sub 2} and H{sub 2}/CO{sub 2}, have to be applied to the 3 different coal power plant (PP) routes: SPP (steam PP), IGCC/standard and IGCC/CO-shift/H{sub 2}-turbine. In case of membrane separation a further dimension of PP concepts is created by the fact, that different measures exist for realization of positive driving forces for permeation. For example the O{sub 2}/N{sub 2} separating membranes in oxyfuel SPPs can be operated with feed gas compression, permeate vacuum, application of a sweep gas at the permeate side or combinations of these 3 measures. An overview is given on the actually developed membrane PP concepts (post-combustion and oxyfuel in SPPs, pre-combustion in IGCC). In all cases energy consuming turbo machinery is required for membrane operation or for CO{sub 2} or H{sub 2} recompression in case of pre-combustion (1st generation of membrane coal PPs). Calculated efficiency losses are not significantly below 10 %-points. An outlook is given to a new IGCC concept, where a suitable sweep gas (N{sub 2} with low O{sub 2} content) of sufficient high flow rate is produced (related to the permeated H{sub 2}). Now the swept H{sub 2}/CO{sub 2} membrane operates without turbo machinery (2nd generation of membrane coal PPs). Lower efficiency losses (between 5 and 10 %-points) seem to be possible now. 10 refs., 18 figs.

  13. Automatic Code Generation for Instrument Flight Software

    Science.gov (United States)

    Wagstaff, Kiri L.; Benowitz, Edward; Byrne, D. J.; Peters, Ken; Watney, Garth

    2008-01-01

    Automatic code generation can be used to convert software state diagrams into executable code, enabling a model- based approach to software design and development. The primary benefits of this process are reduced development time and continuous consistency between the system design (statechart) and its implementation. We used model-based design and code generation to produce software for the Electra UHF radios that is functionally equivalent to software that will be used by the Mars Reconnaissance Orbiter (MRO) and the Mars Science Laboratory to communicate with each other. The resulting software passed all of the relevant MRO flight software tests, and the project provides a useful case study for future work in model-based software development for flight software systems.

  14. Quantification of left and right ventricular function and myocardial mass: Comparison of low-radiation dose 2nd generation dual-source CT and cardiac MRI

    International Nuclear Information System (INIS)

    Objective: To prospectively evaluate the accuracy of left and right ventricular function and myocardial mass measurements based on a dual-step, low radiation dose protocol with prospectively ECG-triggered 2nd generation dual-source CT (DSCT), using cardiac MRI (cMRI) as the reference standard. Materials and methods: Twenty patients underwent 1.5 T cMRI and prospectively ECG-triggered dual-step pulsing cardiac DSCT. This image acquisition mode performs low-radiation (20% tube current) imaging over the majority of the cardiac cycle and applies full radiation only during a single adjustable phase. Full-radiation-phase images were used to assess cardiac morphology, while low-radiation-phase images were used to measure left and right ventricular function and mass. Quantitative CT measurements based on contiguous multiphase short-axis reconstructions from the axial CT data were compared with short-axis SSFP cardiac cine MRI. Contours were manually traced around the ventricular borders for calculation of left and right ventricular end-diastolic volume, end-systolic volume, stroke volume, ejection fraction and myocardial mass for both modalities. Statistical methods included independent t-tests, the Mann–Whitney U test, Pearson correlation statistics, and Bland–Altman analysis. Results: All CT measurements of left and right ventricular function and mass correlated well with those from cMRI: for left/right end-diastolic volume r = 0.885/0.801, left/right end-systolic volum5/0.801, left/right end-systolic volume r = 0.947/0.879, left/right stroke volume r = 0.620/0.697, left/right ejection fraction r = 0.869/0.751, and left/right myocardial mass r = 0.959/0.702. Mean radiation dose was 6.2 ± 1.8 mSv. Conclusions: Prospectively ECG-triggered, dual-step pulsing cardiac DSCT accurately quantifies left and right ventricular function and myocardial mass in comparison with cMRI with substantially lower radiation exposure than reported for traditional retrospective ECG-gating.

  15. Test Sequence Generation for Distributed Software System

    OpenAIRE

    Shuai Wang; Yindong Ji; Shiyuan Yang

    2011-01-01

    This paper considers the test case generation for distributed software (a test case contains one or more test sequences). Applying the single finite state machine (FSM) test approach to distributed software, we will suffer from some problems: 1) the state combinatorial explosion problem; 2) some unexecutable test cases may be generated; 3) some fault may be masked and cannot be isolated accurately. This paper proposed a new test case generation method based on the FSM net model. Instead of te...

  16. Developing software for Symbian OS 2nd edition a beginner''s guide to creating Symbian OS V9 smartphone applications in C++

    CERN Document Server

    Babin, Steve

    2008-01-01

    Many problems encountered by engineers developing code for specialized Symbian subsystems boil down to a lack of understanding of the core Symbian programming concepts. Developing Software for Symbian OS remedies this problem as it provides a comprehensive coverage of all the key concepts. Numerous examples and descriptions are also included, which focus on the concepts the author has seen developers struggle with the most. The book covers development ranging from low-level system programming to end user GUI applications. It also covers the development and packaging tools, as well as providing some detailed reference and examples for key APIs. The new edition includes a completely new chapter on platform security.The overall goal of the book is to provide introductory coverage of Symbian OS v9 and help developers with little or no knowledge of Symbian OS to develop as quickly as possible. There are few people with long Symbian development experience compared to demand, due to the rapid growth of Symbian in re...

  17. Experimental Stochatics (2nd edition)

    International Nuclear Information System (INIS)

    Otto Moeschlin and his co-authors have written a book about simulation of stochastic systems. The book comes with a CD-ROM that contains the experiments discussed in the book, and the text from the book is repeated on the CD-ROM. According to the authors, the aim of the book is to give a quick introduction to stochastic simulation for 'all persons interested in experimental stochastics'. To please this diverse audience, the authors offer a book that has four parts. Part 1, called 'Artificial Randomness', is the longest of the four parts. It gives an overview of the generation, testing and basic usage of pseudo random numbers in simulation. Although algorithms for generating sequences of random numbers are fundamental to simulation, it is a slightly unusual choice to give it such weight in comparison to other algorithmic topics. The remaining three parts consist of simulation case studies. Part 2, 'Stochastic Models', treats four problems - Buffon's needle, a queuing system, and two problems related to the kinetic theory of gases. Part 3 is called 'Stochastic Processes' and discusses the simulation of discrete time Markov chains, birth-death processes, Brownian motion and diffusions. The last section of Part 3 is about simulation as a tool to understand the traffic flow in a system controlled by stoplights, an area of research for the authors. Part 4 is called 'Evaluation of Statistical Procedures'. This section contains examples where simulation is used to test the pees where simulation is used to test the performance of statistical methods. It covers four examples: the Neymann-Pearson lemma, the Wald sequential test, Bayesian point estimation and Hartigan procedures. The CD-ROM contains an easy-to-install software package that runs under Microsoft Windows. The software contains the text and simulations from the book. What I found most enjoyable about this book is the number of topics covered in the case studies. The highly individual selection of applications, which may serve as a source of inspiration for teachers of computational stochastic methods, is the main contribution of this electronic monograph. However, both the book and software suffer from several severe problems. Firstly, I feel that the structure of the text is weak. Probably this is partly the result of the text from the CD-ROM being put into a book format, but the short paragraphs and poorly structured sentences destroy the reading experience. Secondly, although the software is functional, I believe that, like me, many users will be disappointed by the quality of the user interface and the visualizations. The opportunities to interact with the simulations are limited. Thirdly, the presentation is slightly old fashioned and lacking in pedagogical structure. For example, flow charts and Pascal programs are used to present algorithms. To conclude, I am surprised that this electronic monograph warranted a second edition in this form. Teachers may find the examples useful as a starting point, but students and researchers are advised to look elsewhere. (book review)

  18. Integration of health management and support systems is key to achieving cost reduction and operational concept goals of the 2nd generation reusable launch vehicle

    Science.gov (United States)

    Koon, Phillip L.; Greene, Scott

    2002-07-01

    Our aerospace customers are demanding that we drastically reduce the cost of operating and supporting our products. Our space customer in particular is looking for the next generation of reusable launch vehicle systems to support more aircraft like operation. To achieve this goal requires more than an evolution in materials, processes and systems, what is required is a paradigm shift in the design of the launch vehicles and the processing systems that support the launch vehicles. This paper describes the Automated Informed Maintenance System (AIM) we are developing for NASA's Space Launch Initiative (SLI) Second Generation Reusable Launch Vehicle (RLV). Our system includes an Integrated Health Management (IHM) system for the launch vehicles and ground support systems, which features model based diagnostics and prognostics. Health Management data is used by our AIM decision support and process aids to automatically plan maintenance, generate work orders and schedule maintenance activities along with the resources required to execute these processes. Our system will automate the ground processing for a spaceport handling multiple RLVs executing multiple missions. To accomplish this task we are applying the latest web based distributed computing technologies and application development techniques.

  19. A Practical GLR Parser Generator for Software Reverse Engineering

    OpenAIRE

    Teng Geng; Fu Xu; Han Mei; Wei Meng; Zhibo Chen; Changqing Lai

    2014-01-01

    Traditional parser generators use deterministic parsing methods. These methods can not meet the parsing requirements of software reverse engineering effectively. A new parser generator is presented which can generate GLR parser with automatic error recovery. The generated GLR parser has comparable parsing speed with the traditional LALR(1) parser and can be used in the parsing of software reverse engineering.

  20. A Practical GLR Parser Generator for Software Reverse Engineering

    Directory of Open Access Journals (Sweden)

    Teng Geng

    2014-03-01

    Full Text Available Traditional parser generators use deterministic parsing methods. These methods can not meet the parsing requirements of software reverse engineering effectively. A new parser generator is presented which can generate GLR parser with automatic error recovery. The generated GLR parser has comparable parsing speed with the traditional LALR(1 parser and can be used in the parsing of software reverse engineering.

  1. Techno-economic evaluation of 2nd generation bioethanol production from sugar cane bagasse and leaves integrated with the sugar-based ethanol process

    Directory of Open Access Journals (Sweden)

    Macrelli Stefano

    2012-04-01

    Full Text Available Abstract Background Bioethanol produced from the lignocellulosic fractions of sugar cane (bagasse and leaves, i.e. second generation (2G bioethanol, has a promising market potential as an automotive fuel; however, the process is still under investigation on pilot/demonstration scale. From a process perspective, improvements in plant design can lower the production cost, providing better profitability and competitiveness if the conversion of the whole sugar cane is considered. Simulations have been performed with AspenPlus to investigate how process integration can affect the minimum ethanol selling price of this 2G process (MESP-2G, as well as improve the plant energy efficiency. This is achieved by integrating the well-established sucrose-to-bioethanol process with the enzymatic process for lignocellulosic materials. Bagasse and leaves were steam pretreated using H3PO4 as catalyst and separately hydrolysed and fermented. Results The addition of a steam dryer, doubling of the enzyme dosage in enzymatic hydrolysis, including leaves as raw material in the 2G process, heat integration and the use of more energy-efficient equipment led to a 37 % reduction in MESP-2G compared to the Base case. Modelling showed that the MESP for 2G ethanol was 0.97 US$/L, while in the future it could be reduced to 0.78 US$/L. In this case the overall production cost of 1G + 2G ethanol would be about 0.40 US$/L with an output of 102 L/ton dry sugar cane including 50 % leaves. Sensitivity analysis of the future scenario showed that a 50 % decrease in the cost of enzymes, electricity or leaves would lower the MESP-2G by about 20%, 10% and 4.5%, respectively. Conclusions According to the simulations, the production of 2G bioethanol from sugar cane bagasse and leaves in Brazil is already competitive (without subsidies with 1G starch-based bioethanol production in Europe. Moreover 2G bioethanol could be produced at a lower cost if subsidies were used to compensate for the opportunity cost from the sale of excess electricity and if the cost of enzymes continues to fall.

  2. The 2nd Generation z(Redshift) and Early Universe Spectrometer Part I: First-light observation of a highly lensed local-ULIRG analog at high-z

    CERN Document Server

    Ferkinhoff, Carl; Parshley, Stephen; Nikola, Thomas; Stacey, Gordon J; Schoenwald, Justin; Higdon, James L; Higdon, Sarah J U; Verma, Aprajita; Riechers, Dominik; Hailey-Dunsheath, Steven; Menten, Karl; Güsten, Rolf; Wieß, Axel; Irwin, Kent; Cho, Hsiao M; Niemack, Michael; Halpern, Mark; Amiri, Mandana; Hasselfield, Matthew; Wiebe, D V; Ade, Peter A R; Tucker, Carol E

    2013-01-01

    We report first science results from our new spectrometer, the 2nd generation z(Redshift) and Early Universe Spectrometer (ZEUS-2), recently commissioned on the Atacama Pathfinder Experiment telescope (APEX). ZEUS-2 is a submillimeter grating spectrometer optimized for detecting the faint and broad lines from distant galaxies that are redshifted into the telluric windows from 200 to 850 microns. It utilizes a focal plane array of transition-edge sensed bolometers, the first use of these arrays for astrophysical spectroscopy. ZEUS-2 promises to be an important tool for studying galaxies in the years to come due to its synergy with ALMA and its capabilities in the short submillimeter windows that are unique in the post Herschel era. Here we report on our first detection of the [CII] 158 $\\mu m$ line with ZEUS-2. We detect the line at z ~ 1.8 from H-ATLAS J091043.1-000322 with a line flux of $(6.44 \\pm 0.42) \\times 10^{-18} W m^{-2}$. Combined with its far-infrared luminosity and a new Herschel-PACS detection of...

  3. Automatic Testcase Generation for Flight Software

    Science.gov (United States)

    Bushnell, David Henry; Pasareanu, Corina; Mackey, Ryan M.

    2008-01-01

    The TacSat3 project is applying Integrated Systems Health Management (ISHM) technologies to an Air Force spacecraft for operational evaluation in space. The experiment will demonstrate the effectiveness and cost of ISHM and vehicle systems management (VSM) technologies through onboard operation for extended periods. We present two approaches to automatic testcase generation for ISHM: 1) A blackbox approach that views the system as a blackbox, and uses a grammar-based specification of the system's inputs to automatically generate *all* inputs that satisfy the specifications (up to prespecified limits); these inputs are then used to exercise the system. 2) A whitebox approach that performs analysis and testcase generation directly on a representation of the internal behaviour of the system under test. The enabling technologies for both these approaches are model checking and symbolic execution, as implemented in the Ames' Java PathFinder (JPF) tool suite. Model checking is an automated technique for software verification. Unlike simulation and testing which check only some of the system executions and therefore may miss errors, model checking exhaustively explores all possible executions. Symbolic execution evaluates programs with symbolic rather than concrete values and represents variable values as symbolic expressions. We are applying the blackbox approach to generating input scripts for the Spacecraft Command Language (SCL) from Interface and Control Systems. SCL is an embedded interpreter for controlling spacecraft systems. TacSat3 will be using SCL as the controller for its ISHM systems. We translated the SCL grammar into a program that outputs scripts conforming to the grammars. Running JPF on this program generates all legal input scripts up to a prespecified size. Script generation can also be targeted to specific parts of the grammar of interest to the developers. These scripts are then fed to the SCL Executive. ICS's in-house coverage tools will be run to measure code coverage. Because the scripts exercise all parts of the grammar, we expect them to provide high code coverage. This blackbox approach is suitable for systems for which we do not have access to the source code. We are applying whitebox test generation to the Spacecraft Health INference Engine (SHINE) that is part of the ISHM system. In TacSat3, SHINE will execute an on-board knowledge base for fault detection and diagnosis. SHINE converts its knowledge base into optimized C code which runs onboard TacSat3. SHINE can translate its rules into an intermediate representation (Java) suitable for analysis with JPF. JPF will analyze SHINE's Java output using symbolic execution, producing testcases that can provide either complete or directed coverage of the code. Automatically generated test suites can provide full code coverage and be quickly regenerated when code changes. Because our tools analyze executable code, they fully cover the delivered code, not just models of the code. This approach also provides a way to generate tests that exercise specific sections of code under specific preconditions. This capability gives us more focused testing of specific sections of code.

  4. Generating Protocol Software from CPN Models Annotated with Pragmatics

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge; Kristensen, Lars M.

    2013-01-01

    Model-driven software engineering (MDSE) provides a foundation for automatically generating software based on models that focus on the problem domain while abstracting from the details of underlying implementation platforms. Coloured Petri Nets (CPNs) have been widely used to formally model and verify protocol software, but limited work exists on using CPN models of protocols as a basis for automated code generation. The contribution of this paper is a method for generating protocol software from a class of CPN models annotated with code generation pragmatics. Our code generation method consists of three main steps: automatically adding so-called derived pragmatics to the CPN model, computing an abstract template tree, which associates pragmatics with code templates, and applying the templates to generate code which can then be compiled. We illustrate our method using a unidirectional data framing protocol.

  5. Next-generation business intelligence software with Silverlight 3

    CERN Document Server

    Czernicki, Bart

    2010-01-01

    Business Intelligence (BI) software is the code and tools that allow you to view different components of a business using a single visual platform, making comprehending mountains of data easier. Applications that include reports, analytics, statistics, and historical and predictive modeling are all examples of BI applications. Currently, we are in the second generation of BI software, called BI 2.0. This generation is focused on writing BI software that is predictive, adaptive, simple, and interactive. As computers and software have evolved, more data can be presented to end users with increas

  6. Building Knowledge Bases for the Generation of Software Documentation

    OpenAIRE

    Paris, Cecile; Linden, Keith Vander

    1996-01-01

    Automated text generation requires a underlying knowledge base from which to generate, which is often difficult to produce. Software documentation is one domain in which parts of this knowledge base may be derived automatically. In this paper, we describe \\drafter, an authoring support tool for generating user-centred software documentation, and in particular, we describe how parts of its required knowledge base can be obtained automatically.

  7. Using DSL for Automatic Generation of Software Connectors.

    Czech Academy of Sciences Publication Activity Database

    Bureš, Tomáš; Malohlava, M.; Hn?tynka, P.

    Los Alamitos : IEEE Computer Society, 2008, s. 138-147. ISBN 978-0-7695-3091-8. [ICCBSS 2008. International Conference on Composition-Based Software Systems /7./. Madrid (ES), 25.02.2008-29.02.2008,] R&D Projects: GA AV ?R 1ET400300504 Institutional research plan: CEZ:AV0Z10300504 Keywords : component based systems * software connectors * code generation * domain-specific languages Subject RIV: JC - Computer Hardware ; Software

  8. Creating the next generation control system software

    International Nuclear Information System (INIS)

    A new 1980's style support package for future accelerator control systems is proposed. It provides a way to create accelerator applications software without traditional programming. Visual Interactive Applications (VIA) is designed to meet the needs of expanded accelerator complexes in a more cost effective way than past experience with procedural languages by using technology from the personal computer and artificial intelligence communities. 4 refs

  9. Beyond the 2nd Fermi Pulsar Catalog

    CERN Document Server

    Hou, Xian; Reposeur, Thierry; Rousseau, Romain

    2013-01-01

    Over thirteen times more gamma-ray pulsars have now been studied with the Large Area Telescope on NASA's Fermi satellite than the ten seen with the Compton Gamma-Ray Observatory in the nineteen-nineties. The large sample is diverse, allowing better understanding both of the pulsars themselves and of their roles in various cosmic processes. Here we explore the prospects for even more gamma-ray pulsars as Fermi enters the 2nd half of its nominal ten-year mission. New pulsars will naturally tend to be fainter than the first ones discovered. Some of them will have unusual characteristics compared to the current population, which may help discriminate between models. We illustrate a vision of the future with a sample of six pulsars discovered after the 2nd Fermi Pulsar Catalog was written.

  10. Saturn V SII (2nd Stage)

    Science.gov (United States)

    1964-01-01

    This illustration, with callouts, is of the Saturn V SII (2nd Stage) developed by the Space Division of North American Aviation under the direction of the Marshall Space Flight Center. The 82-foot-long and 33-foot-diameter S-II stage utilized five J-2 engines, each with a 200,000-pound thrust capability. The engine used liquid oxygen and liquid hydrogen as its propellants.

  11. Radioisotope thermoelectric generator transportation system subsystem 143 software development plan

    International Nuclear Information System (INIS)

    This plan describes the activities to be performed and the controls to be applied to the process of specifying, developing, and qualifying the data acquisition software for the Radioisotope Thermoelectric Generator (RTG) Transportation System Subsystem 143 Instrumentation and Data Acquisition System (IDAS). This plan will serve as a software quality assurance plan, a verification and validation (V and V) plan, and a configuration management plan

  12. Automating Traceability for Generated Software Artifacts

    Science.gov (United States)

    Richardson, Julian; Green, Jeffrey

    2004-01-01

    Program synthesis automatically derives programs from specifications of their behavior. One advantage of program synthesis, as opposed to manual coding, is that there is a direct link between the specification and the derived program. This link is, however, not very fine-grained: it can be best characterized as Program is-derived- from Specification. When the generated program needs to be understood or modified, more $ne-grained linking is useful. In this paper, we present a novel technique for automatically deriving traceability relations between parts of a specification and parts of the synthesized program. The technique is very lightweight and works -- with varying degrees of success - for any process in which one artifact is automatically derived from another. We illustrate the generality of the technique by applying it to two kinds of automatic generation: synthesis of Kalman Filter programs from speci3cations using the Aut- oFilter program synthesis system, and generation of assembly language programs from C source code using the GCC C compilel: We evaluate the effectiveness of the technique in the latter application.

  13. Search-based software test data generation using evolutionary computation

    CERN Document Server

    Maragathavalli, P

    2011-01-01

    Search-based Software Engineering has been utilized for a number of software engineering activities. One area where Search-Based Software Engineering has seen much application is test data generation. Evolutionary testing designates the use of metaheuristic search methods for test case generation. The search space is the input domain of the test object, with each individual or potential solution, being an encoded set of inputs to that test object. The fitness function is tailored to find test data for the type of test that is being undertaken. Evolutionary Testing (ET) uses optimizing search techniques such as evolutionary algorithms to generate test data. The effectiveness of GA-based testing system is compared with a Random testing system. For simple programs both testing systems work fine, but as the complexity of the program or the complexity of input domain grows, GA-based testing system significantly outperforms Random testing.

  14. Computer Software Configuration Item-Specific Flight Software Image Transfer Script Generator

    Science.gov (United States)

    Bolen, Kenny; Greenlaw, Ronald

    2010-01-01

    A K-shell UNIX script enables the International Space Station (ISS) Flight Control Team (FCT) operators in NASA s Mission Control Center (MCC) in Houston to transfer an entire or partial computer software configuration item (CSCI) from a flight software compact disk (CD) to the onboard Portable Computer System (PCS). The tool is designed to read the content stored on a flight software CD and generate individual CSCI transfer scripts that are capable of transferring the flight software content in a given subdirectory on the CD to the scratch directory on the PCS. The flight control team can then transfer the flight software from the PCS scratch directory to the Electronically Erasable Programmable Read Only Memory (EEPROM) of an ISS Multiplexer/ Demultiplexer (MDM) via the Indirect File Transfer capability. The individual CSCI scripts and the CSCI Specific Flight Software Image Transfer Script Generator (CFITSG), when executed a second time, will remove all components from their original execution. The tool will identify errors in the transfer process and create logs of the transferred software for the purposes of configuration management.

  15. 2nd International Conference on Mobile and Wireless Technology

    CERN Document Server

    Wattanapongsakorn, Naruemon

    2015-01-01

    This book provides a snapshot of the current state-of-the-art in the fields of mobile and wireless technology, security and applications.  The proceedings of the 2nd International Conference on Mobile and Wireless Technology (ICMWT2015), it represents the outcome of a unique platform for researchers and practitioners from academia and industry to share cutting-edge developments in the field of mobile and wireless science technology, including those working on data management and mobile security.   The contributions presented here describe the latest academic and industrial research from the international mobile and wireless community.  The scope covers four major topical areas: mobile and wireless networks and applications; security in mobile and wireless technology; mobile data management and applications; and mobile software.  The book will be a valuable reference for current researchers in academia and industry, and a useful resource for graduate-level students working on mobile and wireless technology...

  16. A code generation framework for the ALMA common software

    Science.gov (United States)

    Troncoso, Nicolás; von Brand, Horst H.; Ibsen, Jorge; Mora, Matias; Gonzalez, Victor; Chiozzi, Gianluca; Jeram, Bogdan; Sommer, Heiko; Zamora, Gabriel; Tejeda, Alexis

    2010-07-01

    Code generation helps in smoothing the learning curve of a complex application framework and in reducing the number of Lines Of Code (LOC) that a developer needs to craft. The ALMA Common Software (ACS) has adopted code generation in specific areas, but we are now exploiting the more comprehensive approach of Model Driven code generation to transform directly an UML Model into a full implementation in the ACS framework. This approach makes it easier for newcomers to grasp the principles of the framework. Moreover, a lower handcrafted LOC reduces the error rate. Additional benefits achieved by model driven code generation are: software reuse, implicit application of design patterns and automatic tests generation. A model driven approach to design makes it also possible using the same model with different frameworks, by generating for different targets. The generation framework presented in this paper uses openArchitectureWare1 as the model to text translator. OpenArchitectureWare provides a powerful functional language that makes this easier to implement the correct mapping of data types, the main difficulty encountered in the translation process. The output is an ACS application readily usable by the developer, including the necessary deployment configuration, thus minimizing any configuration burden during testing. The specific application code is implemented by extending generated classes. Therefore, generated and manually crafted code are kept apart, simplifying the code generation process and aiding the developers by keeping a clean logical separation between the two. Our first results show that code generation improves dramatically the code productivity.

  17. Model Based Analysis and Test Generation for Flight Software

    Science.gov (United States)

    Pasareanu, Corina S.; Schumann, Johann M.; Mehlitz, Peter C.; Lowry, Mike R.; Karsai, Gabor; Nine, Harmon; Neema, Sandeep

    2009-01-01

    We describe a framework for model-based analysis and test case generation in the context of a heterogeneous model-based development paradigm that uses and combines Math- Works and UML 2.0 models and the associated code generation tools. This paradigm poses novel challenges to analysis and test case generation that, to the best of our knowledge, have not been addressed before. The framework is based on a common intermediate representation for different modeling formalisms and leverages and extends model checking and symbolic execution tools for model analysis and test case generation, respectively. We discuss the application of our framework to software models for a NASA flight mission.

  18. Improved ant algorithms for software testing cases generation.

    Science.gov (United States)

    Yang, Shunkun; Man, Tianlong; Xu, Jiaqi

    2014-01-01

    Existing ant colony optimization (ACO) for software testing cases generation is a very popular domain in software testing engineering. However, the traditional ACO has flaws, as early search pheromone is relatively scarce, search efficiency is low, search model is too simple, positive feedback mechanism is easy to produce the phenomenon of stagnation and precocity. This paper introduces improved ACO for software testing cases generation: improved local pheromone update strategy for ant colony optimization, improved pheromone volatilization coefficient for ant colony optimization (IPVACO), and improved the global path pheromone update strategy for ant colony optimization (IGPACO). At last, we put forward a comprehensive improved ant colony optimization (ACIACO), which is based on all the above three methods. The proposed technique will be compared with random algorithm (RND) and genetic algorithm (GA) in terms of both efficiency and coverage. The results indicate that the improved method can effectively improve the search efficiency, restrain precocity, promote case coverage, and reduce the number of iterations. PMID:24883391

  19. Two Sources of Control over the Generation of Software Instructions

    OpenAIRE

    Hartley, Anthony; Paris, Cecile

    1996-01-01

    This paper presents an analysis conducted on a corpus of software instructions in French in order to establish whether task structure elements (the procedural representation of the users' tasks) are alone sufficient to control the grammatical resources of a text generator. We show that the construct of genre provides a useful additional source of control enabling us to resolve undetermined cases.

  20. 2nd International Conference on Pattern Recognition

    CERN Document Server

    Marsico, Maria

    2015-01-01

    This book contains the extended and revised versions of a set of selected papers from the 2nd International Conference on Pattern Recognition (ICPRAM 2013), held in Barcelona, Spain, from 15 to 18 February, 2013. ICPRAM was organized by the Institute for Systems and Technologies of Information, Control and Communication (INSTICC) and was held in cooperation with the Association for the Advancement of Artificial Intelligence (AAAI). The hallmark of this conference was to encourage theory and practice to meet in a single venue. The focus of the book is on contributions describing applications of Pattern Recognition techniques to real-world problems, interdisciplinary research, experimental and/or theoretical studies yielding new insights that advance Pattern Recognition methods.

  1. Web Style Guide, 2nd Edition

    Science.gov (United States)

    The Web Style Guide, 2nd Edition, which is the online version of a book with the same name, demonstrates the step-by-step process involved in designing a Web site. Visitors are assumed to be familiar with whatever Web publishing tool they are using. The guide gives few technical details but instead focuses on the usability, layout, and attractiveness of a Web site, with the goal being to make it as popular with the intended audience as possible. Considerations such as graphics, typography, and multimedia enhancements are discussed. Web site structure, fine-tuned features on individual pages, and almost everything in between is addressed by the guide, making it a handy resource for people who place great importance on the effectiveness of their online creations.

  2. Oxygen Generation System Laptop Bus Controller Flight Software

    Science.gov (United States)

    Rowe, Chad; Panter, Donna

    2009-01-01

    The Oxygen Generation System Laptop Bus Controller Flight Software was developed to allow the International Space Station (ISS) program to activate specific components of the Oxygen Generation System (OGS) to perform a checkout of key hardware operation in a microgravity environment, as well as to perform preventative maintenance operations of system valves during a long period of what would otherwise be hardware dormancy. The software provides direct connectivity to the OGS Firmware Controller with pre-programmed tasks operated by on-orbit astronauts to exercise OGS valves and motors. The software is used to manipulate the pump, separator, and valves to alleviate the concerns of hardware problems due to long-term inactivity and to allow for operational verification of microgravity-sensitive components early enough so that, if problems are found, they can be addressed before the hardware is required for operation on-orbit. The decision was made to use existing on-orbit IBM ThinkPad A31p laptops and MIL-STD-1553B interface cards as the hardware configuration. The software at the time of this reporting was developed and tested for use under the Windows 2000 Professional operating system to ensure compatibility with the existing on-orbit computer systems.

  3. Effective Test Case Generation Using Antirandom Software Testing

    OpenAIRE

    Kulvinder Singh,; Rakesh Kumar,; Iqbal Kaur

    2010-01-01

    Random Testing is a primary technique for the software testing. Antirandom Testing improves the fault-detection capability of Random Testing by employing the location information of previously executed test cases. Antirandom testing selects test case such that it is as different as possible from all the previous executed test cases. The implementation is illustrated using basic examples. Moreover, compared with Random Testing, test cases generated in Antirandom Testing are more evenly spread ...

  4. Curvature of 2nd type induced on plane distribution

    Directory of Open Access Journals (Sweden)

    Omelyan O.

    2014-11-01

    Full Text Available In many-dimensional projective space the plane distribution is considered. The curvature of group connection of 2-nd type, induced by composite clothing of plane distribution, is constructed. It is proved, that a immovability of Cartan’s plane and Bortolotti’s hyperplane in case of holonomic distribution attracts the vanishing of curvature tensor of 2-nd type.

  5. 2nd International Arctic Ungulate Conference

    Directory of Open Access Journals (Sweden)

    A. Anonymous

    1996-01-01

    Full Text Available The 2nd International Arctic Ungulate Conference was held 13-17 August 1995 on the University of Alaska Fairbanks campus. The Institute of Arctic Biology and the Alaska Cooperative Fish and Wildlife Research Unit were responsible for organizing the conference with assistance from biologists with state and federal agencies and commercial organizations. David R. Klein was chair of the conference organizing committee. Over 200 people attended the conference, coming from 10 different countries. The United States, Canada, and Norway had the largest representation. The conference included invited lectures; panel discussions, and about 125 contributed papers. There were five technical sessions on Physiology and Body Condition; Habitat Relationships; Population Dynamics and Management; Behavior, Genetics and Evolution; and Reindeer and Muskox Husbandry. Three panel sessions discussed Comparative caribou management strategies; Management of introduced, reestablished, and expanding muskox populations; and Health risks in translocation of arctic ungulates. Invited lectures focused on the physiology and population dynamics of arctic ungulates; contaminants in food chains of arctic ungulates and lessons learned from the Chernobyl accident; and ecosystem level relationships of the Porcupine Caribou Herd.

  6. Anti-Random Test Generation In Software Testing

    OpenAIRE

    seema rani

    2011-01-01

    The main purpose of software testing is found a error and then correct it. Random testing selects test cases randomly but it does not explore the previous information. Anti-random testing in which each test applied its total distance from all previous tests is maximum. Anti-Random testing is a variation of random testing, which is the process of generating random input and sending that input to a system for test. In which use hamming Distance and Cartesian Distance for measure of difference.

  7. Evaluation of the efficiency and reliability of software generated by code generators

    Science.gov (United States)

    Schreur, Barbara

    1994-01-01

    There are numerous studies which show that CASE Tools greatly facilitate software development. As a result of these advantages, an increasing amount of software development is done with CASE Tools. As more software engineers become proficient with these tools, their experience and feedback lead to further development with the tools themselves. What has not been widely studied, however, is the reliability and efficiency of the actual code produced by the CASE Tools. This investigation considered these matters. Three segments of code generated by MATRIXx, one of many commercially available CASE Tools, were chosen for analysis: ETOFLIGHT, a portion of the Earth to Orbit Flight software, and ECLSS and PFMC, modules for Environmental Control and Life Support System and Pump Fan Motor Control, respectively.

  8. 2nd International technical meeting on small reactors

    International Nuclear Information System (INIS)

    The 2nd International Technical Meeting on Small Reactors was held on November 7-9, 2012 in Ottawa, Ontario. The meeting was hosted by Atomic Energy of Canada Limited (AECL) and Canadian Nuclear Society (CNS). There is growing international interest and activity in the development of small nuclear reactor technology. This meeting provided participants with an opportunity to share ideas and exchange information on new developments. This Technical Meeting covered topics of interest to designers, operators, researchers and analysts involved in the design, development and deployment of small reactors for power generation and research. A special session focussed on small modular reactors (SMR) for generating electricity and process heat, particularly in small grids and remote locations. Following the success of the first Technical Meeting in November 2010, which captured numerous accomplishments of low-power critical facilities and small reactors, the second Technical Meeting was dedicated to the achievements, capabilities, and future prospects of small reactors. This meeting also celebrated the 50th Anniversary of the Nuclear Power Demonstration (NPD) reactor which was the first small reactor (20 MWe) to generate electricity in Canada.

  9. Software para la Evaluación y Selección de Generadores Independientes / Independent Generator Evaluation and Selection Software

    Scientific Electronic Library Online (English)

    Marcos Alberto, de Armas Teyra; Miguel, Castro Fernández.

    2014-04-01

    Full Text Available En muchas industrias, edificios, empresas y servicios por razones productivas, emergentes o de confiablidad, es necesario generar energía eléctrica independientemente del Sistema Eléctrico Nacional. En otros casos se necesitan plantas de generación de energía eléctrica que puedan trabajar de forma a [...] islada alimentando un circuito determinado. Para realizar la selección más económica y eficiente de la capacidad a instalar se debe tomar en consideración no sólo el comportamiento del sistema donde se va a insertar la unidad, sino también, las características operacionales del generador ante las fluctuaciones de la carga, sus límites operacionales y la estabilidad resultante. Este trabajo presenta un software que permite realizar la selección más adecuada considerando la curva de capacidad y la estabilidad del generador ante las particularidades del sistema. Con su aplicación es posible reducir los gastos y las pérdidas económicas debidas a una selección inapropiada. Como caso se presenta su aplicación a una planta de una fábrica de envases de alimentos. Abstract in english In many industries, buildings and services is necessary to employ independent power plants to supply the electric power to a particular system. In other cases, islanded operation is desired due to several specific situations. In order to realize the most efficient, economic and adequate selection of [...] the generator capacity is necessary to consider not only the systems load characteristic, also is necessary to know the limits of capabilities and the resultant stability of the power generator. This paper presents a software that allows to select the adequate generator, expose the operating limits and the resulting stability in fluctuating load condition. With its application is possible to reduce economic losses and the costs due to an impropriated generator selection with an oversized o sub utilized machine. As a case a 2500 kVA power plant is analyzed.

  10. Software for Generating Strip Maps from SAR Data

    Science.gov (United States)

    Hensley, Scott; Michel, Thierry; Madsen, Soren; Chapin, Elaine; Rodriguez, Ernesto

    2004-01-01

    Jurassicprok is a computer program that generates strip-map digital elevation models and other data products from raw data acquired by an airborne synthetic-aperture radar (SAR) system. This software can process data from a variety of airborne SAR systems but is designed especially for the GeoSAR system, which is a dual-frequency (P- and X-band), single-pass interferometric SAR system for measuring elevation both at the bare ground surface and top of the vegetation canopy. Jurassicprok is a modified version of software developed previously for airborne-interferometric- SAR applications. The modifications were made to accommodate P-band interferometric processing, remove approximations that are not generally valid, and reduce processor-induced mapping errors to the centimeter level. Major additions and other improvements over the prior software include the following: a) A new, highly efficient multi-stage-modified wave-domain processing algorithm for accurately motion compensating ultra-wideband data; b) Adaptive regridding algorithms based on estimated noise and actual measured topography to reduce noise while maintaining spatial resolution; c) Exact expressions for height determination from interferogram data; d) Fully calibrated volumetric correlation data based on rigorous removal of geometric and signal-to-noise decorrelation terms; e) Strip range-Doppler image output in user-specified Doppler coordinates; f) An improved phase-unwrapping and absolute-phase-determination algorithm; g) A more flexible user interface with many additional processing options; h) Increased interferogram filtering options; and i) Ability to use disk space instead of random- access memory for some processing steps.

  11. Elements of the Next Generation Science Standards' (NGSS) New Framework for K-12 Science Education aligned with STEM designed projects created by Kindergarten, 1st and 2nd grade students in a Reggio Emilio project approach setting

    Science.gov (United States)

    Facchini, Nicole

    This paper examines how elements of the Next Generation Science Standards' (NGSS) New Framework for K-12 Science Education standards (National Research Council 2011)---specifically the cross-cutting concept "cause and effect" are aligned with early childhood students' creation of projects of their choice. The study took place in a Reggio Emilio-inspired, K-12 school, in a multi-aged kindergarten, first and second grade classroom with 14 students. Students worked on their projects independently with the assistance of their peers and teachers. The students' projects and the alignment with the Next Generation Science Standards' New Framework were analyzed by using pre and post assessments, student interviews, and discourse analysis. Results indicate that elements of the New Framework for K-12 Science Education emerged through students' project presentation, particularly regarding the notion of "cause and effect". More specifically, results show that initially students perceived the relationship between "cause and effect" to be negative.

  12. Effective Test Case Generation Using Antirandom Software Testing

    Directory of Open Access Journals (Sweden)

    Kulvinder Singh,

    2010-11-01

    Full Text Available Random Testing is a primary technique for the software testing. Antirandom Testing improves the fault-detection capability of Random Testing by employing the location information of previously executed test cases. Antirandom testing selects test case such that it is as different as possible from all the previous executed test cases. The implementation is illustrated using basic examples. Moreover, compared with Random Testing, test cases generated in Antirandom Testing are more evenly spread across the input domain. AntirandomTesting has conventionally been applied to programs that have only numerical input types, because the distance between numerical inputs is readily measurable. The vast majority of research involves distance techniques for generating the antirandom test cases. Different from these techniques, we focus on the concrete values ofprogram inputs by proposing a new method to generate the antirandom test cases. The proposed method enables Antirandom Testing to be applied to all kinds of programs regardless of their input types. Empirical studies are further conducted for comparison and evaluation of the effectiveness of these methods is also presented. Experimental results show that, compared with random and hamming distance techniques, the proposed method significantly reduces the number of test cases required to detect the first failure. Overall, proposed antirandom testing gives higher fault coverage than antirandom testing with hamming distance method, which gives higher fault coverage than pure random testing.

  13. Proceedings of the 2nd Educators' Symposium

    OpenAIRE

    3

    2007-01-01

    Preface Putting the model-driven development (MDD) approaches and technologies for software-based systems vision, in which development is centered round the manipulation of models, into practice requires not only sophisticated modeling approaches and tools, but also considerable training and education efforts. To make people ready for MDD, its principles and applications need to be taught to practitioners in industry, incorporated in university curricula, and probab...

  14. A NEO population generation and observation simulation software tool

    Science.gov (United States)

    Müller, Sven; Gelhaus, Johannes; Hahn, Gerhard; Franco, Raffaella

    One of the main targets of ESA's Space Situational Awareness (SSA) program is to build a wide knowledge base about objects that can potentially harm Earth (Near-Earth Objects, NEOs). An important part of this effort is to create the Small Bodies Data Centre (SBDC) which is going to aggregate measurement data from a fully-integrated NEO observation sensor network. Until this network is developed, artificial NEO measurement data is needed in order to validate SBDC algorithms. Moreover, to establish a functioning NEO observation sensor network, it has to be determined where to place sensors, what technical requirements have to be met in order to be able to detect NEOs and which observation strategies work the best. Because of this, a sensor simulation software was needed. This paper presents a software tool which allows users to create and analyse NEO populations and to simulate and analyse population observations. It is a console program written in Fortran and comes with a Graphical User Interface (GUI) written in Java and C. The tool can be distinguished into the components ``Population Generator'' and ``Observation Simulator''. The Population Generator component is responsible for generating and analysing a NEO population. Users can choose between creating fictitious (random) and synthetic populations. The latter are based on one of two models describing the orbital and size distribution of observed NEOs: The existing socalled ``Bottke Model'' (Bottke et al. 2000, 2002) and the new ``Granvik Model'' (Granvik et al. 2014, in preparation) which has been developed in parallel to the tool. Generated populations can be analysed by defining 2D, 3D and scatter plots using various NEO attributes. As a result, the tool creates the appropiate files for the plotting tool ``gnuplot''. The tool's Observation Simulator component yields the Observation Simulation and Observation Analysis functions. Users can define sensor systems using ground- or space-based locations as well as optical or radar sensors and simulate observation campaigns. The tool outputs field-of-view crossings and actual detections of the selected NEO population objects. Using the Observation Analysis users are able to process and plot the results of the Observation Simulation. In order to enable end-users to handle the tool in a user-intuitive and comfortable way, a GUI has been created based on the modular Eclipse Rich Client Platform (RCP) technology. Through the GUI users can easily enter input data for the tool, execute it and view its output data in a clear way. Additionally, the GUI runs gnuplot to create plot pictures and presents them to the user. Furthermore, users can create projects to organise executions of the tool.

  15. Automatic Question Generation Using Software Agents for Technical Institutions

    Directory of Open Access Journals (Sweden)

    Shivank Pandey

    2013-12-01

    Full Text Available In the attempt of producing quality graduates, a major factor that needs a considerable amount of attention is the institutions evaluation system. Now, in order to produce quality result their examination system must be very effective, questions must be able to assess students in every domain. Preparation of question paper with high standard that really kindles the student’s thinking ability is very challengeable task that need to be performed by the academicians. There arises a need for automatic question generation. For all the existing automatic question generating systems, the problem lies either in eliminating the user’s role or in developing factual questions based on Bloom’s taxonomy. To overcome these issues, in the proposed system, the focus is to take input in form of a text file from user which contains of the text upon which the user desires to fetch questions; the output is produced in form of a text file containing questions based on Bloom’s taxonomy. The entire process is carried out by software agents, which eliminates the major problems of existing systems.

  16. 2nd ISPRA nuclear electronics symposium, Stresa, Italy May 20-23, 1975

    International Nuclear Information System (INIS)

    Two round tables were annexed to the 2nd Ispra Nuclear Electronics Symposium. The first one was concerned with software support for the implementation of microprocessors, MOS and bipolar microporcessors, environmental data systems, and the use of microprocessors and minicomputers in nuclear, biomedical and environmental fields. Nuclear electronics future, and its diversification, gravitational waves and electronics, the environmental measurements of air and water quality were discussed during the second round table, and relevant feelings brought out during the discussion on the extension of nuclear electronics techniques to other fields

  17. GENESIS: Agile Generation of Information Management Oriented Software

    Directory of Open Access Journals (Sweden)

    Juan Erasmo Gómez

    2010-06-01

    Full Text Available The specification for an information system can be clear from the beginning: it must acquire, display, query and modify data, using a database. The main issue is to decide which information to manage. In the case originating this work, information was always evolving, even up to the end of the project. This implies the construction of a new system each time the information is redefined. This article presents Genesis, an agile development infrastructure, and proposes an approach for the immediate construction of required information systems. Experts describe their information needs and queries, and Genesis generates the corresponding application, with the appropriate graphical interfaces and database.La especificación de un sistema de información puede estar clara desde el principio: debe adquirir, desplegar, consultar y modificar datos, usando una base de datos. El asunto es decidir cuál información manejar. En el caso que origina este trabajo, la información evoluciona permanentemente, incluso hasta el final del proyecto. Esto implica la construcción de un nuevo sistema cada vez que se redefine la información. Este artículo presenta Genesis, una infraestructura ágil para la construcción inmediata del sistema de información que sea requerido. Los expertos describen su información y consultas. Genesis produce el software correspondiente, generando las interfaces gráficas y la base de datos apropiados.

  18. A Practical Evaluation of Next Generation Sequencing & Molecular Cloning Software

    OpenAIRE

    Qaadri, K.; Moir, R.; Kearse, M.; Buxton, S.; WILSON, A; Cheung, M; Milicevic, B.; Hengjie, W.; Kuhn, J.; Stones-Havas, S.; Olsen, C.

    2014-01-01

    Research biologists increasingly face the arduous process of assessing and implementing a combination of freeware, commercial software, and web services for the management and analysis of data from high-throughput experiments. Laboratories can spend a remarkable amount of their research budgets on software, data analysis, and data management systems. The National Institutes of Health (NIH) and the National Science Foundation (NSF) have emphasized the need for contemporary software to be well-...

  19. Safety profile of bilastine: 2nd generation H1-antihistamines.

    Science.gov (United States)

    Scaglione, F

    2012-12-01

    Bilastine is a new H1 antagonist with no sedative side effects, no cardiotoxic effects, and no hepatic metabolism. In addition, bilastine has proved to be effective for the symptomatic treatment of allergic rhinoconjunctivitis and urticaria. Pharmacological studies have shown that bilastine is highly selective for the H1 receptor in both in vivo and in vitro studies, and with no apparent affinity for other receptors. The absorption of bilastine is fast, linear and dose-proportional; it appears to be safe and well tolerated at all doses levels in healthy population. Multiple administration of bilastine has confirmed the linearity of the kinetic parameters. The distribution in the brain is undetectable. The safety profile in terms of adverse effects is very similar to placebo in all Phase I, II and III clinical trials. Bilastine (20 mg), unlike cetirizine, does not increase alcohol effects on the CNS. Bilastine 20 mg does not increase the CNS depressant effect of lorazepam. Bilastine 20 mg is similar to placebo in the driving test. Therefore, it meets the current criteria for medication used in the treatment of allergic rhinitis and urticaria. PMID:23242729

  20. Meta-programming composers in 2nd generation component systems.

    OpenAIRE

    Assmann, Uwe

    2007-01-01

    Future component systems will require that components can be composed flexibly. In contrast to current systems which only support a fixed set of composition mechanisms, the component system should provide a composition language in which users can define their own specific composers. It is argued for an object-oriented setting that this will be possible by meta-programming the class-graph. Composers will be based on two important elements. First, they will express coup...

  1. Thermoluminescent characteristics of ZrO2:Nd films

    International Nuclear Information System (INIS)

    In this work it is exposed the obtained results after analysing the photo luminescent and thermoluminescent characteristics of activated zirconium oxide with neodymium (ZrO2 :Nd) and its possible application in the UV radiation dosimetry. The realized experiments had as objective to study the characteristics such as the optimum thermal erased treatment, the influence of light on the response, the response depending on the wavelength, the fadeout of the information, the temperature effect, the response depending on the time and the recurring of the response. The results show that the ZrO2 :Nd is a promising material to be used as Tl dosemeter for the UV radiation. (Author)

  2. The 2nd reactor core of the NS Otto Hahn

    International Nuclear Information System (INIS)

    Details of the design of the 2nd reactor core are given, followed by a brief report summarising the operating experience gained with this 2nd core, as well as by an evaluation of measured data and statements concerning the usefulness of the knowledge gained for the development of future reactor cores. Quite a number of these data have been used to improve the concept and thus the specifications for the fuel elements of the 3rd core of the reactor of the NS Otto Hahn. (orig./HP)

  3. Software Defined Radio Architecture Contributions to Next Generation Space Communications

    Science.gov (United States)

    Kacpura, Thomas J.; Eddy, Wesley M.; Smith, Carl R.; Liebetreu, John

    2015-01-01

    Space communications architecture concepts, comprising the elements of the system, the interactions among them, and the principles that govern their development, are essential factors in developing National Aeronautics and Space Administration (NASA) future exploration and science missions. Accordingly, vital architectural attributes encompass flexibility, the extensibility to insert future capabilities, and to enable evolution to provide interoperability with other current and future systems. Space communications architectures and technologies for this century must satisfy a growing set of requirements, including those for Earth sensing, collaborative observation missions, robotic scientific missions, human missions for exploration of the Moon and Mars where surface activities require supporting communications, and in-space observatories for observing the earth, as well as other star systems and the universe. An advanced, integrated, communications infrastructure will enable the reliable, multipoint, high-data-rate capabilities needed on demand to provide continuous, maximum coverage for areas of concentrated activity. Importantly, the cost/value proposition of the future architecture must be an integral part of its design; an affordable and sustainable architecture is indispensable within anticipated future budget environments. Effective architecture design informs decision makers with insight into the capabilities needed to efficiently satisfy the demanding space-communication requirements of future missions and formulate appropriate requirements. A driving requirement for the architecture is the extensibility to address new requirements and provide low-cost on-ramps for new capabilities insertion, ensuring graceful growth as new functionality and new technologies are infused into the network infrastructure. In addition to extensibility, another key architectural attribute of the space communication equipment's interoperability with other NASA communications systems, as well as those communications and navigation systems operated by international space agencies and civilian and government agencies. In this paper, we review the philosophies, technologies, architectural attributes, mission services, and communications capabilities that form the structure of candidate next-generation integrated communication architectures for space communications and navigation. A key area that this paper explores is from the development and operation of the software defined radio for the NASA Space Communications and Navigation (SCaN) Testbed currently on the International Space Station (ISS). Evaluating the lessons learned from development and operation feed back into the communications architecture. Leveraging the reconfigurability provides a change in the way that operations are done and must be considered. Quantifying the impact on the NASA Space Telecommunications Radio System (STRS) software defined radio architecture provides feedback to keep the standard useful and up to date. NASA is not the only customer of these radios. Software defined radios are developed for other applications, and taking advantage of these developments promotes an architecture that is cost effective and sustainable. Developments in the following areas such as an updated operating environment, higher data rates, networking and security can be leveraged. The ability to sustain an architecture that uses radios for multiple markets can lower costs and keep new technology infused.

  4. Generation and Optimization of Test cases for Object-Oriented Software Using State Chart Diagram

    CERN Document Server

    Swain, Ranjita Kumari; Mohapatra, Durga Prasad

    2012-01-01

    The process of testing any software system is an enormous task which is time consuming and costly. The time and required effort to do sufficient testing grow, as the size and complexity of the software grows, which may cause overrun of the project budget, delay in the development of software system or some test cases may not be covered. During SDLC (software development life cycle), generally the software testing phase takes around 40-70% of the time and cost. State-based testing is frequently used in software testing. Test data generation is one of the key issues in software testing. A properly generated test suite may not only locate the errors in a software system, but also help in reducing the high cost associated with software testing. It is often desired that test data in the form of test sequences within a test suite can be automatically generated to achieve required test coverage. This paper proposes an optimization approach to test data generation for the state-based software testing. In this paper, ...

  5. 2nd Quarter Transportation Report FY 2014

    Energy Technology Data Exchange (ETDEWEB)

    Gregory, L.

    2014-07-30

    This report satisfies the U.S. Department of Energy (DOE), National Nuclear Security Administration Nevada Field Office (NNSA/NFO) commitment to prepare a quarterly summary report of radioactive waste shipments to the Nevada National Security Site (NNSS) Radioactive Waste Management Complex (RWMC) at Area 5. There were no shipments sent for offsite treatment and returned to the NNSS this quarter. This report summarizes the second quarter of fiscal year (FY) 2014 low-level radioactive waste (LLW) and mixed low-level radioactive waste (MLLW) shipments. This report also includes annual summaries for FY 2014 in Tables 4 and 5. Tabular summaries are provided which include the following: Sources of and carriers for LLW and MLLW shipments to and from the NNSS; Number and external volume of LLW and MLLW shipments; Highway routes used by carriers; and Incident/accident data applicable to LLW and MLLW shipments. In this report shipments are accounted for upon arrival at the NNSS, while disposal volumes are accounted for upon waste burial. The disposal volumes presented in this report do not include minor volumes of non-radioactive materials that were approved for disposal. Volume reports showing cubic feet (ft3) generated using the Low-Level Waste Information System may vary slightly due to differing rounding conventions.

  6. 2nd International Conference on Data Management Technologies and Applications

    CERN Document Server

    2013-01-01

    The 2nd International Conference on Data Management Technologies and Applications (DATA) aims to bring together researchers, engineers and practitioners interested on databases, data warehousing, data mining, data management, data security and other aspects of information systems and technology involving advanced applications of data.

  7. Test Review: The Profile of Mood States 2nd Edition

    Science.gov (United States)

    Lin, Shuqiong; Hsiao, Yu-Yu; Wang, Miao

    2014-01-01

    The "Profile of Mood States 2nd Edition" (POMS 2) was published in 2012 by Multi-Health Systems (MHS) to assess transient feelings and mood among individuals aged 13 years and above. Evolving from the original POMS (McNair, Lorr, & Droppleman, 1971, 1992), the POMS 2 was designed for youth (13-17 years old) and adults (18 years old…

  8. Basis Principles of Software Development for Eddy Current Inspection of PWR/WWER Steam Generator Tubes

    International Nuclear Information System (INIS)

    Extensive inspection of PWR/WWER steam generators associated with development of own designs of eddy current inspection systems including manipulators, push-pullers, controllers, probes, etc. influence on INETEC decision to start with development of its own software for EC inspections. In last year incredible results were obtained. Main software packages were finished with increased possibilities compared to other software available on the world market. In this article some basic principles of EC NDT software development is described including organizational aspects of software team, description of tasks and description of main achievements. Also associated problems and future development directions are discussed. (author)

  9. Software Reliability Testing Data Generation Approach Based on a Mixture Model

    OpenAIRE

    Qin Zheng; Han Feng-Yan; Wang Xin

    2010-01-01

    To solve the problem about software reliability testing cases and testing data generation of real-time control systems, this study applies the reliability testing cases generation approach based on the mixture of operation profile and Markov chain which describes software operation profile by the use cases of UML, establishes the use model based on UML model for automatically deriving the testing model from the use model, generates a reliability testing case set based on the testing model and...

  10. Development of a Prototype Automation Simulation Scenario Generator for Air Traffic Management Software Simulations

    Science.gov (United States)

    Khambatta, Cyrus F.

    2007-01-01

    A technique for automated development of scenarios for use in the Multi-Center Traffic Management Advisor (McTMA) software simulations is described. The resulting software is designed and implemented to automate the generation of simulation scenarios with the intent of reducing the time it currently takes using an observational approach. The software program is effective in achieving this goal. The scenarios created for use in the McTMA simulations are based on data taken from data files from the McTMA system, and were manually edited before incorporation into the simulations to ensure accuracy. Despite the software s overall favorable performance, several key software issues are identified. Proposed solutions to these issues are discussed. Future enhancements to the scenario generator software may address the limitations identified in this paper.

  11. SEMANTIC WEB-BASED SOFTWARE ENGINEERING BY AUTOMATED REQUIREMENTS ONTOLOGY GENERATION IN SOA

    OpenAIRE

    Vahid Rastgoo; Monireh-Sadat Hosseini; Esmaeil Kheirkhah

    2014-01-01

    This paper presents an approach for automated generation of requirements ontology using UML diagrams in service-oriented architecture (SOA). The goal of this paper is to convenience progress of software engineering processes like software design, software reuse, service discovering and etc. The proposed method is based on a four conceptual layers. The first layer includes requirements achieved by stakeholders, the second one designs service-oriented diagrams from the data in f...

  12. Specification and Generation of Environment for Model Checking of Software Components.

    Czech Academy of Sciences Publication Activity Database

    Pa?ízek, P.; Plášil, František

    2007-01-01

    Ro?. 176, - (2007), s. 143-154. ISSN 1571-0661 R&D Projects: GA AV ?R 1ET400300504 Institutional research plan: CEZ:AV0Z10300504 Keywords : software components * behavior protocols * model checking * automated generation of environment Subject RIV: JC - Computer Hardware ; Software

  13. Using Automatic Code Generation in the Attitude Control Flight Software Engineering Process

    Science.gov (United States)

    McComas, David; O'Donnell, James R., Jr.; Andrews, Stephen F.

    1999-01-01

    This paper presents an overview of the attitude control subsystem flight software development process, identifies how the process has changed due to automatic code generation, analyzes each software development phase in detail, and concludes with a summary of our lessons learned.

  14. Open-source software for generating electrocardiogram signals

    CERN Document Server

    McSharry, P E; Sharry, Patrick E. Mc; Cifford, Gari D.

    2004-01-01

    ECGSYN, a dynamical model that faithfully reproduces the main features of the human electrocardiogram (ECG), including heart rate variability, RR intervals and QT intervals is presented. Details of the underlying algorithm and an open-source software implementation in Matlab, C and Java are described. An example of how this model will facilitate comparisons of signal processing techniques is provided.

  15. NGSUtils: a software suite for analyzing and manipulating next-generation sequencing datasets

    OpenAIRE

    Breese, Marcus R.; Liu, Yunlong

    2013-01-01

    Summary: NGSUtils is a suite of software tools for manipulating data common to next-generation sequencing experiments, such as FASTQ, BED and BAM format files. These tools provide a stable and modular platform for data management and analysis.

  16. 2nd International Conference on Green Communications and Networks 2012

    CERN Document Server

    Ma, Maode; GCN 2012

    2013-01-01

    The objective of the 2nd International Conference on Green Communications and Networks 2012 (GCN 2012) is to facilitate an exchange of information on best practices for the latest research advances in the area of communications, networks and intelligence applications. These mainly involve computer science and engineering, informatics, communications and control, electrical engineering, information computing, and business intelligence and management. Proceedings of the 2nd International Conference on Green Communications and Networks 2012 (GCN 2012) will focus on green information technology and applications, which will provide in-depth insights for engineers and scientists in academia, industry, and government. The book addresses the most innovative research developments including technical challenges, social and economic issues, and presents and discusses the authors’ ideas, experiences, findings, and current projects on all aspects of advanced green information technology and applications. Yuhang Yang is ...

  17. 2nd International Conference on Intelligent Technologies and Engineering Systems

    CERN Document Server

    Chen, Cheng-Yi; Yang, Cheng-Fu

    2014-01-01

    This book includes the original, peer reviewed research papers from the 2nd International Conference on Intelligent Technologies and Engineering Systems (ICITES2013), which took place on December 12-14, 2013 at Cheng Shiu University in Kaohsiung, Taiwan. Topics covered include: laser technology, wireless and mobile networking, lean and agile manufacturing, speech processing, microwave dielectrics, intelligent circuits and systems, 3D graphics, communications, and structure dynamics and control.

  18. Writing TAFs for Convective Weather, 2nd Edition

    Science.gov (United States)

    COMET

    2014-01-28

    "Writing TAFs for Convective Weather, 2nd Edition" uses a severe thunderstorm event to illustrate techniques for producing an effective Terminal Aerodrome Forecast (TAF) following current National Weather Service directives. The unit offers guidance for developing TAFs for different types of convection and discusses how to concisely communicate logic and uncertainty in an aviation forecast discussion (AvnFD) or by other means. It also addresses the importance of maintaining an effective TAF weather watch and updating the TAF proactively.

  19. 2nd International Open and Distance Learning (IODL) Symposium

    OpenAIRE

    Barkan, Reviewed By Murat

    2006-01-01

    This closing remarks prepared and presented by Prof. Dr. Murat BARKAN Anadolu University, Eskisehir, TURKEY DEAR GUESTS, As the 2nd International Open and Distance Learning (IODL) Symposium is now drawing to end, I would like to thank you all for your outstanding speeches, distinguished presentations, constructive roundtable and session discussions, and active participation during the last five days. I hope you all share my view that the whole symposium has been...

  20. Energy, environment and technological innovation: Rome 2nd international congress

    International Nuclear Information System (INIS)

    From the three volumes containing the proceedings of the October 12-16, 1992 2nd International Congress on Energy, Environment and Technological Innovation held at the University of Rome 'La Sapienza', separate abstracts were prepared for 41 papers. The selection of papers included recent developments and research trends in the following high-tech areas: biomass plantations, wind turbines, photovoltaic power plants, solar architecture, building energy management, global warming, automobile air pollution abatement, district heating with cogeneration, and hydrogen fuels for transportation

  1. Mapping and industrial IT project to a 2nd semester design-build project

    DEFF Research Database (Denmark)

    Nyborg, Mads; HØgh, Stig

    2010-01-01

    CDIO means bringing the engineer's daily life and working practice into the educational system. In our opinion this is best done by selecting an appropriate project from industry. In this paper we describe how we have mapped an industrial IT project to a 2nd semester design-build project in the Diploma IT program at the Technical University of Denmark. The system in question is a weighing system operating in a LAN environment. The system is used in the medical industry for producing tablets. We present the design of a curriculum to support the development of major components of the weighing system. A simple teaching model for software engineering is presented which combines technical disciplines with disciplines from section 2-4 in the CDIO syllabus. The implementation of a joint project involving several courses supports the CDIO perspective. Already the traditional IT-diploma education for decades has included many of the essential features of the CDIO (for example, focus on teamwork, development of social skills, the open nature of design problems). The specific project has previously been conducted on 5th Semester The project has now been brought forward to the 2nd semester of study. A successful implementation at this level requires careful planning of activities through the semester. Principles of the CDIO have been of great help in this regard. Finally we draw conclusions and give our recommendations based on those.

  2. Generation of Embedded Hardware/Software from SystemC

    OpenAIRE

    Ouadjaout Salim; Houzet Dominique

    2006-01-01

    Designers increasingly rely on reusing intellectual property (IP) and on raising the level of abstraction to respect system-on-chip (SoC) market characteristics. However, most hardware and embedded software codes are recoded manually from system level. This recoding step often results in new coding errors that must be identified and debugged. Thus, shorter time-to-market requires automation of the system synthesis from high-level specifications. In this paper, we propose a design flow intend...

  3. A Novel Scheme to Design Software-Controllable Vector Microwave Signal Generator and its Application

    Directory of Open Access Journals (Sweden)

    L. Meng

    2010-01-01

    Full Text Available With the rapid development of wireless communications, there will be many communication standards in the future, which may cost much to buy the corresponding vector microwave signal generator. Hence, this study investigated a novel vector microwave signal generation method, which modeled the vector baseband signal by the CAD software (Agilent ADS and then control the conventional microwave signal generation hardware to output vector microwave signals. Compared with the specified vector microwave signal generator developed by Agilent, Anritsu, etc., our software-controllable microwave signal source is cheaper, more flexible and more convenient. Moreover, as an application of our method, we model and realize the TD-SCDMA baseband signal corrupted by multipath channel and Additive White Gaussian Noise (AWGN in ADS software and then control the hardware (Agilent E4432B to generate the TD-SCDMA microwave signals. The measurements of the TD-SCDMA microwave signals approve the validity of our method.

  4. Learning from examples - Generation and evaluation of decision trees for software resource analysis

    Science.gov (United States)

    Selby, Richard W.; Porter, Adam A.

    1988-01-01

    A general solution method for the automatic generation of decision (or classification) trees is investigated. The approach is to provide insights through in-depth empirical characterization and evaluation of decision trees for software resource data analysis. The trees identify classes of objects (software modules) that had high development effort. Sixteen software systems ranging from 3,000 to 112,000 source lines were selected for analysis from a NASA production environment. The collection and analysis of 74 attributes (or metrics), for over 4,700 objects, captured information about the development effort, faults, changes, design style, and implementation style. A total of 9,600 decision trees were automatically generated and evaluated. The trees correctly identified 79.3 percent of the software modules that had high development effort or faults, and the trees generated from the best parameter combinations correctly identified 88.4 percent of the modules on the average.

  5. GEOGRAPHICAL DESCRIPTION OF THE 2 ND DISTRICT OF BUCHAREST

    Directory of Open Access Journals (Sweden)

    Mariana Cârstea

    2009-10-01

    Full Text Available Located in the north-east of Bucharest, with a population of approx. 400,000 inhabitants, the current territory of the 2nd district was once part of Vl?siei forests, crossed by the river Colentina. It is a tabular plain, with low declivity on NW-SE direction the only major bumps are determined leading to the terrace Colentina, tablelands and anthropic relief. The Colentina Plain covers 36% of the Bucharest Municipality and it is characterized by altitudes that vary between 88.9 m in the Free Press Square and 55m at C?telu. Field overlap Otopeni Bucharest north (northern district Colentina, B?neasa, Pipera is characterized by altitudes of 85-90 m, by fragmentation of 0.5 km/square km relief, through a high frequency tablelands and growth of local slopes (common values of 10 degrees. The 2nd district is on the second place in terms of total area of green spaces (4,187,000 square meters with an index of area of green space per capita of 13.6 square meters per head, but uneven distributed in the sector. The vegetation of 2nd district is represented in particular by vegetation in parks (Circus’ Park, Plumbuita, Morarilor, Tei, gardens and green spaces in housing blocks. Valleys are cut into loess are generally steep sides with intense phenomena of warping and biogenic mineral presents meadows, sometimes covered by lakes or swamps. The largest lakes of the valley, made by dams are located on Colentina river. Geomorphologic defining characteristics are the result of the action of erosion, transportation and deposition on the lower course of the Dâmbovi?a river. Altimetry and the average curve in the same time the capital is 80 m.

  6. Book review: Psychology in a work context (2nd Ed.)

    OpenAIRE

    Nanette Tredoux

    2003-01-01

    Bergh, Z. & Theron, A.L. (Eds) (2003) Psychology in a work context (2nd Ed.). Cape Town: Oxford University Press.

    This book is an overview and introduction to Industrial and Organisational Psychology. It is a work of ambitious scope, and it is clear that the contributors have invested a great deal of thought and effort in the planning and execution of the book. The current version is the second edition, and it looks set to become one of those standard textbooks th...

  7. GEOGRAPHICAL DESCRIPTION OF THE 2 ND DISTRICT OF BUCHAREST

    OpenAIRE

    Mariana Cârstea

    2009-01-01

    Located in the north-east of Bucharest, with a population of approx. 400,000 inhabitants, the current territory of the 2nd district was once part of Vl?siei forests, crossed by the river Colentina. It is a tabular plain, with low declivity on NW-SE direction the only major bumps are determined leading to the terrace Colentina, tablelands and anthropic relief. The Colentina Plain covers 36% of the Bucharest Municipality and it is characterized by altitudes that vary between 88.9 m in the Free ...

  8. CELEBRATED APRIL 2nd – INTERNATIONAL DAY OF PERSONS WITH AUTISM

    OpenAIRE

    Manuela KRCHANOSKA

    2014-01-01

    On April 2nd, the Macedonian Scientific Society for Autism, for the fourth time organized an event on the occasion of the International Day of Persons with Autism. The event with cultural and artistic character was held at the Museum of the Macedonian Struggle under the motto “They are not alone, we are with them”. The huge number of citizens only confirmed the motto. It seemed that the hall of the Museum of the Macedonian Struggle is too small for the warm hearts of the audience. More than 3...

  9. 2nd Meeting of DRDO Library Information Officers

    Directory of Open Access Journals (Sweden)

    Director Director

    1981-01-01

    Full Text Available The 2nd Meeting o f the Officers-in-Charge of Libraries TICs of the DRDO was held at DESlDOC on 2-3 December, 1980. The main aim of the Meeting was to identify the common problems among the Libraries/TICs in the dissemination of information to scientists and suggest remedial measures to improve the efficiency. The Meeting was inaugurated by Prof M Krishnamurthi, CCR&D(K. Forty-six delegates from R&D HQr/Labs/Estts attended the Meeting.

  10. Software tool to generate MLC leaf sequence for the delivery of a given intensity profile

    International Nuclear Information System (INIS)

    In the step and shoot method of achieving beam intensity modulation, a field is divided into a set of sub fields each of which are irradiated with uniform beam intensity levels. Usually the fluence map for each beam, provided by the inverse treatment planning software program is normalized into a specified number of intensity levels. The leaf sequencing software then generates the multileaf collimator (MLC) leaf position sequence required to produce the fluence profile

  11. Semi-automatic generation of web-based computing environments for software libraries

    OpenAIRE

    Kressner, D.; Johansson, P.

    2002-01-01

    A set of utilities for generating web computing environments related to mathematical and engineering library software is presented. The web interface can be accessed from a standard world wide web browser with no need for additional software installations on the local machine. The environment provides a user-friendly access to computational routines, workspace management, reusable sessions and support of various data formats, including MATLAB binaries. The creation of new interfaces is a stra...

  12. Scoping analysis of the Advanced Test Reactor using SN2ND

    Energy Technology Data Exchange (ETDEWEB)

    Wolters, E.; Smith, M. (NE NEAMS PROGRAM); ( SC)

    2012-07-26

    A detailed set of calculations was carried out for the Advanced Test Reactor (ATR) using the SN2ND solver of the UNIC code which is part of the SHARP multi-physics code being developed under the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program in DOE-NE. The primary motivation of this work is to assess whether high fidelity deterministic transport codes can tackle coupled dynamics simulations of the ATR. The successful use of such codes in a coupled dynamics simulation can impact what experiments are performed and what power levels are permitted during those experiments at the ATR. The advantages of the SN2ND solver over comparable neutronics tools are its superior parallel performance and demonstrated accuracy on large scale homogeneous and heterogeneous reactor geometries. However, it should be noted that virtually no effort from this project was spent constructing a proper cross section generation methodology for the ATR usable in the SN2ND solver. While attempts were made to use cross section data derived from SCALE, the minimal number of compositional cross section sets were generated to be consistent with the reference Monte Carlo input specification. The accuracy of any deterministic transport solver is impacted by such an approach and clearly it causes substantial errors in this work. The reasoning behind this decision is justified given the overall funding dedicated to the task (two months) and the real focus of the work: can modern deterministic tools actually treat complex facilities like the ATR with heterogeneous geometry modeling. SN2ND has been demonstrated to solve problems with upwards of one trillion degrees of freedom which translates to tens of millions of finite elements, hundreds of angles, and hundreds of energy groups, resulting in a very high-fidelity model of the system unachievable by most deterministic transport codes today. A space-angle convergence study was conducted to determine the meshing and angular cubature requirements for the ATR, and also to demonstrate the feasibility of performing this analysis with a deterministic transport code capable of modeling heterogeneous geometries. The work performed indicates that a minimum of 260,000 linear finite elements combined with a L3T11 cubature (96 angles on the sphere) is required for both eigenvalue and flux convergence of the ATR. A critical finding was that the fuel meat and water channels must each be meshed with at least 3 'radial zones' for accurate flux convergence. A small number of 3D calculations were also performed to show axial mesh and eigenvalue convergence for a full core problem. Finally, a brief analysis was performed with different cross sections sets generated from DRAGON and SCALE, and the findings show that more effort will be required to improve the multigroup cross section generation process. The total number of degrees of freedom for a converged 27 group, 2D ATR problem is {approx}340 million. This number increases to {approx}25 billion for a 3D ATR problem. This scoping study shows that both 2D and 3D calculations are well within the capabilities of the current SN2ND solver, given the availability of a large-scale computing center such as BlueGene/P. However, dynamics calculations are not realistic without the implementation of improvements in the solver.

  13. TagGD: Fast and Accurate Software for DNA Tag Generation and Demultiplexing

    OpenAIRE

    Costea, Paul Igor; Lundeberg, Joakim; Akan, Pelin

    2013-01-01

    Multiplexing is of vital importance for utilizing the full potential of next generation sequencing technologies. We here report TagGD (DNA-based Tag Generator and Demultiplexor), a fully-customisable, fast and accurate software package that can generate thousands of barcodes satisfying user-defined constraints and can guarantee full demultiplexing accuracy. The barcodes are designed to minimise their interference with the experiment. Insertion, deletion and substitution events are considered ...

  14. TagGD : Fast and Accurate Software for DNA Tag Generation and Demultiplexing

    OpenAIRE

    Costea, Paul Igor; Lundeberg, Joakim; Akan, Pelin

    2013-01-01

    Multiplexing is of vital importance for utilizing the full potential of next generation sequencing technologies. We here report TagGD (DNA-based Tag Generator and Demultiplexor), a fully-customisable, fast and accurate software package that can generate thousands of barcodes satisfying user-defined constraints and can guarantee full demultiplexing accuracy. The barcodes are designed to minimise their interference with the experiment. Insertion, deletion and substitution events are considered ...

  15. A Novel Scheme to Design Software-Controllable Vector Microwave Signal Generator and its Application

    OpenAIRE

    Meng, L.; Chen, F.; Liu, B.; Hua, J.; Liu, J.

    2010-01-01

    With the rapid development of wireless communications, there will be many communication standards in the future, which may cost much to buy the corresponding vector microwave signal generator. Hence, this study investigated a novel vector microwave signal generation method, which modeled the vector baseband signal by the CAD software (Agilent ADS) and then control the conventional microwave signal generation hardware to output vector microwave signals. Compared with the specified vector micro...

  16. Afs password expiration starts Feb 2nd 2004

    CERN Document Server

    2004-01-01

    Due to security reasons, and in agreement with CERN management, afs/lxplus passwords will fall into line with Nice/Mail passwords on February 2nd and expire annually. As of the above date afs account holders who have not changed their passwords for over a year will have a 60 day grace period to make a change. Following this date their passwords will become invalid. What does this mean to you? If you have changed your afs password in the past 10 months the only difference is that 60 days before expiration you will receive a warning message. Similar warnings will also appear nearer the time of expiration. If you have not changed your password for more than 10 months, then, as of February 2nd you will have 60 days to change it using the command ?kpasswd'. Help to choose a good password can be found at: http://security.web.cern.ch/security/passwords/ If you have been given a temporary password at any time by the Helpdesk or registration team this will automatically fall into the expiration category ...

  17. Software Reliability Testing Data Generation Approach Based on a Mixture Model

    Directory of Open Access Journals (Sweden)

    Qin Zheng

    2010-01-01

    Full Text Available To solve the problem about software reliability testing cases and testing data generation of real-time control systems, this study applies the reliability testing cases generation approach based on the mixture of operation profile and Markov chain which describes software operation profile by the use cases of UML, establishes the use model based on UML model for automatically deriving the testing model from the use model, generates a reliability testing case set based on the testing model and generates testing input data of reliability testing semi-automatically by eliciting input and output variables and abstracting testing input and output classes. The results of reliability testing on the actual airborne software show that the framework of the test case set generated by the proposed model is fairly stable. Thus, the proposed approach is suitable for generating reliability testing data of the real-time control system software semi-automatically, greatly simplifies the reliability testing process, improves testing efficiency and ensures testing validity.

  18. Skel: Generative Software for Producing Skeletal I/O Applications

    Energy Technology Data Exchange (ETDEWEB)

    Logan, J.; Klasky, S.; Lofstead, J.; Abbasi, H.; Ethier, S.; Grout, R.; Ku, S. H.; Liu, Q.; Ma, X.; Parashar, M.; Podhorszki, N.; Schwan, K.; Wolf, M.

    2011-01-01

    Massively parallel computations consist of a mixture of computation, communication, and I/O. As part of the co-design for the inevitable progress towards exascale computing, we must apply lessons learned from past work to succeed in this new age of computing. Of the three components listed above, implementing an effective parallel I/O solution has often been overlooked by application scientists and was usually added to large scale simulations only when existing serial techniques had failed. As scientists teams scaled their codes to run on hundreds of processors, it was common to call on an I/O expert to implement a set of more scalable I/O routines. These routines were easily separated from the calculations and communication, and in many cases, an I/O kernel was derived from the application which could be used for testing I/O performance independent of the application. These I/O kernels developed a life of their own used as a broad measure for comparing different I/O techniques. Unfortunately, as years passed and computation and communication changes required changes to the I/O, the separate I/O kernel used for benchmarking remained static no longer providing an accurate indicator of the I/O performance of the simulation making I/O research less relevant for the application scientists. In this paper we describe a new approach to this problem where I/O kernels are replaced with skeletal I/O applications automatically generated from an abstract set of simulation I/O parameters. We realize this abstraction by leveraging the ADIOS middleware's XML I/O specification with additional runtime parameters. Skeletal applications offer all of the benefits of I/O kernels including allowing I/O optimizations to focus on useful I/O patterns. Moreover, since they are automatically generated, it is easy to produce an updated I/O skeleton whenever the simulation's I/O changes. In this paper we analyze the performance of automatically generated I/O skeletal applications for the S3D and GTS codes. We show that these skeletal applications achieve performance comparable to that of the production applications. We wrap up the paper with a discussion of future changes to make the skeletal application better approximate the actual I/O performed in the simulation.

  19. THR Simulator – the software for generating radiographs of THR prosthesis

    Directory of Open Access Journals (Sweden)

    Hou Sheng-Mou

    2009-01-01

    Full Text Available Abstract Background Measuring the orientation of acetabular cup after total hip arthroplasty is important for prognosis. The verification of these measurement methods will be easier and more feasible if we can synthesize prosthesis radiographs in each simulated condition. One reported method used an expensive mechanical device with an indeterminable precision. We thus develop a program, THR Simulator, to directly synthesize digital radiographs of prostheses for further analysis. Under Windows platform and using Borland C++ Builder programming tool, we developed the THR Simulator. We first built a mathematical model of acetabulum and femoral head. The data of the real dimension of prosthesis was adopted to generate the radiograph of hip prosthesis. Then with the ray tracing algorithm, we calculated the thickness each X-ray beam passed, and then transformed to grey scale by mapping function which was derived by fitting the exponential function from the phantom image. Finally we could generate a simulated radiograph for further analysis. Results Using THR Simulator, the users can incorporate many parameters together for radiograph synthesis. These parameters include thickness, film size, tube distance, film distance, anteversion, abduction, upper wear, medial wear, and posterior wear. These parameters are adequate for any radiographic measurement research. This THR Simulator has been used in two studies, and the errors are within 2° for anteversion and 0.2 mm for wearing measurement. Conclusion We design a program, THR Simulator that can synthesize prosthesis radiographs. Such a program can be applied in future studies for further analysis and validation of measurement of various parameters of pelvis after total hip arthroplasty.

  20. 2nd international conference on advanced nanomaterials and nanotechnology

    CERN Document Server

    Goswami, D; Perumal, A

    2013-01-01

    Nanoscale science and technology have occupied centre stage globally in modern scientific research and discourses in the early twenty first century. The enabling nature of the technology makes it important in modern electronics, computing, materials, healthcare, energy and the environment. This volume contains selected articles presented (as Invited/Oral/Poster presentations) at the 2nd international conference on advanced materials and nanotechnology (ICANN-2011) held recently at the Indian Institute of Technology Guwahati, during Dec 8-10, 2011. The list of topics covered in this proceedings include: Synthesis and self assembly of nanomaterials Nanoscale characterisation Nanophotonics & Nanoelectronics Nanobiotechnology Nanocomposites  F   Nanomagnetism Nanomaterials for Enery Computational Nanotechnology Commercialization of Nanotechnology The conference was represented by around 400 participants from several countries including delegates invited from USA, Germany, Japan, UK, Taiwan, Italy, Singapor...

  1. 2nd Colombian Congress on Computational Biology and Bioinformatics

    CERN Document Server

    Cristancho, Marco; Isaza, Gustavo; Pinzón, Andrés; Rodríguez, Juan

    2014-01-01

    This volume compiles accepted contributions for the 2nd Edition of the Colombian Computational Biology and Bioinformatics Congress CCBCOL, after a rigorous review process in which 54 papers were accepted for publication from 119 submitted contributions. Bioinformatics and Computational Biology are areas of knowledge that have emerged due to advances that have taken place in the Biological Sciences and its integration with Information Sciences. The expansion of projects involving the study of genomes has led the way in the production of vast amounts of sequence data which needs to be organized, analyzed and stored to understand phenomena associated with living organisms related to their evolution, behavior in different ecosystems, and the development of applications that can be derived from this analysis.  .

  2. Preliminary Design of 13MHz RF Implanter 2nd Cavity

    International Nuclear Information System (INIS)

    A 13MHz Radio Frequency(RF) cavity for an RF implanter has been designed and fabricated at the Proton Engineering Frontier Project(PEFP). The implanter consists of an ion source, a focusing magnet, an RF cavity, a bending magnet and a diagnostic chamber. Nowadays, inductors can be found in almost every electrical and electronic product. These key components are needed to store electrical energy, select frequencies, and protect against overvoltage and overcurrent. In the case of the inductors, which usually work in the range of radio frequency, the one of the most important attributes is the quality factor. 13MHz RF implanter cavity consists of a coil having a circular cross-section, accelerating electrodes that are directly coupled to the beginning and the end of the coil, and a ground electrode for the inductor. It is purpose of this paper for achieve more high beam energy to explain preliminary design concept 2nd cavity

  3. 2nd International Multidisciplinary Microscopy and Microanalysis Congress

    CERN Document Server

    Oral, Ahmet; Ozer, Mehmet

    2015-01-01

    The 2nd International Multidisciplinary Microscopy and Microanalysis Congress & Exhibition (InterM 2014) was held on 16–19 October 2014 in Oludeniz, Fethiye/ Mugla, Turkey. The aim of the congress was to gather scientists from various branches and discuss the latest improvements in the field of microscopy. The focus of the congress has been widened in an "interdisciplinary" manner, so as to allow all scientists working on several related subjects to participate and present their work. These proceedings include 33 peer-reviewed technical papers, submitted by leading academic and research institutions from over 17 countries and representing some of the most cutting-edge research available. The papers were presented at the congress in the following sessions: ·         Applications of Microscopy in the Physical Sciences ·         Applications of Microscopy in the Biological Sciences.

  4. 2nd International Conference on NeuroRehabilitation

    CERN Document Server

    Andersen, Ole; Akay, Metin

    2014-01-01

    The book is the proceedings of the 2nd International Conference on NeuroRehabilitation (ICNR 2014), held 24th-26th June 2014 in Aalborg, Denmark. The conference featured the latest highlights in the emerging and interdisciplinary field of neural rehabilitation engineering and identified important healthcare challenges the scientific community will be faced with in the coming years. Edited and written by leading experts in the field, the book includes keynote papers, regular conference papers, and contributions to special and innovation sessions, covering the following main topics: neuro-rehabilitation applications and solutions for restoring impaired neurological functions; cutting-edge technologies and methods in neuro-rehabilitation; and translational challenges in neuro-rehabilitation. Thanks to its highly interdisciplinary approach, the book will not only be a  highly relevant reference guide for academic researchers, engineers, neurophysiologists, neuroscientists, physicians and physiotherapists workin...

  5. 2nd CEAS Specialist Conference on Guidance, Navigation and Control

    CERN Document Server

    Mulder, Bob; Choukroun, Daniel; Kampen, Erik-Jan; Visser, Coen; Looye, Gertjan

    2013-01-01

    Following the successful 1st CEAS (Council of European Aerospace Societies) Specialist Conference on Guidance, Navigation and Control (CEAS EuroGNC) held in Munich, Germany in 2011, Delft University of Technology happily accepted the invitation of organizing the 2nd  CEAS EuroGNC in Delft, The Netherlands in 2013. The goal of the conference is to promote new advances in aerospace GNC theory and technologies for enhancing safety, survivability, efficiency, performance, autonomy and intelligence of aerospace systems using on-board sensing, computing and systems. A great push for new developments in GNC are the ever higher safety and sustainability requirements in aviation. Impressive progress was made in new research fields such as sensor and actuator fault detection and diagnosis, reconfigurable and fault tolerant flight control, online safe flight envelop prediction and protection, online global aerodynamic model identification, online global optimization and flight upset recovery. All of these challenges de...

  6. Software quality assurance project for reactor physics codes at the Point Lepreau Generating Station

    International Nuclear Information System (INIS)

    One of the ongoing challenges faced by the Nuclear Industry is Software Quality Assurance (SQA). In this paper, a project to address SQA issues in the Reactor Physics Group at the Point Lepreau Generating Station (PLGS) will be discussed. The work illustrates a process which could be implemented at any facility to achieve code compliance to CSA Standard N286.7 requirements. (author)

  7. Real-time infrared scene generation software for I2RSS hardware in the loop

    Science.gov (United States)

    Lyles, Patrick V.; Cosby, David S.; Buford, James A., Jr.; Bunfield, Dennis H.

    2005-05-01

    This paper describes the current research and development of advanced scene generation technology for integration into the I2RSS - Hardware-in-the-Loop (HWIL) facilities at the US Army AMRDEC at Redstone Arsenal, AL. A real-time dynamic infra-red (IR) scene generator has been developed in support of a high altitude scenario leveraging COTS hardware and open source software. The Multi-Spectral Mode Scene Generator (MMSG) is an extensible software architecture that is powerful yet flexible. The I2RSS scene generator has implemented dynamic signature by integrating the signature prediction codes along with Open Source Software, COTS hardware along with custom built interfaces. A modular, plug-in framework has been developed that supports rapid reconfiguration to permit the use of a variety of state data input sources, geometric model formats, and signature and material databases. The platform independent software yields a cost-effective upgrade path to integrate best-of-breed graphics and system architectures.

  8. Real time software for a heat recovery steam generator control system

    Energy Technology Data Exchange (ETDEWEB)

    Valdes, R.; Delgadillo, M.A.; Chavez, R. [Electrical Research Inst., Cuernavaca, Morelos (Mexico)

    1995-12-31

    This paper is addressed to the development and successful implementation of a real time software for the Heat Recovery Steam Generator (HRSG) control system of a Combined Cycle Power Plant. The real time software for the HRSG control system physically resides in a Control and Acquisition System (SAC) which is a component of a distributed control system (DCS). The SAC is a programmable controller. The DCS installed at the Gomez Palacio power plant in Mexico accomplishes the functions of logic, analog and supervisory control. The DCS is based on microprocessors and the architecture consists of workstations operating as a Man-Machine Interface (MMI), linked to SAC controllers by means of a communication system. The HRSG real time software is composed of an operating system, drivers, dedicated computer program and application computer programs. The operating system used for the development of this software was the MultiTasking Operating System (MTOS). The application software developed at IIE for the HRSG control system basically consisted of a set of digital algorithms for the regulation of the main process variables at the HRSG. By using the multitasking feature of MTOS, the algorithms are executed pseudo concurrently. In this way, the applications programs continuously use the resources of the operating system to perform their functions through a uniform service interface. The application software of the HRSG consist of three tasks, each of them has dedicated responsibilities. The drivers were developed for the handling of hardware resources of the SAC controller which in turn allows the signals acquisition and data communication with a MMI. The dedicated programs were developed for hardware diagnostics, task initializations, access to the data base and fault tolerance. The application software and the dedicated software for the HRSG control system was developed using C programming language due to compactness, portability and efficiency.

  9. Minimal Testcase Generation for Object-Oriented Software with State Charts

    Directory of Open Access Journals (Sweden)

    Ranjita Kumari Swain

    2012-08-01

    Full Text Available Today statecharts are a de facto standard in industry for modeling system behavior. Test data generation isone of the key issues in software testing. This paper proposes an reduction approach to test data generationfor the state-based software testing. In this paper, first state transition graph is derived from state chartdiagram. Then, all the required information are extracted from the state chart diagram. Then, test casesare generated. Lastly, a set of test cases are minimized by calculating the node coverage for each test case.It is also determined that which test cases are covered by other test cases. The advantage of our testgeneration technique is that it optimizes test coverage by minimizing time and cost. The present test datageneration scheme generates test cases which satisfy transition path coverage criteria, path coveragecriteria and action coverage criteria. A case study on Railway Ticket Vending Machine (RTVM has beenpresented to illustrate our approach.

  10. Software for Generating Troposphere Corrections for InSAR Using GPS and Weather Model Data

    Science.gov (United States)

    Moore, Angelyn W.; Webb, Frank H.; Fishbein, Evan F.; Fielding, Eric J.; Owen, Susan E.; Granger, Stephanie L.; Bjoerndahl, Fredrik; Loefgren, Johan; Fang, Peng; Means, James D.; Bock, Yehuda; Tong, Xiaopeng

    2013-01-01

    Atmospheric errors due to the troposphere are a limiting error source for spaceborne interferometric synthetic aperture radar (InSAR) imaging. This software generates tropospheric delay maps that can be used to correct atmospheric artifacts in InSAR data. The software automatically acquires all needed GPS (Global Positioning System), weather, and Digital Elevation Map data, and generates a tropospheric correction map using a novel algorithm for combining GPS and weather information while accounting for terrain. Existing JPL software was prototypical in nature, required a MATLAB license, required additional steps to acquire and ingest needed GPS and weather data, and did not account for topography in interpolation. Previous software did not achieve a level of automation suitable for integration in a Web portal. This software overcomes these issues. GPS estimates of tropospheric delay are a source of corrections that can be used to form correction maps to be applied to InSAR data, but the spacing of GPS stations is insufficient to remove short-wavelength tropospheric artifacts. This software combines interpolated GPS delay with weather model precipitable water vapor (PWV) and a digital elevation model to account for terrain, increasing the spatial resolution of the tropospheric correction maps and thus removing short wavelength tropospheric artifacts to a greater extent. It will be integrated into a Web portal request system, allowing use in a future L-band SAR Earth radar mission data system. This will be a significant contribution to its technology readiness, building on existing investments in in situ space geodetic networks, and improving timeliness, quality, and science value of the collected data

  11. Measurement of the (5p1/2nd)3 auto-ionizing levels of strontium

    International Nuclear Information System (INIS)

    The (5p1/2nd)3 (n equal to 11 to 24) auto-ionizing levels of Sr have been measured by the laser melti-step excitation technique. Their effective quantum numbers have been determined. The interaction with the (5p3/2nd)3 series has been discussed

  12. Studi On Oxidation State Of U In Ba2NdUO6

    International Nuclear Information System (INIS)

    Ba2NdUO6 is not of the important compounds that is formed from a solidification process for high level liquid waste using super high temperature method Ba2NdUO6 has ordered perovskite structure. The objective of this study is to investigate oxidation state of U in Ba2NdUO6. The properties of Ba2NdUO6 were observed by using Faraday-type torsion magnetometer and X-ray Photoelectron Spectrometer (XPS). The magnetic susceptibility measured in the temperature range of 4K to room temperature showed that the Ba2NdUO6 is paramagnetism that obeys the Curie-Weiss law. The effective moment of Ba2NdUO6 is 3.04 ?B. The results of xPs spectrum showed that the peaks of U4f for Ba2NdUO6 appeared exactly between binding energy of UO2 and UO3. It can be concluded that Ba2NdUO6 has binding energy peaks corresponding to pentavalent uranium

  13. Psychiatric Diagnosis and Concomitant Medical Treatment for 1st and 2nd Grade Children

    Science.gov (United States)

    Cornell-Swanson, La Vonne; Frankenberger, William; Ley, Katie; Bowman, Krista

    2007-01-01

    This study examined the proportion of children in 1st and 2nd grade classes who were currently prescribed medication for psychotropic disorders. The study also examined the attitudes of 1st and 2nd grade teachers toward diagnosis of psychiatric disorders and use of psychiatric medication to treat children. Results of the current study indicate…

  14. Software architecture for control and data acquisition of linear plasma generator Magnum-PSI

    Energy Technology Data Exchange (ETDEWEB)

    Groen, P.W.C., E-mail: p.w.c.groen@differ.nl [FOM Institute DIFFER – Dutch Institute For Fundamental Energy Research, Association EURATOM-FOM, Partner in the Trilateral Euregio Cluster, P.O. Box 1207, 3430 BE, Nieuwegein (Netherlands); Beveren, V. van; Broekema, A.; Busch, P.J.; Genuit, J.W.; Kaas, G.; Poelman, A.J.; Scholten, J.; Zeijlmans van Emmichoven, P.A. [FOM Institute DIFFER – Dutch Institute For Fundamental Energy Research, Association EURATOM-FOM, Partner in the Trilateral Euregio Cluster, P.O. Box 1207, 3430 BE, Nieuwegein (Netherlands)

    2013-10-15

    Highlights: ? An architecture based on a modular design. ? The design offers flexibility and extendability. ? The design covers the overall software architecture. ? It also covers its (sub)systems’ internal structure. -- Abstract: The FOM Institute DIFFER – Dutch Institute for Fundamental Energy Research has completed the construction phase of Magnum-PSI, a magnetized, steady-state, large area, high-flux linear plasma beam generator to study plasma surface interactions under ITER divertor conditions. Magnum-PSI consists of several hardware subsystems, and a variety of diagnostic systems. The COntrol, Data Acquisition and Communication (CODAC) system integrates these subsystems and provides a complete interface for the Magnum-PSI users. Integrating it all, from the lowest hardware level of sensors and actuators, via the level of networked PLCs and computer systems, up to functions and classes in programming languages, demands a sound and modular software architecture, which is extendable and scalable for future changes. This paper describes this architecture, and the modular design of the software subsystems. The design is implemented in the CODAC system at the level of services and subsystems (the overall software architecture), as well as internally in the software subsystems.

  15. Software architecture for control and data acquisition of linear plasma generator Magnum-PSI

    International Nuclear Information System (INIS)

    Highlights: ? An architecture based on a modular design. ? The design offers flexibility and extendability. ? The design covers the overall software architecture. ? It also covers its (sub)systems’ internal structure. -- Abstract: The FOM Institute DIFFER – Dutch Institute for Fundamental Energy Research has completed the construction phase of Magnum-PSI, a magnetized, steady-state, large area, high-flux linear plasma beam generator to study plasma surface interactions under ITER divertor conditions. Magnum-PSI consists of several hardware subsystems, and a variety of diagnostic systems. The COntrol, Data Acquisition and Communication (CODAC) system integrates these subsystems and provides a complete interface for the Magnum-PSI users. Integrating it all, from the lowest hardware level of sensors and actuators, via the level of networked PLCs and computer systems, up to functions and classes in programming languages, demands a sound and modular software architecture, which is extendable and scalable for future changes. This paper describes this architecture, and the modular design of the software subsystems. The design is implemented in the CODAC system at the level of services and subsystems (the overall software architecture), as well as internally in the software subsystems

  16. Book review: Psychology in a work context (2nd Ed.

    Directory of Open Access Journals (Sweden)

    Nanette Tredoux

    2003-10-01

    Full Text Available Bergh, Z. & Theron, A.L. (Eds (2003 Psychology in a work context (2nd Ed.. Cape Town: Oxford University Press.

    This book is an overview and introduction to Industrial and Organisational Psychology. It is a work of ambitious scope, and it is clear that the contributors have invested a great deal of thought and effort in the planning and execution of the book. The current version is the second edition, and it looks set to become one of those standard textbooks that are revised every few years to keep up with changing times. It is a handsome volume, produced to a high standard of editorial care, pleasingly laid out and organised well enough to be useful as an occasional reference source. An English-Afrikaans glossary, tables of contents for every chapter as well as for the entire book, a comprehensive index and extensive bibliography make it easy to retrieve the information relating to a particular topic. Every chapter ends with a conclusion summarising the gist of the material covered. Quality illustrations lighten the tone and help to bring some of the concepts to life. Learning outcomes and self-assessment exercises and questions for every chapter will be useful to the lecturer using the book as a source for a tutored course, and for the student studying by distance learning. If sold at the suggested retail price, the book represents good value compared to imported textbooks that cover similar ground.

  17. Book Review: Digital Forensic Evidence Examination (2nd ed.

    Directory of Open Access Journals (Sweden)

    Gary Kessler

    2010-06-01

    Full Text Available Cohen, F. (2010. Digital Forensic Evidence Examination (2nd ed.. Livermore, CA: ASP Press. 452 pages, ISBN: 978-1-878109-45-3, US$79.Reviewed by Gary C. Kessler, Gary Kessler Associates & Edith Cowan University (gck@garykessler.netOn the day that I sat down to start to write this review, the following e-mailcame across on one of my lists:Person A and Person B write back and forth and create an email thread. Person A then forwards the email to Person C, but changes some wording in the email exchange between A & B. What is the easiest way (and is it even possible to find out when that earlier email message was altered before sent to Person C?Before you try to answer these questions, read Fred Cohen's Digital Forensic Evidence Examination. His book won't actually tell you how to answer these questions but it will help you understand the difficulty in even trying to answer them with any level of certainty.(see PDF for full review

  18. CELEBRATED APRIL 2nd – INTERNATIONAL DAY OF PERSONS WITH AUTISM

    Directory of Open Access Journals (Sweden)

    Manuela KRCHANOSKA

    2014-09-01

    Full Text Available On April 2nd, the Macedonian Scientific Society for Autism, for the fourth time organized an event on the occasion of the International Day of Persons with Autism. The event with cultural and artistic character was held at the Museum of the Macedonian Struggle under the motto “They are not alone, we are with them”. The huge number of citizens only confirmed the motto. It seemed that the hall of the Museum of the Macedonian Struggle is too small for the warm hearts of the audience. More than 300 guests were present in the hall, among which there were children with autism and their families, prominent professors, doctors, special educators and rehabilitators, psychologists, students and other citizens with glad heart and will who decided to enrich the event with their presence. The event was opened by the violinist Plamenka Trajkovska, which performed one song. After her, the President of the Macedonian Scientific Society for Autism, PhD. Vladimir Trajkovski delivered his speech. The professor told the parents of autistic children, who were present in large number, not to lose hope, to fight for their children, and that the Macedonian Scientific Society for Autism will provide tremendous support and assistance in this struggle.

  19. Testing Product Generation in Software Product Lines Using Pairwise for Features Coverage

    Science.gov (United States)

    Pérez Lamancha, Beatriz; Polo Usaola, Macario

    A Software Product Lines (SPL) is "a set of software-intensive systems sharing a common, managed set of features that satisfy the specific needs of a particular market segment or mission and that are developed from a common set of core assets in a prescribed way". Variability is a central concept that permits the generation of different products of the family by reusing core assets. It is captured through features which, for a SPL, define its scope. Features are represented in a feature model, which is later used to generate the products from the line. From the testing point of view, testing all the possible combinations in feature models is not practical because: (1) the number of possible combinations (i.e., combinations of features for composing products) may be untreatable, and (2) some combinations may contain incompatible features. Thus, this paper resolves the problem by the implementation of combinatorial testing techniques adapted to the SPL context.

  20. POLITO- A new open-source, platform independent software for generating high-quality lithostratigraphic columns

    Directory of Open Access Journals (Sweden)

    Cipran C. Stremtan

    2010-08-01

    Full Text Available POLITO is a free, open-source, and platform-independent software which can automatically generate lithostratigraphic columns from field data. Its simple and easy to use interface allows users to manipulate large datasets and create high-quality graphical outputs, either in editable vector or raster format, or as PDF files. POLITO uses USGS standard lithology patterns and can be downloaded from its Sourceforge project page (http://sourceforge.net/projects/polito/.

  1. RMAWGEN: A software project for a daily Multi-Site Weather Generator with R

    Science.gov (United States)

    Cordano, E.; Eccel, E.

    2012-04-01

    The modeling in in climate change applications for agricultural or hydrological purposes often requires daily time-series of precipitation and temperature. This is the case of downscaled series from monthly or seasonal predictions of Global Climate Models (GCMs). This poster presents a software project, the R package RMAWGEN (R Multi-Sites Auto-regressive Weather GENerator), to generate daily temperature and precipitation time series in several sites by using the theory of vectorial auto-regressive models (VAR). The VAR model is used because it is able to maintain the temporal and spatial correlations among the several series. In particular, observed time series of daily maximum and minimum temperature and precipitation are used to calibrate the parameters of a VAR model (saved as "GPCAvarest2" or "varest2" classes, which inherit the "varest" S3 class defined in the package vars [Pfaff, 2008]). Therefore the VAR model, coupled with monthly mean weather variables downscaled by GCM predictions, allows to generate several stochastic daily scenarios. The structure of the package consists in functions that transform precipitation and temperature time series into Gaussian-distributed random variables through deseasonalization and Principal Component Analysis. Then a VAR model is calibrated on transformed time series. The time series generated by VAR are then inversely re-transformed into precipitation and/or temperature series. An application is included in the software package as an example; it is presented by using a dataset with daily weather time series recorded in 59 different sites of Trentino (Italy) and its neighborhoods for the period 1958-2007. The software is distributed as a Free Software with General Public License (GPL) and is available on CRAN website (http://cran.r-project.org/web/packages/RMAWGEN/index.html)

  2. Simulation of photovoltaic systems electricity generation using homer software in specific locations in Serbia

    OpenAIRE

    Pavlovi? Tomislav M.; Milosavljevi? Dragana D.; Pirsl Danica S.

    2013-01-01

    In this paper basic information of Homer software for PV system electricity generation, NASA - Surface meteorology and solar energy database, RETScreen, PVGIS and HMIRS (Hydrometeorological Institute of Republic of Serbia) solar databases are given. The comparison of the monthly average values for daily solar radiation per square meter received by the horizontal surface taken from NASA, RETScreen, PVGIS and HMIRS solar databases for three locations in Serbia (Belgrade, Negotin and Zlati...

  3. SEMANTIC WEB-BASED SOFTWARE ENGINEERING BY AUTOMATED REQUIREMENTS ONTOLOGY GENERATION IN SOA

    Directory of Open Access Journals (Sweden)

    Vahid Rastgoo

    2014-04-01

    Full Text Available This paper presents an approach for automated generation of requirements ontology using UML diagrams in service-oriented architecture (SOA. The goal of this paper is to convenience progress of software engineering processes like software design, software reuse, service discovering and etc. The proposed method is based on a four conceptual layers. The first layer includes requirements achieved by stakeholders, the second one designs service-oriented diagrams from the data in first layer and extracts XMI codes of them. The third layer includes requirement ontology and protocol ontology to describe behavior of services and relationships between them semantically. Finally the forth layer makes standard the concepts exists in ontologies of previous layer. The generated ontology exceeds absolute domain ontology because it considers the behavior of services moreover the hierarchical relationship of them. Experimental results conducted on a set of UML4Soa diagrams in different scopes demonstrate the improvement of the proposed approach from different points of view such as: completeness of requirements ontology, automatic generation and considering SOA.

  4. Wind-US Results for the AIAA 2nd Propulsion Aerodynamics Workshop

    Science.gov (United States)

    Dippold, Vance III; Foster, Lancert; Mankbadi, Mina

    2014-01-01

    This presentation contains Wind-US results presented at the 2nd Propulsion Aerodynamics Workshop. The workshop was organized by the American Institute of Aeronautics and Astronautics, Air Breathing Propulsion Systems Integration Technical Committee with the purpose of assessing the accuracy of computational fluid dynamics for air breathing propulsion applications. Attendees included representatives from government, industry, academia, and commercial software companies. Participants were encouraged to explore and discuss all aspects of the simulation process including the effects of mesh type and refinement, solver numerical schemes, and turbulence modeling. The first set of challenge cases involved computing the thrust and discharge coefficients for a 25deg conical nozzle for a range of nozzle pressure ratios between 1.4 and 7.0. Participants were also asked to simulate two cases in which the 25deg conical nozzle was bifurcated by a solid plate, resulting in vortex shedding (NPR=1.6) and shifted plume shock (NPR=4.0). A second set of nozzle cases involved computing the discharge and thrust coefficients for a convergent dual stream nozzle for a range of subsonic nozzle pressure ratios. The workshop committee also compared the plume mixing of these cases across various codes and models. The final test case was a serpentine inlet diffuser with an outlet to inlet area ratio of 1.52 and an offset of 1.34 times the inlet diameter. Boundary layer profiles, wall static pressure, and total pressure at downstream rake locations were examined.

  5. 2nd International Open and Distance Learning (IODL Symposium

    Directory of Open Access Journals (Sweden)

    Reviewed by Murat BARKAN

    2006-10-01

    Full Text Available This closing remarks prepared and presented by Prof. Dr. Murat BARKAN Anadolu University, Eskisehir, TURKEY DEAR GUESTS, As the 2nd International Open and Distance Learning (IODL Symposium is now drawing to end, I would like to thank you all for your outstanding speeches, distinguished presentations, constructive roundtable and session discussions, and active participation during the last five days. I hope you all share my view that the whole symposium has been a very stimulating and successful experience. Also, on behalf of all the participants, I would like to take this opportunity to thank and congratulate the symposium organization committee for their excellent job in organizing and hosting our 2nd meeting. Throughout the symposium, five workshops, six keynote speeches and 66 papers, which were prepared by more than 150 academicians and practitioners from 23 different countries, reflected remarkable and various views and approaches about open and flexible learning. Besides, all these academic endeavors, 13 educational films were displayed during the symposium. The technology exhibition, hosted by seven companies, was very effective to showcase the current level of the technology and the success of applications of theory into practice. Now I would like to go over what our scholar workshop and keynote presenters shared with us: Prof. Marina McIsaac form Arizona State University dwelled on how to determine research topics worthwhile to be examined and how to choose appropriate research design and methods. She gave us clues on how to get articles published in professional journals. Prof. Colin Latchem from Australia and Prof. Dr. Ali Ekrem Ozkul from Anadolu University pointed to the importance of strategic planning for distance and flexible learning. They highlighted the advantages of strategic planning for policy-makers, planners, managers and staff. Dr. Wolfram Laaser from Fern University of Hagen, presented different multimedia clips and directed interactive exercises using flashmx in his workshop. Jack Koumi from UK, presented a workshop about what to teach on video and when to choose other media. He exemplified 27 added value techniques and teaching functions for TV and video. He later specified different capabilities and limitations of eight different media used in teaching, emphasizing the importance of optimizing media deployment. Dr. Janet Bohren from University of Cincinnati and Jennifer McVay-Dyche from United Theological Seminary, explained their experience with a course management system used to develop dialogue between K-12 teachers in Turkey and the US, on the topics of religion, culture and schools. Their workshop provided an overview of a pilot study. They showed us a good case-study of utilizing “Blackboard” as a mean for getting rid of biases and improving the understanding of the American and Turkish teachers against each other. We had very remarkable key notes as well. Dr Nikitas Kastis representing European Distance and E-Learning Network (EDEN made his speech on distance and e-Learning evolutions and trends in Europe. He informed the audience about the application and assessment criteria at European scale, concerning e-Learning in the education and training systems. Meanwhile, our key note speakers took our attention to different applications of virtual learning. Dr. Piet Kommers from University of Twente exemplified a virtual training environment for acquiring surgical skills. Dr. Timothy Shih from Tamkang University presented their project called Hard SCORM (Sharable Content Object Reference Model as an asynchronous distance learning specification. In his speech titled “Engaging and Supporting Problem Solving Online” Prof. David Jonassen from University of Missouri, reflected his vision of the future of education and explained why it should embrace problem solving. Then he showed us examples of incorporating this vision with learning environments for making online problem solving possible. Dr. Wolfram Laaser from Fern University talked on applications of ICT at Europe

  6. PREFACE: 2nd National Conference on Nanotechnology 'NANO 2008'

    Science.gov (United States)

    Czuba, P.; Kolodziej, J. J.; Konior, J.; Szymonski, M.

    2009-03-01

    This issue of Journal of Physics: Conference Series contains selected papers presented at the 2nd National Conference on Nanotechnology 'NANO2008', that was held in Kraków, Poland, 25-28 June 2008. It was organized jointly by the Polish Chemical Society, Polish Physical Society, Polish Vacuum Society, and the Centre for Nanometer-scale Science and Advanced Materials (NANOSAM) of the Jagiellonian University. The meeting presentations were categorized into the following topics: 1. Nanomechanics and nanotribology 2. Characterization and manipulation in nanoscale 3. Quantum effects in nanostructures 4. Nanostructures on surfaces 5. Applications of nanotechnology in biology and medicine 6. Nanotechnology in education 7. Industrial applications of nanotechnology, presentations of the companies 8. Nanoengineering and nanomaterials (international sessions shared with the fellows of Maria-Curie Host Fellowships within the 6th FP of the European Community Project 'Nano-Engineering for Expertise and Development, NEED') 9. Nanopowders 10. Carbon nanostructures and nanosystems 11. Nanoelectronics and nanophotonics 12. Nanomaterials in catalysis 13. Nanospintronics 14. Ethical, social, and environmental aspects of nanotechnology The Conference was attended by 334 participants. The presentations were delivered as 7 invited plenary lectures, 25 invited topical lectures, 78 oral and 108 poster contributions. Only 1/6 of the contributions presented during the Conference were submitted for publication in this Proceedings volume. From the submitted material, this volume of Journal of Physics: Conference Series contains 37 articles that were positively evaluated by independent referees. The Organizing Committee gratefully acknowledges all these contributions. We also thank all the referees of the papers submitted for the Proceedings for their timely and thorough work. We would like to thank all members of the National Program Committee for their work in the selection process of invited and contributed papers and in setting up the scientific program of the Conference. P Czuba, J J Kolodziej, J Konior, M Szymonski Kraków, 30 October 2008

  7. Simulation of photovoltaic systems electricity generation using homer software in specific locations in Serbia

    Directory of Open Access Journals (Sweden)

    Pavlovi? Tomislav M.

    2013-01-01

    Full Text Available In this paper basic information of Homer software for PV system electricity generation, NASA - Surface meteorology and solar energy database, RETScreen, PVGIS and HMIRS (Hydrometeorological Institute of Republic of Serbia solar databases are given. The comparison of the monthly average values for daily solar radiation per square meter received by the horizontal surface taken from NASA, RETScreen, PVGIS and HMIRS solar databases for three locations in Serbia (Belgrade, Negotin and Zlatibor is given. It was found that the annual average values of daily solar radiation taken from RETScreen solar database are the closest to the annual average values of daily solar radiation taken from HMIRS solar database for Belgrade, Negotin and Zlatibor. Monthly and total for year values of electricity production of fixed on-grid PV system of 1 kW with optimal inclinated and south oriented solar modules, in Belgrade, Negotin and Zlatibor using HOMER software simulation based on data for daily solar radiation taken from NASA, RETScreen, PVGIS and HMIRS databases are calculated. The relative deviation of electricity production of fixed on-grid PV system of 1 kW using HOMER software simulation based on data for daily solar radiation taken from NASA, RETScreen, and PVGIS databases compared to electricity production of fixed on-grid PV system of 1 kW using HOMER software simulation based on data for daily solar radiation taken from HMIRS databases in Belgrade, Negotin and Zlatibor are given. [Projekat Ministarstva nauke Republike Srbije, br. TR 33009

  8. GENESIS: Agile Generation of Information Management Oriented Software / GENESIS: Generación ágil de software orientado a gestión de información

    Scientific Electronic Library Online (English)

    Claudia, Jiménez Guarín; Juan Erasmo, Gómez.

    2010-05-01

    Full Text Available La especificación de un sistema de información puede estar clara desde el principio: debe adquirir, desplegar, consultar y modificar datos, usando una base de datos. El asunto es decidir cuál información manejar. En el caso que origina este trabajo, la información evoluciona permanentemente, incluso [...] hasta final del proyecto. Esto implica la construcción de un nuevo sistema cada vez que se redefine la información. Este artículo presenta Genesis, una infraestructura ágil para la construcción inmediata del sistema de información que sea requerido. Los expertos describen su información y consultas. Genesis produce el software correspondiente, generando las interfaces gráficas y la base de datos apropiados. Abstract in english The specification for an information system can be clear from the beginning: it must acquire, display, query and modify data, using a database. The main issue is to decide which information to manage. In the case originating this work, information was always evolving, even up to the end of the proje [...] ct. This implies the construction of a new system each time the information is redefined. This article presents Genesis, an agile for the immediate construction of required information systems. Experts describe their information needs and queries, and Genesis generates the corresponding application, with the appropriate graphical interfaces and database.

  9. 1st- and 2nd-order motion and texture resolution in central and peripheral vision

    Science.gov (United States)

    Solomon, J. A.; Sperling, G.

    1995-01-01

    STIMULI. The 1st-order stimuli are moving sine gratings. The 2nd-order stimuli are fields of static visual texture, whose contrasts are modulated by moving sine gratings. Neither the spatial slant (orientation) nor the direction of motion of these 2nd-order (microbalanced) stimuli can be detected by a Fourier analysis; they are invisible to Reichardt and motion-energy detectors. METHOD. For these dynamic stimuli, when presented both centrally and in an annular window extending from 8 to 10 deg in eccentricity, we measured the highest spatial frequency for which discrimination between +/- 45 deg texture slants and discrimination between opposite directions of motion were each possible. RESULTS. For sufficiently low spatial frequencies, slant and direction can be discriminated in both central and peripheral vision, for both 1st- and for 2nd-order stimuli. For both 1st- and 2nd-order stimuli, at both retinal locations, slant discrimination is possible at higher spatial frequencies than direction discrimination. For both 1st- and 2nd-order stimuli, motion resolution decreases 2-3 times more rapidly with eccentricity than does texture resolution. CONCLUSIONS. (1) 1st- and 2nd-order motion scale similarly with eccentricity. (2) 1st- and 2nd-order texture scale similarly with eccentricity. (3) The central/peripheral resolution fall-off is 2-3 times greater for motion than for texture.

  10. 2nd Common Symposium of FP6 Ecobuildings Projects, 7-8 April 2008, Stuttgart, Germany

    OpenAIRE

    Erhorn-Kluttig, H.; Erhorn, H.

    2008-01-01

    The programme and the highlights of the presentations, posters and round table discussions at the 2nd Common Ecobuildings Symposium held in Stuttgart at the beginning of April 2008, are summarised in this information paper.

  11. Using Automated Theorem Provers to Certify Auto-Generated Aerospace Software

    Science.gov (United States)

    Denney, Ewen; Fischer, Bernd; Schumann, Johann

    2004-01-01

    We describe a system for the automated certification of safety properties of NASA software. The system uses Hoare-style program verification technology to generate proof obligations which are then processed by an automated first-order theorem prover (ATP). For full automation, however, the obligations must be aggressively preprocessed and simplified We describe the unique requirements this places on the ATP and demonstrate how the individual simplification stages, which are implemented by rewriting, influence the ability of the ATP to solve the proof tasks. Experiments on more than 25,000 tasks were carried out using Vampire, Spass, and e-setheo.

  12. A practical comparison of de novo genome assembly software tools for next-generation sequencing technologies.

    Science.gov (United States)

    Zhang, Wenyu; Chen, Jiajia; Yang, Yang; Tang, Yifei; Shang, Jing; Shen, Bairong

    2011-01-01

    The advent of next-generation sequencing technologies is accompanied with the development of many whole-genome sequence assembly methods and software, especially for de novo fragment assembly. Due to the poor knowledge about the applicability and performance of these software tools, choosing a befitting assembler becomes a tough task. Here, we provide the information of adaptivity for each program, then above all, compare the performance of eight distinct tools against eight groups of simulated datasets from Solexa sequencing platform. Considering the computational time, maximum random access memory (RAM) occupancy, assembly accuracy and integrity, our study indicate that string-based assemblers, overlap-layout-consensus (OLC) assemblers are well-suited for very short reads and longer reads of small genomes respectively. For large datasets of more than hundred millions of short reads, De Bruijn graph-based assemblers would be more appropriate. In terms of software implementation, string-based assemblers are superior to graph-based ones, of which SOAPdenovo is complex for the creation of configuration file. Our comparison study will assist researchers in selecting a well-suited assembler and offer essential information for the improvement of existing assemblers or the developing of novel assemblers. PMID:21423806

  13. NeuroPG: open source software for optical pattern generation and data acquisition.

    Science.gov (United States)

    Avants, Benjamin W; Murphy, Daniel B; Dapello, Joel A; Robinson, Jacob T

    2015-01-01

    Patterned illumination using a digital micromirror device (DMD) is a powerful tool for optogenetics. Compared to a scanning laser, DMDs are inexpensive and can easily create complex illumination patterns. Combining these complex spatiotemporal illumination patterns with optogenetics allows DMD-equipped microscopes to probe neural circuits by selectively manipulating the activity of many individual cells or many subcellular regions at the same time. To use DMDs to study neural activity, scientists must develop specialized software to coordinate optical stimulation patterns with the acquisition of electrophysiological and fluorescence data. To meet this growing need we have developed an open source optical pattern generation software for neuroscience-NeuroPG-that combines, DMD control, sample visualization, and data acquisition in one application. Built on a MATLAB platform, NeuroPG can also process, analyze, and visualize data. The software is designed specifically for the Mightex Polygon400; however, as an open source package, NeuroPG can be modified to incorporate any data acquisition, imaging, or illumination equipment that is compatible with MATLAB's Data Acquisition and Image Acquisition toolboxes. PMID:25784873

  14. Evolution of a Reconfigurable Processing Platform for a Next Generation Space Software Defined Radio

    Science.gov (United States)

    Kacpura, Thomas J.; Downey, Joseph A.; Anderson, Keffery R.; Baldwin, Keith

    2014-01-01

    The National Aeronautics and Space Administration (NASA)Harris Ka-Band Software Defined Radio (SDR) is the first, fully reprogrammable space-qualified SDR operating in the Ka-Band frequency range. Providing exceptionally higher data communication rates than previously possible, this SDR offers in-orbit reconfiguration, multi-waveform operation, and fast deployment due to its highly modular hardware and software architecture. Currently in operation on the International Space Station (ISS), this new paradigm of reconfigurable technology is enabling experimenters to investigate navigation and networking in the space environment.The modular SDR and the NASA developed Space Telecommunications Radio System (STRS) architecture standard are the basis for Harris reusable, digital signal processing space platform trademarked as AppSTAR. As a result, two new space radio products are a synthetic aperture radar payload and an Automatic Detection Surveillance Broadcast (ADS-B) receiver. In addition, Harris is currently developing many new products similar to the Ka-Band software defined radio for other applications. For NASAs next generation flight Ka-Band radio development, leveraging these advancements could lead to a more robust and more capable software defined radio.The space environment has special considerations different from terrestrial applications that must be considered for any system operated in space. Each space mission has unique requirements that can make these systems unique. These unique requirements can make products that are expensive and limited in reuse. Space systems put a premium on size, weight and power. A key trade is the amount of reconfigurability in a space system. The more reconfigurable the hardware platform, the easier it is to adapt to the platform to the next mission, and this reduces the amount of non-recurring engineering costs. However, the more reconfigurable platforms often use more spacecraft resources. Software has similar considerations to hardware. Having an architecture standard promotes reuse of software and firmware. Space platforms have limited processor capability, which makes the trade on the amount of amount of flexibility paramount.

  15. PREFACE: 2nd Workshop on Germanium Detectors and Technologies

    Science.gov (United States)

    Abt, I.; Majorovits, B.; Keller, C.; Mei, D.; Wang, G.; Wei, W.

    2015-05-01

    The 2nd workshop on Germanium (Ge) detectors and technology was held at the University of South Dakota on September 14-17th 2014, with more than 113 participants from 8 countries, 22 institutions, 15 national laboratories, and 8 companies. The participants represented the following big projects: (1) GERDA and Majorana for the search of neutrinoless double-beta decay (0???) (2) SuperCDMS, EDELWEISS, CDEX, and CoGeNT for search of dark matter; (3) TEXONO for sub-keV neutrino physics; (4) AGATA and GRETINA for gamma tracking; (5) AARM and others for low background radiation counting; (5) as well as PNNL and LBNL for applications of Ge detectors in homeland security. All participants have expressed a strong desire on having better understanding of Ge detector performance and advancing Ge technology for large-scale applications. The purpose of this workshop was to leverage the unique aspects of the underground laboratories in the world and the germanium (Ge) crystal growing infrastructure at the University of South Dakota (USD) by brining researchers from several institutions taking part in the Experimental Program to Stimulate Competitive Research (EPSCoR) together with key leaders from international laboratories and prestigious universities, working on the forefront of the intensity to advance underground physics focusing on the searches for dark matter, neutrinoless double-beta decay (0???), and neutrino properties. The goal of the workshop was to develop opportunities for EPSCoR institutions to play key roles in the planned world-class research experiments. The workshop was to integrate individual talents and existing research capabilities, from multiple disciplines and multiple institutions, to develop research collaborations, which includes EPSCor institutions from South Dakota, North Dakota, Alabama, Iowa, and South Carolina to support multi-ton scale experiments for future. The topic areas covered in the workshop were: 1) science related to Ge-based detectors and technology; 2) Ge zone refining and crystal growth; 3) Ge detector development; 4) Ge orientated business and applications; 5) Ge recycling and recovery; 6) introduction to underground sciences for young scientists; and 7) introduction of experimental techniques for low background experiments to young scientists. Sections 1-5 were dedicated to Ge detectors and technology. Each topic was complemented with a panel discussion on challenges, critical measures, and R&D activities. Sections 6-7 provided students and postdocs an opportunity to understand fundamental principles of underground sciences and experimental techniques on low background experiments. To these two sections, well-known scientists in the field were invited to give lectures and allow young scientists to make presentations on their own research activities. Fifty-six invited talks were delivered during the three-day workshop. Many critical questions were addressed not only in the specific talks but also in the panel discussions. Details of the panel discussions, as well as conference photos, the list of committees and the workshop website can be found in the PDF.

  16. A software tool for simulation of surfaces generated by ball nose end milling

    DEFF Research Database (Denmark)

    Bissacco, Giuliano

    2004-01-01

    The number of models available for prediction of surface topography is very limited. The main reason is that these models cannot be based on engineering principles like those for elastic deformations. Most knowledge about surface roughness and integrity is empirical and up to now very few mathematical relationships relating surface parameters to cutting conditions are available. Basic models of kinematical roughness, determined by the tool profile and the pattern of relative motions of tool and workpiece, have been so far not reliable. The actual roughness may be more than five times higher due to error motions, unstable built up edge and changing tool profile due to wear [1]. Tool chatter is also affecting surface roughness, but its effect is normally not included in prediction of surface roughness, since machining conditions which generate chatter must be avoided in any case. Finally, reproducibility of experimental results concerning surface roughness requires tight control of all influencing factors, difficult to keep in actual machining workshops. This introduces further complications in surface topography modelling. In the light of these considerations, a simple software tool, for prediction of surface topography of ball nose end milled surfaces, was developed. Such software tool is based on a simplified model of the ideal tool motion and neglects the effects due to run-out, static and dynamic deflections and error motions, but has the merit of generating in output a file in a format readable by a surface processor software (SPIP [2]), for calculation of a number of surface roughness parameters. In the next paragraph a description of the basic features of ball nose end milled surfaces is given, while in paragraph 3 the model is described.

  17. Results of the 2nd regular inspection of Unit 2 in Oi Power Station

    International Nuclear Information System (INIS)

    The 2nd regular inspection of Unit 2 in the Oi Power Station in fiscal 1981 was carried out from June 16, 1981, to January 12, 1982. Inspection was made on the reactor proper, reactor cooling system, instrumentation and control system, waste management, reactor containment, etc. By visual, disassembling, leakage, functional, performance and other inspections, the results were as follows: (1) Leakage was detected in four fuel assemblies, among which damage was detected in two fuel rods. The damage of the supporting lattice was also detected in some fuel assemblies. (2) In the core baffle plates, one corner was found to have exceeded the peening standard on momentum flux. Radiation exposure dose during the inspection was all below the permissible level. During the regular inspection, the following improvement works and others were done; peening of the core baffle plates, plugging of steam generator heating tubes, and the improvement of the thermal sleeve on a filling pipe in the chemical-volume control system. (Mori, K.)

  18. Meteosat Second Generation station: processing software and computing arquitecture; Estacion de Recepcion de Imagenes del Satelite Meteosat Segunda generacion: Arquitectura Informatica y Software de Proceso

    Energy Technology Data Exchange (ETDEWEB)

    Martin, L.; Cony, M.; Navarro, A. A.; Zarzalejo, L. F.; Polo, J.

    2010-05-01

    The Renewable Energy Division of CIEMAT houses a specific station for receiving the Meteosat Second Generation images, which is of interest in the works being carried out concerning the solar radiation derived from satellite images. The complexity, the huge amount of information being received and the particular characteristics of the MSG images encouraged the design and development of a specific computer structure and the associated software as well, for a better and more suitable use of the images. This document describes the mentioned structure and software. (Author) 8 refs.

  19. Development of novel software to generate anthropometric norms at perinatal autopsy.

    Science.gov (United States)

    Cain, Matthew D; Siebert, Joseph R; Iriabho, Egiebade; Gruneberg, Alexander; Almeida, Jonas S; Faye-Petersen, Ona Marie

    2015-01-01

    Fetal and infant autopsy yields information regarding cause of death and the risk of recurrence, and it provides closure for parents. A significant number of perinatal evaluations are performed by general practice pathologists or trainees, who often find them time-consuming and/or intimidating. We sought to create a program that would enable pathologists to conduct these examinations with greater ease and to produce reliable, informative reports. We developed software that automatically generates a set of expected anthropometric and organ weight ranges by gestational age (GA)/postnatal age (PA) and a correlative table with the GA/PA that best matches the observed anthropometry. The program highlights measurement and organ weight discrepancies, enabling users to identify abnormalities. Furthermore, a Web page provides options for exporting and saving the data. Pathology residents utilized the program to determine ease of usage and benefits. The average time using conventional methods (ie, reference books and Internet sites) was compared to the average time using our Web page. Average time for novice and experienced residents using conventional methods was 26.7 minutes and 15 minutes, respectively. Using the Web page program, these times were reduced to an average of 3.2 minutes (P < 0.046 and P < 0.02, respectively). Participants found our program simple to use and the corrective features beneficial. This novel application saves time and improves the quality of fetal and infant autopsy reports. The software allows data exportation to reports and data storage for future analysis. Finalization of our software to enable usage by both university and private practice groups is in progress. PMID:25634794

  20. Quetzalcoatl: Una Herramienta para Generar Contratos de Desarrollo de Software en Entornos de Outsourcing / Quetzalcoatl: A Tool for Generating Software Development Contracts in Outsourcing Environments

    Scientific Electronic Library Online (English)

    Jezreel, Mejía; Sergio D., Ixmatlahua; Alma I., Sánchez.

    2014-03-01

    Full Text Available Actualmente el outsourcing es una de las actividades principales de trabajo. Sin embargo, las relaciones que se dan entre un cliente y un proveedor de servicios no son lo suficientemente fuertes para lograr las expectativas de los acuerdos. El contrato de outsourcing para proyectos en desarrollo de [...] software es una alternativa a este tipo de relaciones. En este artículo se presenta la arquitectura de la herramienta Quetzalcoatl, así como las funciones principales que ofrece la herramienta, todo esto con el objetivo de generar y evaluar contratos para proyectos de desarrollo de software en entornos de outsourcing como apoyo a las PYMEs. Abstract in english Nowadays, outsourcing is one of the most important work activities for the software development companies. However, the relationships between a client and a service provider are not b enough to meet the expectations of the agreements. The outsourcing contract for software development projects is an [...] alternative to this type of relationship. This paper presents the architecture of the tool named Quetzalcoatl, also the main functions that this tool offers in order to generate and evaluate contracts for software development projects in outsourcing environments to support SMEs.

  1. Pragmatics Annotated Coloured Petri Nets for Protocol Software Generation and Verification

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge; Kristensen, Lars Michael

    2014-01-01

    This paper presents the formal definition of Pragmatics Annotated Coloured Petri Nets (PA-CPNs). PA-CPNs represent a class of Coloured Petri Nets (CPNs) that are designed to support automated code genera-tion of protocol software. PA-CPNs restrict the structure of CPN models and allow Petri net elements to be annotated with so-called pragmatics, which are exploited for code generation. The approach and tool for gen-erating code is called PetriCode and has been discussed and evaluated in earlier work already. The contribution of this paper is to give a formal def-inition for PA-CPNs; in addition, we show how the structural restrictions of PA-CPNs can be exploited for making the verification of the modelled protocols more efficient. This is done by automatically deriving progress measures for the sweep-line method, and by introducing so-called service testers, that can be used to control the part of the state space that is to be explored for verification purposes.

  2. Next generation hyper-scale software and hardware systems for big data analytics

    CERN Document Server

    CERN. Geneva

    2013-01-01

    Building on foundational technologies such as many-core systems, non-volatile memories and photonic interconnects, we describe some current technologies and future research to create real-time, big data analytics, IT infrastructure. We will also briefly describe some of our biologically-inspired software and hardware architecture for creating radically new hyper-scale cognitive computing systems. About the speaker Rich Friedrich is the director of Strategic Innovation and Research Services (SIRS) at HP Labs. In this strategic role, he is responsible for research investments in nano-technology, exascale computing, cyber security, information management, cloud computing, immersive interaction, sustainability, social computing and commercial digital printing. Rich's philosophy is to fuse strategy and inspiration to create compelling capabilities for next generation information devices, systems and services. Using essential insights gained from the metaphysics of innnovation, he effectively leads ...

  3. Software methodology to generate code for device description for XMC microcontroller

    Directory of Open Access Journals (Sweden)

    Ravi L. Mavani#1 , Sharvani G. S

    2013-06-01

    Full Text Available Timely release of a product is crucial to satisfy the demand of the market. Microcontrollers are delivered with some interfaces that can facilitate the customer to use the chip hassle-free in their respective domains. The product can be delivered quickly if an effort is made to reduce the time in developing these interfaces. Adapting automation methodology in the process of software development life cycle boosts the overall productivity. Hardware resource management is a one of the key aspect for developing interface. This paper explains generation of device description (DD by proposing an automated process which eliminates manual work involved in the evaluation of different configurations. The developed application involves interpreting the data from the diagram files and reproducing this data in a DD format.

  4. Generating Probability Tables of Dynamic Reliability Graph with General Gates by a Software Tool

    International Nuclear Information System (INIS)

    Fault tree method is the most widely used among methods to analyze system reliability, but it is not an intuitive method. So, as a system becomes complex, a corresponding fault tree becomes much more complex. A reliability graph with general gates (RGGG) is an intuitive method to analyze system reliability; it can make a one-to-one match from the actual structure of a system to the reliability graphs of the system. However, RGGG cannot capture the dynamic behavior of the system associated with time dependent events. To overcome this shortcoming, dynamic reliability graph with general gates (DRGGG) was proposed. By using discrete-time method, we can add dynamic nodes to RGGG. However, as a discretization number (interval number) n becomes lager, it becomes harder to apply this dynamic method to a real system, because the large number leads to the complexity in making probability tables. In this paper, we will introduce the software tool which generates probability tables automatically

  5. Next Generation Astronomical Data Processing using Big Data Technologies from the Apache Software Foundation

    Science.gov (United States)

    Mattmann, Chris

    2014-04-01

    In this era of exascale instruments for astronomy we must naturally develop next generation capabilities for the unprecedented data volume and velocity that will arrive due to the veracity of these ground-based sensor and observatories. Integrating scientific algorithms stewarded by scientific groups unobtrusively and rapidly; intelligently selecting data movement technologies; making use of cloud computing for storage and processing; and automatically extracting text and metadata and science from any type of file are all needed capabilities in this exciting time. Our group at NASA JPL has promoted the use of open source data management technologies available from the Apache Software Foundation (ASF) in pursuit of constructing next generation data management and processing systems for astronomical instruments including the Expanded Very Large Array (EVLA) in Socorro, NM and the Atacama Large Milimetre/Sub Milimetre Array (ALMA); as well as for the KAT-7 project led by SKA South Africa as a precursor to the full MeerKAT telescope. In addition we are funded currently by the National Science Foundation in the US to work with MIT Haystack Observatory and the University of Cambridge in the UK to construct a Radio Array of Portable Interferometric Devices (RAPID) that will undoubtedly draw from the rich technology advances underway. NASA JPL is investing in a strategic initiative for Big Data that is pulling in these capabilities and technologies for astronomical instruments and also for Earth science remote sensing. In this talk I will describe the above collaborative efforts underway and point to solutions in open source from the Apache Software Foundation that can be deployed and used today and that are already bringing our teams and projects benefits. I will describe how others can take advantage of our experience and point towards future application and contribution of these tools.

  6. Software for generation and analysis of photoelastic fringes in plates with a single hole subjected to in-plane loads

    International Nuclear Information System (INIS)

    A software package for generating and analyzing photoelastic images on infinite rectangular plates, subjected to in-plane loads, is being presented. It allows the user to generate photoelastic images as produced in a polariscope fed by monochromatic light. Both circular and plane polariscopes in conditions of dark or light field can be selected. Tools for obtaining light intensity distributions along horizontal and vertical lines and for extracting darkest regions of photoelastic fringes are also available. The extraction of such regions can be done by digital image processing (DIP). This process produces thin lines, from which main stresses and intensity factor used in the Fracture Mechanics can be obtained. The software was developed for running on DOS environment in Super VGA mode. The synthetic photoelastic images are generated in 64 gray levels. This software is a useful tool for teaching the fundamentals of photoelasticity and will help the researchers in the development of photoelastic experiments. (author). 6 fefs., 7 figs

  7. Modeling of wind turbines with doubly fed generator system

    CERN Document Server

    Fortmann, Jens

    2014-01-01

    Jens Fortmann describes the deduction of models for the grid integration of variable speed wind turbines and the reactive power control design of wind plants. The modeling part is intended as background to understand the theory, capabilities and limitations of the generic doubly fed generator and full converter wind turbine models described in the IEC 61400-27-1 and as 2nd generation WECC models that are used as standard library models of wind turbines for grid simulation software. Focus of the reactive power control part is a deduction of the origin and theory behind the reactive current requ

  8. Phase relationship in the TiO2–Nd2O3 pseudo-binary system

    International Nuclear Information System (INIS)

    Highlights: ? DSC and XRD measurements for the TiO2–Nd2O3 system. ? Nd2Ti2O7, Nd2TiO5, Nd2Ti3O9 and Nd4Ti9O24 exist. ? Nd2Ti4O11 and Nd4Ti9O24 were the same compounds. ? Thermodynamic calculation on the TiO2–Nd2O3 system. - Abstract: Phase equilibria in the TiO2–Nd2O3 system have been experimentally investigated via X-ray diffraction (XRD) and differential scanning calorimetry (DSC). Four compounds Nd2Ti2O7, Nd2TiO5, Nd2Ti3O9 and Nd4Ti9O24 were confirmed to exist. The literature reported Nd2Ti4O11 was proved to be the same compound as Nd4Ti9O24, and the reported phase transformation of Nd2Ti4O11 from ? structure to ? at 1373 K was not detected. All the phase diagram data from both the literatures and the present work were critically reviewed and taken into account during the thermodynamic optimization of the TiO2–Nd2O3 system. A set of consistent thermodynamic parameters, which can explain most of the experimental data ofthe experimental data of the TiO2–Nd2O3 system, was achieved. The calculated phase diagram of the TiO2–Nd2O3 system was provided.

  9. A study of Ca 4p1/2,3/2nd (J = 3) autoionizing states

    International Nuclear Information System (INIS)

    The spectral properties of Ca 4pnd (J = 3) autoionizing states have been studied by employing the combination of multichannel quantum defect theory (MQDT) with the K-matrix method. The cross section of 4p3/215d excited from 4s15d and energy levels of 4pnd (J = 3) are calculated by using a theoretical model with more complete channels, and the configuration interaction is analysed in detail between 4p1/2nd, 4p3/2nd3/2,5/2 and 4p1/2,3/2ng of five Rydberg series

  10. 2nd international expert meeting straw power; 2. Internationale Fachtagung Strohenergie

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2012-06-15

    Within the 2nd Guelzow expert discussions at 29th to 30th March, 2012 in Berlin (Federal Republic of Germany), the following lectures were held: (1) Promotion of the utilisation of straw in Germany (A. Schuette); (2) The significance of straw in the heat and power generation in EU-27 member states in 2020 and in 2030 under consideration of the costs and sustainability criteria (C. Panoutsou); (3) State of he art of the energetic utilization of hay goods in Europe (D. Thraen); (4) Incineration technological characterisation of straw based on analysis data as well as measured data of large-scale installations (I. Obernberger); (5) Energetic utilization of hay goods in Germany (T. Hering); (6) Actual state of the art towards establishing the first German straw thermal power station (R. Knieper); (7) Straw thermal power plants at agricultural sow farms and poultry farms (H. Heilmann); (8) Country report power from straw in Denmark (A. Evald); (9) Country report power from straw in Poland (J. Antonowicz); (10) Country report power from straw in China (J. Zhang); (11) Energetic utilisation of straw in Czechia (D. Andert); (12) Mobile pelletization of straw (S. Auth); (13) Experiences with the straw thermal power plant from Vattenfall (N. Kirkegaard); (14) Available straw potentials in Germany (potential, straw provision costs) (C. Weiser); (15) Standardization of hay good and test fuels - Classification and development of product standards (M. Englisch); (16) Measures of reduction of emissions at hay good incinerators (V. Lenz); (17) Fermentation of straw - State of the art and perspectives (G. Reinhold); (18) Cellulosis - Ethanol from agricultural residues - Sustainable biofuels (A. Hartmair); (19) Syngas by fermentation of straw (N. Dahmen); (20) Construction using straw (D. Scharmer).

  11. 2nd International Conference on Robot Intelligence Technology and Applications

    CERN Document Server

    Matson, Eric; Myung, Hyun; Xu, Peter; Karray, Fakhri

    2014-01-01

    We are facing a new technological challenge on how to store and retrieve knowledge and manipulate intelligence for autonomous services by intelligent systems which should be capable of carrying out real world tasks autonomously. To address this issue, robot researchers have been developing intelligence technology (InT) for “robots that think” which is in the focus of this book. The book covers all aspects of intelligence from perception at sensor level and reasoning at cognitive level to behavior planning at execution level for each low level segment of the machine. It also presents the technologies for cognitive reasoning, social interaction with humans, behavior generation, ability to cooperate with other robots, ambience awareness, and an artificial genome that can be passed on to other robots. These technologies are to materialize cognitive intelligence, social intelligence, behavioral intelligence, collective intelligence, ambient intelligence and genetic intelligence. The book aims at serving resear...

  12. 2nd Topical Workshop on Laser Technology and Optics Design

    CERN Document Server

    2013-01-01

    Lasers have a variety of applications in particle accelerator operation and will play a key role in the development of future particle accelerators by improving the generation of high brightness electron and exotic ion beams and through increasing the acceleration gradient. Lasers will also make an increasingly important contribution to the characterization of many complex particle beams by means of laser-based beam diagnostics methods. The second LANET topical workshop will address the key aspects of laser technology and optics design relevant to laser application to accelerators. The workshop will cover general optics design, provide an overview of different laser sources and discuss methods to characterize beams in details. Participants will be able to choose from a range of topical areas that go deeper in more specific aspects including tuneable lasers, design of transfer lines, noise sources and their elimination and non-linear optics effects. The format of the workshop will be mainly training-based wit...

  13. The Crest Wing Wave Energy Device : 2nd phase testing

    DEFF Research Database (Denmark)

    Kofoed, Jens Peter; Antonishen, Michael Patrick

    2009-01-01

    This report presents the results of a continuation of an experimental study of the wave energy converting abilities of the Crest Wing wave energy converter (WEC), in the following referred to as ‘Phase 2'. The Crest Wing is a WEC that uses its movement in matching the shape of an oncoming wave to generate power. Model tests have been performed using scale models (length scale 1:30), provided by WaveEnergyFyn, in regular and irregular wave states that can be found in Assessment of Wave Energy Devices. Best Practice as used in Denmark (Frigaard et al., 2008). The tests were carried out at Dept. of Civil Engineering, Aalborg University (AAU) in the 3D deep water wave tank. The displacement and force applied to a power take off system, provided by WaveEnergyFyn, were measured and used to calculate mechanical power available to the power take off.

  14. Generation and Optimization of Test cases for Object-Oriented Software Using State Chart Diagram

    OpenAIRE

    Swain, Ranjita Kumari; Behera, Prafulla Kumar; Mohapatra, Durga Prasad

    2012-01-01

    The process of testing any software system is an enormous task which is time consuming and costly. The time and required effort to do sufficient testing grow, as the size and complexity of the software grows, which may cause overrun of the project budget, delay in the development of software system or some test cases may not be covered. During SDLC (software development life cycle), generally the software testing phase takes around 40-70% of the time and cost. State-based te...

  15. Proceedings of the 2nd KUR symposium on hyperfine interactions

    International Nuclear Information System (INIS)

    Hyperfine interactions between a nuclear spin and an electronic spin discovered from hyperfine splitting in atomic optical spectra have been utilized not only for the determination of nuclear parameters in nuclear physics but also for novel experimental techniques in many fields such as solid state physics, chemistry, biology, mineralogy and for diagnostic methods in medical science. Experimental techniques based on hyperfine interactions yield information about microscopic states of matter so that they are important in material science. Probes for material research using hyperfine interactions have been nuclei in the ground state and radioactive isotopes prepared with nuclear reactors or particle accelerators. But utilization of muons generated from accelerators is recently growing. Such wide spread application of hyperfine interaction techniques gives rise to some difficulty in collaboration among various research fields. In these circumstances, the present workshop was planned after four years since the last KUR symposium on the same subject. This report summarizes the contributions to the workshop in order to be available for the studies of hyperfine interactions. (J.P.N.)

  16. Technical Adequacy of the Disruptive Behavior Rating Scale-2nd Edition--Self-Report

    Science.gov (United States)

    Erford, Bradley T.; Miller, Emily M.; Isbister, Katherine

    2015-01-01

    This study provides preliminary analysis of the Disruptive Behavior Rating Scale-2nd Edition--Self-Report, which was designed to screen individuals aged 10 years and older for anxiety and behavior symptoms. Score reliability and internal and external facets of validity were good for a screening-level test.

  17. All in a Day's Work: Careers Using Science, 2nd Edition (e-book)

    Science.gov (United States)

    Megan Sullivan

    2009-06-23

    "Almost all careers in the 21st century require a working knowledge of science and mathematics," says Steve Metz, The Science Teacher field editor, in his introduction to All in a Day's Work, 2nd edition . "The pending retirement of 78 mi

  18. [Continuous education of pharmacists in the Czech Republic. 2nd cycle 2002-2005].

    Science.gov (United States)

    Kolár, J; Nováková, J

    2006-11-01

    The paper deals with continuous education of pharmacists in the Czech Republic in 2002-2005 (2nd cycle). It surveys the seminars organized within the framework of continuous education, their number, topics, and lecturers. A total number of 232 professional seminars took place, which included 339 lectures, mainly on pharmacology (76.1%). PMID:17288064

  19. Proceedings of the 2nd Mediterranean Conference on Information Technology Applications (ITA '97)

    International Nuclear Information System (INIS)

    This is the proceedings of the 2nd Mediterranean Conference on Information Technology Applications, held in Nicosia, Cyprus, between 6-7 November, 1997. It contains 16 papers. Two of these fall within the scope of INIS and are dealing with Telemetry, Radiation Monitoring, Environment Monitoring, Radiation Accidents, Air Pollution Monitoring, Diagnosis, Computers, Radiology and Data Processing

  20. Proceedings of the 2nd symposium on valves for coal conversion and utilization

    Energy Technology Data Exchange (ETDEWEB)

    Maxfield, D.A. (ed.)

    1981-01-01

    The 2nd symposium on valves for coal conversion and utilization was held October 15 to 17, 1980. It was sponsored by the US Department of Energy, Morgantown Energy Technology Center, in cooperation with the Valve Manufacturers Association. Seventeen papers have been entered individually into EDB and ERA. (LTN)

  1. Point classification of 2nd order ODEs: Tresse classification revisited and beyond

    CERN Document Server

    Kruglikov, Boris

    2008-01-01

    In 1896 Tresse gave a complete description of relative differential invariants for the pseudogroup action of point transformations on the 2nd order ODEs. The purpose of this paper is to review, in light of modern geometric approach to PDEs, this classification and also discuss the role of absolute invariants and the equivalence problem.

  2. Introductory statement to the 2nd scientific forum on sustainable development: A role for nuclear power?

    International Nuclear Information System (INIS)

    In his Introductory Statement to the 2nd Scientific Forum on 'Sustainable Development - A Role for Nuclear Power?' (Vienna, 28 September 1999), the Director General of the IAEA focussed on the the main aspects concerning the development of nuclear power: safety, competitiveness, and public support

  3. 8. Book Review: ‘Broken Bones: Anthropological Analysis of Blunt Force Trauma’ 2 nd edition, 2014

    OpenAIRE

    Gaur, R.

    2014-01-01

    'Broken Bones: Anthropological Analysis of Blunt Force Trauma' 2nd edition, 2014. Editors: Vicki L. Wedel and Alison Galloway; Publisher: Charles C. Thomas, Illinois. pp 479 + xxiii ISBN: 978-0-398-08768-5 (Hard) ISBN: 978-0-398-08769-2 (eBook)

  4. Stem cells and cancer immunotherapy: Arrowhead’s 2nd annual cancer immunotherapy conference

    OpenAIRE

    Bot, Adrian; Chiriva-Internati, Maurizio; Cornforth, Andrew; Czerniecki, Brian J; FERRONE, SOLDANO; Geles, Kenneth; Greenberg, Philip D; Hurt, Elaine; Koya, Richard C; Manjili, Masoud H.; Matsui, William; Morgan, Richard A; Palena, Claudia M.; Powell Jr, Daniel J; Restifo, Nicholas P.

    2014-01-01

    Investigators from academia and industry gathered on April 4 and 5, 2013, in Washington DC at the Arrowhead’s 2nd Annual Cancer Immunotherapy Conference. Two complementary concepts were discussed: cancer “stem cells” as targets and therapeutic platforms based on stem cells.

  5. Software for generating gravity gradients using a geopotential model based on an irregular semivectorization algorithm

    Science.gov (United States)

    Eshagh, Mehdi; Abdollahzadeh, Makan

    2012-02-01

    The spherical harmonic synthesis of second-order derivatives of geopotential is a task of major concern when the spatial resolution of synthesis is high and/or a high-resolution Earth's gravity model is used. Here, a computational technique is presented for such a process. The irregular semivectorization is introduced as a vectorization technique in which one loop is excluded from matrix-vector products of mathematical models in order to speed up the computation and manage the computer memory. The proposed technique has the capability of considering heights of computation points on a regular grid. MATLAB-based software is developed, which can be used for generating gravity gradients on an ordinary personal computer. The numerical results show that irregular semivectorization significantly reduces the computation time to 1 h for synthesis of these data with global coverage and resolution of 5'×5' on an elevation model. In addition, a numerical example is presented for testing satellite gravity gradiometry data of the recent European Space Agency satellite mission, the gravity field and steady-state ocean circulation explorer (GOCE), using an Earth's gravity model.

  6. Towards a "2nd Generation" of Quality Labels: a Proposal for the Evaluation of Territorial Quality Marks / Vers une «2ème génération» de labels de qualité: une proposition pour l'évaluation des marques de qualité territoriale / Hacia una "2" generación" de sellos de calidad: una propuesta para la evaluación de las marcas de calidad territorial

    Scientific Electronic Library Online (English)

    Eduardo, Ramos; Dolores, Garrido.

    2014-12-01

    Full Text Available La literatura reciente analiza el papel de las especificidades territoriales como el núcleo de las estrategias de desarrollo territorial rural basadas en la diferenciación. Desafortunadamente, la proliferación de los sistemas de garantía de calidad está provocando un "laberinto de sellos", que difun [...] den los esfuerzos locales de capitalizar las especificidades rurales. Una segunda generación de sellos se está desarrollando actualmente para simplificar la diferenciación territorial. Una parte de los territorios al sur de Europa está basando sus estrategias de desarrollo rural mediante el proyecto Marca de Calidad Territorial Europea (MCTE). Este trabajo propone una metodología original, diseñada y desarrollada por los autores para la evaluación de algunos de los sellos de segunda generación. Esta metodología se ha validado en quince territorios rurales como los pioneros de la MCTE en España. Abstract in english Recent literature analyses the role of territorial specificities, as the core of territorial rural development strategies based on differentiation. Unfortunately, the proliferation of quality assurance schemes is provoking a "labyrinth of labels" which diffuses the local efforts for capitalizing rur [...] al specificities. A second-generation of labels is currently being developed to simplify the territorial differentiation message. A number of territories in Southern Europe are basing their rural development strategies joining the so-called European Territorial Quality Mark (ETQM) Project. This paper proposes an original methodology, designed and developed by authors, for the evaluation of some of these second-generation labels. This methodology has been validated in 15 rural territories as the pioneers of the ETQM in Spain.

  7. Optimizing Software Testing and Test Case Generation by using the concept of Hamiltonian Paths

    OpenAIRE

    Ankita Bihani; Sargam Badyal

    2014-01-01

    Software testing is a trade-off between budget, time and quality. Broadly, software testing can be classified as Unit testing, Integration testing, Validation testing and System testing. By including the concept of Hamiltonian paths we can improve greatly on the facet of software testing of any project. This paper shows how Hamiltonian paths can be used for requirement specification. It can also be used in acceptance testing phase for checking if all the user requirements are met or not. Furt...

  8. Reliable execution of statechart-generated correct embedded software under soft errors

    OpenAIRE

    Ferreira, Ronaldo R.; Klotz, Thomas; Vörtler, Thilo; Rolt, Jean da; Nazar, Gabriel L.; Moreira, Àlvaro F.; Carro, Luigi; Einwich, Karsten

    2014-01-01

    This paper proposes a design methodology for faulttolerant embedded systems development that starts from software specification and goes down to hardware execution. The proposed design methodology uses formally verified and correctby-construction software created from high-level UML statechart models for software specification and implementation. On the hardware reliability side, this paper uses the MoMa architecture for reliable embedded computing which we deploy as a softcore onto an off-th...

  9. OASIS4 – a coupling software for next generation earth system modelling

    OpenAIRE

    Redler, R.; S. Valcke; Ritzdorf, H.

    2010-01-01

    In this article we present a new version of the Ocean Atmosphere Sea Ice Soil coupling software (OASIS4). With this new fully parallel OASIS4 coupler we target the needs of Earth system modelling in its full complexity. The primary focus of this article is to describe the design of the OASIS4 software and how the coupling software drives the whole coupled model system ensuring the synchronization of the different component models. The application programmer interface (API) manages the couplin...

  10. Efficient FPGA implementation of 2nd order digital controllers using Matlab/Simulink

    OpenAIRE

    Vikas gupta; Khare, K.; Singh, R. P.

    2011-01-01

    This paper explains a method for the design and implementation of digital controller based on Field Programmable Gate Array (FPGA) device. It is more compact, power efficient and provides high speed capabilities as compared to software based PID controllers. The proposed method is based on implementation of Digital controller as digital filters using DSP architectures. The PID controller is designed using MATLAB and Simulink to generate a set of coefficients associated with the desired contro...

  11. Library perceptions of using social software as blogs in the idea generation phase of service innovations : Lessons from an experiment

    DEFF Research Database (Denmark)

    Scupola, Ada; Nicolajsen, Hanne Westh

    2013-01-01

    This article investigates the use of social software such as blogs to communicate with and to involve users in the idea generation process of service innovations. After a theoretical discussion of user involvement and more specifically user involvement using web-tools with specific focus on blogs, the article reports findings and lessons from a field experiment at a university library. In the experiment, a blog was established to collect service innovation ideas from the library users. The experiment shows that a blog may engage a limited number of users in the idea generation process and generate a useful, but modest amount of ideas.

  12. Investigations on layered perovskites: Na2Nd2Ti3O10, H2Nd2Ti3O10 and Nd2Ti3O9?

    International Nuclear Information System (INIS)

    The structure of the layered perovskite Na2Nd2Ti3O10 was investigated by refining X-ray powder diffraction data using the Rietveld method. The Nd3+ cation is located in the intra-slab perovskite site and the Na+ cation in the interlayer space. Acid exchange of the potassium homologous compound leads to the formation of H2Nd2Ti3O10, xH2O. Prior to condensation at 900 C, the thermolysis of this solid acid yields an intermediate phase which, despite the total removal of water, retains the layered structure from 600 C to 850 C. The thermal evolution of the protonated form was studied by thermal analysis and crystallographic techniques and was found to exhibit two main steps. The first corresponds to the removal of water, which is complete at 600 C, and results in an intermediate phase containing five-coordinated titanium cations and a statistical distribution of Nd3+ in the available intra- and interslab sites. The second step can be considered as a complex condensation reaction leading to a 3D-cation defective perovskite. (orig.)

  13. Methodology and Toolset for Model Verification, Hardware/Software co-simulation, Performance Optimisation and Customisable Source-code generation

    DEFF Research Database (Denmark)

    Berger, Michael Stübert; Soler, José

    2013-01-01

    The MODUS project aims to provide a pragmatic and viable solution that will allow SMEs to substantially improve their positioning in the embedded-systems development market. The MODUS tool will provide a model verification and Hardware/Software co-simulation tool (TRIAL) and a performance optimisation and customisable source-code generation tool (TUNE). The concept is depicted in automated modelling and optimisation of embedded-systems development. The tool will enable model verification by guiding the selection of existing open-source model verification engines, based on the automated analysis of system properties, and producing inputs to be fed into these engines, interfacing with standard (SystemC) simulation platforms for HW/SW co-simulation, customisable source-code generation towards respecting coding standards and conventions and software performance-tuning optimisation through automated design transformations.

  14. Real-time infrared and semi-active laser scene generation software for AMSTAR hardware in the loop

    Science.gov (United States)

    Cosby, David S.; Lyles, Patrick; Bunfield, Dennis; Trimble, Darian; Rossi, Todd

    2005-05-01

    This paper describes the current research and development of advanced scene generation technology for integration into the Advanced Multispectral Simulation Test and Acceptance Resource (AMSTAR) Hardware-in-the-Loop (HWIL) facilities at the US Army AMRDEC and US Army Redstone Technical Test Center at Redstone Arsenal, AL. A real-time multi-mode (infra-red (IR) and semi-active laser (SAL)) scene generator for a tactical sensor system has been developed leveraging COTS hardware and open source software (OSS). A modular, plug-in architecture has been developed that supports rapid reconfiguration to permit the use of a variety of state data input sources, geometric model formats, and signature and material databases. The platform-independent software yields a cost-effective upgrade path to integrate best-of-breed personal computer (PC) graphics processing unit (GPU) technology.

  15. A 2nd order regenerator model including flow dispersion and bypass losses

    Energy Technology Data Exchange (ETDEWEB)

    Kuehl, H.D.; Schulz, S. [Univ. of Dortmund (Germany). Inst. of Thermodynamics

    1996-12-31

    The reliability of 2nd order computer codes for the simulation of regenerative cycles, such as the Stirling or the Vuilleumier cycle, is strongly dependent on the quality of the models that are used to describe the various losses. Particularly the thermal regenerator losses can severely affect the performance, and so an accurate modeling of these is essential in order to obtain realistic results. Nevertheless most present approaches are based on the assumption of a strictly one-dimensional, ideal plug flow. In this paper, this theory is reviewed, and it is suggested to additionally include losses due to both microscopic and macroscopic flow inhomogeneities. For this purpose, simplified equations are derived, which can easily by handled in a 2nd order code.

  16. Report of the 2nd through 7th conferences of Special Committee on Nuclear Criticality Safety

    International Nuclear Information System (INIS)

    Special committee on Nuclear Criticality Safety was established as a public committee of Atomic Energy Society of Japan in November, 1988. The main objectives of this committee are to contribute to reasonable criticality safety design/control through extensive discussions among the specialists of reactor physics, fuel treatment process and radiation surveillance technique and so on. The conferences were held totally seven times. This report concerns with the activities of this committee in the 2nd (1989) through 7th (1992) conferences. (author)

  17. Pb2+, Nd3+ and Eu3+ as local structural probes in sodium borate glasses

    International Nuclear Information System (INIS)

    The structure of sodium borate glasses has been investigated using Pb2+, Nd3+ and Eu3+ as local probes. Materials with low modifier concentrations were found to have an approximately two-dimensional (2D) B-O network. Increasing the proportion of Na2O leads to a gradual 2D ? 3D transition of the network former which is completely achieved in glasses containing 25 mol% Na2O. (Auth.)

  18. Production of artificial ionospheric layers by frequency sweeping near the 2nd gyroharmonic

    OpenAIRE

    Pedersen, T.; Mccarrick, M.; Reinisch, B.; Watkins, B.; Hamel, R.; Paznukhov, V.

    2011-01-01

    Artificial ionospheric plasmas descending from the background F-region have been observed on multiple occasions at the High Frequency Active Auroral Research Program (HAARP) facility since it reached full 3.6 MW power. Proximity of the transmitter frequency to the 2nd harmonic of the electron gyrofrequency (2fce) has been noted as a requirement for their occurrence, and their disappearance after only a few minutes has been attributed to the increasing...

  19. Preface: 2nd Workshop on the State of the Art in Nuclear Cluster Physics

    International Nuclear Information System (INIS)

    The 2nd workshop on the "State of the Art in Nuclear Cluster Physics" (SOTANCP2) took place on May 25-28, 2010, at the Universite Libre de Bruxelles (Brussels, Belgium). The first workshop of this series was held in Strasbourg (France) in 2008. The purpose of SOTANCP2 was to promote the exchange of ideas and to discuss new developments in Clustering Phenomena in Nuclear Physics and Nuclear Astrophysics both from a theoretical and from an experimental point of view

  20. Idaho National Laboratory Quarterly Performance Analysis for the 2nd Quarter FY 2015

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, Lisbeth A. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-04-01

    This report is published quarterly by the Idaho National Laboratory (INL) Quality and Performance Management Organization. The Department of Energy (DOE) Occurrence Reporting and Processing System (ORPS), as prescribed in DOE Order 232.2, “Occurrence Reporting and Processing of Operations Information,” requires a quarterly analysis of events, both reportable and not reportable, for the previous 12 months. This report is the analysis of events for the 2nd Qtr FY-15.

  1. GSIMF: a web service based software and database management system for the next generation grids

    International Nuclear Information System (INIS)

    To process the vast amount of data from high energy physics experiments, physicists rely on Computational and Data Grids; yet, the distribution, installation, and updating of a myriad of different versions of different programs over the Grid environment is complicated, time-consuming, and error-prone. Our Grid Software Installation Management Framework (GSIMF) is a set of Grid Services that has been developed for managing versioned and interdependent software applications and file-based databases over the Grid infrastructure. This set of Grid services provide a mechanism to install software packages on distributed Grid computing elements, thus automating the software and database installation management process on behalf of the users. This enables users to remotely install programs and tap into the computing power provided by Grids

  2. System and Component Software Specification, Run-time Verification and Automatic Test Generation Project

    National Aeronautics and Space Administration — This proposal is for the creation of a system-level software specification and verification tool. This proposal suggests a major leap-forward in usability of...

  3. An Evaluation of Software Distributed Shared Memory for Next-Generation Processors and Networks

    OpenAIRE

    Cox, A. L.; Dwarkadas, S.; Keleher, P.; Zwaenepoel, W

    1993-01-01

    We evaluate the effect of processor speed, network characteristics, and software overhead on the performance of release-consistent software distributed shared memory. We examine five different protocols for implementing release consistency: eager update, eager invalidate, lazy update, lazy invalidate, and a new protocol called lazy hybrid. This lazy hybrid protocol combines the benefits of both lazy update and lazy invalidate. Our simulations indicate that with the processors and networks tha...

  4. Utilisation of 2nd generation web technologies in master level vocational teacher training

    OpenAIRE

    Péter Tóth

    2009-01-01

    The Masters level Opportunities and Technological Innovation in Vocational Teacher Education project (project site: http://motivate.tmpk.bmf.hu/) aims to develop the use and management of virtual learning environments in the area of vocational teacher training, drawing on a well established international partnership of institutions providing both technical and educational expertise. This paper gives an overall picture of the first results and products of the collaboration. We touch upon the g...

  5. Utilisation of 2nd generation web technologies in master level vocational teacher training

    Directory of Open Access Journals (Sweden)

    Péter Tóth

    2009-03-01

    Full Text Available The Masters level Opportunities and Technological Innovation in Vocational Teacher Education project (project site: http://motivate.tmpk.bmf.hu/ aims to develop the use and management of virtual learning environments in the area of vocational teacher training, drawing on a well established international partnership of institutions providing both technical and educational expertise. This paper gives an overall picture of the first results and products of the collaboration. We touch upon the goals, the assessments and the learning process of using “Multimedia and e-Learning: e-learning methods and tools” module in details. The main cooperative and collaborative devices are presented in virtual learning environment. The communication during collaborative learning, the structured debate on forum and the benefits of collaborative learning in VLE are interpreted at the end of this paper.

  6. Healing of rat mouth mucosa after irradiation with CO2, Nd:YAG, and CO2-Nd:YAG combination lasers.

    Science.gov (United States)

    Luomanen, M; Rauhamaa-Mäkinen, R; Meurman, J H; Kosloff, T; Tiitta, O

    1994-08-01

    The healing process of wounds made by a combination laser was studied in 90 rats. The laser system enabled both separate and combined use of CO2 and Nd:YAG laser irradiations. The laser wounds and the control excision wounds made by alligator forceps appeared on both sides of the tongue. Specimens from the wound sites were taken immediately, 6 h, and 1, 2, 4, 7, 11, 21, 28, and 42 days after surgery. The wound-healing process was studied by macroscopic evaluation before preparing the specimens for light microscopy. Some differences were noted in the wound-healing process among the three groups into which the experimental animals were divided. Tissue coagulation damage was most extensive in the Nd:YAG laser sites, where it was observed in its full extent 4 days after surgery. Epithelial cells were seen to begin to proliferate in all the wounds 6 h after surgery. Re-epithelialization was completed by between 7 (CO2) and 21 days (Nd:YAG) at all the wound sites. The inflammatory cell infiltration was more prominent in the Nd:YAG and the CO2-Nd:YAG combination laser wounds than in the CO2 and excision wounds during healing. Tissue regeneration occurred faster with less contraction in the combination CO2-Nd:YAG wounds than in Nd:YAG wounds. The best macroscopic healing result was seen in the CO2 wound sites. The combination laser was effective both at cutting and at coagulating tissue. Combining the CO2 and Nd:YAG laser irradiation into one beam resulted in a greater incision depth than what could have been expected from using the two lasers separately. PMID:8091122

  7. Sustainable development - a role for nuclear power? 2nd scientific forum

    International Nuclear Information System (INIS)

    The 2nd Scientific Forum of the International Atomic Energy Agency (IAEA) was held during the 43rd General Conference. This paper summarizes the deliberations of the two-day Forum. The definition of 'sustainable development' of the 1987 Bruntland Commission - 'development that meets the needs of the present without compromising the ability of future generations to meet their own needs' - provided the background for the Forum's debate whether and how nuclear power could contribute to sustainable energy development. The framework for this debate comprises different perspectives on economic, energy, environmental, and political considerations. Nuclear power, along with all energy generating systems, should be judged on these considerations using a common set of criteria (e.g., emission levels, economics, public safety, wastes, and risks). First and foremost, there is a growing political concern over the possible adverse impact of increasing emissions of greenhouse gases from fossil fuel combustion. However, there is debate as to whether this would have any material impact on the predominantly economic criteria currently used to make investment decisions on energy production. According to the views expressed, the level of safety of existing nuclear power plants is no longer a major concern - a view not yet fully shared by the general public. The need to maintain the highest standards of safety in operation remains, especially under the mounting pressure of competitiveness in deregulated and liberalized energy markets. The industry must continuously reinforce a strong safety culture among reactor designers, builders, and operators. Furthermore, a convincing case for safety will have to be made for any new reactor designs. Of greater concern to the public and politicians are the issues of radioactive waste and proliferation of nuclear weapons. There is a consensus among technical experts that radioactive wastes from nuclear power can be disposed of safely and economically in deep geologic formations. However, the necessary political decisions to select sites for repositories need public support and understanding about what the industry is doing and what can be done. As to nuclear weapons proliferation, the existing safeguards system must be fully maintained and strengthened and inherently proliferation-resistant fuel cycles should be explored. Overviews of the future global energy demand and of the prospects for nuclear power in various economic regions of the world indicate that, in the case of the OECD countries, the dominant issue is economics in an increasingly free market system for electricity. For the so-called transition economies, countries of the Former Soviet Union and Central and Eastern Europe, the issue is one of managing nuclear power plant operations safely. In the case of developing countries, the dominant concern is effective management of technology, in addition to economics and finance. The prospects for nuclear power depend on the resolution of two cardinal issues. The first is economic competitiveness, and in particular, reduced capital cost. The second is public confidence in the ability of the industry to manage plant operations and its high level waste safely. There is a continuing need for dialogue and communication with all sectors of the public: economists, investors, social scientists, politicians, regulators, unions, and environmentalists. Of help in this dialogue would be nuclear power's relevance to and comparative advantages in addressing environmental issues, such as global climate change, local air quality, and regional acidification. Suggestions have been made for a globalized approach to critical nuclear power issues, such as waste management, innovative and proliferation-resistant reactors and fuel cycles, and international standards for new generation nuclear reactor designs.The conclusion seems to be that there is a role for nuclear energy in sustainable development, especially if greenhouse gas emissions are to be limited. Doubts persist in the minds of many energy experts over the pote

  8. OASIS4 – a coupling software for next generation earth system modelling

    Directory of Open Access Journals (Sweden)

    R. Redler

    2009-07-01

    Full Text Available In this article we present a new version of the Ocean Atmosphere Sea Ice Soil coupling software (OASIS4. With this new fully parallel OASIS4 coupler we target the needs of Earth system modelling in its full complexity. The primary focus of this article is to describe the design of the OASIS4 software and how the coupling software drives the whole coupled model system ensuring the synchronization of the different component models. The application programmer interface (API manages the coupling exchanges between arbitrary climate component models, as well as the input and output from and to files of each individual component. The OASIS4 Transformer instance performs the parallel interpolation and transfer of the coupling data between source and target model components. As a new core technology for the software, the fully parallel search algorithm of OASIS4 is described in detail. First benchmark results are discussed with simple test configurations to demonstrate the efficiency and scalability of the software when applied to Earth system model components. Typically the compute time needed in order to perform the search is in the order of a few seconds and is only weakly dependant on the grid size.

  9. OASIS4 – a coupling software for next generation earth system modelling

    Directory of Open Access Journals (Sweden)

    R. Redler

    2010-01-01

    Full Text Available In this article we present a new version of the Ocean Atmosphere Sea Ice Soil coupling software (OASIS4. With this new fully parallel OASIS4 coupler we target the needs of Earth system modelling in its full complexity. The primary focus of this article is to describe the design of the OASIS4 software and how the coupling software drives the whole coupled model system ensuring the synchronization of the different component models. The application programmer interface (API manages the coupling exchanges between arbitrary climate component models, as well as the input and output from and to files of each individual component. The OASIS4 Transformer instance performs the parallel interpolation and transfer of the coupling data between source and target model components. As a new core technology for the software, the fully parallel search algorithm of OASIS4 is described in detail. First benchmark results are discussed with simple test configurations to demonstrate the efficiency and scalability of the software when applied to Earth system model components. Typically the compute time needed to perform the search is in the order of a few seconds and is only weakly dependant on the grid size.

  10. A first-generation software product line for data acquisition systems in astronomy

    Science.gov (United States)

    López-Ruiz, J. C.; Heradio, Rubén; Cerrada Somolinos, José Antonio; Coz Fernandez, José Ramón; López Ramos, Pablo

    2008-07-01

    This article presents a case study on developing a software product line for data acquisition systems in astronomy based on the Exemplar Driven Development methodology and the Exemplar Flexibilization Language tool. The main strategies to build the software product line are based on the domain commonality and variability, the incremental scope and the use of existing artifacts. It consists on a lean methodology with little impact on the organization, suitable for small projects, which reduces product line start-up time. Software Product Lines focuses on creating a family of products instead of individual products. This approach has spectacular benefits on reducing the time to market, maintaining the know-how, reducing the development costs and increasing the quality of new products. The maintenance of the products is also enhanced since all the data acquisition systems share the same product line architecture.

  11. Generating Variable Strength Covering Array for Combinatorial Software Testing with Greedy Strategy

    Directory of Open Access Journals (Sweden)

    Ziyuan Wang

    2013-12-01

    Full Text Available Combinatorial testingis a practical and efficient software testing techniques, which could detectthe faults that triggered by interactions among factors in software. Comparedto the classic fixed strength combinatorial testing, the variable strengthcombinatorial testing usually uses less test cases to detect more interactionfaults, because it considers the actual interaction relationship in softwaresufficiently. For a model of variable strength combinatorial testing that hasbeen propose previously, two heuristic algorithms, which are based onone-test-at-a-time greedy strategy, are proposed in this paper to generatevariable strength covering arrays as test suites in software testing.Experimental results show that, compared to some existed algorithms and tools,the two proposed algorithms have advantages onboth the execution effectiveness and the optimality of the size of generatedtest suite.

  12. Software concept of in-service diagnostic systems for nuclear steam generating facilities

    International Nuclear Information System (INIS)

    The concept of software systems of in-service diagnostics is presented for the primary circuits of WWER-440 and WWER-1000 reactors. The basic and supplementary systems and user software are described for the collection, processing and evaluation of diagnostic signals from the primary circuits of the Dukovany and Bohunice nuclear power plants and the design is presented of the hierarchical structure of computers in the diagnostic systems of the Mochovce and Temelin nuclear power plants. The systems are operated using computers of Czechoslovak make of the ADT production series with operating systems RTE-II or DOS IV. (J.B.)

  13. Estructura cristalina del nuevo óxido tipo perovskita compleja Ba2NdZrO5,5

    Directory of Open Access Journals (Sweden)

    D.A. Land\\u00EDnez T\\u00E9llez

    2007-01-01

    Full Text Available A new complex perovskite material Ba2NdZrO5;5has been synthesized for the first time by a conventional solid state reaction process. X–raydiffraction (XRD measurements and Rietveld analysis revealed an ordered complex cubic structure characteristic of A2BB0O6crystallinestructure with a lattice constanta= 8;40ß0;01?A. Energy Dispersive X–ray (EDX analysis shows that Ba2NdZrO5;5is free of impuritytraces. Preliminary studies reveal that at820±C temperature Ba2NdZrO5;5does not react with YBa2Cu3O7°±. These favorable characteristicsof Ba2NdZrO5;5show that it can be used as a potential substrate material for fabrication of superconducting films.

  14. Radiation protection for repairs of reactor's internals at the 2nd Unit of the Nuclear Power Plant Temelin

    International Nuclear Information System (INIS)

    This paper describes the process and extent of repairs of the 2nd unit of the Nuclear power plant Temelin during the shutdown of the reactor. All works were optimized in terms of radiation protection of workers.

  15. Proceedings of the 2nd Workshop on Robots in Clutter: Preparing robots for the real world (Berlin, 2013)

    OpenAIRE

    Zillich, Michael; Bennewitz, Maren; Fox, Maria; Piater, Justus; Pangercic, Dejan

    2013-01-01

    This volume represents the proceedings of the 2nd Workshop on Robots in Clutter: Preparing robots for the real world, held June 27, 2013, at the Robotics: Science and Systems conference in Berlin, Germany.

  16. A proposed approach for developing next-generation computational electromagnetics software

    Energy Technology Data Exchange (ETDEWEB)

    Miller, E.K.; Kruger, R.P. (Los Alamos National Lab., NM (United States)); Moraites, S. (Simulated Life Systems, Inc., Chambersburg, PA (United States))

    1993-01-01

    Computations have become a tool coequal with mathematics and measurements as a means of performing electromagnetic analysis and design. This is demonstrated by the volume of articles and meeting presentations in which computational electromagnetics (CEM) is routinely employed to address an increasing variety of problems. Yet, in spite of the substantial resources invested in CEM software over the past three decades, little real progress seems to have been made towards providing the EM engineer software tools having a functionality equivalent to that expected of hardware instrumentation. Furthermore, the bulk of CEM software now available is generally of limited applicability to large, complex problems because most modeling codes employ a single field propagator, or analytical form, of Maxwell's Equations. The acknowledged advantages of hybrid models, i.e., those which employ different propagators in differing regions of a problem, are relatively unexploited. The thrust of this discussion is to propose a new approach designed to address both problems outlined above, integrating advances being made in both software and hardware development. After briefly reviewing the evolution of modeling CEM software to date and pointing out the deficiencies thereof, we describe an approach for making CEM tools more truly user friendly'' called EMSES (Electromagnetic Modeling and Simulation Environment for Systems). This will be achieved through two main avenues. One is developing a common problem-description language implemented in a visual programming environment working together with a translator that produces the specific model description needed by various numerical treatments, in order to optimize user efficiency. The other is to employ a new modeling paradigm based on the idea of field propagators to expedite the development of the hybrid models that are needed to optimize computation efficiency.

  17. A proposed approach for developing next-generation computational electromagnetics software

    Energy Technology Data Exchange (ETDEWEB)

    Miller, E.K.; Kruger, R.P. [Los Alamos National Lab., NM (United States); Moraites, S. [Simulated Life Systems, Inc., Chambersburg, PA (United States)

    1993-02-01

    Computations have become a tool coequal with mathematics and measurements as a means of performing electromagnetic analysis and design. This is demonstrated by the volume of articles and meeting presentations in which computational electromagnetics (CEM) is routinely employed to address an increasing variety of problems. Yet, in spite of the substantial resources invested in CEM software over the past three decades, little real progress seems to have been made towards providing the EM engineer software tools having a functionality equivalent to that expected of hardware instrumentation. Furthermore, the bulk of CEM software now available is generally of limited applicability to large, complex problems because most modeling codes employ a single field propagator, or analytical form, of Maxwell`s Equations. The acknowledged advantages of hybrid models, i.e., those which employ different propagators in differing regions of a problem, are relatively unexploited. The thrust of this discussion is to propose a new approach designed to address both problems outlined above, integrating advances being made in both software and hardware development. After briefly reviewing the evolution of modeling CEM software to date and pointing out the deficiencies thereof, we describe an approach for making CEM tools more truly ``user friendly`` called EMSES (Electromagnetic Modeling and Simulation Environment for Systems). This will be achieved through two main avenues. One is developing a common problem-description language implemented in a visual programming environment working together with a translator that produces the specific model description needed by various numerical treatments, in order to optimize user efficiency. The other is to employ a new modeling paradigm based on the idea of field propagators to expedite the development of the hybrid models that are needed to optimize computation efficiency.

  18. Characterization of the 1st and 2nd EF-hands of NADPH oxidase 5 by fluorescence, isothermal titration calorimetry, and circular dichroism

    Directory of Open Access Journals (Sweden)

    Wei Chin-Chuan

    2012-04-01

    Full Text Available Abstract Background Superoxide generated by non-phagocytic NADPH oxidases (NOXs is of growing importance for physiology and pathobiology. The calcium binding domain (CaBD of NOX5 contains four EF-hands, each binding one calcium ion. To better understand the metal binding properties of the 1st and 2nd EF-hands, we characterized the N-terminal half of CaBD (NCaBD and its calcium-binding knockout mutants. Results The isothermal titration calorimetry measurement for NCaBD reveals that the calcium binding of two EF-hands are loosely associated with each other and can be treated as independent binding events. However, the Ca2+ binding studies on NCaBD(E31Q and NCaBD(E63Q showed their binding constants to be 6.5 × 105 and 5.0 × 102 M-1 with ?Hs of -14 and -4 kJ/mol, respectively, suggesting that intrinsic calcium binding for the 1st non-canonical EF-hand is largely enhanced by the binding of Ca2+ to the 2nd canonical EF-hand. The fluorescence quenching and CD spectra support a conformational change upon Ca2+ binding, which changes Trp residues toward a more non-polar and exposed environment and also increases its ?-helix secondary structure content. All measurements exclude Mg2+-binding in NCaBD. Conclusions We demonstrated that the 1st non-canonical EF-hand of NOX5 has very weak Ca2+ binding affinity compared with the 2nd canonical EF-hand. Both EF-hands interact with each other in a cooperative manner to enhance their Ca2+ binding affinity. Our characterization reveals that the two EF-hands in the N-terminal NOX5 are Ca2+ specific. Graphical abstract

  19. A Software Safety Certification Plug-in for Automated Code Generators (Executive Briefing)

    Science.gov (United States)

    Denney, Ewen; Schumann, Johann; Greaves, Doug

    2006-01-01

    A viewgraph presentation describing a certification tool to check the safety of auto-generated codes is shown. The topics include: 1) Auto-generated Code at NASA; 2) Safety of Auto-generated Code; 3) Technical Approach; and 4) Project Plan.

  20. Roles of doping ions in afterglow properties of blue CaAl2O4:Eu2+,Nd3+ phosphors

    International Nuclear Information System (INIS)

    Eu2+ doped and Nd3+ co-doped calcium aluminate (CaAl2O4:Eu2+,Nd3+) phosphor was prepared by a urea-nitrate solution combustion method at furnace temperatures as low as 500 °C. The produced CaAl2O4:Eu2+,Nd3+ powder was investigated in terms of phase composition, morphology and luminescence by X-Ray diffraction (XRD), Scanning Electron Microscope (SEM), Fourier Transform Infra Red spectroscopy (FTIR) and Photoluminescence (PL) techniques respectively. XRD analysis depicts a dominant monoclinic phase that indicates no change in the crystalline structure of the phosphor with varying concentration of Eu2+ and Nd3+. SEM results show agglomerates with non-uniform shapes and sizes with a number of irregular network structures having lots of voids and pores. The Energy Dispersive X-ray Spectroscopy (EDS) and (FTIR) spectra confirm the expected chemical components of the phosphor. PL measurements indicated one broadband excitation spectra from 200 to 300 nm centered around 240 nm corresponding to the crystal field splitting of the Eu2+ d-orbital and an emission spectrum in the blue region with a maximum on 440 nm. This is a strong indication that there was dominantly one luminescence center, Eu2+ which represents emission from transitions between the 4f7 ground state and the 4f6–5d1 excited state configuration. High concentrations of Eu2+ and Nd3+ generally reduce both intensity and lifetime of the phosphor powders. The optimized content of Eu2+ is 1 mol% and for Nd3+ is 1 mol% for the obtained phosphors with excellent optical properties. The phosphor also emits visible light at around 587 and 616 nm. Such emissions can be ascribed to the 5D0–7F1 and 5D0–7F2 intrinsic transition of Eu3+ respectively. The decay characteristics exhibit a significant rise in initial intensity with increasing Eu2+ doping concentration while the decay time increased with Nd3+ co-doping. The observed afterglow can be ascribed to the generation of suitable traps due to the presence of the Nd3+ ions.

  1. Collection of documents in the 2nd information exchange meeting on radioactive waste disposal research network

    International Nuclear Information System (INIS)

    The 2nd meeting on 'Radioactive Waste Disposal Research Network' was held at the Nagoya University Museum on March 30, 2007. The 'Radioactive Waste Disposal Research Network' was established in Interorganization Atomic Energy Research Program under academic collaborative agreement between Japan Atomic Energy Agency and the University of Tokyo. The objective is to develop both research infrastructures and human expertise in Japan to an adequate performance level, thereby contributing to the development of the fundamental research in the field of radioactive waste disposal. This material is a collection of presentations and discussions during the information exchange meeting. (author)

  2. 2nd workshop on Wendelstein VII-X, Schloss Ringberg, Bavaria, 13-16 June 1988

    International Nuclear Information System (INIS)

    This IPP-Report is based on the 'Summary of the Workshop' by H. Wobig, and contains a number of figures and tables from contributed papers with some short descriptive remarks. About 40 papers were presented at the 2nd Workshop on Wendelstein VII-X. The programme of the workshop is given in appendix 1. There were nearly 50 participants as listed in appendix 2, several of them on a part-time basis. Appendix 3 gives the correspondence for the numbers of figures and tables to those contained in the contributions to the workshop. (orig.)

  3. Book Review: The Communicating Leader: The key to strategic alignment (2nd Ed)

    OpenAIRE

    Birkenbach, X. C.

    2003-01-01

    Title: The Communicating Leader: The key to strategic alignment (2nd Ed) Author: Gustav Puth Publisher: Van Schaik Publishers Reviewer: XC Birkenbach

    The aim of the book according to the author, is "meant to be a usable tool, an instrument in the toolbox of the real leader and leadership student". The book is written in conversational style (as intended by the author) and the 219 pages of the 10 chapters are logically packaged into three parts. While the main emphasis is naturally on...

  4. Group field theory as the 2nd quantization of Loop Quantum Gravity

    CERN Document Server

    Oriti, Daniele

    2013-01-01

    We construct a 2nd quantized reformulation of canonical Loop Quantum Gravity at both kinematical and dynamical level, in terms of a Fock space of spin networks, and show in full generality that it leads directly to the Group Field Theory formalism. In particular, we show the correspondence between canonical LQG dynamics and GFT dynamics leading to a specific GFT model from any definition of quantum canonical dynamics of spin networks. We exemplify the correspondence of dynamics in the specific example of 3d quantum gravity. The correspondence between canonical LQG and covariant spin foam models is obtained via the GFT definition of the latter.

  5. Construction of the 2nd 500kV DC gun at KEK

    International Nuclear Information System (INIS)

    The 2nd 500 kV DC photocathode electron gun for a ERL injector was constructed at KEK. The gun has some functions such as a insulated anode electrode for using dark current monitor, a repeller electrode for decreasing backward ions, extreme high vacuum pumps and so on. A high voltage conditioning is just begun from this summer. In addition, a new cathode preparation system has been developed. It can prepare three cathodes simultaneously and storage many cathodes in a good vacuum condition. The detail design was finished and the construction of all in-vacuum components is progressing. (author)

  6. Re-fighting the 2nd Anglo-Boer War: historians in the trenches

    OpenAIRE

    Ian Van der Waag

    2012-01-01

    Some one hundred years ago, South Africa was tom apart by the 2nd Anglo- Boer War (1899-1902). The war was a colossal psychological experience fought at great expense: It cost Britain twenty-two thousand men and £223 million. The social, economic and political cost to South Africa was greater than the statistics immediately indicate: at least ten thousand fighting men in addition to the camp deaths, where a combination of indifference and incompetence resulted in the deaths of 27 927 Boers a...

  7. Future Control and Automation : Proceedings of the 2nd International Conference on Future Control and Automation

    CERN Document Server

    2012-01-01

    This volume Future Control and Automation- Volume 1 includes best papers selected from 2012 2nd International Conference on Future Control and Automation (ICFCA 2012) held on July 1-2, 2012, Changsha, China. Future control and automation is the use of control systems and information technologies to reduce the need for human work in the production of goods and services. This volume can be divided into five sessions on the basis of the classification of manuscripts considered, which is listed as follows: Identification and Control, Navigation, Guidance and Sensor, Simulation Technology, Future Telecommunications and Control

  8. A New Generation of Telecommunications for Mars: The Reconfigurable Software Radio

    Science.gov (United States)

    Adams, J.; Horne, W.

    2000-01-01

    Telecommunications is a critical component for any mission at Mars as it is an enabling function that provides connectivity back to Earth and provides a means for conducting science. New developments in telecommunications, specifically in software - configurable radios, expand the possible approaches for science missions at Mars. These radios provide a flexible and re-configurable platform that can evolve with the mission and that provide an integrated approach to communications and science data processing. Deep space telecommunication faces challenges not normally faced by terrestrial and near-earth communications. Radiation, thermal, highly constrained mass, volume, packaging and reliability all are significant issues. Additionally, once the spacecraft leaves earth, there is no way to go out and upgrade or replace radio components. The reconfigurable software radio is an effort to provide not only a product that is immediately usable in the harsh space environment but also to develop a radio that will stay current as the years pass and technologies evolve.

  9. Makahiki+WattDepot : An open source software stack for next generation energy research and education

    DEFF Research Database (Denmark)

    Johnson, Philip M.; Xu, Yongwen

    2013-01-01

    The accelerating world-wide growth in demand for energy has led to the conceptualization of a “smart grid”, where a variety of decentralized, intermittent, renewable energy sources (for example, wind, solar, and wave) would provide most or all of the power required by small-scale “micro-grids” servicing hundreds to thousands of consumers. Such a smart grid will require consumers to transition from passive to active participation in order to optimize the efficiency and effectiveness of the grid’s electrical capabilities. This paper presents a software stack comprised of two open source software systems, Makahiki and WattDepot, which together are designed to engage consumers in energy issues through a combination of education, real-time feedback, incentives, and game mechanics. We detail the novel features of Makahiki and WattDepot, along with our initial experiences using them to implement an energy challenge called the Kukui Cup.

  10. 2nd PEGS Annual Symposium on Antibodies for Cancer Therapy: April 30-May 1, 2012, Boston, USA.

    Science.gov (United States)

    Ho, Mitchell; Royston, Ivor; Beck, Alain

    2012-01-01

    The 2nd Annual Antibodies for Cancer Therapy symposium, organized again by Cambridge Healthtech Institute as part of the Protein Engineering Summit, was held in Boston, USA from April 30th to May 1st, 2012. Since the approval of the first cancer antibody therapeutic, rituximab, fifteen years ago, eleven have been approved for cancer therapy, although one, gemtuzumab ozogamicin, was withdrawn from the market. The first day of the symposium started with a historical review of early work for lymphomas and leukemias and the evolution from murine to human antibodies. The symposium discussed the current status and future perspectives of therapeutic antibodies in the biology of immunoglobulin, emerging research on biosimilars and biobetters, and engineering bispecific antibodies and antibody-drug conjugates. The tumor penetration session was focused on the understanding of antibody therapy using ex vivo tumor spheroids and the development of novel agents targeting epithelial junctions in solid tumors. The second day of the symposium discussed the development of new generation recombinant immunotoxins with low immunogenicity, construction of chimeric antigen receptors, and the proof-of-concept of 'photoimmunotherapy'. The preclinical and clinical session presented antibodies targeting Notch signaling and chemokine receptors. Finally, the symposium discussed emerging technologies and platforms for therapeutic antibody discovery. PMID:22864478

  11. Research on Object-oriented Software Testing Cases of Automatic Generation

    Directory of Open Access Journals (Sweden)

    Junli Zhang

    2013-11-01

    Full Text Available In the research on automatic generation of testing cases, there are different execution paths under drivers of different testing cases. The probability of these paths being executed is also different. For paths which are easy to be executed, more redundant testing case tend to be generated; But only fewer testing cases are generated for the control paths which are hard to be executed. Genetic algorithm can be used to instruct the automatic generation of testing cases. For the former paths, it can restrict the generation of these kinds of testing cases. On the contrary, the algorithm will encourage the generation of such testing cases as much as possible. So based on the study on the technology of path-oriented testing case automatic generation, the genetic algorithm is adopted to construct the process of automatic generation. According to the triggering path during the dynamic execution of program, the generated testing cases are separated into different equivalence class. The number of testing case is adjusted dynamicly by the fitness corresponding to the paths. The method can create a certain number of testing cases for each execution path to ensure the sufficiency. It also reduces redundant testing cases so it is an effective method for automatic generation of testing cases.

  12. Technical Background Material for the Wave Generation Software AwaSys 5

    DEFF Research Database (Denmark)

    Frigaard, Peter; Andersen, Thomas Lykke

    2010-01-01

    "Les Appareils Generateurs de Houle en Laboratorie" presented by Bi¶esel and Suquet in 1951 discussed and solved the analytical problems concerning a number of di®erent wave generator types. For each wave maker type the paper presented the transfer function between wave maker displacement and wave amplitude in those cases where the analytical problem could be solved. The article therefore represented a giant step in wave generation techniques and found the basis for today's wave generation in hydraulics laboratories.

  13. 2nd Research Coordination Meeting of the IAEA Coordinated Research Project on Analyses of, and Lessons Learned from the Operational Experience with Fast Reactor Equipment and Systems. Working Material

    International Nuclear Information System (INIS)

    The objectives of the 2nd RCM were to: - Discuss the status – and resolve remaining open issues – of forming the teams of participants from Member States for each of the three CRP “Work Domains” i.e. “Steam Generators”, “Fuel and Blanket Subassemblies”, and “Structural Materials”; - Define and agree upon the list of main CRP tasks; - Update the roadmap (work plans, milestones, deadlines) for the CRP; - Define time and venue of the next RCM

  14. Proceedings of the 2nd JAERI symposium on HTGR technologies October 21 ? 23, 1992, Oarai, Japan

    International Nuclear Information System (INIS)

    The Japan Atomic Energy Research Institute (JAERI) held the 2nd JAERI Symposium on HTGR Technologies on October 21 to 23, 1992, at Oarai Park Hotel at Oarai-machi, Ibaraki-ken, Japan, with support of International Atomic Energy Agency (IAEA), Science and Technology Agency of Japan and the Atomic Energy Society of Japan on the occasion that the construction of the High Temperature Engineering Test Reactor (HTTR), which is the first high temperature gas-cooled reactor (HTGR) in Japan, is now being proceeded smoothly. In this symposium, the worldwide present status of research and development (R and D) of the HTGRs and the future perspectives of the HTGR development were discussed with 47 papers including 3 invited lectures, focusing on the present status of HTGR projects and perspectives of HTGR Development, Safety, Operation Experience, Fuel and Heat Utilization. A panel discussion was also organized on how the HTGRs can contribute to the preservation of global environment. About 280 participants attended the symposium from Japan, Bangladesh, Germany, France, Indonesia, People's Republic of China, Poland, Russia, Switzerland, United Kingdom, United States of America, Venezuela and the IAEA. This paper was edited as the proceedings of the 2nd JAERI Symposium on HTGR Technologies, collecting the 47 papers presented in the oral and poster sessions along with 11 panel exhibitions on the results of research and development associated to the HTTR. (author)

  15. Proceedings of the 2nd technical meeting on high temperature gas-cooled reactors

    International Nuclear Information System (INIS)

    From the point of view for establishing and upgrading the technology basis of HTGRs, the 2nd Technical Meeting on High Temperature Gas-cooled Reactors (HTGRs) was held on March 11 and 12, 1992, in Tokai Research Establishment in order to review the present status and the results of Research and Development (R and D) of HTGRs, to discuss on the items of R and D which should be promoted more positively in the future and then, to help in determining the strategy of development of high temperature engineering and examination in JAERI. At the 2nd Technical Meeting, which followed the 1st Technical Meeting held in February 1990 in Tokai Research Establishment, expectations to the High Temperature Engineering Test Reactor (HTTR), possible contributions of the HTGRs to the preservation of the global environment and the prospect of HTGRs were especially discussed, focusing on the R and D of Safety, high temperature components and process heat utilization by the experts from JAERI as well as universities, national institutes, industries and so on. This proceedings summarizes the papers presented in the oral sessions and materials exhibited in the poster session at the meeting and will be variable as key materials for promoting the R and D on HTGRs from now on. (author)

  16. All in a Day's Work:Careers Using Science, 2nd Edition

    Science.gov (United States)

    Megan Sullivan

    2008-06-01

    "Almost all careers in the 21st century require a working knowledge of science and mathematics," says Steve Metz, The Science Teacher field editor, in his introduction to All in a Day's Work, 2nd edition . "The pending retirement of 78 millions baby boomers can only add to the need for science and mathematics training, as companies begin recruiting replacement workers in science fields, sometimes--believe it or not--as early as middle school!" This expanded second edition will help you give students an exciting look at the vast array of jobs built on a foundation of science, including: ? the expected--high school science teacher, microbiologist, and radiation therapist, ? the unexpected--bomb investigator, space architect, and musical acoustics scientist, the adventurous--astronaut, deep-cave explorer, and dinosaur paleontologist, and ? the offbeat-- shark advocate, roller coaster designer, and oyster wrangler All in a Day's Work, 2nd edition is a compendium of 49 of the popular "Career of the Month" columns from the NSTA high school journal The Science Teacher . Each column profiles a person in a science-related job and can be reproduced and shared with your high school students as they make career and education plans. Each profile includes suggestions about how to find additional career information, including links to websites and relevant professional organizations and interest groups.

  17. Software tool for analysing the family shopping basket without candidate generation

    Directory of Open Access Journals (Sweden)

    Roberto Carlos Naranjo Cuervo

    2010-05-01

    Full Text Available Tools leading to useful knowledge being obtained for supporting marketing decisions being taken are currently needed in the e-commerce environment. A process is needed for this which uses a series of techniques for data-processing; data-mining is one such technique enabling automatic information discovery. This work presents the association rules as a suitable technique for dis-covering how customers buy from a company offering business to consumer (B2C e-business, aimed at supporting decision-ma-king in supplying its customers or capturing new ones. Many algorithms such as A priori, DHP, Partition, FP-Growth and Eclat are available for implementing association rules; the following criteria were defined for selecting the appropriate algorithm: database insert, computational cost, performance and execution time. The development of a software tool is also presented which involved the CRISP-DM approach; this software tool was formed by the following four sub-modules: data pre-processing, data-mining, re-sults analysis and results application. The application design used three-layer architecture: presentation logic, business logic and service logic. Data warehouse design and algorithm design were included in developing this data-mining software tool. It was tested by using a FoodMart company database; the tests included performance, functionality and results’ validity, thereby allo-wing association rules to be found. The results led to concluding that using association rules as a data mining technique facilita-tes analysing volumes of information for B2C e-business services which represents a competitive advantage for those companies using Internet as their sales’ media.

  18. Development of Data Analysis Software for Diagnostic Eddy Current Probe (D-probe) for Steam Generator Tube Inspection

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Myung Sik; Hur, Do Haeng; Kim, Kyung Mo; Han, Jung Ho; Lee, Deok Hyun [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2010-10-15

    Occurrences of a stress corrosion cracking in the steam generator tubes of nuclear power plants are closely related to the residual stress existing on the region of a geometric change, that is, expansion transition, u-bend, ding, dent, bulge, etc. Therefore, information on the location, type and quantitative size of a geometric anomaly existing in a tube is a prerequisite to the activity of a non destructive inspection for a root cause analysis, alert detection of an earlier crack, and the prediction of a further crack evolution. KAERI developed an innovative eddy current probe, D-probe, equipped with the simultaneous dual functions of a crack detection and a 3-dimensional quantitative profile measurement. Its excellent performance has been verified through the sampling inspections in several domestic nuclear power plants where the various types of the steam generator tube cracking were observed in operation. The qualified data analysis software should be furnished in order to deploy D-probe to pre- and in-service inspection of commercial power plant. This paper introduces the PC-Windows based eddy current data analysis software which is being developed for D-probe in cooperation with Zetec Inc

  19. System and Component Software Specification, Run-time Verification and Automatic Test Generation Project

    National Aeronautics and Space Administration — The following background technology is described in Part 5: Run-time Verification (RV), White Box Automatic Test Generation (WBATG). Part 5 also describes how WBATG...

  20. A Customer Value Creation Framework for Businesses That Generate Revenue with Open Source Software

    OpenAIRE

    Aparna Shanker

    2012-01-01

    Technology entrepreneurs must create value for customers in order to generate revenue. This article examines the dimensions of customer value creation and provides a framework to help entrepreneurs, managers, and leaders of open source projects create value, with an emphasis on businesses that generate revenue from open source assets. The proposed framework focuses on a firm's pre-emptive value offering (also known as a customer value proposition). This is a firm's offering of the value it se...

  1. Assessment of nursing care using indicators generated by software / Evaluación de la asistencia de enfermería utilizando indicadores generados por un software / Avaliação da assistência de enfermagem utilizando indicadores gerados por um software

    Scientific Electronic Library Online (English)

    Ana Paula Souza, Lima; Tânia Couto Machado, Chianca; Meire Chucre, Tannure.

    2015-04-01

    Full Text Available OBJETIVO: analisar a eficácia do Processo de Enfermagem em uma Unidade de Terapia Intensiva, utilizando indicadores gerados por um software. MÉTODO: estudo transversal, cujos dados foram coletados durante quatro meses. Enfermeiros e acadêmicos realizaram, diariamente, cadastro e anamnese (na admiss [...] ão), exame físico, diagnósticos de enfermagem, planejamento/prescrição de enfermagem e avaliação da assistência de 17 pacientes, utilizando um software. Calculou-se os indicadores incidência e prevalência de diagnósticos de enfermagem, taxa de efetividade diagnóstica de risco e taxa de efetividade na prevenção de complicações. RESULTADOS: o Risco de desequilíbrio na temperatura corporal foi o diagnóstico mais incidente (23,53%) e o menos incidente foi o Risco de constipação (0%). O Risco de integridade da pele prejudicada foi prevalente em 100% dos pacientes, enquanto o Risco de confusão aguda foi o menos prevalente (11,76%). Risco de constipação e Risco de integridade da pele prejudicada obtiveram taxa de efetividade diagnóstica de risco de 100%. A taxa de efetividade na prevenção de confusão aguda e de queda foi de 100%. CONCLUSÃO: analisou-se a eficácia do Processo de Enfermagem utilizando indicadores, pois retratam como o enfermeiro tem identificado os problemas e riscos do paciente, e planejado a assistência de forma sistematizada. Abstract in spanish OBJETIVO: analizar la eficacia del Proceso de Enfermería en una Unidad de Terapia Intensiva, utilizando indicadores generados por un software. MÉTODO: estudio transversal, cuyos datos fueron recolectados durante cuatro meses. Enfermeros y académicos realizaron, diariamente, registro y anamnesis (en [...] la admisión), examen físico, diagnósticos de enfermería, planificación/prescripción de enfermería y evaluación de la asistencia en 17 pacientes, utilizando un software. Se calculó los indicadores incidencia y prevalencia de diagnósticos de enfermería, la tasa de efectividad diagnóstica de riesgo y la tasa de efectividad en la prevención de complicaciones. RESULTADOS: el Riesgo de desequilibrio en la temperatura corporal fue el diagnóstico más prevalente (23,53%) y el menos prevalente fue el Riesgo de constipación (0%). El Riesgo de integridad de la piel perjudicada fue prevalente en 100% de los pacientes, en cuanto el Riesgo de confusión aguda fue el menos prevalente (11,76%). El Riesgo de constipación y el Riesgo de integridad de la piel perjudicada obtuvieron una tasa de efectividad diagnóstica de riesgo de 100%. La tasa de efectividad en la prevención de confusión aguda y de caída fue de 100%. CONCLUSIÓN: se analizó la eficacia del Proceso de Enfermería utilizando indicadores, ya que retratan cómo el enfermero ha identificado los problemas y riesgos del paciente, y planificado la asistencia de forma sistematizada. Abstract in english OBJECTIVE: to analyze the efficacy of the Nursing Process in an Intensive Care Unit using indicators generated by software. METHOD: cross-sectional study using data collected for four months. RNs and students daily registered patients, took history (at admission), performed physical assessments, an [...] d established nursing diagnoses, nursing plans/prescriptions, and assessed care delivered to 17 patients using software. Indicators concerning the incidence and prevalence of nursing diagnoses, rate of effectiveness, risk diagnoses, and rate of effective prevention of complications were computed. RESULTS: the Risk for imbalanced body temperature was the most frequent diagnosis (23.53%), while the least frequent was Risk for constipation (0%). The Risk for Impaired skin integrity was prevalent in 100% of the patients, while Risk for acute confusion was the least prevalent (11.76%). Risk for constipation and Risk for impaired skin integrity obtained a rate of risk diagnostic effectiveness of 100%. The rate of effective prevention of acute confusion and falls was 100%. CONCLUSION: the efficacy of the Nursing Process using indicators was analyzed because these

  2. A Customer Value Creation Framework for Businesses That Generate Revenue with Open Source Software

    Directory of Open Access Journals (Sweden)

    Aparna Shanker

    2012-03-01

    Full Text Available Technology entrepreneurs must create value for customers in order to generate revenue. This article examines the dimensions of customer value creation and provides a framework to help entrepreneurs, managers, and leaders of open source projects create value, with an emphasis on businesses that generate revenue from open source assets. The proposed framework focuses on a firm's pre-emptive value offering (also known as a customer value proposition. This is a firm's offering of the value it seeks to create for a customer, in order to meet his or her requirements.

  3. A critical discussion of the 2nd intercomparison on electron paramagnetic resonance dosimetry with tooth enamel

    International Nuclear Information System (INIS)

    Recently, we have participated in 'The 2nd International Intercomparison on EPR Tooth Dosimetry' wherein 18 laboratories had to evaluate low-radiation doses (100-1000 mGy) in intact teeth (Wieser et al., Radiat. Meas., 32 (2000a) 549). The results of this international intercomparison seem to indicate a promising picture of EPR tooth dosimetry. In this paper, the two Belgian EPR participants present a more detailed and critical study of their contribution to this intercomparison. The methods used were maximum likelihood common factor analysis (MLCFA) and spectrum subtraction. Special attention is paid to potential problems with sample preparation, intrinsic dose evaluation, linearity of the dose response, and determination of dose uncertainties

  4. Boundary value problems for the 2nd-order Seiberg-Witten equations

    Directory of Open Access Journals (Sweden)

    Celso Melchiades Doria

    2005-02-01

    Full Text Available It is shown that the nonhomogeneous Dirichlet and Neuman problems for the 2nd-order Seiberg-Witten equation on a compact 4-manifold X admit a regular solution once the nonhomogeneous Palais-Smale condition ℋ is satisfied. The approach consists in applying the elliptic techniques to the variational setting of the Seiberg-Witten equation. The gauge invariance of the functional allows to restrict the problem to the Coulomb subspace 𝒞αℭ of configuration space. The coercivity of the 𝒮𝒲α-functional, when restricted into the Coulomb subspace, imply the existence of a weak solution. The regularity then follows from the boundedness of L∞-norms of spinor solutions and the gauge fixing lemma.

  5. Preliminary GPS orbit combination results of the IGS 2nd reprocessing campaign

    Science.gov (United States)

    Choi, Kevin

    2015-04-01

    International GNSS Service (IGS) has contributed to the International Terrestrial Reference Frame by reprocessing historic GPS network data and submitting Terrestrial Reference Frame solutions and Earth Rotation Parameters. For the 2nd reprocessing campaign, Analysis Centers (ACs) used up to 21 years of GPS observation data with daily integrations. IERS2010 conventions are applied to model the physical effects of the Earth. Total eight ACs have participated (7 Global solutions, and 2 Tide Gauge solutions) by reprocessing entire time series in a consistent way using the latest models and methodology. IGS combined daily SINEX TRF and EOP combinations have already been submitted to the IERS for ITRF2013. This presentation mainly focuses on the preliminary quality assessment of the reprocessed AC orbits. Quality of the orbit products are examined by examining the repeatability between daily AC satellite ephemeris. Power spectral analysis shows the background noise characteristics of each AC products, and its periodic behaviors.

  6. 2nd FP7 Conference and International Summer School Nanotechnology : From Fundamental Research to Innovations

    CERN Document Server

    Yatsenko, Leonid

    2015-01-01

    This book presents some of the latest achievements in nanotechnology and nanomaterials from leading researchers in Ukraine, Europe, and beyond. It features contributions from participants in the 2nd International Summer School “Nanotechnology: From Fundamental Research to Innovations” and International Research and Practice Conference “Nanotechnology and Nanomaterials”, NANO-2013, which were held in Bukovel, Ukraine on August 25-September 1, 2013. These events took place within the framework of the European Commission FP7 project Nanotwinning, and were organized jointly by the Institute of Physics of the National Academy of Sciences of Ukraine, University of Tartu (Estonia), University of Turin (Italy), and Pierre and Marie Curie University (France). Internationally recognized experts from a wide range of universities and research institutions share their knowledge and key results on topics ranging from nanooptics, nanoplasmonics, and interface studies to energy storage and biomedical applications. Pr...

  7. 2nd Canada-China joint workshop on supercritical-water-cooled reactors (CCSC-2010)

    International Nuclear Information System (INIS)

    The 2nd Canada-China Joint Workshop on Supercritical-Water-Cooled Reactors (CCSC-2010) was held in Toronto, Ontario, Canada on April 25-25, 2010. This joint workshop aimed at providing a forum for discussion of advancements and issues, sharing information and technology transfer, and establishing future collaborations on research and developments for supercritical water-cooled reactors (SCWR) between Canadian and Chinese research organizations. Participants were those involved in research and development of SCWR core design, materials, chemistry, corrosion, thermalhydraulics, and safety analysis at organizations in Canada and China. Papers related to the following topics were of interest to the workshop: reactor core and fuel designs; materials, chemistry and corrosion; thermalhydraulics and safety analysis; balance of plant; and other applications.

  8. Summary of the 2nd workshop on ion beam-applied biology

    International Nuclear Information System (INIS)

    Induction of novel plant resources by ion beam-irradiation has been investigated in JAERI. To share the knowledge of the present status of the field, and to find out future plants, 1st Workshop on ion beam-applied biology was held last year titled as ''Development of breeding technique for ion beams''. To further improve the research cooperation and to exchange useful information in the field, researchers inside JAERI and also with researchers outside, such as those from agricultural experiment stations, companies, and Universities met each other at the 2nd workshop on ion beam-applied biology titled as ''Future development of breeding technique for ion beams''. People from RIKEN, Institute of Radiation Breeding, Wakasa wan Energy Research Center, National Institute of Radiological Science also participated in this workshop. The 12 of the presented papers are indexed individually. (J.P.N.)

  9. 2nd Symposium on Fluid-Structure-Sound Interactions and Control

    CERN Document Server

    Liu, Yang; Huang, Lixi; Hodges, Dewey

    2014-01-01

    With rapid economic and industrial development in China, India and elsewhere, fluid-related structural vibration and noise problems are widely encountered in many fields, just as they are in the more developed parts of the world, causing increasingly grievous concerns. Turbulence clearly has a significant impact on many such problems. On the other hand, new opportunities are emerging with the advent of various new technologies, such as signal processing, flow visualization and diagnostics, new functional materials, sensors and actuators, etc. These have revitalized interdisciplinary research activities, and it is in this context that the 2nd symposium on fluid-structure-sound interactions and control (FSSIC) was organized. Held in Hong Kong (May 20-21, 2013) and Macau (May 22-23, 2013), the meeting brought together scientists and engineers working in all related branches from both East and West and provided them with a forum to exchange and share the latest progress, ideas and advances and to chart the fronti...

  10. Textile Tectonics : 2nd Ventulett Symposium, Georgia Tech University, School of Architecture, Atlanta, November 2008

    DEFF Research Database (Denmark)

    Mossé, Aurélie

    The meeting of architecture and textiles is a continuous but too often forgotten story of intimate exchange. However, the 2nd Ventulett Symposium hosted by the College of Architecture, within Georgia Institute of Technology, Atlanta, GA, was one of these precious moments celebrating such a marriage. Organized by Lars Spuybroeck, principal of Nox, Rotterdam, and current Thomas W. Ventulett III distinguished chair of Architectural Design, the event was embracing the textile tectonics as a core topic, praising textiles as the key component of architecture, relying on Gottfried Semper’s understanding of the discipline. Inspiring time gathering some of the most exciting architects of the moment, Lars Spuybroeck, Mark Burry, Evan Douglis, Michael Hensel and Cecil Balmond were invited to discuss their understanding of tectonics. Full text available at http://textilefutures.co.uk/exchange/bin/view/TextileFutures/TextileTectonics

  11. Book Review: The Communicating Leader: The key to strategic alignment (2nd Ed

    Directory of Open Access Journals (Sweden)

    X. C. Birkenbach

    2003-10-01

    Full Text Available Title: The Communicating Leader: The key to strategic alignment (2nd Ed Author: Gustav Puth Publisher: Van Schaik Publishers Reviewer: XC Birkenbach

    The aim of the book according to the author, is "meant to be a usable tool, an instrument in the toolbox of the real leader and leadership student". The book is written in conversational style (as intended by the author and the 219 pages of the 10 chapters are logically packaged into three parts. While the main emphasis is naturally on leadership and communication, the coverage includes topics typically encountered in Organisational Behaviour or Management texts, e.g., organizational culture, managing change, motivation, conflict management and strategic management.

  12. Proceedings of the 2nd joint seminar on atomic collisions and heavy ion induced nuclear reactions

    International Nuclear Information System (INIS)

    The meeting of the 2nd joint seminar on atomic collisions and heavy ion induced nuclear reactions was held at the University of Tokyo, May 13 and 14, 1982. The aim of this seminar has been not only to recognize the common problems lying between above two research fields, but also to obtain an overview of the theoretical and experimental approaches to clear the current problems. In the seminar, more than 50 participants gathered and presented 16 papers. These are two general reviews and fourteen comprehensive surveys on topical subjects which have been developed very intensively in recent years. The editors would like to thank all participants for their assistance and cooperation in making possible a publication of these proceedings. (author)

  13. Nonlinear Dynamics of Memristor Based 2nd and 3rd Order Oscillators

    KAUST Repository

    Talukdar, Abdul Hafiz

    2011-05-01

    Exceptional behaviours of Memristor are illustrated in Memristor based second order (Wien oscillator) and third order (phase shift oscillator) oscillator systems in this Thesis. Conventional concepts about sustained oscillation have been argued by demonstrating the possibility of sustained oscillation with oscillating resistance and dynamic poles. Mathematical models are also proposed for analysis and simulations have been presented to support the surprising characteristics of the Memristor based oscillator systems. This thesis also describes a comparative study among the Wien family oscillators with one Memristor. In case of phase shift oscillator, one Memristor and three Memristors systems are illustrated and compared to generalize the nonlinear dynamics observed for both 2nd order and 3rd order system. Detail explanations are provided with analytical models to simplify the unconventional properties of Memristor based oscillatory systems.

  14. Using integrating spheres with wavelength modulation spectroscopy: effect of pathlength distribution on 2nd harmonic signals

    Science.gov (United States)

    Hodgkinson, J.; Masiyano, D.; Tatam, R. P.

    2013-02-01

    We have studied the effect on 2nd harmonic wavelength modulation spectroscopy of the use of integrating spheres as multipass gas cells. The gas lineshape becomes distorted at high concentrations, as a consequence of the exponential pathlength distribution of the sphere, introducing nonlinearity beyond that expected from the Beer-Lambert law. We have modelled this numerically for methane absorption at 1.651 ?m, with gas concentrations in the range of 0-2.5 %vol in air. The results of this model compare well with experimental measurements. The nonlinearity for the 2 fWMS measurements is larger than that for direct scan measurements; if this additional effect were not accounted for, the resulting error would be approximately 20 % of the reading at a concentration of 2.5 %vol methane.

  15. Proceedings of the 2nd seminar of R and D on advanced ORIENT

    International Nuclear Information System (INIS)

    The 2nd Seminar of R and D on advanced ORIENT was held at Ricotte, on November 7th, 2008, Japan Atomic Energy Agency. The first meeting of this seminar was held on Oarai, Ibaraki on May, 2008, and more than fifty participants including related researchers and general public people were attended to this seminar. The second seminar has headed by Nuclear Science and Engineering Directorate, JAEA on Tokai, Ibaraki with 63 participants. Spent nuclear fuel should be recognized not only mass of radioactive elements but also potentially useful materials including platinum metals and rare earth elements. Taking the cooperation with universities related companies and research institutes, into consideration, we aimed at expanding and progressing the basic researches. This report records abstracts and figures submitted from the oral speakers in this seminar. (author)

  16. Belief Functions: Theory and Applications - Proceedings of the 2nd International Conference on Belief Functions

    CERN Document Server

    Masson, Marie-Hélène

    2012-01-01

    The theory of belief functions, also known as evidence theory or Dempster-Shafer theory, was first introduced by Arthur P. Dempster in the context of statistical inference, and was later developed by Glenn Shafer as a general framework for modeling epistemic uncertainty. These early contributions have been the starting points of many important developments, including the Transferable Belief Model and the Theory of Hints. The theory of belief functions is now well established as a general framework for reasoning with uncertainty, and has well understood connections to other frameworks such as probability, possibility and imprecise probability theories.   This volume contains the proceedings of the 2nd International Conference on Belief Functions that was held in Compiègne, France on 9-11 May 2012. It gathers 51 contributions describing recent developments both on theoretical issues (including approximation methods, combination rules, continuous belief functions, graphical models and independence concepts) an...

  17. 2nd AINSE Symposium on Small-Angle Scattering and Reflectometry. Program and Abstracts

    International Nuclear Information System (INIS)

    On the 25th and 26th of June 2003, AINSE, ANU, the Bragg Institute and the CRC for Polymers jointly held the 2nd AINSE Symposium on Small-Angle Scattering and Reflectometry at Lucas Heights in Sydney. The symposium encompassed the following techniques: Small-angle neutron scattering (SANS); Small-angle x-ray scattering (SAXS); Wide-angle x-ray scattering (WAXS); Neutron Reflectometry and X-ray reflectometry. Emphasis was placed on students presenting their work and researchers that currently did not employ, but were interested in using the above scattering techniques were encouraged to attend and present a poster on their current work. All presentations were in INIS scope and have been separately indexed

  18. An Evidential Interpretation of the 1st and 2nd Laws of Thermodynamics

    CERN Document Server

    Vieland, V J

    2013-01-01

    I argue here that both the 1st and 2nd laws of thermodynamics, generally understood to be quintessentially physical in nature, can be equally well described as being about the flow dynamics of information without the need to invoke physical manifestations for information. This involves developing two distinct, yet related, forms of bookkeeping: one pertaining to what physicists generally understand as information per se, which I call purely combinatoric information; and the other pertaining to a version of what physicists understand as energy, which I call evidential information, for reasons to be made clear. I illustrate both sets of books with application to a simple coin-tossing (binomial) experiment. I then show that the physical quantity temperature (T) linking those two forms of bookkeeping together in physics has a familiar, but surprising, interpretation in this setting: the direct informational analogue of T turns out to be what we would in ordinary English call the evidence.

  19. Performance evaluation of Enhanced 2nd Order Gray Edge Color Constancy Algorithm Using Bilateral Filter

    Directory of Open Access Journals (Sweden)

    Richa Dogra*1

    2014-04-01

    Full Text Available The color constancy techniques becomes an important pre-processing technique which reduces the effect of the light source on the given image or scene. It is found that light effects lot on a given scene. So effect of the light source may degrades the performance of certain applications a lot like face recognition, object detection, lane detection etc. Color constancy has ability to detection of color independent of light source. It is a characteristic of the distinct color awareness organization which guarantees that the apparent color of objects remains relatively constant under altering illumination conditions. The overall goal of this paper is to propose a new algorithm 2nd order gray edge based color constancy algorithm using bilateral algorithm. The overall attention is to enhance the color constancy algorithm further. The histogram stretching is also used to improve the results. The comparison has shown the significant improvement over the available techniques.

  20. Knowledge grows when shared : The Launch of OpenAIRE, 2nd December in Ghent

    DEFF Research Database (Denmark)

    Elbæk, Mikael Karstensen

    2010-01-01

    Knowledge is one of the few commodities that don’t devalue when used. Actually knowledge grows when shared and the free online access to peer-reviewed scientific publications is a potent ingredient the process of sharing. The sharing of knowledge is facilitated by the Open Access Movement. However Open Access is much more than downloading the PDF. Vice President of the European Commission and European Digital Agenda Commissioner Neelie Kroes boldly presented this message in the Opening Session of the OpenAIRE launch. On the 2nd December 2010 the official launch of OpenAIRE the European infrastructure for Open Access was launched in Ghent, Belgium. This project and initiative is facilitating the success of the Open Access Pilot in FP7 as presented earlier in this journal. In this brief article I will present some of the most interesting issues that were discussed during the first session of the day.

  1. Design and implementation aspects of open source Next Generation networks (NGN) test-bed software toolkits

    OpenAIRE

    Vingarzan, Dragos

    2014-01-01

    Informations- und Kommunikationstechnologien bilden seit langem das immer wichtiger werdende Rückgrat der weltweiten Wirtschaft und Telekommunikation, in der speziell Telekommunikationsnetze und -dienste einen elementaren Anteil tragen. Durch die Konvergenz von Telekommunikations- und Internettechnologien hat sich die Telekommunikationslandschaft in der letzten Dekade drastisch verändert. Bislang geschlossene Telekommunikationsumgebungen haben sich imWandel zum sogenannten Next Generation N...

  2. 1st and 2nd Trimester Headsize in Fetuses with Congenital Heart Disease: A Cohort Study

    DEFF Research Database (Denmark)

    Lauridsen, Mette HØj; Petersen, Olav BjØrn

    ?Background: Congenital heart disease (CHD) is associated with neuro-developmental disorders. The influence of CHD on the brain may be present in the fetus. We hypothesize that fetal cerebral growth is impaired as early as 2nd trimester. Aim: To investigate if fetal cerebral growth is associated with major and minor CHD.: Pregnant women in Denmark (more than 95%) attend two publicly funded ultrasound scans; around 12 and 20 weeks gestational age (GA). During the first scan fetal bi-parietal-diameter (BPD) is routinely obtained. During the second scan fetal head- circumference (HC) is obtained and screening for fetal malformations is carried out. Our cohort includes all fetuses in Western Denmark (2.9 million inhabitants) screened in between January 1st 2012 and December 31st 2013, diagnosed with any structural, non-syndromic congenital heart disease either during pregnancy or up to 6 months after birth.? Results 276 fetuses with CHD were identified. 114 (41%) were genetically screened primarily by chromosomal microarray analysis (n=82). Fetuses with identified chromosomal abnormalities were excluded as were multiple gestation fetuses and fetuses with major extra cardiac malformations. Data from 208 fetuses (75%) with presumed non-syndromic CHD were included, 85 (41%) with minor and 123 (59%) with major CHD. Z-scores for head size were analysed. Conclusions: Our preliminary results suggest that Bi-parietal-diameter in children with CHD is within the normal range in the 1st trimester, but fetal cerebral growth may be disrupted as early as during 2nd trimester in major CHD.

  3. Radcalc for windows benchmark study: A comparison of software results with Rocky Flats hydrogen gas generation data

    International Nuclear Information System (INIS)

    Radcalc for Windows Version 2.01 is a user-friendly software program developed by Waste Management Federal Services, Inc., Northwest Operations for the U.S. Department of Energy (McFadden et al. 1998). It is used for transportation and packaging applications in the shipment of radioactive waste materials. Among its applications are the classification of waste per the US. Department of Transportation regulations, the calculation of decay heat and daughter products, and the calculation of the radiolytic production of hydrogen gas. The Radcalc program has been extensively tested and validated (Green et al. 1995, McFadden et al. 1998) by comparison of each Radcalc algorithm to hand calculations. An opportunity to benchmark Radcalc hydrogen gas generation calculations to experimental data arose when the Rocky Flats Environmental Technology Site (RFETS) Residue Stabilization Program collected hydrogen gas generation data to determine compliance with requirements for shipment of waste in the TRUPACT-II (Schierloh 1998). The residue/waste drums tested at RFETS contain contaminated, solid, inorganic materials in polyethylene bags. The contamination is predominantly due to plutonium and americium isotopes. The information provided by Schierloh (1 998) of RFETS includes decay heat, hydrogen gas generation rates, calculated Geff values, and waste material type, making the experimental data ideal for benchmarking Radcalc. The following sections discuss the RFETS data and the Radcalc cases modeled with the data. Results are tabulated and also provided graphically

  4. Development of new generation software tools for simulation of electron beam formation in novel high power gyrotrons

    International Nuclear Information System (INIS)

    Computer aided design (CAD) based on numerical experiments performed by using adequate physical models and efficient simulation codes is an indispensable tool for development, investigation, and optimization of gyrotrons used as radiation sources for electron cyclotron resonance heating (ECRH) of fusion plasmas. In this paper, we review briefly the state-of-the-art in the field of modelling and simulation of intense, relativistic, helical electron beams formed in the electron-optical systems (EOS) of powerful gyrotrons. We discuss both the limitations of the known computer codes and the requirements for increasing their capabilities for solution of various design problems that are being envisaged in the development of the next generation gyrotrons for ECRH. Moreover, we present the concept followed by us in an attempt to unite the advantages of the modern programming techniques with self-consistent, first-principles 3D physical models in the creation of a new highly efficient and versatile software package for simulation of powerful gyrotrons

  5. Numerical Simulation of the Francis Turbine and CAD used to Optimized the Runner Design (2nd).

    Science.gov (United States)

    Sutikno, Priyono

    2010-06-01

    Hydro Power is the most important renewable energy source on earth. The water is free of charge and with the generation of electric energy in a Hydroelectric Power station the production of green house gases (mainly CO2) is negligible. Hydro Power Generation Stations are long term installations and can be used for 50 years and more, care must be taken to guarantee a smooth and safe operation over the years. Maintenance is necessary and critical parts of the machines have to be replaced if necessary. Within modern engineering the numerical flow simulation plays an important role in order to optimize the hydraulic turbine in conjunction with connected components of the plant. Especially for rehabilitation and upgrading existing Power Plants important point of concern are to predict the power output of turbine, to achieve maximum hydraulic efficiency, to avoid or to minimize cavitations, to avoid or to minimized vibrations in whole range operation. Flow simulation can help to solve operational problems and to optimize the turbo machinery for hydro electric generating stations or their component through, intuitive optimization, mathematical optimization, parametric design, the reduction of cavitations through design, prediction of draft tube vortex, trouble shooting by using the simulation. The classic design through graphic-analytical method is cumbersome and can't give in evidence the positive or negative aspects of the designing options. So it was obvious to have imposed as necessity the classical design methods to an adequate design method using the CAD software. There are many option chose during design calculus in a specific step of designing may be verified in ensemble and detail form a point of view. The final graphic post processing would be realized only for the optimal solution, through a 3 D representation of the runner as a whole for the final approval geometric shape. In this article it was investigated the redesign of the hydraulic turbine's runner, medium head Francis type, with following value for the most important parameter, the rated specific speed ns.

  6. Economic Load Distribution Among Generating Plant Using Software Based on Articial Neural Network (ANN.

    Directory of Open Access Journals (Sweden)

    Anireh, V.I.E.,

    2014-12-01

    Full Text Available This paper presents an application system that provided the best load distribution for optimal power flow (OPF with minimal fuel cost using artificial neural network (ANN. The idea was informed by the application of the principle of equal incremental cost rate for all the generators. The study also demonstrated a new computational model for the exponential function e which was introduced to the Tanh transfer function. A new version of Tanh function was therefore developed called the Tansigmod. It increased the speed of training of the feed forward network by a gain factor of about 3 units. To test the effectiveness of the proposed system, the case of a five bus network for TransAmadi gas turbine power station in Port Harcourt, Nigeria was demonstrated. Result obtained from the system show for example cost saving or Netsave monthly of about 108,520 Naira for a service load of 60MW supply. This benefit was derived from the system due to optimal distribution of load as against equal distribution of load. The application will assist the operators in the gas turbine power stations with the task of planning power generation economically.

  7. A Facilitated Interface to Generate a Combined Textual and Graphical Database System Using Widely Available Software

    Directory of Open Access Journals (Sweden)

    Corey Lawson

    2012-10-01

    Full Text Available Data-Base Management System (DBMS is the current standard for storing information. A DBMS organizes and maintains a structure of storage of data. Databases make it possible to store vast amounts of randomly created information and then retrieve items using associative reasoning in search routines. However, design of databases is cumbersome. If one is to use a database primarily to directly input information, each field must be predefined manually, and the fields must be organized to permit coherent data input. This static requirement is problematic and requires that database table(s be predefined and customized at the outset, a difficult proposition since current DBMS lack a user friendly front end to allow flexible design of the input model. Furthermore, databases are primarily text based, making it difficult to process graphical data. We have developed a general and nonproprietary approach to the problem of input modeling designed to make use of the known informational architecture to map data to a database and then retrieve the original document in freely editable form. We create form templates using ordinary word processing software: Microsoft InfoPath 2007. Each field in the form is given a unique name identifier in order to be distinguished in the database. It is possible to export text based documents created initially in Microsoft Word by placing a colon at the beginning of any desired field location. InfoPath then captures the preceding string and uses it as the label for the field. Each form can be structured in a way to include any combination of both textual and graphical fields. We input data into InfoPath templates. We then submit the data through a web service to populate fields in an SQL database. By appropriate indexing, we can then recall the entire document from the SQL database for editing, with corresponding audit trail. Graphical data is handled no differently than textual data and is embedded in the database itself permitting direct query approaches. This technique makes it possible for general users to benefit from a combined text-graphical database environment with a flexible non-proprietary interface. Consequently, any template can be effortlessly transformed to a database system and easily recovered in a narrative form.

  8. Comparative analysis of 1st, 2nd, and 4th year MD students' attitudes toward Complementary Alternative Medicine (CAM

    Directory of Open Access Journals (Sweden)

    Skelton Michele

    2008-09-01

    Full Text Available Abstract Background To identify and report the attitudes and beliefs of 1st, 2nd, and 4th year medical students toward complementary alternative medicine (CAM. Methods The previously validated and reliability tested CHBQ was administered to medical students attending the University of South Florida School of Medicine. Results Significant changes were found between both 1st (46.0 ± 7.7 and 4th (37.8 ± 15.7 year students and 2nd (48.3 ± 7.8 and 4th (37.8 ± 15.7 year students. No significant difference was found between 1st (46.0 ± 7.7 and 2nd (48.3 ± 7.8 year students. When comparing scores based on gender, a significant difference was present between males (41.2 ± 12.2 and females (46.1 ± 11.0. Conclusion CHBQ scores were significantly more positive in both 1st and 2nd year medical students in comparison with 4th year student's scores. These findings suggest that as student exposure to allopathic techniques and procedures increases during the last year of medical school, their attitudes toward CAM decrease. Females were also significantly more likely to have stronger positive attitudes toward CAM than males, though both genders represented an overall positive attitude toward CAM.

  9. GONe: software for estimating effective population size in species with generational overlap.

    Science.gov (United States)

    Coombs, J A; Letcher, B H; Nislow, K H

    2012-01-01

    GONe is a user-friendly, Windows-based program for estimating effective size (N(e) ) in populations with overlapping generations. It uses the Jorde-Ryman modification to the temporal method to account for age structure in populations. This method requires estimates of age-specific survival and birth rate and allele frequencies measured in two or more consecutive cohorts. Allele frequencies are acquired by reading in genotypic data from files formatted for either GENEPOP or TEMPOFS. For each interval between consecutive cohorts, N(e) is estimated at each locus and over all loci. Furthermore, N(e) estimates are output for three different genetic drift estimators (F(s) , F(c) and F(k) ). Confidence intervals are derived from a chi-square distribution with degrees of freedom equal to the number of independent alleles. GONe has been validated over a wide range of N(e) values, and for scenarios where survival and birth rates differ between sexes, sex ratios are unequal and reproductive variances differ. GONe is freely available for download at https://bcrc.bio.umass.edu/pedigreesoftware/. PMID:21827640

  10. Generalized evaluation of flow losses in 2nd order analysis of regenerative cycles

    Energy Technology Data Exchange (ETDEWEB)

    Kuehl, H.D.; Schulz, S.; Walther, C. [Univ. of Dortmund (Germany). Inst. of Thermodynamics

    1995-12-31

    Flow losses due to pressure drop do not only cause a loss of mechanical work but also affect the heat flows exchanged by a regenerative cycle, as the values of the p,V-integrals for the single cylinder volumes will be different from those obtained for ideal flow conditions, i.e. no pressure drop. As in 2nd order analysis the evaluation of the heat flows is primarily based on such ideal assumptions, additional corrections are required to account for the effects of flow dissipation. Although it is evident that the sum of these corrections must be equal to the loss of mechanical work, additional considerations are required to evaluate them separately. In this paper a generalized algorithm is presented which allows determination of these corrections for any valveless regenerative cycle, including more complex systems such as the Vuilleumier or the Duplex-Ericsson cycle. For this purpose, a systematic general description technique is introduced, which can handle even branched systems where the arrangement of the components is no longer linear. It is possible to generalize the equations of isothermal analysis, and on this basis a solution for the above problem is derived, assuming flow losses to be composed of a linear (laminar) and a square (turbulent) fraction. An example is used to demonstrate and discuss the results.

  11. Production of artificial ionospheric layers by frequency sweeping near the 2nd gyroharmonic

    Directory of Open Access Journals (Sweden)

    T. Pedersen

    2011-01-01

    Full Text Available Artificial ionospheric plasmas descending from the background F-region have been observed on multiple occasions at the High Frequency Active Auroral Research Program (HAARP facility since it reached full 3.6 MW power. Proximity of the transmitter frequency to the 2nd harmonic of the electron gyrofrequency (2fce has been noted as a requirement for their occurrence, and their disappearance after only a few minutes has been attributed to the increasing frequency mismatch at lower altitudes. We report new experiments employing frequency sweeps to match 2fce in the artificial plasmas as they descend. In addition to revealing the dependence on the 2fce resonance, this technique reliably produces descending plasmas in multiple transmitter beam positions and appears to increase their stability and lifetime. High-speed ionosonde measurements are used to monitor the altitude and density of the artificial plasmas during both the formation and decay stages.

  12. Mesocosm soil ecological risk assessment tool for GMO 2nd tier studies

    DEFF Research Database (Denmark)

    D'Annibale, Alessandra; Maraldo, Kristine

    Ecological Risk Assessment (ERA) of GMO is basically identical to ERA of chemical substances, when it comes to assessing specific effects of the GMO plant material on the soil ecosystem. The tiered approach always includes the option of studying more complex but still realistic ecosystem level effects in 2nd tier caged experimental systems, cf. the new GMO ERA guidance: EFSA Journal 2010; 8(11):1879. We propose to perform a trophic structure analysis, TSA, and include the trophic structure as an ecological endpoint to gain more direct insight into the change in interactions between species, i.e. the food-web structure, instead of relying only on the indirect evidence from population abundances. The approach was applied for effect assessment in the agro-ecosystem where we combined factors of elevated CO2, viz. global climate change, and GMO plant effects. A multi-species (Collembola, Acari and Enchytraeidae) mesocosm factorial experiment was set up in a greenhouse at ambient CO2 and 450 ppm CO2 with a GM barley variety and conventional varieties. The GM barley differed concerning the composition of amino acids in the grain (antisense C-hordein line). The fungicide carbendazim acted as a positive control. After 5 and 11 weeks, data on populations, plants and soil organic matter decomposition were evaluated. Natural abundances of stable isotopes, 13C and 15N, of animals, soil, plants and added organic matter (crushed maize leaves) were used to describe the soil food web structure.

  13. Proceedings of the 2nd CSNI Specialist Meeting on Simulators and Plant Analysers

    International Nuclear Information System (INIS)

    The safe utilisation of nuclear power plants requires the availability of different computerised tools for analysing the plant behaviour and training the plant personnel. These can be grouped into three categories: accident analysis codes, plant analysers and training simulators. The safety analysis of nuclear power plants has traditionally been limited to the worst accident cases expected for the specific plant design. Many accident analysis codes have been developed for different plant types. The scope of the analyses has continuously expanded. The plant analysers are now emerging tools intended for extensive analysis of the plant behaviour using a best estimate model for the whole plant including the reactor and full thermodynamic process, both combined with automation and electrical systems. The comprehensive model is also supported by good visualisation tools. Training simulators with real time plant model are tools for training the plant operators to run the plant. Modern training simulators have also features supporting visualisation of the important phenomena occurring in the plant during transients. The 2nd CSNI Specialist Meeting on Simulators and Plant Analysers in Espoo attracted some 90 participants from 17 countries. A total of 49 invited papers were presented in the meeting in addition to 7 simulator system demonstrations. Ample time was reserved for the presentations and informal discussions during the four meeting days. (orig.)

  14. Academic Training - 2nd Term: 08.01.2007 - 31.03.2007

    CERN Multimedia

    2006-01-01

    2006 - 2007 ACADEMIC TRAINING PROGRAMME 2nd Term : 08.01.2007 - 31.03.2007 LECTURE SERIES Applied Superconductivity by V. Palmieri, INFN, Padova, It. 17, 18, 19 January 11:00 -1200 - Auditorium, Bldg 500 String Theory for Pedestrians by B. Zwiebach, M.I.T. Cambridge, USA 29, 30, 31 January 11:00-12:00 - Auditorium, Bldg 500 on 29, 30 January TH Auditorium on 31 January Introduction to Supersymmetry by D. Kaplan, John Hopkins University, Baltimore, USA 12, 13, 14, 15 February 11:00-12:00 - Auditorium, Bldg 500 The Hunt for the Higgs Particle by F. Zwirner, University of Padova, It 27, 28 February, 1st March 11:00-12:00 - Auditorium, Bldg 500 From Evolution Theory to Parallel and Distributed Genetic by F. Fernandez de Vega 15, 16, March 11:00-12:00 - Auditorium, Bldg 500 The lectures are open to all those interested, without application. The abstract of the lectures, as well as any change to the above information (title, dates, time, place etc.) will be published in the CERN bulletin, the WWW, an...

  15. Academic Training - 2nd Term: 08.01.2007 - 31.03.2007

    CERN Multimedia

    2006-01-01

    2006 - 2007 ACADEMIC TRAINING PROGRAMME 2nd Term : 08.01.2007 - 31.03.2007 LECTURE SERIES Applied Superconductivity by V. Palmieri, INFN, Padova, It. 17, 18, 19 January 11:00 -1200 - Auditorium, bldg. 500 String Theory for Pedestrians by B. Zwiebach, M.I.T. Cambridge, USA 29, 30, 31 January 11:00-12:00 - Auditorium, bldg. 500 on 29, 30 January TH Auditorium on 31 January Introduction to Supersymmetry by D. Kaplan, John Hopkins University, Baltimore, USA 12, 13, 14, 15 February 11:00-12:00 - Auditorium, bldg. 500 The Hunt for the Higgs Particle by F. Zwirner, University of Padova, It 27, 28 February, 1st March 11:00-12:00 - Auditorium, bldg. 500 From Evolution Theory to Parallel and Distributed Genetic Programming by F. Fernandez de Vega 15, 16, March 11:00-12:00 - Auditorium, bldg. 500 The lectures are open to all those interested, without application. The abstract of the lectures, as well as any change to the above information (title, dates, time, place etc.) will be published in the WWW, and ...

  16. Transient 2nd Degree Av Block Mobitz Type II: A Rare Finding in Dengue Haemorrhagic Fever

    Science.gov (United States)

    Nigam, Ashwini Kumar; Agarwal, Ayush; Singh, Amit K; Yadav, Subhash

    2015-01-01

    Dengue has been a major problem as endemic occurs almost every year and causes a state of panic due to lack of proper diagnostic methods and facilities for proper management. Patients presenting with classical symptoms are easy to diagnose, however as a large number of cases occur every year, a number of cases diagnosed with dengue fever on occasion presents with atypical manifestations, which cause extensive evaluation of the patients, unnecessary referral to higher centre irrespective of the severity and therefore a rough idea of these manifestations must be present in the backdrop in order to prevent these problems. Involvement of cardiovascular system in dengue has been reported in previous studies, and they are usually benign and self-limited. The importance of study of conduction abnormalities is important as sometimes conduction blocks are the first sign of acute myocarditis in patients of Dengue Hemorrhagic Fever in shock. We present here a case of 2nd Degree Mobitz Type II atrioventricular AV block in a case of Dengue Hemorrhagic fever reverting to the normal rhythm in recovery phase and no signs thereafter on follow up.

  17. THINKLET: ELEMENTO CLAVE EN LA GENERACIÓN DE MÉTODOS COLABORATIVOS PARA EVALUAR USABILIDAD DE SOFTWARE / THINKLET: KEY ELEMENT IN THE COLLABORATIVE METHODS GENERATION FOR EVALUATE SOFTWARE USABILITY

    Scientific Electronic Library Online (English)

    Andrés, Solano Alegría; Yenny, Méndez Alegría; César, Collazos Ordóñez.

    2010-07-01

    Full Text Available En la actualidad, la usabilidad es un atributo fundamental para el éxito de un producto software. La competitividad entre organizaciones obliga a mejorar el nivel de usabilidad de los productos, debido al riesgo que existe de perder clientes, si el producto no es fácil de usar y/o fácil de aprender. [...] Aunque se han establecido métodos para evaluar la usabilidad de productos software, la mayoría de estos métodos no consideran la posibilidad de involucrar a varias personas trabajando de forma colaborativa en el proceso de evaluación. Por esta razón, convendría utilizar la Metodología para el Diseño de Métodos de Evaluación de Usabilidad Colaborativos, de tal forma que se diseñen métodos que permitan a varias personas de diversas áreas de conocimiento, trabajar de forma colaborativa en el proceso de evaluación. Este artículo presenta de forma general, la metodología mencionada y hace especial énfasis en los thinklets, como elementos clave para el diseño de procesos colaborativos. Abstract in english Currently, usability is a critical attribute to success of software. The competition among organizations forces to improve the level of product usability due to the risk of losing customers if product would not be easy to use and/or easy to learn. Methods have been established to evaluate the usabil [...] ity of software products; however, most of these methods don't take into account the possibility to involve several people working collaboratively in the evaluation process. Therefore, Methodology for Design of Collaborative Usability Evaluation Methods should be used to design methods that allow several people from a range of knowledge areas to work collaboratively in the evaluation process. This paper presents the methodology mentioned and gives special emphasis on Thinklets, as key elements for design of collaborative processes.

  18. PREFACE: 2nd International Conference on Competitive Materials and Technological Processes (IC-CMTP2)

    Science.gov (United States)

    László, Gömze A.

    2013-12-01

    Competitiveness is one of the most important factors in our life and it plays a key role in the efficiency both of organizations and societies. The more scientifically supported and prepared organizations develop more competitive materials with better physical, chemical and biological properties and the leading companies apply more competitive equipment and technology processes. The aims of the 2nd International Conference on Competitive Materials and Technology Processes (ic-cmtp2) are the following: Promote new methods and results of scientific research in the fields of material, biological, environmental and technology sciences; Change information between the theoretical and applied sciences as well as technical and technological implantations. Promote the communication between the scientist of different nations, countries and continents. Among the major fields of interest are materials with extreme physical, chemical, biological, medical, thermal, mechanical properties and dynamic strength; including their crystalline and nano-structures, phase transformations as well as methods of their technological processes, tests and measurements. Multidisciplinary applications of materials science and technological problems encountered in sectors like ceramics, glasses, thin films, aerospace, automotive and marine industry, electronics, energy, construction materials, medicine, biosciences and environmental sciences are of particular interest. In accordance to the program of the conference ic-cmtp2, more than 250 inquiries and registrations from different organizations were received. Researchers from 36 countries in Asia, Europe, Africa, North and South America arrived at the venue of conference. Including co-authors, the research work of more than 500 scientists are presented in this volume. Professor Dr Gömze A László Chair, ic-cmtp2 The PDF also contains lists of the boards, session chairs and sponsors.

  19. Re-fighting the 2nd Anglo-Boer War: historians in the trenches

    Directory of Open Access Journals (Sweden)

    Ian Van der Waag

    2012-02-01

    Full Text Available Some one hundred years ago, South Africa was tom apart by the 2nd Anglo- Boer War (1899-1902. The war was a colossal psychological experience fought at great expense: It cost Britain twenty-two thousand men and £223 million. The social, economic and political cost to South Africa was greater than the statistics immediately indicate: at least ten thousand fighting men in addition to the camp deaths, where a combination of indifference and incompetence resulted in the deaths of 27 927 Boers and at least 14 154 Black South Africans. Yet these numbers belie the consequences. It was easy for the British to 'forget' the pain of the war, which seemed so insignificant after the losses sustained in 1914-18. With a long history of far-off battles and foreign wars, the British casualties of the Anglo-Boer War became increasingly insignificant as opposed to the lesser numbers held in the collective Afrikaner mind. This impact may be stated somewhat more candidly in terms of the war participation ratio for the belligerent populations. After all, not all South Africans fought in uniform. For the Australian colonies these varied between 4½per thousand (New South Wales to 42.3 per thousand (Tasmania. New Zealand 8 per thousand, Britain 8½ per thousand: and Canada 12.3 per thousand; while in parts of South Africa this was perhaps as high as 900 per thousand. The deaths and high South African participation ratio, together with the unjustness of the war in the eyes of most Afrikaners, introduced bitterness, if not a hatred, which has cast long shadows upon twentieth-century South Africa.

  20. Conference Report on the 2nd International Symposium on Lithium Applications for Fusion Devices

    International Nuclear Information System (INIS)

    The 2nd International Symposium on Lithium Applications for Fusion Devices (ISLA-2011) was held on 27–29 April 2011 at the Princeton Plasma Physics Laboratory (PPPL) with broad participation from the community working on aspects of lithium research for fusion energy development. This community is expanding rapidly in many areas including experiments in magnetic confinement devices and a variety of lithium test stands, theory and modeling and developing innovative approaches. Overall, 53 presentations were given representing 26 institutions from 10 countries. The latest experimental results from nine magnetic fusion devices were given in 24 presentations, from NSTX (PPPL, USA), LTX (PPPL, USA), FT-U (ENEA, Italy), T-11M (TRINITY, RF), T-10 (Kurchatov Institute, RF), TJ-II (CIEMAT, Spain), EAST (ASIPP, China), HT-7 (ASIPP, China), and RFX (Padova, Italy). Sessions were devoted to: I. Lithium in magnetic confinement experiments (facility overviews), II. Lithium in magnetic confinement experiments (topical issues), III. Special session on liquid lithium technology, IV. Lithium laboratory test stands, V. Lithium theory/modeling/comments, VI. Innovative lithium applications and VII. Panel discussion on lithium PFC viability in magnetic fusion reactors. There was notable participation from the fusion technology communities, including the IFE, IFMIF and TBM communities providing productive exchanges with the physics oriented magnetic confinement lithium research groups. It confinement lithium research groups. It was agreed to continue future exchanges of ideas and data to help develop attractive liquid lithium solutions for very challenging magnetic fusion issues, such as development of a high heat flux steady-state divertor concept and acceptable plasma disruption mitigation techniques while improving plasma performance with lithium. The next workshop will be held at ENEA, Frascati, Italy in 2013. (conference report)

  1. Synthesis, structure and preperties of double polyphosphate of potassium and neodymium-K2Nd(PO3)5

    International Nuclear Information System (INIS)

    Results of IR spectroscopic, X-ray diffraction and luminescent analyses and methods of K2Nd(PO3)5 monocrystal synthesis are presented. The compound is crystallized in monoclinic crystal system sp.gr. Cc, elementary cell parameters are a=8.430; b=11.752; c=13.272 A; ?=90.68 deg; V=1294.6 A3, dx=3.17 g/cm3, Z=4. Crystal-optical characteristics are Ng=1.588; Nm=1.575; Np=1.569 (±0.005). It is found that the shortest Nd-Nd distance and life time of Nd3+ ion metastable level much exceed those in M1Ln(PO3)4 compounds (where M1=Li, Na, K, Rb, Tl). K2Nd(PO3)5 crystals may be considered promising for mini-laser materials

  2. Report on the 2nd International Consortium on Hallucination Research: Evolving Directions and Top-10 “Hot Spots” in Hallucination Research

    OpenAIRE

    Waters, Flavie; Woods, Angela; Fernyhough, Charles

    2013-01-01

    This article presents a report on the 2nd meeting of the International Consortium on Hallucination Research, held on September 12th and 13th 2013 at Durham University, UK. Twelve working groups involving specialists in each area presented their findings and sought to summarize the available knowledge, inconsistencies in the field, and ways to progress. The 12 working groups reported on the following domains of investigation: cortical organisation of hallucinations, nonclinical hallucinations,...

  3. Influence of socio-cultural environment on development of childrens musical talents in 2nd trienium of primary school

    OpenAIRE

    Antolin, Petra

    2014-01-01

    The thesis examines the impact of socio-cultural environment on the development of musical talent among pupils in the 2nd three years of primary school. Thesis begins with a closer look at the definition of giftedness, which may be general or specific (partial). Specific giftedness means that children achieve above-average results in one area only, while general gifted children achieve above-average results in multiple different areas. The characteristics of gifted pupils which distinguish th...

  4. Exploration of performance limitation of 9-cell cavity processed in KEK AR East 2nd experimental hall

    International Nuclear Information System (INIS)

    So far our 9-cell cavity performance is often suffered from field emission. We are investigating our facilities at the KEK AR East 2nd experimental hall. We examined two points of view post EP/BCP cleaning and particle contamination. Particle contamination problem has been found in our HPR system, cavity assembly, and vacuum evacuation procedure. We have taken cures against these problems. We will report about these problems and the cured results on cavity performance in this paper. (author)

  5. From cell regulation to patient survival: 2nd Cancer immunotherapy and immunomonitoring (CITIM) meeting, Budapest, 2–5 May 2011

    OpenAIRE

    Umansky, Viktor; Malyguine, Anatoli; Kotlan, Beatrix; Aptsiauri, Natalia; Shurin, Michael R.

    2011-01-01

    The 2nd International Conference “Cancer Immunotherapy and Immunomonitoring (CITIM)” took place in Budapest, Hungary, and was organized by the International (Chair—Michael Shurin) and Local (Chair—Beatrix Kotlan) Organizing Committees. The main aim was to bring the world’s best tumor immunologists to Budapest to discuss the mechanisms of immune regulation in the tumor microenvironment, the efficacy of anticancer immunotherapeutic modalities, the results of clinical trials and the me...

  6. 2nd PEGS Annual Symposium on Antibodies for Cancer Therapy: April 30–May 1, 2012, Boston, USA

    OpenAIRE

    Ho, Mitchell; Royston, Ivor; Beck, Alain

    2012-01-01

    The 2nd Annual Antibodies for Cancer Therapy symposium, organized again by Cambridge Healthtech Institute as part of the Protein Engineering Summit, was held in Boston, USA from April 30th to May 1st, 2012. Since the approval of the first cancer antibody therapeutic, rituximab, fifteen years ago, eleven have been approved for cancer therapy, although one, gemtuzumab ozogamicin, was withdrawn from the market.  The first day of the symposium started with a historical review of early work for l...

  7. Effects of B2O3 and SiO2 on the persistent luminescence property of CaAl2O4:Eu2+,Nd3+

    International Nuclear Information System (INIS)

    CaAl2O4:Eu2+,Nd3+, CaAl2O4:Eu2+,Nd3+.0.02B2O3 and CaAl2O4:Eu2+,Nd3+.0.02SiO2 with persistent luminescence were prepared by a high-temperature state reaction in a reductive atmosphere. The effects of B3+ and Si4+ on the luminescence of CaAl2O4:Eu2+,Nd3+ were studied. The emission spectra of these three kinds of phosphors are very similar. CaAl2O4:Eu2+,Nd3+.0.02SiO2 has the highest initial luminance, but it decays very quickly. CaAl2O4:Eu2+,Nd3+.0.02B2O3 has the higher initial luminance and decays more slowly compared to CaAl2O4:Eu2+,Nd3+, and exhibits a much better persistent luminescence property. Those different effects can be described to different structural trap levels induced by B3+ and Si4+ doping, which has been conformed by thermoluminescence (TL) methods

  8. Dependence of cosmic ray solar daily variation (1st, 2nd and 3rd) on heliomagnetic polarity reversals

    International Nuclear Information System (INIS)

    Using 696 station-years of neutron monitor data during the period of 1964-1983 and those from muon telescopes at surface and underground stations, a long term variation of cosmic ray solar daily variations (1st, 2nd and 3rd) has been analysed in order to study its dependence on the heliomagnetic polarity reversals. It is found that 1st, 2nd and 3rd harmonic variations show respectively counter clockwise phase shifts on the harmonic dial for the transition from the Negative to the Positive polarity state. The polarity state is defined as 'Positive' ('Negative') when the interplanetary magnetic field (IMF) is away (toward) in the Northern Hemisphere and toward (away) in the Southern Hemisphere. It is demonstrated that these phase shifts cannot be explained by one-dimensional diffusion of the pitch angle distribution along the IMF-axis, such as the one presented by Bieber and Pomerantz, but can be explained by three-dimensional treatment of the cosmic ray diffusion-convection in space developed by Munakata and Nagashima in considering the drift effect in the heliosphere pointed out by Jokipii et al. According to the theory, the rigidity dependence of the observed phase shift of the 2nd harmonic variation suggests that the power exponent of the rigidity spectrum of the cosmic ray mean free path is less than unity at least in a rigidity region of about one to several tens GV. (author)

  9. Investigations of near IR photoluminescence properties in TiO2:Nd,Yb materials using hyperspectral imaging methods

    International Nuclear Information System (INIS)

    TiO2 and TiO2:Nd,Yb films were deposited by a doctor blade deposition technique from pastes prepared by a sol–gel process, and characterized by electron microscopy and spectroscopic techniques. Near infrared (NIR) photoluminescence (PL) properties upon 808 nm excitation were also examined. The rutile TiO2:Nd,Yb samples exhibited the strongest NIR PL signal. The relationship between the morphological properties, annealing temperature and the optical behavior of TiO2:Nd,Yb films is discussed. Furthermore, the study showed that hyperspectral imaging spectroscopy can be used as a rapid and nondestructive macroscopic characterization technique for the identification of spectral features and evaluation of luminescent surfaces of oxides. -- Highlights: ? Films and powders of Nd and Yb-doped titania have been synthesized. ? Three modifications (anatase, rutile and Degussa P25) have been studied. ? The NIR photoluminescence properties were studied by hyperspectral imaging. ? Emission at 978, 1008, 1029, 1064 and 1339 nm was obtained. ? The structural properties and their influence on the optical behavior are discussed

  10. Hanbury Brown and Twiss Interferometry with respect to 2nd and 3rd-order event planes in Au+Au collisions at = 200 GeV

    Science.gov (United States)

    Niida, Takafumi; Phenix Collaboration

    2014-09-01

    Azimuthal angle dependence of HBT interferometry have been measured with respect to 2nd and 3rd-order event planes in Au+Au collisions at = 200 GeV at the PHENIX experiment. The 3rd-oder dependence of a Gaussian source radii was clearly observed as well as 2nd-order dependence. The result for 2nd-order indicates that the initial source eccentricity is diluted but still retain the initial shape at freeze-out, while the result for 3rd-order implies that the initial triangularity vanishes during the medium evolution, which is supported by a Gaussian source model and Monte-Carlo simulation.

  11. Open3DGRID : An open-source software aimed at high-throughput generation of molecular interaction fields (MIFs)

    DEFF Research Database (Denmark)

    Tosco, Paolo; Balle, Thomas

    Description Open3DGRID is an open-source software aimed at high-throughput generation of molecular interaction fields (MIFs). Open3DGRID can generate steric potential, electron density and MM/QM electrostatic potential fields; furthermore, it can import GRIDKONT binary files produced by GRID and CoMFA/CoMSIA fields (exported from SYBYL with the aid of a small SPL script). High computational performance is attained through implementation of parallelized algorithms for MIF generation. Most prominent features in Open3DGRID include: •Seamless integration with OpenBabel, PyMOL, GAUSSIAN, FIREFLY, GAMESS-US, TURBOMOLE, MOLDEN, Molecular Discovery GRID •Multi-threaded computation of MIFs (both MM and QM); support for MMFF94 and GAFF force-fields with automated assignment of atom types to the imported molecular structures •Human and machine-readable text output, integrated with 3D maps in several formats to allow visualization of results in PyMOL, MOE, Maestro and SYBYL •User-friendly interface toall major QM packages (e.g. GAUSSIAN, FIREFLY, GAMESS-US, TURBOMOLE, MOLDEN), allows calculation of QM electron density and electrostatic potential 3D maps from within Open3DGRID •User-friendly interface to Molecular Discovery GRID to compute GRID MIFs from within Open3DGRID Open3DGRID is controlled through a command line interface; commands can be either entered interactively from a command prompt or read from a batch script. If PyMOL is installed on the system while Open3DGRID is being operated interactively, the setup of 3D grid computations can be followed in real time on PyMOL's viewport, allowing to tweak grid size and training/test set composition very easily. The main output is arranged as human-readable plain ASCII text, while a number of additional files are generated to store data and to export the results of computations for further analysis and visualization with third party tools. In particular, Open3DGRID can export 3D maps for visualization in PyMOL, MOE, Maestro and SYBYL. Open3DGRID is written in C; while pre-built binaries are available for mainstream operating systems (Windows 32/64-bit, Linux 32/64-bit, Solaris x86 32/64-bit, FreeBSD 32/64-bit, Intel Mac OS X 32/64-bit), source code is portable and can be compiled under any *NIX platform supporting POSIX threads. The modular nature of the code allows for easy implementation of new features, so that the core application can be customized to meet individual needs. A detailed ChangeLog is kept to keep track of the additions and modifications during Open3DGRID's development.

  12. Development of the Monte Carlo event generator tuning software package Lagrange and its application to tune the PYTHIA model to the LHCb data

    CERN Document Server

    Popov, Dmitry; Hofmann, Werner

    One of the general problems of modern high energy physics is a problem of comparing experimental data, measurements of observables in high energy collisions, to theory, which is represented by Monte Carlo simulations. This work is dedicated to further development of the tuning methodology and implementation of software tools for tuning of the PYTHIA Monte Carlo event generator for the LHCb experiment. The aim of this thesis is to create a fast analytical model of the Monte Carlo event generator and then fitting the model to the experimental data, recorded by the LHCb detector, considering statistical and computational uncertainties and estimating the best values for the tuned parameters, by simultaneous tuning of a group of phenomenological parameters in many-dimensional parameter-space. The fitting algorithm is interfaced to the LHCb software framework, which models the response of the LHCb detector. Typically, the tunings are done to the measurements which are corrected for detector effects. These correctio...

  13. White Paper Summary of 2nd ASTM International Workshop on Hydrides in Zirconium Alloy Cladding

    Energy Technology Data Exchange (ETDEWEB)

    Sindelar, R. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Louthan, M. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); PNNL, B. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-05-29

    This white paper recommends that ASTM International develop standards to address the potential impact of hydrides on the long term performance of irradiated zirconium alloys. The need for such standards was apparent during the 2nd ASTM International Workshop on Hydrides in Zirconium Alloy Cladding and Assembly Components, sponsored by ASTM International Committee C26.13 and held on June 10-12, 2014, in Jackson, Wyoming. The potentially adverse impacts of hydrogen and hydrides on the long term performance of irradiated zirconium-alloy cladding on used fuel were shown to depend on multiple factors such as alloy chemistry and processing, irradiation and post irradiation history, residual and applied stresses and stress states, and the service environment. These factors determine the hydrogen content and hydride morphology in the alloy, which, in turn, influence the response of the alloy to the thermo-mechanical conditions imposed (and anticipated) during storage, transport and disposal of used nuclear fuel. Workshop presentations and discussions showed that although hydrogen/hydride induced degradation of zirconium alloys may be of concern, the potential for occurrence and the extent of anticipated degradation vary throughout the nuclear industry because of the variations in hydrogen content, hydride morphology, alloy chemistry and irradiation conditions. The tools and techniques used to characterize hydrides and hydride morphologies and their impacts on material performance also vary. Such variations make site-to-site comparisons of test results and observations difficult. There is no consensus that a single material or system characteristic (e.g., reactor type, burnup, hydrogen content, end-of life stress, alloy type, drying temperature, etc.) is an effective predictor of material response during long term storage or of performance after long term storage. Multi-variable correlations made for one alloy may not represent the behavior of another alloy exposed to identical conditions and the material responses to thermo-mechanical exposures will be different depending on the materials and systems used. The discussions at the workshop showed several gaps in the standardization of processes and techniques necessary to assess the long term performance of irradiated zirconium alloy cladding during dry storage and transport. The development of, and adherence to, standards to help bridge these gaps will strengthen the technical basis for long term storage and post-storage operations, provide consistency across the nuclear industry, maximize the value of most observations, and enhance the understanding of behavioral differences among alloys. The need for, and potential benefits of, developing the recommended standards are illustrated in the various sections of this report.

  14. GENERACIÓN AUTOMÁTICA DE APLICACIONES SOFTWARE A PARTIR DEL ESTANDAR MDA BASÁNDOSE EN LA METODOLOGÍA DE SISTEMAS EXPERTOS E INTELIGENCIA ARTIFICIAL / AUTOMATIC GENERATION OF SOFTWARE APPLICATIONS FROM STANDARD MDA STANDARD BASED ON THE METHOD OF ARTIFICIAL INTELLIGENCE AND EXPERT SYSTEMS

    Directory of Open Access Journals (Sweden)

    IVÁN MAURICIO RUEDA CÁCERES

    2011-04-01

    Full Text Available RESUMEN ANALÍTICO Son muchos los estudios que se han presentado a cerca de la generación automática de líneas de código, este artículo pretende presentar una solución a las limitaciones de una herramienta muy conocida llamada MDA, haciendo uso los avances tecnológicos de la inteligencia artificial y los sistemas expertos. Abarca los principios del marco de trabajo de MDA, transformando los modelos usados y añadiendo características a estos que permitirán hacer más eficiente esta metodología de trabajo. El modelo propuesto abarca las fases del ciclo de vida software siguiendo las reglas del negocio que hacen parte esencial un proyecto real de software. Es con las reglas del negocio que se empieza a dar la transformación del estándar MDA y se pretende dar un aporte que contribuya a automatizar las reglas del negocio de forma tal que sirva para la definición de las aplicaciones en todo el ciclo de vida que la genera. ANALYTICAL SUMMARY Many studies are presented about automatic generation of code lines, this article want to present a solution for limitations of a tool called MDA, using from Artifcial intelligence technological advances and expert sistems. covering the principle of MDA work frame, transforming used models and adding characteristics to this that allow to make more effcient this work metodology. the proposed model covers the phases cycle life software, following the business rules that make essential part in a real software proyect. With the Business rules can start to transform the standard MDA aiming to give a contribution to automate the business rules that works to defne aplications in all the life's cicle that generate it.

  15. Herramienta software para el análisis de canasta de mercado sin selección de candidatos / Software tool for analysing the family shopping basket without candidate generation

    Scientific Electronic Library Online (English)

    Roberto Carlos, Naranjo Cuervo; Luz Marina, Sierra Martínez.

    2009-04-01

    Full Text Available Actualmente en el entorno del comercio electrónico es necesario contar con herramientas que permitan obtener conocimiento útil que brinde soporte a la toma de decisiones de marketing; para ello se necesita de un proceso que utiliza una serie de técnicas para el procesamiento de los datos, entre ella [...] s se encuentra la minería de datos, que permite llevar a cabo un proceso de descubrimiento de información automático. Este trabajo tiene como objetivo presentar la técnica de reglas de asociación como la adecuada para descubrir cómo compran los clientes en una empresa que ofrece un servicio de comercio electrónico tipo B2C, con el fin de apoyar la toma de decisiones para desarrollar ofertas hacia sus clientes o cautivar nuevos. Para la implementación de las reglas de asociación existe una variedad de algoritmos como: A priori, DHP, Partition, FP-Growth y Eclat y para seleccionar el más adecuado se define una serie de criterios (Danger y Berlanga, 2001), entre los que se encuentran: inserciones a la base de datos, costo computacional, tiempo de ejecución y rendimiento, los cuales se analizaron en cada algoritmo para realizar la selección. Además, se presenta el desarrollo de una herramienta software que contempla la metodología CRISP-DM constituida por cuatro submódulos, así: Preprocesamiento de datos, Minería de datos, Análisis de resultados y Aplicación de resultados. El diseño de la aplicación utiliza una arquitectura de tres capas: Lógica de presentación, Lógica del Negocio y Lógica de servicios; dentro del proceso de construcción de la herramienta se incluye el diseño de la bodega de datos y el diseño de algoritmo como parte de la herramienta de minería de datos. Las pruebas hechas a la herramienta de minería de datos desarrollada se realizaron con una base de datos de la compañía FoodMart3. Estas pruebas fueron de: rendimiento, funcionalidad y confiabilidad en resultados, las cuales permiten encontrar reglas de asociación igualmente. Los resultados obtenidos facilitaron concluir, entre otros aspectos, que las reglas de asociación como técnica de minería de datos permiten analizar volúmenes de datos para servicios de comercio electrónico tipo B2C, lo cual es una ventaja competitiva para las empresas. Abstract in english Tools leading to useful knowledge being obtained for supporting marketing decisions being taken are currently needed in the ecommerce environment. A process is needed for this which uses a series of techniques for data-processing; data-mining is one such technique enabling automatic information disc [...] overy. This work presents the association rules as a suitable technique for discovering how customers buy from a company offering business to consumer (B2C) e-business, aimed at supporting decision-making in supplying its customers or capturing new ones. Many algorithms such as A priori, DHP, Partition, FP-Growth and Eclat are available for implementing association rules; the following criteria were defined for selecting the appropriate algorithm: database insert, computational cost, performance and execution time. The development of a software tool is also presented which involved the CRISP-DM approach; this software tool was formed by the following four sub-modules: data pre-processing, data-mining, results analysis and results application. The application design used three-layer architecture: presentation logic, business logic and service logic. Data warehouse design and algorithm design were included in developing this data-mining software tool. It was tested by using a FoodMart company database; the tests included performance, functionality and results’ validity, thereby allowing association rules to be found. The results led to concluding that using association rules as a data mining technique facilitates analysing volumes of information for B2C e-business services which represents a competitive advantage for those companies using Internet as their sales’ media.

  16. Free Open Source Software: FOSS Based GIS for Spatial Retrievals of Appropriate Locations for Ocean Energy Utilizing Electric Power Generation Plants

    OpenAIRE

    Kohei Arai

    2012-01-01

    Free Open Source Software: FOSS based Geographic Information System: GIS for spatial retrievals of appropriate locations for ocean wind and tidal motion utilizing electric power generation plants is proposed. Using scatterometer onboard earth observation satellites, strong wind coastal areas are retrieved with FOSS/GIS of PostgreSQL/GIS. PostGIS has to be modified together with altimeter and scatterometer database. These modification and database creation would be a good reference to the user...

  17. Digital avionics systems - Principles and practices (2nd revised and enlarged edition)

    Science.gov (United States)

    Spitzer, Cary R.

    1993-01-01

    The state of the art in digital avionics systems is surveyed. The general topics addressed include: establishing avionics system requirements; avionics systems essentials in data bases, crew interfaces, and power; fault tolerance, maintainability, and reliability; architectures; packaging and fitting the system into the aircraft; hardware assessment and validation; software design, assessment, and validation; determining the costs of avionics.

  18. Controls/CFD Interdisciplinary Research Software Generates Low-Order Linear Models for Control Design From Steady-State CFD Results

    Science.gov (United States)

    Melcher, Kevin J.

    1997-01-01

    The NASA Lewis Research Center is developing analytical methods and software tools to create a bridge between the controls and computational fluid dynamics (CFD) disciplines. Traditionally, control design engineers have used coarse nonlinear simulations to generate information for the design of new propulsion system controls. However, such traditional methods are not adequate for modeling the propulsion systems of complex, high-speed vehicles like the High Speed Civil Transport. To properly model the relevant flow physics of high-speed propulsion systems, one must use simulations based on CFD methods. Such CFD simulations have become useful tools for engineers that are designing propulsion system components. The analysis techniques and software being developed as part of this effort are an attempt to evolve CFD into a useful tool for control design as well. One major aspect of this research is the generation of linear models from steady-state CFD results. CFD simulations, often used during the design of high-speed inlets, yield high resolution operating point data. Under a NASA grant, the University of Akron has developed analytical techniques and software tools that use these data to generate linear models for control design. The resulting linear models have the same number of states as the original CFD simulation, so they are still very large and computationally cumbersome. Model reduction techniques have been successfully applied to reduce these large linear models by several orders of magnitude without significantly changing the dynamic response. The result is an accurate, easy to use, low-order linear model that takes less time to generate than those generated by traditional means. The development of methods for generating low-order linear models from steady-state CFD is most complete at the one-dimensional level, where software is available to generate models with different kinds of input and output variables. One-dimensional methods have been extended somewhat so that linear models can also be generated from two- and three-dimensional steady-state results. Standard techniques are adequate for reducing the order of one-dimensional CFD-based linear models. However, reduction of linear models based on two- and three-dimensional CFD results is complicated by very sparse, ill-conditioned matrices. Some novel approaches are being investigated to solve this problem.

  19. An open-source software tool for the generation of relaxation time maps in magnetic resonance imaging

    Directory of Open Access Journals (Sweden)

    Kühne Titus

    2010-07-01

    Full Text Available Abstract Background In magnetic resonance (MR imaging, T1, T2 and T2* relaxation times represent characteristic tissue properties that can be quantified with the help of specific imaging strategies. While there are basic software tools for specific pulse sequences, until now there is no universal software program available to automate pixel-wise mapping of relaxation times from various types of images or MR systems. Such a software program would allow researchers to test and compare new imaging strategies and thus would significantly facilitate research in the area of quantitative tissue characterization. Results After defining requirements for a universal MR mapping tool, a software program named MRmap was created using a high-level graphics language. Additional features include a manual registration tool for source images with motion artifacts and a tabular DICOM viewer to examine pulse sequence parameters. MRmap was successfully tested on three different computer platforms with image data from three different MR system manufacturers and five different sorts of pulse sequences: multi-image inversion recovery T1; Look-Locker/TOMROP T1; modified Look-Locker (MOLLI T1; single-echo T2/T2*; and multi-echo T2/T2*. Computing times varied between 2 and 113 seconds. Estimates of relaxation times compared favorably to those obtained from non-automated curve fitting. Completed maps were exported in DICOM format and could be read in standard software packages used for analysis of clinical and research MR data. Conclusions MRmap is a flexible cross-platform research tool that enables accurate mapping of relaxation times from various pulse sequences. The software allows researchers to optimize quantitative MR strategies in a manufacturer-independent fashion. The program and its source code were made available as open-source software on the internet.

  20. Using semi-automated photogrammetry software to generate 3D surfaces from oblique and vertical photographs at Mount St. Helens, WA

    Science.gov (United States)

    Schilling, S.; Diefenbach, A. K.

    2012-12-01

    Photogrammetry has been used to generate contours and Digital Elevation Models (DEMs) to monitor change at Mount St. Helens, WA since the 1980 eruption. We continue to improve techniques to monitor topographic changes within the crater. During the 2004-2008 eruption, 26 DEMs were used to track volume and rates of growth of a lava dome and changes of Crater Glacier. These measurements constrained seismogenic extrusion models and were compared with geodetic deflation volume to constrain magma chamber behavior. We used photogrammetric software to collect irregularly spaced 3D points primarily by hand and, in reasonably flat areas, by automated algorithms, from commercial vertical aerial photographs. These models took days to months to complete and the areal extent of each surface was determined by visual inspection. Later in the eruption, we pioneered the use of different software to generate irregularly spaced 3D points manually from oblique images captured by a hand-held digital camera. In each case, the irregularly spaced points and intervening interpolated points formed regular arrays of cells or DEMs. Calculations using DEMs produced from the hand-held images duplicated volumetric and rate results gleaned from the vertical aerial photographs. This manual point capture technique from oblique hand-held photographs required only a few hours to generate a model over a focused area such as the lava dome, but would have taken perhaps days to capture data over the entire crater. Here, we present results from new photogrammetric software that uses robust image-matching algorithms to produce 3D surfaces automatically after inner, relative, and absolute orientations between overlapping photographs are completed. Measurements using scans of vertical aerial photographs taken August 10, 2005 produced dome volume estimates within two percent of those from a surface generated using the vertical aerial photograph manual method. The new August 10th orientations took less than 8 hours to complete, surface generation took a couple of minutes, and its coverage extends over the entire crater. In addition, preliminary tests of surfaces generated with this software from hand-held oblique images suggest equal speed and accuracy to that achieved from vertical aerial photographs. Oblique hand-held images offer ease of image capture and unobscured views at times when steam plumes and clouds obstruct vertical aerial photography. The possibility of quick camera calibration, image capture, and daily surface generation shifts photogrammetry into the realm of near real-time monitoring and hazard assessment.

  1. Development of Hydrologic Characterization Technology of Fault Zones: Phase I, 2nd Report

    International Nuclear Information System (INIS)

    This is the year-end report of the 2nd year of the NUMO-LBNL collaborative project: Development of Hydrologic Characterization Technology of Fault Zones under NUMO-DOE/LBNL collaboration agreement, the task description of which can be found in the Appendix 3. Literature survey of published information on the relationship between geologic and hydrologic characteristics of faults was conducted. The survey concluded that it may be possible to classify faults by indicators based on various geometric and geologic attributes that may indirectly relate to the hydrologic property of faults. Analysis of existing information on the Wildcat Fault and its surrounding geology was performed. The Wildcat Fault is thought to be a strike-slip fault with a thrust component that runs along the eastern boundary of the Lawrence Berkeley National Laboratory. It is believed to be part of the Hayward Fault system but is considered inactive. Three trenches were excavated at carefully selected locations mainly based on the information from the past investigative work inside the LBNL property. At least one fault was encountered in all three trenches. Detailed trench mapping was conducted by CRIEPI (Central Research Institute for Electric Power Industries) and LBNL scientists. Some intriguing and puzzling discoveries were made that may contradict with the published work in the past. Predictions are made regarding the hydrologic property of the Wildcat Fault based on the analysis of fault structure. Preliminary conceptual models of the Wildcat Fault were proposed. The Wildcat Fault appears to have multiple splays and some low angled faults may be part of the flower structure. In parallel, surface geophysical investigations were conducted using electrical resistivity survey and seismic reflection profiling along three lines on the north and south of the LBNL site. Because of the steep terrain, it was difficult to find optimum locations for survey lines as it is desirable for them to be as straight as possible. One interpretation suggests that the Wildcat Fault is westerly dipping. This could imply that the Wildcat Fault may merge with the Hayward Fault at depth. However, due to the complex geology of the Berkeley Hills, multiple interpretations of the geophysical surveys are possible. iv An effort to construct a 3D GIS model is under way. The model will be used not so much for visualization of the existing data because only surface data are available thus far, but to conduct investigation of possible abutment relations of the buried formations offset by the fault. A 3D model would be useful to conduct 'what if' scenario testing to aid the selection of borehole drilling locations and configurations. Based on the information available thus far, a preliminary plan for borehole drilling is outlined. The basic strategy is to first drill boreholes on both sides of the fault without penetrating it. Borehole tests will be conducted in these boreholes to estimate the property of the fault. Possibly a slanted borehole will be drilled later to intersect the fault to confirm the findings from the boreholes that do not intersect the fault. Finally, the lessons learned from conducting the trenching and geophysical surveys are listed. It is believed that these lessons will be invaluable information for NUMO when it conducts preliminary investigations at yet-to-be selected candidate sites in Japan.

  2. 2nd Radio and Antenna Days of the Indian Ocean (RADIO 2014)

    Science.gov (United States)

    2014-10-01

    It was an honor and a great pleasure for all those involved in its organization to welcome the participants to the ''Radio and Antenna Days of the Indian Ocean'' (RADIO 2014) international conference that was held from 7th to 10th April 2014 at the Sugar Beach Resort, Wolmar, Flic–en–Flac, Mauritius. RADIO 2014 is the second of a series of conferences organized in the Indian Ocean region. The aim of the conference is to discuss recent developments, theories and practical applications covering the whole scope of radio–frequency engineering, including radio waves, antennas, propagation, and electromagnetic compatibility. The RADIO international conference emerged following discussions with engineers and scientists from the countries of the Indian Ocean as well as from other parts of the world and a need was felt for the organization of such an event in this region. Following numerous requests, the Island of Mauritius, worldwide known for its white sandy beaches and pleasant tropical atmosphere, was again chosen for the organization of the 2nd RADIO international conference. The conference was organized by the Radio Society, Mauritius and the Local Organizing Committee consisted of scientists from SUPELEC, France, the University of Mauritius, and the University of Technology, Mauritius. We would like to take the opportunity to thank all people, institutions and companies that made the event such a success. We are grateful to our gold sponsors CST and FEKO as well as URSI for their generous support which enabled us to partially support one PhD student and two scientists to attend the conference. We would also like to thank IEEE–APS and URSI for providing technical co–sponsorship. More than hundred and thirty abstracts were submitted to the conference. They were peer–reviewed by an international scientific committee and, based on the reviews, either accepted, eventually after revision, or rejected. RADIO 2014 brought together participants from twenty countries spanning five continents: Australia, Botswana, Brazil, Canada, China, Denmark, France, India, Italy, Mauritius, Poland, Reunion Island, Russia, South Africa, South Korea, Spain, Switzerland, The Netherlands, United Kingdom, and USA. The conference featured eleven oral sessions and one poster session on state–of–the–art research themes. Three internationally recognized scientists delivered keynote speeches during the conference. Prizes for the first and second Best Student Papers were awarded during the closing ceremony. Following the call for the extended contributions for publication as a volume in the IOP Conference Series: Materials Science and Engineering (MSE), both on–line and in print, we received thirty–two full papers. All submitted contributions were then peer–reviewed, revised whenever necessary, and accepted or rejected based on the recommendations of the reviewers of the editorial board. At the end of the procedure, twenty–five of them have been accepted for publication in this volume.

  3. FOREWORD: 2nd International Workshop on New Computational Methods for Inverse Problems (NCMIP 2012)

    Science.gov (United States)

    Blanc-Féraud, Laure; Joubert, Pierre-Yves

    2012-09-01

    Conference logo This volume of Journal of Physics: Conference Series is dedicated to the scientific contributions presented during the 2nd International Workshop on New Computational Methods for Inverse Problems, (NCMIP 2012). This workshop took place at Ecole Normale Supérieure de Cachan, in Cachan, France, on 15 May 2012, at the initiative of Institut Farman. The first edition of NCMIP also took place in Cachan, France, within the scope of the ValueTools Conference, in May 2011 (http://www.ncmip.org/2011/). The NCMIP Workshop focused on recent advances in the resolution of inverse problems. Indeed inverse problems appear in numerous scientific areas such as geophysics, biological and medical imaging, material and structure characterization, electrical, mechanical and civil engineering, and finance. The resolution of inverse problems consists of estimating the parameters of the observed system or structure from data collected by an instrumental sensing or imaging device. Its success firstly requires the collection of relevant observation data. It also requires accurate models describing the physical interactions between the instrumental device and the observed system, as well as the intrinsic properties of the solution itself. Finally, it requires the design of robust, accurate and efficient inversion algorithms. Advanced sensor arrays and imaging devices provide high rate and high volume data; in this context, the efficient resolution of the inverse problem requires the joint development of new models and inversion methods, taking computational and implementation aspects into account. During this one-day workshop, researchers had the opportunity to bring to light and share new techniques and results in the field of inverse problems. The topics of the workshop were: algorithms and computational aspects of inversion, Bayesian estimation, kernel methods, learning methods, convex optimization, free discontinuity problems, metamodels, proper orthogonal decomposition, reduced models for the inversion, non-linear inverse scattering, image reconstruction and restoration, applications (bio-medical imaging, non-destructive evaluation etc). NCMIP 2012 was a one-day workshop. Each of the submitted papers was reviewed by 2 to 4 reviewers. Among the accepted papers, there are 8 oral presentations and 5 posters. Three international speakers were invited for a long talk. This second edition attracted 60 registered attendees in May 2012. NCMIP 2012 was supported by Institut Farman (ENS Cachan) and endorsed by the following French research networks (GDR ISIS, GDR Ondes, GDR MOA, GDR MSPC). The program committee acknowledges the following laboratories CMLA, LMT, LSV, LURPA, SATIE, as well as DIGITEO Network. Laure Blanc-Féraud and Pierre-Yves Joubert Workshop Co-chairs Laure Blanc-Féraud, I3S laboratory, CNRS, France Pierre-Yves Joubert, IEF laboratory, Paris-Sud University, CNRS, France Technical Program Committee Alexandre Baussard, ENSTA Bretagne, Lab-STICC, France Marc Bonnet, ENSTA, ParisTech, France Jerôme Darbon, CMLA, ENS Cachan, CNRS, France Oliver Dorn, School of Mathematics, University of Manchester, UK Mário Figueiredo, Instituto Superior Técnico, Lisbon, Portugal Laurent Fribourg, LSV, ENS Cachan, CNRS, France Marc Lambert, L2S Laboratory, CNRS, SupElec, Paris-Sud University, France Anthony Quinn, Trinity College, Dublin, Ireland Christian Rey, LMT, ENS Cachan, CNRS, France Joachim Weickert, Saarland University, Germany Local Chair Alejandro Mottini, Morpheme group I3S-INRIA Sophie Abriet, SATIE, ENS Cachan, CNRS, France Béatrice Bacquet, SATIE, ENS Cachan, CNRS, France Reviewers Gilles Aubert, J-A Dieudonné Laboratory, CNRS and University of Nice-Sophia Antipolis, France Alexandre Baussard, ENSTA Bretagne, Lab-STICC, France Laure Blanc-Féraud, I3S laboratory, CNRS, France Marc Bonnet, ENSTA, ParisTech, France Jerôme Darbon, CMLA, ENS Cachan, CNRS, France Oliver Dorn, School of Mathematics, University of Manchester, UK Gérard Favier, I3S laboratory, CNRS, France Mário Figueiredo, Instituto Superior Técnico, Lisb

  4. Development of Hydrologic Characterization Technology of Fault Zones -- Phase I, 2nd Report

    Energy Technology Data Exchange (ETDEWEB)

    Karasaki, Kenzi; Onishi, Tiemi; Black, Bill; Biraud, Sebastien

    2009-03-31

    This is the year-end report of the 2nd year of the NUMO-LBNL collaborative project: Development of Hydrologic Characterization Technology of Fault Zones under NUMO-DOE/LBNL collaboration agreement, the task description of which can be found in the Appendix 3. Literature survey of published information on the relationship between geologic and hydrologic characteristics of faults was conducted. The survey concluded that it may be possible to classify faults by indicators based on various geometric and geologic attributes that may indirectly relate to the hydrologic property of faults. Analysis of existing information on the Wildcat Fault and its surrounding geology was performed. The Wildcat Fault is thought to be a strike-slip fault with a thrust component that runs along the eastern boundary of the Lawrence Berkeley National Laboratory. It is believed to be part of the Hayward Fault system but is considered inactive. Three trenches were excavated at carefully selected locations mainly based on the information from the past investigative work inside the LBNL property. At least one fault was encountered in all three trenches. Detailed trench mapping was conducted by CRIEPI (Central Research Institute for Electric Power Industries) and LBNL scientists. Some intriguing and puzzling discoveries were made that may contradict with the published work in the past. Predictions are made regarding the hydrologic property of the Wildcat Fault based on the analysis of fault structure. Preliminary conceptual models of the Wildcat Fault were proposed. The Wildcat Fault appears to have multiple splays and some low angled faults may be part of the flower structure. In parallel, surface geophysical investigations were conducted using electrical resistivity survey and seismic reflection profiling along three lines on the north and south of the LBNL site. Because of the steep terrain, it was difficult to find optimum locations for survey lines as it is desirable for them to be as straight as possible. One interpretation suggests that the Wildcat Fault is westerly dipping. This could imply that the Wildcat Fault may merge with the Hayward Fault at depth. However, due to the complex geology of the Berkeley Hills, multiple interpretations of the geophysical surveys are possible. iv An effort to construct a 3D GIS model is under way. The model will be used not so much for visualization of the existing data because only surface data are available thus far, but to conduct investigation of possible abutment relations of the buried formations offset by the fault. A 3D model would be useful to conduct 'what if' scenario testing to aid the selection of borehole drilling locations and configurations. Based on the information available thus far, a preliminary plan for borehole drilling is outlined. The basic strategy is to first drill boreholes on both sides of the fault without penetrating it. Borehole tests will be conducted in these boreholes to estimate the property of the fault. Possibly a slanted borehole will be drilled later to intersect the fault to confirm the findings from the boreholes that do not intersect the fault. Finally, the lessons learned from conducting the trenching and geophysical surveys are listed. It is believed that these lessons will be invaluable information for NUMO when it conducts preliminary investigations at yet-to-be selected candidate sites in Japan.

  5. Experiments and Demonstrations in Physics: Bar-Ilan Physics Laboratory (2nd Edition)

    Science.gov (United States)

    Kraftmakher, Yaakov

    2014-08-01

    The following sections are included: * Data-acquisition systems from PASCO * ScienceWorkshop 750 Interface and DataStudio software * 850 Universal Interface and Capstone software * Mass on spring * Torsional pendulum * Hooke's law * Characteristics of DC source * Digital storage oscilloscope * Charging and discharging a capacitor * Charge and energy stored in a capacitor * Speed of sound in air * Lissajous patterns * I-V characteristics * Light bulb * Short time intervals * Temperature measurements * Oersted's great discovery * Magnetic field measurements * Magnetic force * Magnetic braking * Curie's point I * Electric power in AC circuits * Faraday's law of induction I * Self-inductance and mutual inductance * Electromagnetic screening * LCR circuit I * Coupled LCR circuits * Probability functions * Photometric laws * Kirchhoff's rule for thermal radiation * Malus' law * Infrared radiation * Irradiance and illuminance

  6. The whiteStar development project: Westinghouse's next generation core design simulator and core monitoring software to power the nuclear renaissance

    International Nuclear Information System (INIS)

    The WhiteStar project has undertaken the development of the next generation core analysis and monitoring system for Westinghouse Electric Company. This on-going project focuses on the development of the ANC core simulator, BEACON core monitoring system and NEXUS nuclear data generation system. This system contains many functional upgrades to the ANC core simulator and BEACON core monitoring products as well as the release of the NEXUS family of codes. The NEXUS family of codes is an automated once-through cross section generation system designed for use in both PWR and BWR applications. ANC is a multi-dimensional nodal code for all nuclear core design calculations at a given condition. ANC predicts core reactivity, assembly power, rod power, detector thimble flux, and other relevant core characteristics. BEACON is an advanced core monitoring and support system which uses existing instrumentation data in conjunction with an analytical methodology for on-line generation and evaluation of 3D core power distributions. This new system is needed to design and monitor the Westinghouse AP1000 PWR. This paper describes provides an overview of the software system, software development methodologies used as well some initial results. (authors)

  7. Roles of doping ions in afterglow properties of blue CaAl{sub 2}O{sub 4}:Eu{sup 2+},Nd{sup 3+} phosphors

    Energy Technology Data Exchange (ETDEWEB)

    Wako, A.H., E-mail: wakoah@qwa.ufs.ac.za [Department of Physics, University of the Free State, QwaQwa Campus, Private Bag X13, Phuthaditjhaba 9866 (South Africa); Dejene, B.F. [Department of Physics, University of the Free State, QwaQwa Campus, Private Bag X13, Phuthaditjhaba 9866 (South Africa); Swart, H.C. [Department of Physics, University of the Free State, P.O. Box 339, Bloemfontein 9300 (South Africa)

    2014-04-15

    Eu{sup 2+} doped and Nd{sup 3+} co-doped calcium aluminate (CaAl{sub 2}O{sub 4}:Eu{sup 2+},Nd{sup 3+}) phosphor was prepared by a urea-nitrate solution combustion method at furnace temperatures as low as 500 °C. The produced CaAl{sub 2}O{sub 4}:Eu{sup 2+},Nd{sup 3+} powder was investigated in terms of phase composition, morphology and luminescence by X-Ray diffraction (XRD), Scanning Electron Microscope (SEM), Fourier Transform Infra Red spectroscopy (FTIR) and Photoluminescence (PL) techniques respectively. XRD analysis depicts a dominant monoclinic phase that indicates no change in the crystalline structure of the phosphor with varying concentration of Eu{sup 2+} and Nd{sup 3+}. SEM results show agglomerates with non-uniform shapes and sizes with a number of irregular network structures having lots of voids and pores. The Energy Dispersive X-ray Spectroscopy (EDS) and (FTIR) spectra confirm the expected chemical components of the phosphor. PL measurements indicated one broadband excitation spectra from 200 to 300 nm centered around 240 nm corresponding to the crystal field splitting of the Eu{sup 2+} d-orbital and an emission spectrum in the blue region with a maximum on 440 nm. This is a strong indication that there was dominantly one luminescence center, Eu{sup 2+} which represents emission from transitions between the 4f{sup 7} ground state and the 4f{sup 6}–5d{sup 1} excited state configuration. High concentrations of Eu{sup 2+} and Nd{sup 3+} generally reduce both intensity and lifetime of the phosphor powders. The optimized content of Eu{sup 2+} is 1 mol% and for Nd{sup 3+} is 1 mol% for the obtained phosphors with excellent optical properties. The phosphor also emits visible light at around 587 and 616 nm. Such emissions can be ascribed to the {sup 5}D{sub 0}–{sup 7}F{sub 1} and {sup 5}D{sub 0}–{sup 7}F{sub 2} intrinsic transition of Eu{sup 3+} respectively. The decay characteristics exhibit a significant rise in initial intensity with increasing Eu{sup 2+} doping concentration while the decay time increased with Nd{sup 3+} co-doping. The observed afterglow can be ascribed to the generation of suitable traps due to the presence of the Nd{sup 3+} ions.

  8. RECONSTRUCTING THE IDEA OF PRAYER SPACE: A CRITICAL ANALYSIS OF THE TEMPORARY PRAYING PLATFORM PROJECT OF 2ND YEAR ARCHITECTURE STUDENTS IN THE NATIONAL UNIVERSITY OF MALAYSIA (UKM

    Directory of Open Access Journals (Sweden)

    Nangkula Utaberta

    2013-12-01

    Full Text Available Abstract God created human as caliph on this earth. Caliph means leader, care-taker and guardian. Therefore humans have an obligation to maintain, preserve and conserve this natural for future generations. Today we see a lot of damage that occurs in the earth caused by human behavior. Islam saw the whole of nature as a place of prayer that must be maintained its cleanliness and purity. Therefore as Muslims we need to preserve nature as we keep our place of prayer. The main objective of this paper is to re-questioning and re-interpreting the idea of sustainability in Islamic Architecture through a critical analysis of first project of 2nd year architecture student of UKM which is the “Temporary Praying Platform”. The discussion itself will be divided into three (3 main parts. The first part will be discussing contemporary issues in Islamic Architecture especially in the design of Mosques while the second part will expand the framework of sustainability in Islamic Architecture. The last part will be analyzing some sample of design submission by 2nd year students of UKM on the temporary praying platform project. It is expected that this paper can start a further discussion on the inner meaning in Islam and how it was implemented in the design of praying spaces in the future. Keywords:  Sustainability, Islamic Architecture, Temporary Praying PlatformAbstrak Tuhan menciptakan manusia sebagai khalifah di muka bumi ini. Khalifah berarti pemimpin, penjaga dan wali. Oleh karena itu, manusia memiliki kewajiban untuk memelihara, menjaga dan melestarikan alam ini untuk generasi mendatang. Sekaranginikitatelahmelihat banyak kerusakan yang terjadi di bumi yang disebabkan oleh perilaku manusia itu sendiri yang disebutkan sebagai khalifah di bumi. Islam melihat seluruh alam sebagai tempat beribadah yang harus dijaga kebersihan dan kemurniannya, oleh karena itu, sebagai umat Islam adalah perlu melestarikan alam seperti menjaga tempat ibadah mereka. Tujuan utama dari makalah ini adalah untuk mempertanyakan dan menafsirkan kembali gagasan keberlanjutan (sustainable dalam Arsitektur Islam melalui analisis kritis tugas  pertama dari mahasiswa arsitektur angkatan  tahun  kedua dari Universiti Kebangsaan Malaysia (UKM, yaitu tugas perancangan " tempat beribadah sementara "atau “temporary praying platform” . Kajiandibagi menjadi tiga bagian utama. Bagian pertama akan membahas isu-isu kontemporer dalam Arsitektur Islam terutama dalam desain masjid. Kajian kedua adalah kerangka keberlanjutan dalam arsitektur Islam. Bagian ketiga adalah analisis dari beberapa sampel pengajuan desain oleh mahasiswa. Diharapkan tulisan ini dapat memulai diskusi lebih lanjut tentang makna batin dalam Islam dan bagaimana penerapannya dalam desain ruang beribadah yang sustainable. Kata kunci:Keberlanjutan, Arsitektur Islam, tempat beribadah sementara  

  9. Virtual Visit to the ATLAS Control Room by 2nd High School of Eleftherio–Kordelio in Thessaloniki

    CERN Multimedia

    2013-01-01

    Our school is the 2nd High School of Eleftherio – Kordelio. It is located at the west suburbs of Thessaloniki in Greece and our students are between 15-17 years old. Thessaloniki is the second largest city in Greece with a port of a major role in trading at the area of South Balkans. During this period of time our students have heard so much about CERN and the great discoveries which have taken place there and they are really keen on visiting and learning many things about it.

  10. Interaction between Short-Term Heat Pretreatment and Fipronil on 2nd Instar Larvae of Diamondback Moth, Plutella Xylostella (Linn)

    OpenAIRE

    Gu, Xiaojun; Tian, Sufen; Wang, Dehui; Gao, Fei; Wei, Hui

    2010-01-01

    Based on the cooperative virulence index (c.f.) and LC50 of fipronil, the interaction effect between short-term heat pretreatment and fipronil on 2nd instar larvae of diamondback moth (DBM), Plutella xylostella (Linnaeus), was assessed. The results suggested that pretreatment of the tested insects at 30 °C for 2, 4 and 8h could somewhat decrease the toxicity of fipronil at all set concentrations. The LC50 values of fipronil increased after heat pretreatment and c.f. values in all these treat...

  11. Horizontal transmission of hepatitis B virus amongst British 2nd World War soldiers in South-East Asia.

    OpenAIRE

    Gill, G. V.; Bell, D. R.; Vandervelde, E. M.

    1991-01-01

    Infection with hepatitis B virus (HBV) is much more common in tropical than in temperate countries. Visitors to the tropics are thus at risk from HBV, though the degree of risk, and the routes of infection involved are uncertain. We report serological markers of HBV in two groups of 2nd World War soldiers, who served in the Thai/Burma jungles. The groups comprised 100 ex-prisoners of the Japanese (POW), and 100 Burma Campaign Veterans (BCV). Surface antigen to HBV (HbsAg) was positive in 0% o...

  12. New results on formaldehyde: the 2nd International Formaldehyde Science Conference (Madrid, 19-20 April 2012).

    Science.gov (United States)

    Bolt, Hermann M; Morfeld, Peter

    2013-01-01

    The toxicology and epidemiology of formaldehyde were discussed on the 2nd International Formaldehyde Science Conference in Madrid, 19-20 April 2012. It was noted that a substantial amount of new scientific data has appeared within the last years since the 1st conference in 2007. Progress has been made in characterisation of genotoxicity, toxicokinetics, formation of exogenous and endogenous DNA adducts, controlled human studies and epidemiology. Thus, new research results are now at hand to be incorporated into existing evaluations on formaldehyde by official bodies. PMID:23138381

  13. In-vitro comparison of a 1st and a 2nd generation US contrast agent for reflux diagnosis

    International Nuclear Information System (INIS)

    Purpose: Contrast-enhanced sonographic reflux diagnosis, i. e. voiding urosonography (VUS), is gradually becoming an alternative for diagnostic imaging of vesicoureteric reflux (VUR). A limiting factor for the widespread application of VUS is the cost of the US contrast agents. The development of new US contrast agents and the possibility of reducing the administered dose are expected to lower the cost. The aim of this study was an in-vitro comparison of the new US contrast agent (SonoVue registered) and the routinely used contrast agent Levovist registered, while taking into consideration the physical-chemical properties relevant for reflux diagnosis. Materials and Methods: The in-vitro experiment setup simulated the in-vivo VUS. The US modalities fundamental and harmonic imaging (THI/ECI, Sonoline Elegra registered, Siemens) were utilized, the latter with both low and high mechanical indices (MI). SonoVue registered was tested in concentrations of 0.25 %, 0.5 % and 1 % and Levovist registered at 5 % volume. The in-vitro contrast duration served as the parameter for comparison. This was defined as the time from the start of the experiment until the time when more than 50 % of the image area was free of microbubbles. Results: The use of different concentrations of SonoVue registered did not have any impact on the contrast duration. The contrast duration of SonoVue registered turned out to be ue registered turned out to be significantly longer when the US modality was switched from low to high MI. In the case of THI with high MI as is routinely with Levovist registered, the contrast duration of Levovist registered at a concentration of 5 % was 1.1 min, whereas that of SonoVue registered at a concentration of 1 % reached 7.3 min. This means that despite SonoVue registered being administered at a dose five times lower than that of Levovist registered, the in-vitro contrast duration increased by more than 80 %. Moreover, a freshly prepared suspension of SonoVue registered did not show change in the contrast duration for nearly 6 hours. In the case of Levovist registered there was a significant reduction in the contrast duration after only a half hour. Conclusion: The in-vivo use of SonoVue registered is expected to yield a significant dose reduction so that one vial can be used for more than one examination. A measurable cost reduction can consequently be achieved. (orig.)

  14. 2nd Generation RLV Risk Reduction Definition Program: Pratt & Whitney Propulsion Risk Reduction Requirements Program (TA-3 & TA-4)

    Science.gov (United States)

    Matlock, Steve

    2001-01-01

    This is the final report and addresses all of the work performed on this program. Specifically, it covers vehicle architecture background, definition of six baseline engine cycles, reliability baseline (space shuttle main engine QRAS), and component level reliability/performance/cost for the six baseline cycles, and selection of 3 cycles for further study. This report further addresses technology improvement selection and component level reliability/performance/cost for the three cycles selected for further study, as well as risk reduction plans, and recommendation for future studies.

  15. Comparison of elution efficiency of 99Mo/99mTc generator using theoretical and a free web based software method

    International Nuclear Information System (INIS)

    Full text: Generator is constructed on the principle of decay growth relationship between a long lived parent radionuclide and short lived daughter radionuclide. Difference in chemical properties of daughter and parent radionuclide helps in efficient separation of the two radionuclides. Aim and Objectives: The present study was designed to calculate the elution efficiency of the generator using the traditional formula based method and free web based software method. Materials and Methods: 99Mo/99mTc MON.TEK (Monrol, Gebze) generator and sterile 0.9% NaCl vial and vacuum vial in the lead shield were used for the elution. A new 99Mo/99mTc generator (calibrated activity 30GBq) calibrated for thursday was received on monday morning in our department. Generator was placed behind lead bricks in fume hood. The rubber plugs of both vacuum and 0.9% NaCl vial were wiped with 70% isopropyl alcohol swabs. Vacuum vial placed inside the lead shield was inserted in the vacuum position simultaneously 10 ml NaCl vial was inserted in the second slot. After 1-2 min vacuum vial was removed without moving the emptied 0.9%NaCl vial. The vacuum slot was covered with another sterile vial to maintain sterility. The RAC was measured in the calibrated dose calibrator (Capintec, 15 CRC). The elution efficiency was calculated theoretically and using free web based software (Apache Web server (www.apache.org) and PHP (www.php.net). Web site of the Italia PHP (www.php.net). Web site of the Italian Association of Nuclear Medicine and Molecular Imaging (www.aimn.it). Results: The mean elution efficiency calculated by theoretical method was 93.95% +0.61. The mean elution efficiency as calculated by the software was 92.85% + 0.89. There was no statistical difference in both the methods. Conclusion: The free web based software provides precise and reproducible results and thus saves time and mathematical calculation steps. This enables a rational use of available activity and also enabling a selection of the type and number of procedures to perform in a busy nuclear medicine department

  16. Synthetic CO, H2 and HI surveys of the Galactic 2nd Quadrant, and the properties of molecular gas

    CERN Document Server

    Duarte-Cabral, A; Dobbs, C L; Mottram, J C; Gibson, S J; Brunt, C M; Douglas, K A

    2014-01-01

    We present CO, H2, HI and HISA distributions from a set of simulations of grand design spirals including stellar feedback, self-gravity, heating and cooling. We replicate the emission of the 2nd Galactic Quadrant by placing the observer inside the modelled galaxies and post process the simulations using a radiative transfer code, so as to create synthetic observations. We compare the synthetic datacubes to observations of the 2nd Quadrant of the Milky Way to test the ability of the current models to reproduce the basic chemistry of the Galactic ISM, as well as to test how sensitive such galaxy models are to different recipes of chemistry and/or feedback. We find that models which include feedback and self-gravity can reproduce the production of CO with respect to H2 as observed in our Galaxy, as well as the distribution of the material perpendicular to the Galactic plane. While changes in the chemistry/feedback recipes do not have a huge impact on the statistical properties of the chemistry in the simulated g...

  17. Brain order disorder 2nd group report of f-EEG

    Science.gov (United States)

    Lalonde, Francois; Gogtay, Nitin; Giedd, Jay; Vydelingum, Nadarajen; Brown, David; Tran, Binh Q.; Hsu, Charles; Hsu, Ming-Kai; Cha, Jae; Jenkins, Jeffrey; Ma, Lien; Willey, Jefferson; Wu, Jerry; Oh, Kenneth; Landa, Joseph; Lin, C. T.; Jung, T. P.; Makeig, Scott; Morabito, Carlo Francesco; Moon, Qyu; Yamakawa, Takeshi; Lee, Soo-Young; Lee, Jong-Hwan; Szu, Harold H.; Kaur, Balvinder; Byrd, Kenneth; Dang, Karen; Krzywicki, Alan; Familoni, Babajide O.; Larson, Louis; Harkrider, Susan; Krapels, Keith A.; Dai, Liyi

    2014-05-01

    Since the Brain Order Disorder (BOD) group reported on a high density Electroencephalogram (EEG) to capture the neuronal information using EEG to wirelessly interface with a Smartphone [1,2], a larger BOD group has been assembled, including the Obama BRAIN program, CUA Brain Computer Interface Lab and the UCSD Swartz Computational Neuroscience Center. We can implement the pair-electrodes correlation functions in order to operate in a real time daily environment, which is of the computation complexity of O(N3) for N=102~3 known as functional f-EEG. The daily monitoring requires two areas of focus. Area #(1) to quantify the neuronal information flow under arbitrary daily stimuli-response sources. Approach to #1: (i) We have asserted that the sources contained in the EEG signals may be discovered by an unsupervised learning neural network called blind sources separation (BSS) of independent entropy components, based on the irreversible Boltzmann cellular thermodynamics(?S < 0), where the entropy is a degree of uniformity. What is the entropy? Loosely speaking, sand on the beach is more uniform at a higher entropy value than the rocks composing a mountain - the internal binding energy tells the paleontologists the existence of information. To a politician, landside voting results has only the winning information but more entropy, while a non-uniform voting distribution record has more information. For the human's effortless brain at constant temperature, we can solve the minimum of Helmholtz free energy (H = E - TS) by computing BSS, and then their pairwise-entropy source correlation function. (i) Although the entropy itself is not the information per se, but the concurrence of the entropy sources is the information flow as a functional-EEG, sketched in this 2nd BOD report. Area #(2) applying EEG bio-feedback will improve collective decision making (TBD). Approach to #2: We introduce a novel performance quality metrics, in terms of the throughput rate of faster (?t) & more accurate (?A) decision making, which applies to individual, as well as team brain dynamics. Following Nobel Laureate Daniel Kahnmen's novel "Thinking fast and slow", through the brainwave biofeedback we can first identify an individual's "anchored cognitive bias sources". This is done in order to remove the biases by means of individually tailored pre-processing. Then the training effectiveness can be maximized by the collective product ?t * ?A. For Area #1, we compute a spatiotemporally windowed EEG in vitro average using adaptive time-window sampling. The sampling rate depends on the type of neuronal responses, which is what we seek. The averaged traditional EEG measurements and are further improved by BSS decomposition into finer stimulus-response source mixing matrix [A] having finer & faster spatial grids with rapid temporal updates. Then, the functional EEG is the second order co-variance matrix defined as the electrode-pair fluctuation correlation function C(s~, s~') of independent thermodynamic source components. (1) We define a 1-D Space filling curve as a spiral curve without origin. This pattern is historically known as the Peano-Hilbert arc length a. By taking the most significant bits of the Cartesian product a? O(x * y * z), it represents the arc length in the numerical size with values that map the 3-D neighborhood proximity into a 1-D neighborhood arc length representation. (2) 1-D Fourier coefficients spectrum have no spurious high frequency contents, which typically arise in lexicographical (zig-zag scanning) discontinuity [Hsu & Szu, "Peano-Hilbert curve," SPIE 2014]. A simple Fourier spectrum histogram fits nicely with the Compressive Sensing CRDT Mathematics. (3) Stationary power spectral density is a reasonable approximation of EEG responses in striate layers in resonance feedback loops capable of producing a 100, 000 neuronal collective Impulse Response Function (IRF). The striate brain layer architecture represents an ensemble

  18. Free Open Source Software: FOSS Based GIS for Spatial Retrievals of Appropriate Locations for Ocean Energy Utilizing Electric Power Generation Plants

    Directory of Open Access Journals (Sweden)

    Kohei Arai

    2012-09-01

    Full Text Available Free Open Source Software: FOSS based Geographic Information System: GIS for spatial retrievals of appropriate locations for ocean wind and tidal motion utilizing electric power generation plants is proposed. Using scatterometer onboard earth observation satellites, strong wind coastal areas are retrieved with FOSS/GIS of PostgreSQL/GIS. PostGIS has to be modified together with altimeter and scatterometer database. These modification and database creation would be a good reference to the users who would like to create GIS system together with database with FOSS.

  19. Numerical Study of Entropy Generation in the Flameless Oxidation Using Large Eddy Simulation Model and OpenFOAM Software

    OpenAIRE

    Mousavi, Seyed Mahmood

    2014-01-01

    In this paper, in order to 3D investigation non-premixed flameless oxidation, large eddy simulation model using OpenFOAM software is applied. In this context, finite volume discrete ordinate model and partially stirred reactor are applied in order to model radiation and the combustion, respectively, and the full mechanism GRI-2.11 is used to precisely represent chemistry reactions. The flow field is discretized using the volume method and PISO algorithm coupled the pressure and velocity field...

  20. 2nd International Conference on INformation Systems Design and Intelligent Applications

    CERN Document Server

    Satapathy, Suresh; Sanyal, Manas; Sarkar, Partha; Mukhopadhyay, Anirban

    2015-01-01

    The second international conference on INformation Systems Design and Intelligent Applications (INDIA – 2015) held in Kalyani, India during January 8-9, 2015. The book covers all aspects of information system design, computer science and technology, general sciences, and educational research. Upon a double blind review process, a number of high quality papers are selected and collected in the book, which is composed of two different volumes, and covers a variety of topics, including natural language processing, artificial intelligence, security and privacy, communications, wireless and sensor networks, microelectronics, circuit and systems, machine learning, soft computing, mobile computing and applications, cloud computing, software engineering, graphics and image processing, rural engineering, e-commerce, e-governance, business computing, molecular computing, nano computing, chemical computing, intelligent computing for GIS and remote sensing, bio-informatics and bio-computing. These fields are not only ...

  1. Software Testing

    OpenAIRE

    Sarbjeet Singh; Sukhvinder singh; Gurpreet Singh,

    2010-01-01

    Software goes through a cycle of software development stages. A software is envisioned, created, evaluated, fixed and then put to use. To run any software consistently without any failure/bug/error, the most important step is to test the software. This paper points various types of software testing(manual and automation), various software testing techniques like black box, white box, gray box, sanity, functional testing etc. and software test life cycle models (V-model and W-model). This pape...

  2. The MeqTrees software system and its use for third-generation calibration of radio interferometers

    CERN Document Server

    Noordam, Jan E; 10.1051/0004-6361/201015013

    2011-01-01

    The formulation of the radio interferometer measurement equation (RIME) by Hamaker et al. has provided us with an elegant mathematical apparatus for better understanding, simulation and calibration of existing and future instruments. The calibration of the new radio telescopes (LOFAR, SKA) would be unthinkable without the RIME formalism, and new software to exploit it. MeqTrees is designed to implement numerical models such as the RIME, and to solve for arbitrary subsets of their parameters. The technical goal of MeqTrees is to provide a tool for rapid implementation of such models, while offering performance comparable to hand-written code. We are also pursuing the wider goal of increasing the rate of evolution of radio astronomical software, by offering a tool for rapid experimentation and exchange of ideas. MeqTrees is implemented as a Python-based front-end called the meqbrowser, and an efficient (C++-based) computational back-end called the meqserver. Numerical models are defined on the front-end via a P...

  3. Generating statements at whole-body imaging with a workflow-optimized software tool - first experiences with multireader analysis

    International Nuclear Information System (INIS)

    Introduction: Due to technical innovations in sectional diagram methods, whole-body imaging has increased in importance for clinical radiology, particularly for the diagnosis of systemic tumor disease. Large numbers of images have to be evaluated in increasingly shorter time periods. The aim was to create and evaluate a new software tool to assist and automate the process of diagnosing whole-body datasets. Material and Methods: Thirteen whole-body datasets were evaluated by 3 readers using the conventional system and the new software tool. The times for loading the datasets, examining 5 different regions (head, neck, thorax, abdomen and pelvis/skeletal system) and retrieving a relevant finding for demonstration were acquired. Additionally a Student T-Test was performed. For qualitative analysis the 3 readers used a scale from 0 - 4 (0 = bad, 4 = very good) to assess dataset loading convenience, lesion location assistance, and ease of use. Additionally a kappa value was calculated. Results: The average loading time was 39.7 s (± 5.5) with the conventional system and 6.5 s (± 1.4) (p 0.9). The qualitative analysis showed a significant advantage with respect to convenience (p 0.9). (orig.)

  4. A summary of the 2nd workshop on Human Resources Development (HRD) in the nuclear field in Asia. FY2000

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-06-01

    The Human Resources Development (HRD) Project was added in 1999 as a Cooperation Activity of 'the Forum for Nuclear Cooperation in Asia (FNCA)' which is organized by Nuclear Committee. The HRD Project supports to solidify the foundation of nuclear development utilization in Asia by promoting human resources development in Asian countries. The principal activity of the HRD Project is to hold the Workshop on Human Resources Development in the Nuclear Field in Asia once a year. The objective of the Workshop is to clarify problems and needs of the human resources development of each country and to support it mutually by exchanging information etc. The report consists of a summary of the 2nd Workshop on Human Resources Development in the Nuclear Field in Asia held on November 27 and 28, 2000 at Tokai Research Establishment of JAERI. (author)

  5. CaF2:Mn thermoluminescence: a single glow peak not described by 1st or 2nd order kinetics

    International Nuclear Information System (INIS)

    The thermoluminescence (TL) of CaF2:Mn has been studied using photon counting and digital recording. For doses of 10 rad or less the TL glow curves appear to consist of a single glow peak. However, there are indications - which are pronounced at larger doses - that one additional low intensity peak (area less than or equal to one percent) is superimposed on each side of the central peak. The intense peak is not described by 1st or 2nd order kinetics but is well described by the more general kinetics from which these kinetics are derived. These observations, and the results of additional kinetic analysis, demonstrate that retrapping is not negligible and may include all three peaks. In such systems, which are likely to include other dosimeter materials and minerals, peak height will not increase linearly with dose; an important factor for dosimetry and dating applications

  6. CaF2:Mn thermoluminescence: a single glow peak not described by 1st or 2nd order kinetics

    International Nuclear Information System (INIS)

    The thermoluminescent (TL) of CaF2:Mn has been studied using photon counting and digital recording. For doses of 10 rad or less the TL glow curves appear to consist of a single glow peak. However, there are indications - which are pronounced at larger doses - that one additional low intensity peak (area <= 1%) is superimposed on each side of the central peak. The intense peak is not described by 1st or 2nd order kinetics but is well described by the more general kinetics from which these kinetics are derived. These observations, and the results of additional kinetic analysis, demonstrate that retrapping is not negligible and may include all three peaks. In such systems, which are likely to include other dosimeter materials and minerals, peak height will not increase linearly with dose: an important factor for dosimetry and dating applications. (author)

  7. Methods for designing algorithms of electric equipment protection in nuclear power plants at the 2nd level of protection

    International Nuclear Information System (INIS)

    The questions are discussed of equipment protection systems in nuclear power plants as part of the automated process control systems in these plants. The electrical equipment protection system is discussed with respect to protection algorithms at the 2nd protection level. Three methods can be applied in designing the algorithms or in investigating the operating condition of the equipment in a two-value state space. The characteristics, benefits and constraints of the methods are analyzed in detail. The methods include the decision table method, the failure weight method and the failure tree method. A comparison of the methods shows that for the above purpose, the decision table method and the failure weight method are most suitable. The decision table method is also most suitable for designing the power part of the protection algorithm. (Z.M.). 3 figs., 1 tab., 2 refs

  8. A summary of the 2nd workshop on Human Resources Development (HRD) in the nuclear field in Asia. FY2000

    International Nuclear Information System (INIS)

    The Human Resources Development (HRD) Project was added in 1999 as a Cooperation Activity of 'the Forum for Nuclear Cooperation in Asia (FNCA)' which is organized by Nuclear Committee. The HRD Project supports to solidify the foundation of nuclear development utilization in Asia by promoting human resources development in Asian countries. The principal activity of the HRD Project is to hold the Workshop on Human Resources Development in the Nuclear Field in Asia once a year. The objective of the Workshop is to clarify problems and needs of the human resources development of each country and to support it mutually by exchanging information etc. The report consists of a summary of the 2nd Workshop on Human Resources Development in the Nuclear Field in Asia held on November 27 and 28, 2000 at Tokai Research Establishment of JAERI. (author)

  9. Post-flight BET products for the 2nd discovery entry, STS-19 (51-A)

    Science.gov (United States)

    Kelly, G. M.; Mcconnell, J. G.; Heck, M. L.; Troutman, P. A.; Waters, L. A.; Findlay, J. T.

    1985-01-01

    The post-flight products for the second Discovery flight, STS-19 (51-A), are summarized. The inertial best estimate trajectory (BET), BT19D19/UN=169750N, was developed using spacecraft dynamic measurements from Inertial Measurement Unit 2 (IMU2) in conjunction with the best tracking coverage available for any of the earlier Shuttle entries. As a consequence of the latter, an anchor epoch was selected which conforms to an initial altitude of greater than a million feet. The Extended BET, ST19BET/UN=274885C, incorporated the previously mentioned inertial reconstructed state information and the Langley Atmospheric Information Retrieval System (LAIRS) atmosphere, ST19MET/UN=712662N, with some minor exceptions. Primary and back-up AEROBET reels are NK0165 and NK0201, respectively. This product was only developed over the lowermost 360 kft altitude range due to atmosphere problems but this relates to altitudes well above meaningful signal in the IMUs. Summary results generated from the AEROBET for this flight are presented with meaningful configuration and statistical comparisons from the previous thirteen flights. Modified maximum likelihood estimation (MMLE) files were generated based on IMU2 and the Rate Gyro Assembly/Accelerometer Assembly (RGA/AA), respectively. Appendices attached define spacecraft and physical constants utilized, show plots of the final tracking data residuals from the post-flight fit, list relevant parameters from the BET at a two second spacing, and retain for archival purpose all relevant input and output tapes and files generated.

  10. Fractally Generated Microstrip Bandpass Filter Designs Basedon Dual-Mode Square Ring Resonator for WirelessCommunication Systems

    Directory of Open Access Journals (Sweden)

    Jawad K. Ali

    2008-01-01

    Full Text Available A novel fractal design scheme has been introduced in this paper to generate microstrip bandpass filter designs with miniaturized sizes for wireless applications. The presented fractal scheme is based on Minkowski-like prefractal geometry. The space-filling property and self-similarity of this fractal geometry has found to produce reduced size symmetrical structures corresponding to the successive iteration levels. The resulting filter designs are with sizes suitable for use in modern wireless communication systems. The performance of each of the generated bandpass filter structures up to the 2nd iteration has been analyzed using a method of moments (MoM based software IE3D, which is widely adopted in microwave research and industry. Results show that these filters possess good transmission and return loss characteristics, besides the miniaturized sizes meeting the design specifications of most of wireless communication systems

  11. Operation of Finnish nuclear power plants. Quarterly report, 2nd quarter 1998

    International Nuclear Information System (INIS)

    Quarterly reports on the operation of Finnish NPPs describe events and observations relating to nuclear and radiation safety that the Radiation and Nuclear Safety Authority (STUK) considers safety significant. Safety improvements at the plants are also described. The report includes a summary of the radiation safety of plant personnel and the environment and tabulated data on the plants' production and load factors. The Loviisa plant units were in power operation for the whole second quarter of 1998. The Olkiluoto units discontinued electricity generation for annual maintenance and also briefly for tests pertaining to the power upratings of the units. In addition, there were breaks in power generation at Olkiluoto 2 due to a low electricity demand in Midsummer and turbine balancing. The Olkiluoto units were in power operation in this quarter with the exception of the aforementioned breaks. The load factor average of the four plant units was 87.7%. The events in this quarter had no bearing on the nuclear or radiation safety. Occupational doses and radioactive releases off-site were below authorised limits. Radioactive substances were measurable in samples collected around the plants in such quantities only as have no bearing on the radiation exposure of the population. (orig.)

  12. Operation of Finnish nuclear power plants. Quarterly report, 2nd quarter 1998

    Energy Technology Data Exchange (ETDEWEB)

    Tossavainen, K. [ed.

    1999-01-01

    Quarterly reports on the operation of Finnish NPPs describe events and observations relating to nuclear and radiation safety that the Radiation and Nuclear Safety Authority (STUK) considers safety significant. Safety improvements at the plants are also described. The report includes a summary of the radiation safety of plant personnel and the environment and tabulated data on the plants` production and load factors. The Loviisa plant units were in power operation for the whole second quarter of 1998. The Olkiluoto units discontinued electricity generation for annual maintenance and also briefly for tests pertaining to the power upratings of the units. In addition, there were breaks in power generation at Olkiluoto 2 due to a low electricity demand in Midsummer and turbine balancing. The Olkiluoto units were in power operation in this quarter with the exception of the aforementioned breaks. The load factor average of the four plant units was 87.7%. The events in this quarter had no bearing on the nuclear or radiation safety. Occupational doses and radioactive releases off-site were below authorised limits. Radioactive substances were measurable in samples collected around the plants in such quantities only as have no bearing on the radiation exposure of the population. (orig.)

  13. Space Software

    Science.gov (United States)

    1990-01-01

    Xontech, Inc.'s software package, XonVu, simulates the missions of Voyager 1 at Jupiter and Saturn, Voyager 2 at Jupiter, Saturn, Uranus and Neptune, and Giotto in close encounter with Comet Halley. With the program, the user can generate scenes of the planets, moons, stars or Halley's nucleus and tail as seen by Giotto, all graphically reproduced with high accuracy in wireframe representation. Program can be used on a wide range of computers, including PCs. User friendly and interactive, with many options, XonVu can be used by a space novice or a professional astronomer. With a companion user's manual, it sells for $79.

  14. The 2nd DBCLS BioHackathon: interoperable bioinformatics Web services for integrated applications

    Directory of Open Access Journals (Sweden)

    Katayama Toshiaki

    2011-08-01

    Full Text Available Abstract Background The interaction between biological researchers and the bioinformatics tools they use is still hampered by incomplete interoperability between such tools. To ensure interoperability initiatives are effectively deployed, end-user applications need to be aware of, and support, best practices and standards. Here, we report on an initiative in which software developers and genome biologists came together to explore and raise awareness of these issues: BioHackathon 2009. Results Developers in attendance came from diverse backgrounds, with experts in Web services, workflow tools, text mining and visualization. Genome biologists provided expertise and exemplar data from the domains of sequence and pathway analysis and glyco-informatics. One goal of the meeting was to evaluate the ability to address real world use cases in these domains using the tools that the developers represented. This resulted in i a workflow to annotate 100,000 sequences from an invertebrate species; ii an integrated system for analysis of the transcription factor binding sites (TFBSs enriched based on differential gene expression data obtained from a microarray experiment; iii a workflow to enumerate putative physical protein interactions among enzymes in a metabolic pathway using protein structure data; iv a workflow to analyze glyco-gene-related diseases by searching for human homologs of glyco-genes in other species, such as fruit flies, and retrieving their phenotype-annotated SNPs. Conclusions Beyond deriving prototype solutions for each use-case, a second major purpose of the BioHackathon was to highlight areas of insufficiency. We discuss the issues raised by our exploration of the problem/solution space, concluding that there are still problems with the way Web services are modeled and annotated, including: i the absence of several useful data or analysis functions in the Web service "space"; ii the lack of documentation of methods; iii lack of compliance with the SOAP/WSDL specification among and between various programming-language libraries; and iv incompatibility between various bioinformatics data formats. Although it was still difficult to solve real world problems posed to the developers by the biological researchers in attendance because of these problems, we note the promise of addressing these issues within a semantic framework.

  15. Operation of Finnish nuclear power plants. Quarterly report, 2nd quarter 1996

    Energy Technology Data Exchange (ETDEWEB)

    Sillanpaeae, T. [ed.

    1996-11-01

    Quarterly Reports on the operation of Finnish nuclear power plants describe events and observations relating to nuclear and radiation safety which the Finnish Centre for Radiation and Nuclear Safety (STUK) considers safety significant. Safety improvements at the plants are also described. The report also includes a summary of the radiation safety of plant personnel and of the environment and tabulated data on the plants` production and load factors. In the second quarter of 1996, the Finnish nuclear power plant units were in power operation except for the annual maintenance outages of TVO plant units and the Midsummer shutdown at TVO II which was due to low electricity demand, a turbine generator inspection and repairs. The load factor average of all plant units was 88.9 %. Events in the second quarter of 1996 were classified level 0 on the International Nuclear Event Scale (INES).

  16. Development of a radioactive waste treatment equipment utilizing microwave heating, 2nd report

    International Nuclear Information System (INIS)

    The objective of the present study is to establish an incineration technique utilizing microwave heating which enables a high volume reduction of spent ion-exchange resins and filtering media generated at nuclear facilities. The past three years from 1982 to 1985, with the financial aid from the Agency of Science and Technology, brought a great and rapid progress to this project when the heating technique was switched from direct microwave heating to indirect heating by employing a bed of beads of silicon carbide. This material was also used to build a secondary furnace, walls and roster bars, to treat the obnoxious gases and soot arising in the primary incineration process by the radiating heat of this material heated to above 1000 deg C again by microwave energy, but not by the originarily applied direct plasma torch combustion. The incinerator and the secondary furnace were integrated into one unit as the principal treating equipment. This novel approach made possible a well stabilized continuous incineration operation. Further, developmental efforts toward industrial applications were made by setting up a pilot plant with microwave generators, 2 sets of 5 kW of 2450 MHz and 1 set of 25 kW of 915 MHz, and tests were carried out to prove remarkably high volume reduction capability well above roughly 200 on weight basis. For hot test runs, a one - tenth scale pilot test setup was installed at the TOKAI Laboratory of Japan Atmic Energy Research Institute and tested wiic Energy Research Institute and tested with materials spiked with radioisotopes and also with spent ion-exchange resins stored there. Very satisfactory results were obtained in these proving tests to show the efficient capability of high volume reduction treatment of otherwise stable radioactive waste materials such as spent ion-exchange resins. (author)

  17. Software testing concepts and operations

    CERN Document Server

    Mili, Ali

    2015-01-01

    Explores and identifies the main issues, concepts, principles and evolution of software testing, including software quality engineering and testing concepts, test data generation, test deployment analysis, and software test managementThis book examines the principles, concepts, and processes that are fundamental to the software testing function. This book is divided into five broad parts. Part I introduces software testing in the broader context of software engineering and explores the qualities that testing aims to achieve or ascertain, as well as the lifecycle of software testing. Part II c

  18. Software Engineering

    Science.gov (United States)

    Dr Gene Tagliarini

    CSC 450. Software Engineering (3) Prerequisite: CSC 332 and senior standing. Study of the design and production of large and small software systems. Topics include systems engineering, software life-cycle and characterization; use of software tools. Substantial software project required.

  19. Optimal Planning of an Off-grid Electricity Generation with Renewable Energy Resources using the HOMER Software

    Directory of Open Access Journals (Sweden)

    Hossein Shahinzadeh

    2015-03-01

    Full Text Available In recent years, several factors such as environmental pollution which is caused by fossil fuels and various diseases caused by them from one hand and concerns about the dwindling fossil fuels and price fluctuation of the products and resulting effects of these fluctuations in the economy from other hand has led most countries to seek alternative energy sources for fossil fuel supplies. Such a way that in 2006, about 18% of the consumed energy of the world is obtained through renewable energies. Iran is among the countries that are geographically located in hot and dry areas and has the most sun exposure in different months of the year. Except in the coasts of Caspian Sea, the percentage of sunny days throughout the year is between 63 to 98 percent in Iran. On the other hand, there are dispersed and remote areas and loads far from national grid which is impossible to provide electrical energy for them through transmission from national grid, therefore, for such cases the renewable energy technologies could be used to solve the problem and provide the energy. In this paper, technical and economic feasibility for the use of renewable energies for independent systems of the grid for a dispersed load in the area on the outskirts of Isfahan (Sepahan with the maximum energy consumption of 3Kwh in a day is studied and presented. In addition, the HOMER simulation software is used as the optimization tool.

  20. Report of 2nd workshop on particle process. A report of the Yayoi study meeting

    International Nuclear Information System (INIS)

    In Nuclear Engineering Research Laboratory, Faculty of Engineering, University of Tokyo, a short term research named Yayoi Research Group, as a joint application research work of nuclear reactor (Yayoi) and electron Linac in Japan, has been held more than 10 times a year. This report is arranged the summaries of 'Research on Particle Method', one of them, held on August 7, 1996. As named 'Particle Method' here, the method explaining and calculating the fluids and powders as a group of particles is more suitable for treating a problem with boundary face and a large deformation of the fluids on comparison with the conventional method using lattice, which is more expectable in future development. In this report, the following studies are contained; 1) Stress analysis without necessary of element breakdown, 2) Local interpolation differential operator method and nonstructural lattice, 3) Selforganized simulation of the dynamical construction, 4) A lattice BGK solution of laminar flow over a background facing step, 5) Numerical analysis of solid-gas two phase flow using discrete element method, 6) Application of flow analysis technique to power generation plant equipments, 7) Corrision wave captured flow calculation using the particle method, and 8) Analysis of complex problem on thermal flow using the particle (MPS) method. (G.K.)

  1. Promoting concrete algorithm for implementation in computer system and data movement in terms of software reuse to generate actual values suitable for different access

    Directory of Open Access Journals (Sweden)

    Nderim Zeqiri

    2013-04-01

    Full Text Available The construction of functional algorithms by a good line and programming, open new routes and in the same time increase the capability to use them in the Mechatronics systems with specific and reliability system for any practical implementation and by justification in aspect of the economy context, and in terms of maintenance, making it more stable etc. This flexibility is really a possibility for the new approach and by makes the program code an easy way for updating data and In many cases is needed a quick access method which is which is specified in the context of generating appropriate values for digital systems. This forms, is opening a new space and better management to manage a respective values of a program code, and for software reuse, because this solution reduce costs and has a positive effect in terms of a digital economy.

  2. GENERATION OF A VECTOR OF NODAL FORCES PRODUCED BY LOADS PRE-SET BY THE ARBITRARY SCULPTED SURFACE DESIGNATED FOR UNIVERSAL STRESS ANALYSIS SOFTWARE

    Directory of Open Access Journals (Sweden)

    Shaposhnikov Nikolay Nikolaevich

    2012-10-01

    A user may select the surface accommodating any simulated arbitrary load; further, a point of the pre-set load intensity specified in the Distributed Load Q field of interface window Distributed Loads, and the point of zero intensity load are to be specified. The above source data are used to calculate the scale coefficient of transition from linear distances to the real value of the load intensity generated within the coordinate surface. The point of zero load intensity represents a virtual plane of zero distributed load values. The proposed software designated for the conversion of arbitrary distributed loads into the nodal load is compact; therefore, it may be integrated into modules capable of exporting the nodal load into other systems of strength analysis, though functioning as a problem-oriented geometrical utility of AutoCAD.

  3. Design of control system for the 2nd and 3rd charge exchange system in J-PARC 3GeV RCS

    International Nuclear Information System (INIS)

    J-PARC 3GeV Synchrotron Accelerator is using method of charge exchange injection using three carbon foils. In order to achieve this injection, three charge exchange devices installed in this facility. These devices are controlled by one control system. The 2nd and 3rd charge exchange devices are upgrading to increase maintainability and exhaust ability of the vacuum unit, and the control system has reconsidered. Basic policy of redesigning the control system is separated from centralized control system of the three devices, and we reconstruct the control system that independent from the centralized control system. On this condition, we are upgrading of the 2nd and 3rd charge exchange device. It is necessary to redesign the interlock unit about safety, because of being stand-alone control. Now, the error signal of the charge exchange unit consolidates the error signal of three devices, and it operates the Machine Protection System (MPS). Therefore, we needed long time to search occasion why the error happened. However, the MPS will be operated by the error signal on each unit, we hope it makes a difference to search occasion easily. The 2nd and 3rd charge exchange units adopt a simple control system using Yokogawa electric PLC FA-M3. We are designing of the control system with safety that fuses the drive unit and the vacuum unit. This report is about design of the 2nd and 3rd charge exchange unit control system that reconstructed the hardware of their unit. (author)

  4. The 2nd International Conference on Nuclear Physics in Astrophysics Refereed and selected contributions Debrecen, Hungary May 16–20, 2005

    CERN Document Server

    Fülöp, Zsolt; Somorjai, Endre

    2006-01-01

    Launched in 2004, "Nuclear Physics in Astrophysics" has established itself in a successful topical conference series addressing the forefront of research in the field. This volume contains the selected and refereed papers of the 2nd conference, held in Debrecen in 2005 and reprinted from "The European Physical Journal A - Hadrons and Nuclei".

  5. The 2008—2013 Crisis as Metastasis. A Preview of the 2nd edition of The Cancer Stage of Capitalism by Pluto Press

    OpenAIRE

    John McMurtry

    2013-01-01

    By means of selection of relevant excerpts, a preview is offered hereby of the 2nd edition of John McMurtry's prophetic 1999 book "The Cancer Stage of Capitalism", published by Pluto Press, and entitled "The Cancer Stage of Capitalism and Its Cure"

  6. The 2008—2013 Crisis as Metastasis. A Preview of the 2nd edition of The Cancer Stage of Capitalism by Pluto Press

    Directory of Open Access Journals (Sweden)

    John McMurtry

    2013-03-01

    Full Text Available By means of selection of relevant excerpts, a preview is offered hereby of the 2nd edition of John McMurtry's prophetic 1999 book "The Cancer Stage of Capitalism", published by Pluto Press, and entitled "The Cancer Stage of Capitalism and Its Cure"

  7. Incretinas, incretinomiméticos, inhibidores de DPP IV: (2ª parte Incretins, Incretinmimetics, Inhibitors (2nd part

    Directory of Open Access Journals (Sweden)

    Claudia Bayón

    2010-09-01

    Full Text Available En los últimos años se reconoce un nuevo mecanismo involucrado en la fisiopatología de la Diabetes Mellitus tipo 2: el déficit de producción y/o acción de las incretinas. Las incretinas son enterohormonas que estimulan la secreción de insulina en respuesta a la ingesta de nutrientes. Glucagon like peptide-1 (GLP1 y Polipéptido insulinotrópico glucosa dependiente (GIP son las principales incretinas descubiertas hasta hoy. Ambas presentan también efecto trófico sobre las células beta de los islotes pancreáticos. GLP-1 presenta otras acciones como son la inhibición de la secreción de glucagón, enlentecimiento del vaciamiento gástrico e inhibición del apetito. Ambas incretinas son rápidamente clivadas por la enzima dipeptidil peptidasa 4 (DPP-4. Nuevas drogas como los incretinomiméticos, análogos y los inhibidores de DPP-4 se presentan como una terapéutica prometedora para los pacientes con diabetes tipo 2. Conflicto de intereses: Dr. León Litwak - Miembro del Board Latinoamericano de Eli Lilly y Sanofi Aventis - Miembro del Board Nacional de los laboratorios Novo Nordisk, Novartis, GlaxoSmithKline, Sanofi Aventis, Boheringer Ingelheim, Bristol Myers, Astra Zeneca - Investigador principal de protocolos pertenecientes a Eli Lilly, Novo Nordisk, Novartis, GlaxoSmithKline, Takeda, PPDF, Pfizer, Merck Sharp and Dôhme, Amger, Roche, Minimed, Quintiles - Conferencista de los laboratorios mencionados.Two main pathophysiological mechanisms are currently involved in Type 2 Diabetes (T2DM, insulin resistance and impairment of beta cell function. However, in recent years a new mechanism was reported: a significant decrease in incretins production and/or action. Incretins are gastrointestinal hormones whose main action is stimulating insulin secretion in response to nutrients. The best known incretins are glucagon like peptide-1 (GLP-1 and Gastric insulinotropic peptide (GIP. GLP-1 and GIP not only increase insulin secretion, but also decrease glucagon secretion, slow gastric emptying and reduce apetite, generating weight loss. Both incretins are rapidly clived by the enzyme dipeptidil peptidase 4 (DPP4. In order to emulate incretins action, several drugs were developed: GLP-1 receptor agonists, GLP-1 mimetics, and DPP4 inhibitors. All of them seem to became a very promising tool for the treatment of T2DM. Financial Interests: Dr. León Litwak - Member of the Latin American Board of Eli Lilly and Sanofi Aventis - Member of the National Board of the following laboratories: Novo Nordisk, Novartis, GlaxoSmithKlein Sanofi, Aventis, Boheringer Ingelheim, Bristol Myers, Astra Zeneca - Principal Investigator of Protocols from: Eli Lilly, Novo Nordisk, Novartis, GlaxoSmithKlein, Takeda, PPDF, Pfizer, Merck Sharp and Dôhme, Amgen, Roche, Minimed, Quintiles - Lecturer for the former laboratories.

  8. Incretinas, incretinomiméticos, inhibidores de DPP IV: (2ª parte) / Incretins, Incretinmimetics, Inhibitors (2nd part)

    Scientific Electronic Library Online (English)

    Claudia, Bayón; Mercedes Araceli, Barriga; León, Litwak.

    2010-09-01

    Full Text Available En los últimos años se reconoce un nuevo mecanismo involucrado en la fisiopatología de la Diabetes Mellitus tipo 2: el déficit de producción y/o acción de las incretinas. Las incretinas son enterohormonas que estimulan la secreción de insulina en respuesta a la ingesta de nutrientes. Glucagon like p [...] eptide-1 (GLP1) y Polipéptido insulinotrópico glucosa dependiente (GIP) son las principales incretinas descubiertas hasta hoy. Ambas presentan también efecto trófico sobre las células beta de los islotes pancreáticos. GLP-1 presenta otras acciones como son la inhibición de la secreción de glucagón, enlentecimiento del vaciamiento gástrico e inhibición del apetito. Ambas incretinas son rápidamente clivadas por la enzima dipeptidil peptidasa 4 (DPP-4). Nuevas drogas como los incretinomiméticos, análogos y los inhibidores de DPP-4 se presentan como una terapéutica prometedora para los pacientes con diabetes tipo 2. Conflicto de intereses: Dr. León Litwak - Miembro del Board Latinoamericano de Eli Lilly y Sanofi Aventis - Miembro del Board Nacional de los laboratorios Novo Nordisk, Novartis, GlaxoSmithKline, Sanofi Aventis, Boheringer Ingelheim, Bristol Myers, Astra Zeneca - Investigador principal de protocolos pertenecientes a Eli Lilly, Novo Nordisk, Novartis, GlaxoSmithKline, Takeda, PPDF, Pfizer, Merck Sharp and Dôhme, Amger, Roche, Minimed, Quintiles - Conferencista de los laboratorios mencionados. Abstract in english Two main pathophysiological mechanisms are currently involved in Type 2 Diabetes (T2DM), insulin resistance and impairment of beta cell function. However, in recent years a new mechanism was reported: a significant decrease in incretins production and/or action. Incretins are gastrointestinal hormon [...] es whose main action is stimulating insulin secretion in response to nutrients. The best known incretins are glucagon like peptide-1 (GLP-1) and Gastric insulinotropic peptide (GIP). GLP-1 and GIP not only increase insulin secretion, but also decrease glucagon secretion, slow gastric emptying and reduce apetite, generating weight loss. Both incretins are rapidly clived by the enzyme dipeptidil peptidase 4 (DPP4). In order to emulate incretins action, several drugs were developed: GLP-1 receptor agonists, GLP-1 mimetics, and DPP4 inhibitors. All of them seem to became a very promising tool for the treatment of T2DM. Financial Interests: Dr. León Litwak - Member of the Latin American Board of Eli Lilly and Sanofi Aventis - Member of the National Board of the following laboratories: Novo Nordisk, Novartis, GlaxoSmithKlein Sanofi, Aventis, Boheringer Ingelheim, Bristol Myers, Astra Zeneca - Principal Investigator of Protocols from: Eli Lilly, Novo Nordisk, Novartis, GlaxoSmithKlein, Takeda, PPDF, Pfizer, Merck Sharp and Dôhme, Amgen, Roche, Minimed, Quintiles - Lecturer for the former laboratories.

  9. MYOB software for dummies

    CERN Document Server

    Curtis, Veechi

    2012-01-01

    Your complete guide to MYOB® AccountRight softwareNow in its seventh edition, MYOB® Software For Dummies walks you through everything you need to know, from starting your MYOB® file from scratch and recording payments and receipts, to tracking profit and analysing sales. This new edition includes all the information you need on the new generation of MYOB® AccountRight software, including the new cloud computing features. Set up MYOB® software - understand how to make it work the first time Keep track of purchases and sales - monitor customer accounts and ensure you get pai

  10. GENII (Generation II): The Hanford Environmental Radiation Dosimetry Software System: Volume 3, Code maintenance manual: Hanford Environmental Dosimetry Upgrade Project

    Energy Technology Data Exchange (ETDEWEB)

    Napier, B.A.; Peloquin, R.A.; Strenge, D.L.; Ramsdell, J.V.

    1988-09-01

    The Hanford Environmental Dosimetry Upgrade Project was undertaken to incorporate the internal dosimetry models recommended by the International Commission on Radiological Protection (ICRP) in updated versions of the environmental pathway analysis models used at Hanford. The resulting second generation of Hanford environmental dosimetry computer codes is compiled in the Hanford Environmental Dosimetry System (Generation II, or GENII). This coupled system of computer codes is intended for analysis of environmental contamination resulting from acute or chronic releases to, or initial contamination of, air, water, or soil, on through the calculation of radiation doses to individuals or populations. GENII is described in three volumes of documentation. This volume is a Code Maintenance Manual for the serious user, including code logic diagrams, global dictionary, worksheets to assist with hand calculations, and listings of the code and its associated data libraries. The first volume describes the theoretical considerations of the system. The second volume is a Users' Manual, providing code structure, users' instructions, required system configurations, and QA-related topics. 7 figs., 5 tabs.

  11. GENII [Generation II]: The Hanford Environmental Radiation Dosimetry Software System: Volume 3, Code maintenance manual: Hanford Environmental Dosimetry Upgrade Project

    International Nuclear Information System (INIS)

    The Hanford Environmental Dosimetry Upgrade Project was undertaken to incorporate the internal dosimetry models recommended by the International Commission on Radiological Protection (ICRP) in updated versions of the environmental pathway analysis models used at Hanford. The resulting second generation of Hanford environmental dosimetry computer codes is compiled in the Hanford Environmental Dosimetry System (Generation II, or GENII). This coupled system of computer codes is intended for analysis of environmental contamination resulting from acute or chronic releases to, or initial contamination of, air, water, or soil, on through the calculation of radiation doses to individuals or populations. GENII is described in three volumes of documentation. This volume is a Code Maintenance Manual for the serious user, including code logic diagrams, global dictionary, worksheets to assist with hand calculations, and listings of the code and its associated data libraries. The first volume describes the theoretical considerations of the system. The second volume is a Users' Manual, providing code structure, users' instructions, required system configurations, and QA-related topics. 7 figs., 5 tabs

  12. Software Engineering Program: Software Process Improvement Guidebook

    Science.gov (United States)

    1996-01-01

    The purpose of this document is to provide experience-based guidance in implementing a software process improvement program in any NASA software development or maintenance community. This guidebook details how to define, operate, and implement a working software process improvement program. It describes the concept of the software process improvement program and its basic organizational components. It then describes the structure, organization, and operation of the software process improvement program, illustrating all these concepts with specific NASA examples. The information presented in the document is derived from the experiences of several NASA software organizations, including the SEL, the SEAL, and the SORCE. Their experiences reflect many of the elements of software process improvement within NASA. This guidebook presents lessons learned in a form usable by anyone considering establishing a software process improvement program within his or her own environment. This guidebook attempts to balance general and detailed information. It provides material general enough to be usable by NASA organizations whose characteristics do not directly match those of the sources of the information and models presented herein. It also keeps the ideas sufficiently close to the sources of the practical experiences that have generated the models and information.

  13. ECH power deposition at 3rd harmonic in high elongation TCV discharges sustained by 2nd harmonic current profile broadening

    International Nuclear Information System (INIS)

    This paper summarises the present effort aimed at developing high elongation heated discharges and testing their confinement properties at normalised currents for which the highest ideal MHD ?-limits are predicted. 2nd harmonic (X2) far off-axis ECH/CD is used to stabilise the plasma vertically at high elongation by broadening the current profile in stationary conditions (during the current flat top and over several current diffusion times). Current broadening is maximal for a power deposition in a narrow region (?a/5), for a finite toroidal injection angle and for high plasma density using upper lateral launchers to minimise refraction. In these discharges which are twice X2 overdense in the centre, 3rd harmonic (X3) is injected from a top launcher to deposit power in the centre and increase the central pressure, simultaneously with far off-axis X2. Using modulated X3, full absorption is measured by the diamagnetic probe. Absorption higher than calculated by thermal ray tracing is occasionally found, indicating absorption on the electron bulk as well as in the suprathermal electron population sometimes with a hollow deposition profile. The high sensitivity of the power coupling to the beam angle stresses the need for developing a mirror feedback scheme to increase the coupling efficiency in transient heating scenarios. (author)

  14. Contractions of 2D 2nd Order Quantum Superintegrable Systems and the Askey Scheme for Hypergeometric Orthogonal Polynomials

    Directory of Open Access Journals (Sweden)

    Ernest G. Kalnins

    2013-10-01

    Full Text Available We show explicitly that all 2nd order superintegrable systems in 2 dimensions are limiting cases of a single system: the generic 3-parameter potential on the 2-sphere, S9 in our listing. We extend the Wigner-Inönü method of Lie algebra contractions to contractions of quadratic algebras and show that all of the quadratic symmetry algebras of these systems are contractions of that of S9. Amazingly, all of the relevant contractions of these superintegrable systems on flat space and the sphere are uniquely induced by the well known Lie algebra contractions of e(2 and so(3. By contracting function space realizations of irreducible representations of the S9 algebra (which give the structure equations for Racah/Wilson polynomials to the other superintegrable systems, and using Wigner's idea of ''saving'' a representation, we obtain the full Askey scheme of hypergeometric orthogonal polynomials. This relationship directly ties the polynomials and their structure equations to physical phenomena. It is more general because it applies to all special functions that arise from these systems via separation of variables, not just those of hypergeometric type, and it extends to higher dimensions.

  15. Summary of the 2nd international symposium on arthrogryposis, St. Petersburg, Russia, September 17-19, 2014.

    Science.gov (United States)

    Hall, Judith G; Ogranovich, Alga; Pontén, Ava; van Bosse, Harold J P

    2015-06-01

    Enormous progress has been made in understanding the etiology and therapies for arthrogryposis (multiple congenital contractures). A 2nd International Symposium on Arthrogryposis was sponsored by the Turner Institute in St. Petersburg, Russia. Olga Agranovich, Head of the Arthrogryposis Department of the Turner Institute, organized this special meeting. Care providers from multiple disciplines from all over the world representing 18 nations attended. Participants included: Pediatric orthopedic specialists, rehabilitation physicians, occupational therapists, physical therapists, medical geneticists, neurologists, craniofacial physicians, psychologists, developmental biologists, as well as representatives from parent support groups. The 1st symposium established the need for a collaborative and interdisciplinary approach to the treatment of arthrogryposis, engagement of parent support organizations, and the aim for more research. The Second Symposium highlighted the continuing need for more research on various therapies, identification of different types of arthrogryposis, standardized descriptions of severity, development of new orthotics, improved prenatal diagnosis, and studying adult outcome. Major progress has been made on both upper and lower limb treatments. © 2015 Wiley Periodicals, Inc. PMID:25847824

  16. Increasing the water temperature of a 2nd order stream reach: Hydraulic aspects of a whole-stream manipulative experiment

    Science.gov (United States)

    de Lima, João L. M. P.; Canhoto, Cristina

    2015-04-01

    What will happen when water temperatures of streams increases, due to climate changes or in connection with rapidly changing human systems? Trying to answer to this question a whole-stream manipulative experiment was undertaken, where an increase in water temperature was artificially induced on a 2nd order stream reach. The main objective of this poster is to describe this experiment focusing on the design of the hydraulic system. The system maintained a steady flow while allowing natural variation in abiotic factors and was successfully used to evaluate the effects of warming on a stream ecosystem at several levels of biological organization. A constant flow of stream water was controlled by a hydraulic setup (~22m long; ~1.5m width) subdivided into two independent channels. One channel of the study reach received heated water (~3°C above the other), while the other received water at stream ambient temperature. The warming system maintained a steady gravity controlled flow making use of weirs and valves.

  17. Explicit formulas for 2nd-order driving terms due to sextupoles and chromatic effects of quadrupoles

    International Nuclear Information System (INIS)

    Optimization of nonlinear driving terms have become a useful tool for designing storage rings, especially modern light sources where the strong nonlinearity is dominated by the large chromatic effects of quadrupoles and strong sextupoles for chromaticity control. The Lie algebraic method is well known for computing such driving terms. However, it appears that there was a lack of explicit formulas in the public domain for such computation, resulting in uncertainty and/or inconsistency in widely used codes. This note presents explicit formulas for driving terms due to sextupoles and chromatic effects of quadrupoles, which can be considered as thin elements. The computation is accurate to the 4th-order Hamiltonian and 2nd-order in terms of magnet parameters. The results given here are the same as the APS internal note AOP-TN-2009-020. This internal nte has been revised and published here as a Light Source Note in order to get this information into the public domain, since both ELEGANT and OPA are using these formulas.

  18. Multi-atom resonant photoemission and the development of next-generation software and high-speed detectors for electron spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Kay, Alexander William

    2000-09-01

    This dissertation has involved the exploration of a new effect in photoelectron emission, multi-atom resonant photoemission (MARPE), as well as the development of new software, data analysis techniques, and detectors of general use in such research. We present experimental and theoretical results related to MARPE, in which the photoelectron intensity from a core level on one atom is influenced by a core-level absorption resonance on another. We point out that some of our and others prior experimental data has been strongly influenced by detector non-linearity and that the effects seen in new corrected data are smaller and of different form. Corrected data for the MnO(001) system with resonance between the O 1s and Mn 2p energy levels are found to be well described by an extension of well-known intraatomic resonant photoemission theory to the interatomic case, provided that interactions beyond the usual second-order Kramers-Heisenberg treatment are included. This theory is also found to simplify under certain conditions so as to yield results equivalent to a classical x-ray optical approach, with the latter providing an accurate and alternative, although less detailed and general, physical picture of these effects. Possible future applications of MARPE as a new probe of near-neighbor identities and bonding and its relationship to other known effects are also discussed. We also consider in detail specially written data acquisition software that has been used for most of the measurements reported here. This software has been used with an existing experimental system to develop the method of detector characterization and then data correction required for the work described above. The development of a next generation one-dimensional, high-speed, electron detector is also discussed. Our goal has been to design, build and test a prototype high-performance, one-dimensional pulse-counting detector that represents a significant advancement in detector technology and is well matched to modern high-brightness synchrotrons radiation sources and high-transmission electron-energy analyzers as typically used in photoelectron spectroscopy experiments. The general design of the detector and the results of initial tests are discussed and the acquisition of photoelectron spectra with the first test detector is described.

  19. Multi-atom resonant photoemission and the development of next-generation software and high-speed detectors for electron spectroscopy

    International Nuclear Information System (INIS)

    This dissertation has involved the exploration of a new effect in photoelectron emission, multi-atom resonant photoemission (MARPE), as well as the development of new software, data analysis techniques, and detectors of general use in such research. We present experimental and theoretical results related to MARPE, in which the photoelectron intensity from a core level on one atom is influenced by a core-level absorption resonance on another. We point out that some of our and others prior experimental data has been strongly influenced by detector non-linearity and that the effects seen in new corrected data are smaller and of different form. Corrected data for the MnO(001) system with resonance between the O 1s and Mn 2p energy levels are found to be well described by an extension of well-known intraatomic resonant photoemission theory to the interatomic case, provided that interactions beyond the usual second-order Kramers-Heisenberg treatment are included. This theory is also found to simplify under certain conditions so as to yield results equivalent to a classical x-ray optical approach, with the latter providing an accurate and alternative, although less detailed and general, physical picture of these effects. Possible future applications of MARPE as a new probe of near-neighbor identities and bonding and its relationship to other known effects are also discussed. We also consider in detail specially written data acquisition software that has been used for most of the measurements reported here. This software has been used with an existing experimental system to develop the method of detector characterization and then data correction required for the work described above. The development of a next generation one-dimensional, high-speed, electron detector is also discussed. Our goal has been to design, build and test a prototype high-performance, one-dimensional pulse-counting detector that represents a significant advancement in detector technology and is well matched to modern high-brightness synchrotrons radiation sources and high-transmission electron-energy analyzers as typically used in photoelectron spectroscopy experiments. The general design of the detector and the results of initial tests are discussed and the acquisition of photoelectron spectra with the first test detector is described

  20. PREFACE: Proceedings of the 2nd International Conference on Quantum Simulators and Design (Tokyo, Japan, 31 May-3 June 2008) Proceedings of the 2nd International Conference on Quantum Simulators and Design (Tokyo, Japan, 31 May-3 June 2008)

    Science.gov (United States)

    Akai, Hisazumi; Tsuneyuki, Shinji

    2009-02-01

    This special issue of Journal of Physics: Condensed Matter comprises selected papers from the proceedings of the 2nd International Conference on Quantum Simulators and Design (QSD2008) held in Tokyo, Japan, between 31 May and 3 June 2008. This conference was organized under the auspices of the Development of New Quantum Simulators and Quantum Design Grant-in-Aid for Scientific Research on Priority Areas, Ministry of Education, Culture, Sports, Science and Technology of Japan (MEXT). The conference focused on the development of first principles electronic structure calculations and their applications. The aim was to provide an opportunity for discussion on the progress in computational materials design and, in particular, the development of quantum simulators and quantum design. Computational materials design is a computational approach to the development of new materials. The essential ingredient is the use of quantum simulators to design a material that meets a given specification of properties and functionalities. For this to be successful, the quantum simulator should be very reliable and be applicable to systems of realistic size. During the conference, new methods of quantum simulation and quantum design were discussed including methods beyond the local density approximation of density functional theory, order-N methods, methods dealing with excitations and reactions, and the application of these methods to the design of novel materials, devices and systems. The conference provided an international forum for experimental and theoretical researchers to exchange ideas. A total of 220 delegates from eight countries participated in the conference. There were 13 invited talks, ten oral presentations and 120 posters. The 3rd International Conference on Quantum Simulators and Design will be held in Germany in the autumn of 2011.

  1. What is Your Software Worth?

    OpenAIRE

    Wiederhold, Gio

    2005-01-01

    This article presents a method for valuing software based on the income that use of that software is expected to generate in the future. Well-known principles of intellectual property (IP) valuation, sales expectations, discounting to present value, and the like, are applied, always focusing on the benefits and costs of software. A major issue, not dealt with in the literature of valuing intangibles, is that software is continually upgraded. Applying depreciation schedules is the simple solu...

  2. 2nd Abel Symposium

    CERN Document Server

    Nunno, Giulia; Lindstrøm, Tom; Øksendal, Bernt; Zhang, Tusheng

    2007-01-01

    Kiyosi Ito, the founder of stochastic calculus, is one of the few central figures of the twentieth century mathematics who reshaped the mathematical world. Today stochastic calculus is a central research field with applications in several other mathematical disciplines, for example physics, engineering, biology, economics and finance. The Abel Symposium 2005 was organized as a tribute to the work of Kiyosi Ito on the occasion of his 90th birthday. Distinguished researchers from all over the world were invited to present the newest developments within the exciting and fast growing field of stochastic analysis. The present volume combines both papers from the invited speakers and contributions by the presenting lecturers. A special feature is the Memoirs that Kiyoshi Ito wrote for this occasion. These are valuable pages for both young and established researchers in the field.

  3. 2nd ISAAC Congress

    CERN Document Server

    Gilbert, Robert; Kajiwara, Joji

    2000-01-01

    This book is the Proceedings of the Second ISAAC Congress. ISAAC is the acronym of the International Society for Analysis, its Applications and Computation. The president of ISAAC is Professor Robert P. Gilbert, the second named editor of this book, e-mail: gilbert@math.udel.edu. The Congress is world-wide valued so highly that an application for a grant has been selected and this project has been executed with Grant No. 11-56 from *the Commemorative Association for the Japan World Exposition (1970). The finance of the publication of this book is exclusively the said Grant No. 11-56 from *. Thus, a pair of each one copy of two volumes of this book will be sent to all contributors, who registered at the Second ISAAC Congress in Fukuoka, free of charge by the Kluwer Academic Publishers. Analysis is understood here in the broad sense of the word, includ­ ing differential equations, integral equations, functional analysis, and function theory. It is the purpose of ISAAC to promote analysis, its applications, and...

  4. 2nd Bozeman Conference

    CERN Document Server

    Lund, John

    1991-01-01

    This volume contains a collection of papers delivered by the partici­ pants at the second Conference on Computation and Control held at Mon­ tana State University in Bozeman, Montana from August 1-7, 1990. The conference, as well as this proceedings, attests to the vitality and cohesion between the control theorist and the numerical analyst that was adver­ tised by the first Conference on Computation and Control in 1988. The proceedings of that initial conference was published by Birkhiiuser Boston as the first volume of this same series entitled Computation and Control, Proceedings of the Bozeman Conference, Bozeman, Montana, 1988. Control theory and numerical analysis are both, by their very nature, interdisciplinary subjects as evidenced by their interaction with other fields of mathematics and engineering. While it is clear that new control or es­ timation algorithms and new feedback design methodologies will need to be implemented computationally, it is likewise clear that new problems in computation...

  5. 2nd INTERA Conference

    CERN Document Server

    2014-01-01

    This book presents the latest scientific research related to the field of Robotics. It involves different topics such as biomedicine, energy efficiency and home automation and robotics.  The book is written by technical experts and researchers from academia and industry working on robotics applications.The book could be used as supplementary material for courses related to Robotics and Domotics.

  6. 2nd SUMO Conference

    CERN Document Server

    Weber, Melanie

    2015-01-01

    This contributed volume contains the conference proceedings of the Simulation of Urban Mobility (SUMO) conference 2014, Berlin. The included research papers cover a wide range of topics in traffic planning and simulation, including open data, vehicular communication, e-mobility, urban mobility, multimodal traffic as well as usage approaches. The target audience primarily comprises researchers and experts in the field, but the book may also be beneficial for graduate students.  

  7. Design and development of a prototypical software for semi-automatic generation of test methodologies and security checklists for IT vulnerability assessment in small- and medium-sized enterprises (SME)

    Science.gov (United States)

    Möller, Thomas; Bellin, Knut; Creutzburg, Reiner

    2015-03-01

    The aim of this paper is to show the recent progress in the design and prototypical development of a software suite Copra Breeder* for semi-automatic generation of test methodologies and security checklists for IT vulnerability assessment in small and medium-sized enterprises.

  8. Diffusion of suprathermal electrons measured by means of ECRH and 2nd harmonic ECE O-mode

    International Nuclear Information System (INIS)

    In the study of anomalous transport in thermonuclear plasmas, the diffusion of suprathermal electrons deserves special attention. From certain energies onward electrons are effectively collisionless, and therefore follow the field lines. Thus, they can be used to probe the stochasticity of the magnetic field structure. For high energy, electrons are eventually insensitive to magnetic stochasticity as their curvature B-drift becomes larger than the radial correlation length of the turbulence. Hence, by studying the confinement of collisionless electrons in different energy ranges, both the level of magnetic turbulence and the radial correlation length can be established. A study of the confinement of suprathermal electrons has been reported by Kwon et al, who used measurements of hard X-ray in ASDEX. This study focussed on runaway electrons in the MeV-range, created in the start-up phase of the discharge. In this paper, we concentrate on the transport of suprathermal electrons with an energy of a few times Te. The advantages of this approach are that a) the curvature B-drift of these electrons is small, so that the transport is sensitive to small scale magnetic turbulence, and b) as we shall show, a local study of the diffusion of these electrons can be made using ECE spectroscopy. We describe experiments performed in the RTP tokamak, in which ECRH O-mode was launched from the low-field side. In this way, a population of suprathermals in the center of the pln of suprathermals in the center of the plasma is almost instantaneously raised in perpendicular energy. This population is diagnosed by ECE with a grating polychromator in the optically thin 2nd harmonic O-mode. (author) 2 refs., 4 figs

  9. A verification of the high density after contrast enhancement in the 2nd week in cerebroischemic lesion

    International Nuclear Information System (INIS)

    To determine the indication, it is necessary to make clear the relation among the Stage (time and course), the Strength, the Pathogenesis, and the Effects of the operation in these diseases (SSPE relation). In this report, we focused on the High Density of CT after the contrast enhancement in the cases of ischemic lesions (the High Density was named ''Ribbon H. D.''). Seventeen cases of Ribbon H. D. in fresh infarctions were verified concerning the time of the appearance of the H. D., the features of its location and nature, and the histological findings. The results were as follows: The Ribbon H. D. appeared in the early stage of infarctions, and had its peak density at the end of the 2nd week after the onset. The Ribbon H. D. was mostly located along the cortical line, showing a ribbon-like band. The Ribbon H. D. did not appear in the sharply demarcated coagulation necrosis in the early stage or in the defined Low Density (L. D.) in the late stage of infarctions. Although the Ribbon H. D. shows the extravasation of contrast media, it does not necessarily show the existence of the hemorrhagic infarction. Some part of the Ribbon H. D. changes to a well-defined L. D. and the rest of the part becomes relative isodensity in the late stage. This change corresponds to the change in the incomplete necrosis which is afterwards divided into a resolution with a cystic cavity and the glial replacement in the late stage. In conclusion, it is possible to understand that the Ribbot is possible to understand that the Ribbon H. D. corresponds to the lesion of an incomplete necrosis, with neovascularization, in the early stage of infarctions. Therefore, in addition to the present indication of a by-pass operation (TIA, RIND), this incomplete necrosis (Ribbon H. D.), its surrounding area and just before the appearance of the Ribbon H. D. might be another indication of the operation. (author)

  10. ENABLE -- A systolic 2nd level trigger processor for track finding and e/? discrimination for ATLAS/LHC

    International Nuclear Information System (INIS)

    The Enable Machine is a systolic 2nd level trigger processor for the transition radiation detector (TRD) of ATLAS/LHC. It is developed within the EAST/RD-11 collaboration at CERN. The task of the processor is to find electron tracks and to reject pion tracks according to the EAST benchmark algorithm in less than 10?s. Track are identified by template matching in a (?,z) region of interest (RoI) selected by a 1st level trigger. In the (?,z) plane tracks of constant curvature are straight lines. The relevant lines form mask templates. Track identification is done by histogramming the coincidences of the templates and the RoI data for each possible track. The Enable Machine is an array processor that handles tracks of the same slope in parallel, and tracks of different slope in a pipeline. It is composed of two units, the Enable histogrammer unit and the Enable z/?-board. The interface daughter board is equipped with a HIPPI-interface developed at JINR/-Dubna, and Xilinx 'corner turning' data converter chips. Enable uses programmable gate arrays (XILINX) for histogramming and synchronous SRAMs for pattern storage. With a clock rate of 40 MHz the trigger decision time is 6.5 ?s and the latency 7.0 ?s. The Enable machine is scalable in the RoI size as well as in the number of tracks processed. It can be adapted to different recognition tasks and detector setups. The prototype of the Enable Machine has been tested in a beam time of the RD6 collaboration at CERN in Octthe RD6 collaboration at CERN in October 1993

  11. The ratios of 2nd to 4th digit may be a predictor of schizophrenia in male patients.

    Science.gov (United States)

    Bolu, Abdullah; Oznur, Taner; Develi, Sedat; Gulsun, Murat; Aydemir, Emre; Alper, Mustafa; Toygar, Mehmet

    2015-05-01

    The production of androgens (mostly testosterone) during the early fetal stage is essential for the differentiation of the male brain. Some authors have suggested a relationship between androgen exposure during the prenatal period and schizophrenia. These two separate relationships suggest that digit length ratios are associated with schizophrenia in males. The study was performed in a university hospital between October 2012 and May 2013. One hundred and three male patients diagnosed with schizophrenia according to DSM-IV using SCID-I, and 100 matched healthy males, were admitted to the study. Scale for the Assessment of Positive Symptoms (SAPS), Scale for the Assessment of Negative Symptoms (SANS) and Brief Psychiatric Rating Scale (BPRS) were used to assess schizophrenia symptoms. The second digit (2D) and fourth digit (4D) asymmetry index (AI), and the right- and left-hand 2D:4D ratios were calculated. All parametric data in the groups were compared using an independent t-test. The predictive power of the AI was estimated by receiver operating characteristics analysis. The 2D:4D AI was statistically significantly lower in the patient group than the healthy control comparison group. There were significant differences between the schizophrenia and the control groups in respect of left 2D:4D and right 2D:4D. There was no correlation between AI, left, or right 2D:4D, BPRS, or SAPS in the schizophrenia group. However, there was a negative correlation between left 2nd digit (L2D):4D and the SANS score. Our findings support the view that the 2D:4D AI can be used as a moderate indicator of schizophrenia. Even more simply, the right or left 2D:4D can be used as an indicator. L2D:4D could indicate the severity of negative symptoms. Clin. Anat. 28:551-556, 2015. © 2015 Wiley Periodicals, Inc. PMID:25779956

  12. The Effects of Star Strategy of Computer-Assisted Mathematics Lessons on the Achievement and Problem Solving Skills in 2nd Grade Courses

    Directory of Open Access Journals (Sweden)

    Jale ?PEK

    2013-12-01

    Full Text Available The aim of research is to determine the effect of STAR strategy on 2nd grade students’ academic achievement and problem solving skills in the computer assisted mathematics instruction. 30 students who attend to 2nd grade course in a primary school in Ayd?n in 2010-2011 education year join the study group. The research has been taken place in 7 week. 3 types of tests have been used as means of data collecting tool, “Academic Achievement Test”, “Problem Solving Achievement Test” and “The Evaluation Form of Problem Solving Skills”. At the end of research students’ views about computer assisted mathematics instruction were evaluated. It has been examined that whether the differences between the scores of pre-test and post-test are statistically meaningful or not. According to the results, a positive increase on the academic achievement and problem solving skills has been determined at the end of the education carried out with STAR strategy.

  13. Comparative analysis of effectiveness of treatment with anti-TB drugs of the 1st and 2nd lines for children and adolescents with multidrug resistant tuberculosis

    Directory of Open Access Journals (Sweden)

    Tleukhan Abildaev

    2012-05-01

    Full Text Available The paper shows results of study on comparative treatment effectiveness in children and adolescents with from multi drug resistant tuberculosis MDR TB (2000-2008 treated with anti-TB drugs of the 2nd line (80 patients and 1st line (80 patients in the Kazakhstan. It was stated in patients with MDR TB that outcomes of treatment were successful in 91.2%, but relapse development of TB disease occurred in 12.7% of cases, and 5 (6.2% patients died (P ?0.05. Thus, patients with MDR TB need to be treated with anti-TB drugs of the 2nd line accordingly to their DST.

  14. Process of change in organisations through eHealth: 2nd International eHealth Symposium 2010, Stuttgart, Germany, June 7 - 8, 2010 ; Proceedings edited by Stefan Kirn

    OpenAIRE

    Kirn, Stefan

    2010-01-01

    Foreword: On behalf of the Organizing Committee, it is my pleasure to welcome you to Hohenheim, Stuttgart for the 2nd International eHealth Symposium which is themed 'Process of change in organisations through eHealth'. Starting with the inaugural event in 2009, which took place in Turku, Finland, we want to implement a tradition of international eHealth symposia. The presentations and associated papers in this proceedings give a current and representative outline of technical options, applic...

  15. Solid solutions in the HfO2-Nd2O3(Pr2O3, Tb2O3) systems with mixed conductivity

    International Nuclear Information System (INIS)

    X-ray diffraction and electric conductivity methods were used to investigate solid solutions of monoclinic structure, cubic fluorite type and pyrochlore type solid solutions in HfO2-Nd2O3(Pr2O3, Tb2O3) systems. Tetragonal solid solutions on the base of HfO2 have been revealed at temperatures above 1650 deg C as well

  16. 2nd Annual Workshop Proceedings of the Collaborative Project "Redox Phenomena Controlling Systems" (7th EC FP CP RECOSY) (KIT Scientific Reports ; 7557)

    OpenAIRE

    Buckau, Gunnar; Kienzler, Bernhard; Duro, Lara; Grive?, Mireia; Montoya, Vanessa; 3

    2010-01-01

    These are proceedings of the 2nd Annual Workshop of the EURATOM FP7 Collaborative Project "Redox Phenomena Controlling System", held in Larnaca (Cyprus) 16th to 19th March 2010. The project deals with the impact of redox processes on the long-term safety of nuclear waste disposal. The proceedings have six workpackage overview contributions, and 21 reviewed scientific-technical short papers. The proceedings document the scientific-technical progress of the second project year.

  17. Proceedings of the 2nd International Workshop on Groundwater Risk Assessment at Contaminated Sites (GRACOS) and Integrated Soil and Water Protection (SOWA)

    OpenAIRE

    Universität / Zentrum für Angewandte Geowissenschaften - Center for Applied Geoscience

    2003-01-01

    The background given for the 2nd International Workshop on Groundwater Risk Assessment at Contaminated Sites and Integrated Soil and Water Protection is: - contaminated sites - large-scale diffuse pollution of soils from disposal of non-regulated waste on land, agricultural activities, atmospheric deposition of pollutants, etc. A major risk at most contaminated sites is that of groundwater pollution by organic and inorganic compounds. Since complete restoration of all these contami...

  18. Results of the 2nd periodical inspection of the asphalt solidification facility and the incinerator in Unit 2 of the Sendai Nuclear Power Station

    International Nuclear Information System (INIS)

    The 2nd periodical inspection was carried out on the asphalt solidification facility and the incinerator in Unit 2 of the Sendai Nuclear Power Station from November 5 to 28, 1985. Inspection was made in radiation control facility and disposal facility. By external appearance, disassembly, function and performance tests there were observed no abnormalities. The personnel exposure doses during the inspection were below the permissible level. In the inspection, improvement etc. works were not done. (Mori, K.)

  19. All-optical 1st- and 2nd-order differential equation solvers with large tuning ranges using Fabry-Pérot semiconductor optical amplifiers.

    Science.gov (United States)

    Chen, Kaisheng; Hou, Jie; Huang, Zhuyang; Cao, Tong; Zhang, Jihua; Yu, Yuan; Zhang, Xinliang

    2015-02-01

    We experimentally demonstrate an all-optical temporal computation scheme for solving 1st- and 2nd-order linear ordinary differential equations (ODEs) with tunable constant coefficients by using Fabry-Pérot semiconductor optical amplifiers (FP-SOAs). By changing the injection currents of FP-SOAs, the constant coefficients of the differential equations are practically tuned. A quite large constant coefficient tunable range from 0.0026/ps to 0.085/ps is achieved for the 1st-order differential equation. Moreover, the constant coefficient p of the 2nd-order ODE solver can be continuously tuned from 0.0216/ps to 0.158/ps, correspondingly with the constant coefficient q varying from 0.0000494/ps2 to 0.006205/ps2. Additionally, a theoretical model that combining the carrier density rate equation of the semiconductor optical amplifier (SOA) with the transfer function of the Fabry-Pérot (FP) cavity is exploited to analyze the solving processes. For both 1st- and 2nd-order solvers, excellent agreements between the numerical simulations and the experimental results are obtained. The FP-SOAs based all-optical differential-equation solvers can be easily integrated with other optical components based on InP/InGaAsP materials, such as laser, modulator, photodetector and waveguide, which can motivate the realization of the complicated optical computing on a single integrated chip. PMID:25836230

  20. A free software for pore-scale modelling: solving Stokes equation for velocity fields and permeability values in 3D pore geometries

    Science.gov (United States)

    Gerke, Kirill; Vasilyev, Roman; Khirevich, Siarhei; Karsanina, Marina; Collins, Daniel; Korost, Dmitry; Mallants, Dirk

    2015-04-01

    In this contribution we introduce a novel free software which solves the Stokes equation to obtain velocity fields for low Reynolds-number flows within externally generated 3D pore geometries. Provided with velocity fields, one can calculate permeability for known pressure gradient boundary conditions via Darcy's equation. Finite-difference schemes of 2nd and 4th order of accuracy are used together with an artificial compressibility method to iteratively converge to a steady-state solution of Stokes' equation. This numerical approach is much faster and less computationally demanding than the majority of open-source or commercial softwares employing other algorithms (finite elements/volumes, lattice Boltzmann, etc.) The software consists of two parts: 1) a pre and post-processing graphical interface, and 2) a solver. The latter is efficiently parallelized to use any number of available cores (the speedup on 16 threads was up to 10-12 depending on hardware). Due to parallelization and memory optimization our software can be used to obtain solutions for 300x300x300 voxels geometries on modern desktop PCs. The software was successfully verified by testing it against lattice Boltzmann simulations and analytical solutions. To illustrate the software's applicability for numerous problems in Earth Sciences, a number of case studies have been developed: 1) identifying the representative elementary volume for permeability determination within a sandstone sample, 2) derivation of permeability/hydraulic conductivity values for rock and soil samples and comparing those with experimentally obtained values, 3) revealing the influence of the amount of fine-textured material such as clay on filtration properties of sandy soil. This work was partially supported by RSF grant 14-17-00658 (pore-scale modelling) and RFBR grants 13-04-00409-a and 13-05-01176-a.

  1. Software Metrics and Software Metrology

    CERN Document Server

    Abran, Alain

    2010-01-01

    Most of the software measures currently proposed to the industry bring few real benefits to either software managers or developers. This book looks at the classical metrology concepts from science and engineering, using them as criteria to propose an approach to analyze the design of current software measures and then design new software measures (illustrated with the design of a software measure that has been adopted as an ISO measurement standard). The book includes several case studies analyzing strengths and weaknesses of some of the software measures most often quoted. It is meant for sof

  2. PREFACE: The 2nd International Conference on Geological, Geographical, Aerospace and Earth Sciences 2014 (AeroEarth 2014)

    Science.gov (United States)

    Lumban Gaol, Ford; Soewito, Benfano

    2015-01-01

    The 2nd International Conference on Geological, Geographical, Aerospace and Earth Sciences 2014 (AeroEarth 2014), was held at Discovery Kartika Plaza Hotel, Kuta, Bali, Indonesia during 11 - 12 October 2014. The AeroEarth 2014 conference aims to bring together researchers and engineers from around the world. Through research and development, earth scientists have the power to preserve the planet's different resource domains by providing expert opinion and information about the forces which make life possible on Earth. Earth provides resources and the exact conditions to make life possible. However, with the advent of technology and industrialization, the Earth's resources are being pushed to the brink of depletion. Non-sustainable industrial practices are not only endangering the supply of the Earth's natural resources, but are also putting burden on life itself by bringing about pollution and climate change. A major role of earth science scholars is to examine the delicate balance between the Earth's resources and the growing demands of industrialization. Through research and development, earth scientists have the power to preserve the planet's different resource domains by providing expert opinion and information about the forces which make life possible on Earth. We would like to express our sincere gratitude to all in the Technical Program Committee who have reviewed the papers and developed a very interesting Conference Program as well as the invited and plenary speakers. This year, we received 98 papers and after rigorous review, 17 papers were accepted. The participants come from eight countries. There are four Parallel Sessions and two invited Speakers. It is an honour to present this volume of IOP Conference Series: Earth and Environmental Science (EES) and we deeply thank the authors for their enthusiastic and high-grade contributions. Finally, we would like to thank the conference chairmen, the members of the steering committee, the organizing committee, the organizing secretariat and the financial support from the conference sponsors that allowed the success of AeroEarth 2014. The Editors of the AeroEarth 2014 Proceedings Dr. Ford Lumban Gaol Dr. Benfano Soewito

  3. Cloud Occurrence Measurements Over Sea during the 2nd 7 Southeast Asian Studies (7SEAS) Field Campaign in Palawan Archipelago

    Science.gov (United States)

    Antioquia, C. T.; Uy, S. N.; Caballa, K.; Lagrosas, N.

    2014-12-01

    Ground based sky imaging cameras have been used to measure cloud cover over an area to aid in radiation budget models. During daytime, certain clouds tend to help decrease atmospheric temperature by obstructing sunrays in the atmosphere. Thus, the detection of clouds plays an important role in the formulation of radiation budget in the atmosphere. In this study, a wide angled sky imager (GoPro Hero 2) was brought on board M/Y Vasco to detect and quantity cloud occurrence over sea during the 2nd 7SEAS field campaign. The camera is just a part of a number of scientific instruments used to measure weather, aerosol chemistry and solar radiation among others. The data collection started during the departure from Manila Bay on 05 September 2012 and went on until the end of the cruise (29 September 2012). The camera was placed in a weather-proof box that is then affixed on a steel mast where other instruments are also attached during the cruise. The data has a temporal resolution of 1 minute, and each image is 500x666 pixels in size. Fig. 1a shows the track of the ship during the cruise. The red, blue, hue, saturation, and value of the pixels are analysed for cloud occurrence. A pixel is considered to "contain" thick cloud if it passes all four threshold parameters (R-B, R/B, R-B/R+B, HSV; R is the red pixel color value, blue is the blue pixel color value, and HSV is the hue saturation value of the pixel) and considered thin cloud if it passes two or three parameters. Fig. 1b shows the daily analysis of cloud occurrence. Cloud occurrence here is quantified as the ratio of the pixels with cloud to the total number of pixels in the data image. The average cloud cover for the days included in this dataset is 87%. These measurements show a big contrast when compared to cloud cover over land (Manila Observatory) which is usually around 67%. During the duration of the cruise, only one day (September 6) has an average cloud occurrence below 50%; the rest of the days have averages of 66% or higher - 98% being the highest. This result would then give a general trend of how cloud occurrences over land and over sea differ in the South East Asian region. In this study, these cloud occurrences come from local convection and clouds brought about by Southwest Monsoon winds.

  4. 2nd State of the Onion: Larry Wall's Keynote Address at the Second Annual O'Reilly Perl Conference

    Science.gov (United States)

    This page, part of publisher O'Reilly & Associates' Website devoted to the Perl language, contains a transcript of Larry Wall's keynote address at the second annual O'Reilly Perl Conference, which was held August 17-20, 1998, in San Jose, California. In his keynote address, Larry Wall, the original author of the Perl programming language, provides a thought-provoking (and entertaining) mix of philosophy and technology. Wall's talk touches on the future of the Perl language, the relationship of the free software community to commercial software developers, chaos, complexity, and human symbology. The page also includes copies of graphics used during the keynote.

  5. UWB Tracking Software Development

    Science.gov (United States)

    Gross, Julia; Arndt, Dickey; Ngo, Phong; Phan, Chau; Dusl, John; Ni, Jianjun; Rafford, Melinda

    2006-01-01

    An Ultra-Wideband (UWB) two-cluster Angle of Arrival (AOA) tracking prototype system is currently being developed and tested at NASA Johnson Space Center for space exploration applications. This talk discusses the software development efforts for this UWB two-cluster AOA tracking system. The role the software plays in this system is to take waveform data from two UWB radio receivers as an input, feed this input into an AOA tracking algorithm, and generate the target position as an output. The architecture of the software (Input/Output Interface and Algorithm Core) will be introduced in this talk. The development of this software has three phases. In Phase I, the software is mostly Matlab driven and calls C++ socket functions to provide the communication links to the radios. This is beneficial in the early stage when it is necessary to frequently test changes in the algorithm. Phase II of the development is to have the software mostly C++ driven and call a Matlab function for the AOA tracking algorithm. This is beneficial in order to send the tracking results to other systems and also to improve the tracking update rate of the system. The third phase is part of future work and is to have the software completely C++ driven with a graphics user interface. This software design enables the fine resolution tracking of the UWB two-cluster AOA tracking system.

  6. Software reliability

    CERN Document Server

    Bendell, A

    1986-01-01

    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  7. Software Complexity Methodologies & Software Security

    OpenAIRE

    Masoud Rafighi; Nasser Modiri

    2011-01-01

    It is broadly clear that complexity is one of the software natural features. Software natural complexity and software requirement functionality are two inseparable part and they have special range. measurement complexity have explained with using the MacCabe and Halsted models and with an example discuss about software complexity in this paper Flow metric information Henry and Kafura, complexity metric system Agresti-card-glass, design metric in item’s level have compared and peruse then ca...

  8. TESTING FOR OBJECT ORIENTED SOFTWARE

    Directory of Open Access Journals (Sweden)

    Jitendra S. Kushwah

    2011-02-01

    Full Text Available This paper deals with design and development of an automated testing tool for Object Oriented Software. By an automated testing tool, we mean a tool that automates a part of the testing process. It can include one or more of the following processes: test strategy eneration, test case generation, test case execution, test data generation, reporting and logging results. By object-oriented software we mean a software designed using OO approach and implemented using a OO language. Testing of OO software is different from testing software created using procedural languages. Severalnew challenges are posed. In the past most of the methods for testing OO software were just a simple extension of existing methods for conventional software. However, they have been shown to be not very appropriate. Hence, new techniques have been developed. This thesis work has mainly focused on testing design specifications for OO software. As described later, there is a lack of specification-based testing tools for OO software. An advantage of testing software specifications as compared to program code is that specifications aregenerally correct whereas code is flawed. Moreover, with software engineering principles firmly established in the industry, most of the software developed nowadays follow all the steps of Software Development Life Cycle (SDLC. For this work, UML specifications created in Rational Rose are taken. UML has become the de-factostandard for analysis and design of OO software. Testing is conducted at 3 levels: Unit, Integration and System. At the system level there is no difference between the testing techniques used for OO software and other software created using a procedural language, and hence, conventional techniques can be used. This tool provides features for testing at Unit (Class level as well as Integration level. Further a maintenance-level component has also been incorporated. Results of applying this tool to sample Rational Rose files have been ncorporated, and have been found to be satisfactory.

  9. New glycoproteomics software, GlycoPep Evaluator, generates decoy glycopeptides de novo and enables accurate false discovery rate analysis for small data sets.

    Science.gov (United States)

    Zhu, Zhikai; Su, Xiaomeng; Go, Eden P; Desaire, Heather

    2014-09-16

    Glycoproteins are biologically significant large molecules that participate in numerous cellular activities. In order to obtain site-specific protein glycosylation information, intact glycopeptides, with the glycan attached to the peptide sequence, are characterized by tandem mass spectrometry (MS/MS) methods such as collision-induced dissociation (CID) and electron transfer dissociation (ETD). While several emerging automated tools are developed, no consensus is present in the field about the best way to determine the reliability of the tools and/or provide the false discovery rate (FDR). A common approach to calculate FDRs for glycopeptide analysis, adopted from the target-decoy strategy in proteomics, employs a decoy database that is created based on the target protein sequence database. Nonetheless, this approach is not optimal in measuring the confidence of N-linked glycopeptide matches, because the glycopeptide data set is considerably smaller compared to that of peptides, and the requirement of a consensus sequence for N-glycosylation further limits the number of possible decoy glycopeptides tested in a database search. To address the need to accurately determine FDRs for automated glycopeptide assignments, we developed GlycoPep Evaluator (GPE), a tool that helps to measure FDRs in identifying glycopeptides without using a decoy database. GPE generates decoy glycopeptides de novo for every target glycopeptide, in a 1:20 target-to-decoy ratio. The decoys, along with target glycopeptides, are scored against the ETD data, from which FDRs can be calculated accurately based on the number of decoy matches and the ratio of the number of targets to decoys, for small data sets. GPE is freely accessible for download and can work with any search engine that interprets ETD data of N-linked glycopeptides. The software is provided at https://desairegroup.ku.edu/research. PMID:25137014

  10. PREFACE: 2nd International Conference on Particle Physics in memoriam Engin Ar?k and her Colleagues

    Science.gov (United States)

    Çetin, Serkant Ali; Jenni, Peter; Erkcan Özcan, Veysi; Nefer ?eno?uz, Vedat

    2012-02-01

    The 2nd International Conference on Particle Physics in memoriam Engin Ar?k and her Colleagues: Fatma ?enel Boyda?, ?skender Hikmet, Mustafa Fidan, Berkol Do?an and Engin Abat was held at Do?u? University, ?stanbul, Turkey on 20-25 June 2011. The conference was organized jointly by the Do?u? and Bo?aziçi Universities, with support from CERN and the Turkish Academy of Sciences. This was the second International Conference on Particle Physics (ICPP) organized in memory of Engin Ar?k and her Colleagues who lost their lives in the tragic plane accident on November 30 2007, on their way to the workshop of the Turkish Accelerator Center (TAC) Project. The first of this conference series was held on 27-31 October 2008 at Bo?aziçi University, ?stanbul, Turkey. The conference is intended to be repeated every two years in Istanbul as a Conference Series under the name 'ICPP-Istanbul'. Professor Engin Ar?k had a pioneering role in experimental particle physics in Turkey, and was an inspiring teacher to many colleagues. She led the Turkish participation in experiments at CERN such as CHARMII, SMC, CHORUS, ATLAS and CAST. One of her latest involvements was in the national project to design the Turkish Accelerator Center with the collaboration of 10 Turkish universities including Do?u? and Bo?aziçi. Our dear colleagues not only participated in the TAC project but also collaborated on the ATLAS (E Ar?k, E Abat and B Do?an) and CAST (E Ar?k, F ?enel Boyda?, ? Hikmet and B Do?an) experiments. We believe that the ICPP-Istanbul conference series has been, and will always be, a way to commemorate them in a most appropriate context. The topics covered in ICPP-Istanbul-II were 'LHC Physics and Tevatron Results', 'Neutrinos and Dark Matter', 'Particle Factories' and 'Accelerator Physics and Future TeV Scale Colliders'. The main emphasis was on the recent experimental results in high-energy physics with discussions on expectations from existing or future experiments. There were 20 plenary and 35 contributed talks at the conference, and a majority of these presentations are included in this proceedings. We are grateful to all speakers, the collaborations represented, and all members of the advisory and organizing committees for their invaluable contributions which enabled the conference to reach such a high scientific quality with many exciting results and discussions, making it a big success. Serkant Ali Çetin Chair of the Organizing Committee Peter Jenni Chair of the Scientific Advisory Committee Scientific Advisory Committee Organizing Committee Ovsat AbdinovANAS, AzerbaijanKazem AziziDo?u? U. Metin Ar?kBo?aziçi U., TurkeySerkant Ali Çetin*Do?u? U. Albert De RoeckCERN, SwitzerlandZuhal KaplanBo?aziçi U. Daniel DenegriCEA, FranceÖzgül Kurtulu?Do?u? U. Samim ErhanUCLA, USAErkcan ÖzcanBo?aziçi U. Dan GreenFNAL, USANefer ?eno?uzDo?u? U. Erhan GülmezBo?aziçi U., Turkey?smail UmanDo?u? U. Rolf HeuerCERN, Switzerland Peter Jenni*CERN, Switzerland*Committee Chairs Max KleinLiverpool U., UK Livio MapelliCERN, Switzerland Tatsuya NakadaEPFL, Switzerland Ya?ar ÖnelIowa U., USA Gülsen ÖnengütÇukurova U., Turkey Ken PeachOxford U., UK Christoph RembserCERN, Switzerland Leonid RivkinPSI, Switzerland Yannis SemertzidisBNL, USA Saleh SultansoyTOBB ETU, Turkey Gökhan ÜnelUCI, USA Konstantin ZioutasPatras U., Greece Organizing InstitutionsSupporting Institutions DogusCERN Do?u? UniversityCERN - European Organization for Nuclear Research BogaziciTUBA Bo?aziçi UniversityTÜBA - The Turkish Academy of Sciences

  11. GREEN SOFTWARE ENGINEERING PROCESS : MOVING TOWARDS SUSTAINABLE SOFTWARE PRODUCT DESIGN

    OpenAIRE

    Shantanu Ray

    2013-01-01

    The Software development lifecycle (SDLC) currently focuses on systematic execution and maintenance of software by dividing the software development process into various phases that include requirements-gathering, design, implementation, testing, deployment and maintenance. The problem here is that certain important decisions taken in these phases like use of paper, generation of e-Waste, power consumption and increased carbon foot print by means of travel, Air-conditioning etc may harm the e...

  12. Software Radio

    Directory of Open Access Journals (Sweden)

    Varun Sharma

    2010-05-01

    Full Text Available This paper aims to provide an overview on rapidly growing technology in the radio domain which overcomes the drawbacks suffered by the conventional analog radio. This is the age of Software radio – the technology which tries to transform the hardware radio transceivers into smart programmable devices which can fit into various devices available in today’s rapidly evolving wireless communication industry. This new technology has some or the entire physical layer functions software defined. All of the waveform processing, including the physical layer, of a wireless device moves into the software. An ideal Software Radio provides improved device flexibility, software portability, and reduced development costs. This paper tries to get into the details of all this. It takes one through a brief history of conventional radios, analyzes the drawbacks and then focuses on the Software radio in overcoming these short comings.

  13. Software engineering

    CERN Document Server

    Zielinski, K

    2005-01-01

    The capability to design quality software and implement modern information systems is at the core of economic growth in the 21st century. Nevertheless, exploiting this potential is only possible when adequate human resources are available and when modern software engineering methods and tools are used. The recent years have witnessed rapid evolution of software engineering methodologies, including the creation of new platforms and tools which aim to shorten the software design process, raise its quality and cut down its costs. This evolution is made possible through ever-increasing knowledge of software design strategies as well as through improvements in system design and code testing procedures. At the same time, the need for broad access to high-performance and high-throughput computing resources necessitates the creation of large-scale, interactive information systems, capable of processing millions of transactions per seconds. These systems, in turn, call for new, innovative distributed software design a...

  14. Software Economies

    OpenAIRE

    Bacon, David F.; Bokelberg, Eric; Chen, Yiling; Kash, Ian; Parkes, David C.; Rao, Malvika; Sridharan, Manu

    2010-01-01

    Software construction has typically drawn on engineering metaphors like building bridges or cathedrals, which emphasize architecture, specification, central planning, and determinism. Approaches to correctness have drawn on metaphors from mathematics, like formal proofs. However, these approaches have failed to scale to modern software systems, and the problem keeps getting worse. We believe that the time has come to completely re-imagine the creation of complex software, drawing on systems i...

  15. The comparison of salivary level of estrogen and progesterone in 1st , 2nd and 3rd trimester in pregnant women with and without geographic tongue

    Science.gov (United States)

    Ghalayani, Parichehr; Tavangar, Atefeh; Nilchian, Firoozeh; Khalighinejad, Navid

    2013-01-01

    Background: Geographic tongue (GT) was first reported as a wandering rash of the tongue in 1831; however, its etiopathogenesis remains unclear. Increased prevalence of GT has been documented in the pregnancy. The aim of this study was to compare the level of salivary estrogen and progesterone in pregnant women with and without GT. Materials and Methods: This analytical-descriptive study consisted of 26 pregnant women (13 with GT, 13 without GT) with an age range between 18 years and 45 years. The estrogen and progesterone level was measured during 1st , 2nd and 3rd trimester of pregnancy. Saliva sampling was performed to determine the level of sex hormones. The samples were stored at -80°C and determined by Eliza method. The results were analyzed by t-test and repeated measure ANOVA (? = 0.05). Results: The mean level of estrogen for control and case group was 49.4and 52.33 in the 1st , 71.05 and 74.12 in the 2nd and 109.1 and 112.16 in the 3rd trimester respectively. The mean level of progesterone was 0.72 and 0.72 in the 1st , 1.14 and 1.21 in the 2nd and 1.3 and 1.28 in the 3rd trimester of pregnancy for the control and case groups respectively. Even though, there was no significant difference regarding the level of sex hormones between case and control groups (P < 0.05), but the difference between the level of these hormones during 3 trimesters of pregnancy was significant in each group (P = 0.001). Conclusion: The level of sex hormones is not the only etiologic factor of GT in pregnant women, but other factors such as genetic potential, human leukocyte antigen marker and stress may aggravate the incidence of this lesion. PMID:24348617

  16. Analysis of Polish writing on the history of physical education and sports in the North-Eastern borderlands of the 2nd republic

    Directory of Open Access Journals (Sweden)

    Eligiusz Ma?olepszy

    2013-05-01

    Full Text Available The aim of this paper is presentation of the up-to-date state of research on physical education and sports in the North-Eastern Borderlands of the 2nd Republic based on analysis of Polish literatureon the subject. In the sense of territorial scope, the paper covers the areas of the Polesie, Novogrodek and Vilnius voivodeships.As for the scope of studies on the history of physical education and sports in the North-Eastern Borderlands of the 2nd Republic, the most cognitively significant is the work by Laskiewicz on „Kultura fizyczna na Wile?szczy?nie w latach 1900–1939. Zarys monograficznydziejów” (Physical Culture in the Region of Vilnius in the Years 1900–1939. An Outline of Monographic History. The history of physical culture in rural areas were fairly well drawn up. Interms of historiography, there are publications presenting physical education and sports in urban areas. The publications mainly refer to physical activity in larger towns and cities, e.g. in Baranowicze, Brest-on-Bug, Lida, Novogrodek and in Vilnius. In terms of voivodeships, papers on physical education and sports in the Region of Vilnius significantly predominate. The presented analysis of the state of research – in reference to Polish writings – shows the necessity to supplement the preliminary archival researchof the sources – in order to prepare a monograph on „Dziejów wychowania fizycznego i sportu na Kresach Pó?nocno-Wschodnich II Rzeczypospolitej” (the History of Physical Education and Sports inthe North – Eastern Borderlands of the 2nd Republic. A preliminary archival research should also be conducted in the archives kept by Byelorussia and Lithuania.

  17. BioTfueL Project: Targeting the Development of Second-Generation Biodiesel and Biojet Fuels Le projet BioTfueL : un projet de développement de biogazole et biokérosène de 2 génération

    OpenAIRE

    -c, Viguie? J.; Ullrich N.; Porot P.; Bournay L.; Hecquet M.; Rousseau J.

    2013-01-01

    2nd generation biofuels will have an important part to take in the energy transition as far as fuels are concerned. Using non edible biomass, they will avoid any direct competition with food usage. Within 2nd generation biofuels, the BTL route consists in the production of middle distillates (Diesel and jet fuel) via gasification and Fischer-Tropsch (FT) synthesis. These fuels are called “drop in” fuels; this means that to be used they technically do not request any modification in t...

  18. Inventing software

    CERN Document Server

    Nichols, Kenneth

    1998-01-01

    Since the introduction of personal computers, software has emerged as a driving force in the global economy and a major industry in its own right. During this time, the U.S. government has reversed its prior policy against software patents and is now issuing thousands of such patents each year, provoking heated controversy among programmers, lawyers, scholars, and software companies. This book is the first to step outside of the highly-polarized debate and examine the current state of the law, its suitability to the realities of software development, and its implications for day-to-day softwa

  19. Proceedings of the 2nd NUCEF international symposium NUCEF`98. Safety research and development of base technology on nuclear fuel cycle

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-03-01

    This volume contains 68 papers presented at the 2nd NUCEF International Symposium NUCEF`98 held on 16-17 November 1998, in Hitachinaka, Japan, following the 1st symposium NUCEF`95 (Proceeding: JAERI-Conf 96-003). The theme of this symposium was `Safety Research and Development of Base Technology on Nuclear Fuel Cycle`. The papers were presented in oral and poster sessions on following research fields: (1) Criticality Safety, (2) Reprocessing and Partitioning, (3) Radioactive Waste Management. The 68 papers are indexed individually. (J.P.N.)

  20. 2nd (final) IAEA research co-ordination meeting on 'charge exchange cross section data for fusion plasma studies'. Summary report

    International Nuclear Information System (INIS)

    The proceedings and conclusions of the 2nd Research Co-ordination Meeting on 'Charge Exchange Cross Section Data for Fusion Plasma Studies', held on September 25 and 26, 2000 at the IAEA Headquarters in Vienna, are briefly described. This report includes a summary of the presentations made by the meeting participants and a review of the accomplishments of the Co-ordinated Research Project (CRP). In addition, short summaries from the participants are included indicating the specific research completed in support of this CRP. (author)

  1. 2nd (final) IAEA research co-ordination meeting on 'plasma-material interaction data for mixed plasma facing materials in fusion reactors'. Summary report

    International Nuclear Information System (INIS)

    The proceedings and conclusions of the 2nd Research Co-ordination Meeting on 'Plasma-Material Interaction Data for Mixed Plasma Facing Materials in Fusion Reactors', held on October 16 and 17, 2000 at the IAEA Headquarters in Vienna, are briefly described. This report includes a summary of the presentations made by the meeting participants and a review of the accomplishments of the Co-ordinated Research Project (CRP). In addition, short summaries from the participants are included indicating the specific research completed in support of this CRP. (author)

  2. Observation of a single thermoluminescence glow peak described by kinetics more general than the usual 1st and 2nd order kinetics

    Energy Technology Data Exchange (ETDEWEB)

    Hornyak, W.F.; Levy, P.W.; Kierstead, J.A.

    1984-01-01

    It has been shown that at least one thermoluminescence (TL) system, CaF/sub 2/:Mn, exhibits a single glow peak that is not described by the well known 1st or 2nd order TL kinetics. However, the glow peak is described by the more general kinetic expression from which the well-known kinetics are derived. In addition to determining the usual kinetic parameters, E and s, it has been shown that the retrapping to recombination cross section ratio, in the CaF/sub 2/:Mn TL system, is roughly 10/sup -5/.

  3. Phase equilibria and crystal chemistry of the CaO–1/2 Nd2O3–CoOz system at 885 °C in air

    International Nuclear Information System (INIS)

    The phase diagram of the CaO–1/2 Nd2O3–CoOz system at 885 °C in air has been determined. The system consists of two calcium cobaltate compounds that have promising thermoelectric properties, namely, the 2D thermoelectric oxide solid solution, (Ca3?xNdx)Co4O9?z (0?x?0.5), which has a misfit layered structure, and Ca3Co2O6 which consists of 1D chains of alternating CoO6 trigonal prisms and CoO6 octahedra. Ca3Co2O6 was found to be a point compound without the substitution of Nd on the Ca site. The reported Nd2CoO4 phase was not observed at 885 °C. A ternary (Ca1?xNd1+x)CoO4?z (x=0) phase, or (CaNdCo)O4?z, was found to be stable at this temperature. A solid solution region of distorted perovskite (Nd1?xCax)CoO3?z (0?x?0.25, space group Pnma) was established. In the peripheral binary systems, while a solid solution region was identified for (Nd1?xCax)2O3?z (0?x?0.2), Nd was not found to substitute in the Ca site of CaO. Six solid solution tie-line regions and six three-phase regions were determined in the CaO–Nd2O3–CoOz system in air. - Graphical abstract: Phase diagram of the 1/2 Nd2O3–CaO–CoOx system at 885 °C, showing the limits of various solid solutions, and the tie-line relationships of various phases. - Highlights: • Phase diagram of the CaO–1/2 Nd2O3–CoOz system constructed. • System consists of thermoelectric oxide (Ca3?xNdx)Co4O9?z (0?x?0.5). • Structures of (Nd1?xCax)CoO3?z and (CaNdCo)O4?z determined

  4. Proceedings of the 2nd NUCEF international symposium NUCEF'98. Safety research and development of base technology on nuclear fuel cycle

    International Nuclear Information System (INIS)

    This volume contains 68 papers presented at the 2nd NUCEF International Symposium NUCEF'98 held on 16-17 November 1998, in Hitachinaka, Japan, following the 1st symposium NUCEF'95 (Proceeding: JAERI-Conf 96-003). The theme of this symposium was 'Safety Research and Development of Base Technology on Nuclear Fuel Cycle'. The papers were presented in oral and poster sessions on following research fields: (1) Criticality Safety, (2) Reprocessing and Partitioning, (3) Radioactive Waste Management. The 68 papers are indexed individually. (J.P.N.)

  5. Preseismic oscillating electric field "strange attractor like" precursor, of T = 6 months, triggered by Ssa tidal wave. Application on large (Ms > 6.0R) EQs in Greece (October 1st, 2006 - December 2nd, 2008)

    CERN Document Server

    Thanassoulas, C; Verveniotis, G; Zymaris, N

    2009-01-01

    In this work the preseismic "strange attractor like" precursor is studied, in the domain of the Earth's oscillating electric field for T = 6 months. It is assumed that the specific oscillating electric field is generated by the corresponding lithospheric oscillation, triggered by the Ssa tidal wave of the same wave length (6 months) under excess strain load conditions met in the focal area of a future large earthquake. The analysis of the recorded Earth's oscillating electric field by the two distant monitoring sites of PYR and HIO and for a period of time of 26 months (October 1st, 2006 - December 2nd, 2008) suggests that the specific precursor can successfully resolve the predictive time window in terms of months and for a "swarm" of large EQs (Ms > 6.0R), in contrast to the resolution obtained by the use of electric fields of shorter (T = 1, 14 days, single EQ identification) wave length. More over, the fractal character of the "strange attractor like" precursor in the frequency domain is pointed out. Fina...

  6. Software management issues

    International Nuclear Information System (INIS)

    The difficulty of managing the software in large HEP collaborations appears to becoming progressively worst with each new generation of detector. If one were to extrapolate to the SSC, it will become a major problem. This paper explores the possible causes of the difficulty and makes suggestions on what corrective actions should be taken

  7. A feasible study of docetaxel/nedaplatin combined chemotherapy for relapsed or refractory esophageal cancer patients as a 2nd-line chemotherapy

    International Nuclear Information System (INIS)

    As a 2nd-line treatment for relapsed or refractory esophageal cancer patients after chemoradiotherapy, we performed a combination chemotherapy of docetaxel (DOC)/nedaplatin (CDGP) for 11 patients. Intravenous drip infusion of DOC 30 mg/m2 and CDGP 30 mg/m2 on days 1, 8 and 15, and 4 weeks treatment was assumed as 1 cycle. We treated 8 of 11 patients with more than 2 cycles, and 4 of 8 patients were treated with radiation therapy (RT). The effects by Response Evaluation Criteria In Solid Tumor (RECIST) revealed partial response (PR) in 2 patients (50%), stable disease (SD) in 1 patient and progress disease (PD) in 1 patient without RT, and PR in 3 patients and not effective in 1 patient with RT, respectively. There was no treatment-related death nor adverse event of grade 4. The Hematological toxicities of leukopenia of grade 3 were observed in 3 patients. Non-hematological toxicities more than grade 3 were not observed. The combination chemotherapy of DOC/CDGP was concluded to be safe and effective for relapsed or refractory esophageal cancer patients as a 2nd-line treatment. (author)

  8. A study on trait anger – anger expression and friendship commitment levels of primary school 2nd stage students who play – do not play sports

    Directory of Open Access Journals (Sweden)

    Hüseyin K?r?mo?lu

    2010-09-01

    Full Text Available The aim of this research was to investigate the state of trait anger-anger expression and friendship commitment levels depending on the fact whether 2nd-stage-students in primary schools played sports or not.Personal Information Form, Trait Anger-Anger Expression Scales and Inventory of Peer Attachment were used in order to collect the data.The population of the research was consisted of the students who studied in 2nd stage of 40 primary state schools that belonged to National Education Directorate of Hatay Province between 2009-2010 academic year. Sample group was made up by 853 students of 21 primary schools who were selected from the population (262 boy students and 149 girl students who played sports as registered players; 233 boy students and 209 girl students who did not play sports..To sum up; the comparison of the scores of trait anger and external anger of the participant students who played sports yielded a statistically significant difference in terms of sex variable (p< 0.05. As for the sedentary group, boys had higher scores of internal anger and external anger than girls. In the comparison of the scores of friendship commitment in sedentary students in terms of sex variable, it was found out that there was a statistically significant difference between girls and boys, which was in favour of boys (p<0.05.

  9. Silverlight 4 Business Intelligence Software

    CERN Document Server

    Czernicki, Bart

    2010-01-01

    Business Intelligence (BI) software allows you to view different components of a business using a single visual platform, which makes comprehending mountains of data easier. BI is everywhere. Applications that include reports, analytics, statistics, and historical and predictive modeling are all examples of BI. Currently, we are in the second generation of BI software - called BI 2.0 - which is focused on writing BI software that is predictive, adaptive, simple, and interactive. As computers and software have evolved, more data can be presented to end users with increasingly visually rich tech

  10. Software requirements

    CERN Document Server

    Wiegers, Karl E

    2003-01-01

    Without formal, verifiable software requirements-and an effective system for managing them-the programs that developers think they've agreed to build often will not be the same products their customers are expecting. In SOFTWARE REQUIREMENTS, Second Edition, requirements engineering authority Karl Wiegers amplifies the best practices presented in his original award-winning text?now a mainstay for anyone participating in the software development process. In this book, you'll discover effective techniques for managing the requirements engineering process all the way through the development cy

  11. Software review

    OpenAIRE

    Modak, Jayant M

    2000-01-01

    Extend from Imagine That Inc. is simulation software which the company advertises as software for the next millennium. I had not seen this software before, and therefore, was not sure of what to expect from it. But I was pleasantly surprised with its abilities after working with it for a few days. Extend is supplied on a CD, accompanied by a Users Manual which covers various topics such as building a model, enhancing the model and running the model with the blocks provided with the model. It ...

  12. Nuclear application software package

    International Nuclear Information System (INIS)

    The Nuclear Application Software Package generates a full-core distribution and power peaking analysis every six minutes during reactor operation. Information for these calculations is provided by a set of fixed incore, self-powered rhodium detectors whose signals are monitored and averaged to obtain input for the software. Following the calculation of a power distribution and its normalization to a core heat balance, the maximum power peaks in the core and minimum DNBR are calculated. Additional routines are provided to calculate the core reactivity, future xenon concentrations, critical rod positions, and assembly isotopic concentrations

  13. An overview of second generation biofuel technologies.

    Science.gov (United States)

    Sims, Ralph E H; Mabee, Warren; Saddler, Jack N; Taylor, Michael

    2010-03-01

    The recently identified limitations of 1st-generation biofuels produced from food crops (with perhaps the exception of sugarcane ethanol) have caused greater emphasis to be placed on 2nd-generation biofuels produced from ligno-cellulosic feedstocks. Although significant progress continues to be made to overcome the technical and economic challenges, 2nd-generation biofuels production will continue to face major constraints to full commercial deployment. The logistics of providing a competitive, all-year-round, supply of biomass feedstock to a commercial-scale plant is challenging, as is improving the performance of the conversion process to reduce costs. The biochemical route, being less mature, probably has a greater cost reduction potential than the thermo-chemical route, but here a wider range of synthetic fuels can be produced to better suit heavy truck, aviation and marine applications. Continued investment in research and demonstration by both public and private sectors, coupled with appropriate policy support mechanisms, are essential if full commercialisation is to be achieved within the next decade. After that, the biofuel industry will grow only at a steady rate and encompass both 1st- and 2nd-generation technologies that meet agreed environmental, sustainability and economic policy goals. PMID:19963372

  14. Software Reviews.

    Science.gov (United States)

    Mathematics and Computer Education, 1988

    1988-01-01

    Presents reviews of six software packages. Includes (1) "Plain Vanilla Statistics"; (2) "MathCAD 2.0"; (3) "GrFx"; (4) "Trigonometry"; (5) "Algebra II"; (6) "Algebra Drill and Practice I, II, and III." (PK)

  15. Software Radio

    OpenAIRE

    Varun Sharma; Yadvinder Singh Mann

    2010-01-01

    This paper aims to provide an overview on rapidly growing technology in the radio domain which overcomes the drawbacks suffered by the conventional analog radio. This is the age of Software radio – the technology which tries to transform the hardware radio transceivers into smart programmable devices which can fit into various devices available in today’s rapidly evolving wireless communication industry. This new technology has some or the entire physical layer functions software defined....

  16. A neutron diffraction study of structural distortion and magnetic ordering in the cation-ordered perovskites Ba2Nd1?xYxMoO6

    International Nuclear Information System (INIS)

    The cation ordered perovskites Ba2Nd1?xYxMoO6 (0.04?x?0.35) have been synthesised by solid-state techniques under reducing conditions at temperatures up to 1350 °C. Rietveld analyses of X-ray and neutron powder diffraction data show that these compounds adopt a tetragonally distorted perovskite structure. The tetragonal distortion is driven by the bonding requirements of the Ba2+ cation that occupies the central interstice of the perovskite; this cation would be underbonded if these compounds retained the cubic symmetry exhibited by the prototypical structure. The size and charge difference between the lanthanides and Mo5+ lead to complete ordering of the cations to give a rock-salt ordering of Nd3+/Y3+O6 and MoO6 octahedra. The I4/m space group symmetry is retained on cooling the x=0.1, 0.2 and 0.35 samples to low temperature ca. 2 K. Ba2Nd0.90Y0.10MoO6 undergoes a gradual distortion of the MoO6 units on cooling from room temperature to give two long trans bonds (2.001(2) Å) along the z-direction and four shorter apical bonds (1.9563(13) Å) in the xy-plane. This distortion of the MoO6 units stabilises the 4d1 electron in the dxz and dyz orbitals whilst the dxy orbital is increased in energy due to the contraction of the Mo–O bonds in the xy-plane. This bond extension along z is propagated through the structure and gives a negative thermal expansion of ?13×10?6 K?1 along c. The overall volumetric thermal expansion is positive due to conventional expansion along the other two crystallographic axes. With increasing Y3+ content this distortion is reduced in x=0.2 and eliminated in x=0.35 which contains largely regular MoO6 octahedra. The x=0.1 and x=0.2 show small peaks in the neutron diffraction profile due to long range antiferromagnetic order arising from ordered moments of ca. 2 ?B. - Graphical Abstract: The distortion in the molybdenum crystal field is continuously adjusted by chemical composition of the perovskite. Highlights: ? Introducing Y3+ into Ba2NdMoO6 stabilises tetragonal symmetry to 2 K. ? A distortion of the ligand field around the Mo5+ 4d1 cation confers electronic stabilisation. ? The size of the distortion is progressively reduced with increasing Y3+ content. ? Distortion gives negative thermal expansion along z and antiferromagnetic order at T?15 K

  17. PREFACE: 2nd Russia-Japan-USA Symposium on the Fundamental and Applied Problems of Terahertz Devices and Technologies (RJUS TeraTech - 2013)

    Science.gov (United States)

    Karasik, Valeriy; Ryzhii, Viktor; Yurchenko, Stanislav

    2014-03-01

    The 2nd Russia-Japan-USA Symposium 'The Fundamental & Applied Problems of Terahertz Devices & Technologies' (RJUS TeraTech - 2013) Bauman Moscow State Technical University Moscow, Russia, 3-6 June, 2013 The 2nd Russia-Japan-USA Symposium 'The Fundamental & Applied Problems of Terahertz Devices & Technologies' (RJUS TeraTech - 2013) was held in Bauman Moscow State Technical University on 3-6 June 2013 and was devoted to modern problems of terahertz optical technologies. RJUS TeraTech 2013 was organized by Bauman Moscow State Technical University in cooperation with Tohoku University (Sendai, Japan) and University of Buffalo (The State University of New York, USA). The Symposium was supported by Bauman Moscow State Technical University (Moscow, Russia) and Russian Foundation for Basic Research (grant number 13-08-06100-g). RJUS TeraTech - 2013 became a foundation for sharing and discussing modern and promising achievements in fundamental and applied problems of terahertz optical technologies, devices based on grapheme and grapheme strictures, condensed matter of different nature. Among participants of RJUS TeraTech - 2013, there were more than 100 researchers and students from different countries. This volume contains proceedings of the 2nd Russia-Japan-USA Symposium 'The Fundamental & Applied Problems of Terahertz Devices & Technologies'. Valeriy Karasik, Viktor Ryzhii and Stanislav Yurchenko Bauman Moscow State Technical University Symposium chair Anatoliy A Aleksandrov, Rector of BMSTU Symposium co-chair Valeriy E Karasik, Head of the Research and Educational Center 'PHOTONICS AND INFRARED TECHNOLOGY' (Russia) Invited Speakers Taiichi Otsuji, Research Institute of Electrical Communication, Tohoku University, Sendai, Japan Akira Satou, Research Institute of Electrical Communication, Tohoku University, Sendai, Japan Michael Shur, Electrical, Computer and System Engineering and Physics, Applied Physics, and Astronomy, Rensselaer Polytechnic Institute, NY, USA Natasha Kirova, University Paris-Sud, France Andrei Sergeev, Department of Electrical Engineering, The University of Buffalo, The State University of New Your, Buffalo, NY, USA Magnus Willander, Linkoping University (LIU), Department of Science and Technology, Linkopings, Sweden Dmitry R Khohlov, Physical Faculty, Lomonosov Moscow State University, Russia Vladimir L Vaks, Institute for Physics of Microstructures of Russian Academy of Sciences, Russia

  18. ISE-SPL: uma abordagem baseada em linha de produtos de software aplicada à geração automática de sistemas para educação médica na plataforma E-learning / ISE-SPL: a software product line approach applied to automatic generation of systems for medical education in E-learning platform

    Scientific Electronic Library Online (English)

    Túlio de Paiva Marques, Carvalho; Bruno Gomes de, Araújo; Ricardo Alexsandro de Medeiros, Valentim; Jose, Diniz Junior; Francis Solange Vieira, Tourinho; Rosiane Viana Zuza, Diniz.

    2013-12-01

    Full Text Available INTRODUÇÃO: O e-learning surgiu como uma forma complementar de ensino, trazendo consigo vantagens como o aumento da acessibilidade da informação, aprendizado personalizado, democratização do ensino e facilidade de atualização, distribuição e padronização do conteúdo. Neste sentido, o presente trabal [...] ho tem como objeto apresentar uma ferramenta, intitulada de ISE-SPL, cujo propósito é a geração automática de sistemas de e-learning para a educação médica, utilizando para isso sistemas ISE (Interactive Spaced-Education) e conceitos de Linhas de Produto de Software. MÉTODOS: A ferramenta consiste em uma metodologia inovadora para a educação médica que visa auxiliar o docente da área de saúde na sua prática pedagógica por meio do uso de tecnologias educacionais, todas baseadas na computação aplicada à saúde (Informática em Saúde). RESULTADOS: Os testes realizados para validar a ISE-SPL foram divididos em duas etapas: a primeira foi feita através da utilização de um software de análise de ferramentas semelhantes ao ISE-SPL, chamado S.P.L.O.T; e a segunda foi realizada através da aplicação de questionários de usabilidade aos docentes da área da saúde que utilizaram o ISE-SPL. CONCLUSÃO: Ambos os testes demonstraram resultados positivos, permitindo comprovar a eficiência e a utilidade da ferramenta de geração de softwares de e-learning para o docente da área da saúde. Abstract in english INTRODUCTION: E-learning, which refers to the use of Internet-related technologies to improve knowledge and learning, has emerged as a complementary form of education, bringing advantages such as increased accessibility to information, personalized learning, democratization of education and ease of [...] update, distribution and standardization of the content. In this sense, this paper aims to present a tool, named ISE-SPL, whose purpose is the automatic generation of E-learning systems for medical education, making use of ISE systems (Interactive Spaced-Education) and concepts of Software Product Lines. METHODS: The tool consists of an innovative methodology for medical education that aims to assist professors of healthcare in their teaching through the use of educational technologies, all based on computing applied to healthcare (Informatics in Health). RESULTS: The tests performed to validate the ISE-SPL were divided into two stages: the first was made by using a software analysis tool similar to ISE-SPL, called S.P.L.O.T and the second was performed through usability questionnaires to healthcare professors who used ISE-SPL. CONCLUSION: Both tests showed positive results, allowing to conclude that ISE-SPL is an efficient tool for generation of E-learning software and useful for teachers in healthcare.

  19. Parametric Estimation of Software Systems

    Directory of Open Access Journals (Sweden)

    Kavita Choudhary

    2011-05-01

    Full Text Available Software is characterised by software metrics. Calculation of effort estimation is a type of software metrics. Software effort estimation plays a vital role in the development of software. In recent years, software has become the most expensive component of computer system projects. The major part of cost of software development is due to the human-effort, and most cost estimation methods focus on this aspect and give estimates in terms of person-month. In this paper, estimation of effort required for the development of software project is calculated using genetic algorithm approach. Software systems are becoming complex and they desire for new, effective and optimized technique with limited resources. A solution to this problem lies in nature where complex species have evolved from simple organisms and constantly become able to adapt to changes in the environment. In case of species, it takes hundreds of generations and years which are not considerable in the field of software engineering. With the use of genetic algorithm, it can be done instantly by simulating the results on various tools of genetic algorithm.

  20. Proceedings of the Fifth Triennial Software Quality Forum 2000, Software for the Next Millennium, Software Quality Forum

    Energy Technology Data Exchange (ETDEWEB)

    Scientific Software Engineering Group, CIC-12

    2000-04-01

    The Software Quality Forum is a triennial conference held by the Software Quality Assurance Subcommittee for the Department of Energy's Quality Managers. The forum centers on key issues, information, and technology important in software development for the Nuclear Weapons Complex. This year it will be opened up to include local information technology companies and software vendors presenting their solutions, ideas, and lessons learned. The Software Quality Forum 2000 will take on a more hands-on, instructional tone than those previously held. There will be an emphasis on providing information, tools, and resources to assist developers in their goal of producing next generation software.

  1. Design and manufacture of a D-shape coil-based toroid-type HTS DC reactor using 2nd generation HTS wire

    Science.gov (United States)

    Kim, Kwangmin; Go, Byeong-Soo; Sung, Hae-Jin; Park, Hea-chul; Kim, Seokho; Lee, Sangjin; Jin, Yoon-Su; Oh, Yunsang; Park, Minwon; Yu, In-Keun

    2014-09-01

    This paper describes the design specifications and performance of a real toroid-type high temperature superconducting (HTS) DC reactor. The HTS DC reactor was designed using 2G HTS wires. The HTS coils of the toroid-type DC reactor magnet were made in the form of a D-shape. The target inductance of the HTS DC reactor was 400 mH. The expected operating temperature was under 20 K. The electromagnetic performance of the toroid-type HTS DC reactor magnet was analyzed using the finite element method program. A conduction cooling method was adopted for reactor magnet cooling. Performances of the toroid-type HTS DC reactor were analyzed through experiments conducted under the steady-state and charge conditions. The fundamental design specifications and the data obtained from this research will be applied to the design of a commercial-type HTS DC reactor.

  2. Recent Observations of Clouds and Precipitation by the Airborne Precipitation Radar 2nd Generation in Support of the GPM and ACE Missions

    Science.gov (United States)

    Durden, Stephen L.; Tanelli, Simone; Im, Eastwood

    2012-01-01

    In this paper we illustrate the unique dataset collected during the Global Precipitation Measurement Cold-season Precipitation Experiment (GCPEx, US/Canada Jan/Feb 2012). We will focus on the significance of these observations for the development of algorithms for GPM and ACE, with particular attention to classification and retrievals of frozen and mixed phase hydrometeors.

  3. The optical, mechanical, and thermal design and performance of the 2nd generation redshift (z) and early universe spectrometer, ZEUS-2

    Science.gov (United States)

    Parshley, Stephen C.; Ferkinhoff, Carl; Nikola, Thomas; Stacey, Gordon J.; Ade, Peter A.; Tucker, Carole E.

    2012-09-01

    We have built a new long-slit grating spectrometer (ZEUS-2) for observations in the submillimeter wavelength regime (200-650 ?m). ZEUS-2 is optimized for observations of redshifted far-infrared spectral lines from galaxies in the early Universe. The spectrometer employs three transition-edge sensed bolometer arrays, allowing for simultaneous observations of multiple lines in several telluric windows. Here we will discuss the optical, mechanical, and thermal requirements of ZEUS-2 and their subsequent design and performance. The entire instrument is cooled using a pulse tube cryocooler and an adiabatic demagnetization refrigerator. The cryogen-free approach enables remote control of the cooling system and allows for deployment of ZEUS-2 to telescope sites where access is limited. The compact and lightweight design is also within the size and weight constraints of several submm telescopes, making ZEUS-2 deployable at a variety of sites. ZEUS-2 completed a successful engineering run at the CSO on Mauna Kea in May 2012, and we plan to have our science-grade array system deployed on the APEX telescope in Chile for a science run in the fall of 2012.

  4. Computing and software

    Directory of Open Access Journals (Sweden)

    White, G. C.

    2004-06-01

    Full Text Available The reality is that the statistical methods used for analysis of data depend upon the availability of software. Analysis of marked animal data is no different than the rest of the statistical field. The methods used for analysis are those that are available in reliable software packages. Thus, the critical importance of having reliable, up–to–date software available to biologists is obvious. Statisticians have continued to develop more robust models, ever expanding the suite of potential analysis methods available. But without software to implement these newer methods, they will languish in the abstract, and not be applied to the problems deserving them. In the Computers and Software Session, two new software packages are described, a comparison of implementation of methods for the estimation of nest survival is provided, and a more speculative paper about how the next generation of software might be structured is presented. Rotella et al. (2004 compare nest survival estimation with different software packages: SAS logistic regression, SAS non–linear mixed models, and Program MARK. Nests are assumed to be visited at various, possibly infrequent, intervals. All of the approaches described compute nest survival with the same likelihood, and require that the age of the nest is known to account for nests that eventually hatch. However, each approach offers advantages and disadvantages, explored by Rotella et al. (2004. Efford et al. (2004 present a new software package called DENSITY. The package computes population abundance and density from trapping arrays and other detection methods with a new and unique approach. DENSITY represents the first major addition to the analysis of trapping arrays in 20 years. Barker & White (2004 discuss how existing software such as Program MARK require that each new model’s likelihood must be programmed specifically for that model. They wishfully think that future software might allow the user to combine pieces of likelihood functions together to generate estimates. The idea is interesting, and maybe some bright young statistician can work out the specifics to implement the procedure. Choquet et al. (2004 describe MSURGE, a software package that implements the multistate capture–recapture models. The unique feature of MSURGE is that the design matrix is constructed with an interpreted language called GEMACO. Because MSURGE is limited to just multistate models, the special requirements of these likelihoods can be provided. The software and methods presented in these papers gives biologists and wildlife managers an expanding range of possibilities for data analysis. Although ease–of–use is generally getting better, it does not replace the need for understanding of the requirements and structure of the models being computed. The internet provides access to many free software packages as well as user–discussion groups to share knowledge and ideas. (A starting point for wildlife–related applications is (http://www.phidot.org.

  5. Comparison of Strong Gravitational Lens Model Software II. HydraLens: Computer-Assisted Strong Gravitational Lens Model Generation and Translation

    CERN Document Server

    Lefor, Alsn T

    2015-01-01

    The behavior of strong gravitational lens model software in the analysis of lens models is not necessarily consistent among the various software available, suggesting that the use of several models may enhance the understanding of the system being studied. Among the publicly available codes, the model input files are heterogeneous, making the creation of multiple models tedious. An enhanced method of creating model files and a method to easily create multiple models, may increase the number of comparison studies. HydraLens simplifies the creation of model files for four strong gravitational lens model software packages, including Lenstool, Gravlens/Lensmodel, glafic and PixeLens, using a custom designed GUI for each of the four codes that simplifies the entry of the model for each of these codes, obviating the need for user manuals to set the values of the many flags and in each data field. HydraLens is designed in a modular fashion, which simplifies the addition of other strong gravitational lens codes in th...

  6. The Use of the 2nd Law as a Potential Design Tool for Aircraft Air Frame Subsystems

    Directory of Open Access Journals (Sweden)

    David J. Moorhouse

    2006-12-01

    Full Text Available This paper presents the modeling of the irreversible thermodynamics of the Air Frame Subsystem as a component of integrated aircraft design/synthesis. Entropy calculation procedures for complicated geometries in curvilinear coordinates are described, including the effects of turbulence. Both inviscid and viscous calculations are reported and the contributions of the various terms in the entropy equation are investigated. The procedure is validated and then extended to the calculation of entropy generation associated with flow over the B747200 aircraft. Results show that most of the entropy generation is due to turbulence. The viscous dissipation term in the entropy equation dominates compared to the heat transfer term. The implications of the results for design improvement are briefly discussed.

  7. Final Report of the 2nd Ex-Vessel Neutron Dosimetry Installation And Evaluations for Kori Unit 3 Reactor Pressure Vessel

    International Nuclear Information System (INIS)

    This report describes a neutron fluence assessment performed for the Kori Unit 3 pressure vessel beltline region based on the guidance specified in Regulatory Guide 1.190. In this assessment, maximum fast neutron exposures expressed in terms of fast neutron fluence (E>1 MeV) and iron atom displacements (dpa) were established for the beltline region of the pressure vessel. After Cycle 16 of reactor operation, 2nd Ex-Vessel Neutron Dosimetry Program was instituted at Kori Unit 3 to provide continuous monitoring of the beltline region of the reactor vessel. The use of the Ex-Vessel Neutron Dosimetry Program coupled with available surveillance capsule measurements provides a plant specific data base that enables the evaluation of the vessel exposure and the uncertainty associated with that exposure over the service life of the unit. Ex-Vessel Neutron Dosimetry has been evaluated at the conclusion of Cycle 17

  8. Double NASICON-type cell: ordered Nd3+ distribution in Li0.2Nd0.8/3Zr2(PO4)3.

    Science.gov (United States)

    Barré, Maud; Crosnier-Lopez, Marie-Pierre; Le Berre, Françoise; Bohnké, Odile; Suard, Emmanuelle; Fourquet, Jean-Louis

    2008-06-21

    The NASICON compound Li(0.2)Nd(0.8/3)Zr(2)(PO(4))(3), synthesized by a sol-gel process, has been structurally characterized by TEM and powder diffraction (neutron and X-ray). It crystallizes in the space group R3[combining macron] (No. 148): at room temperature, the Nd(3+) ions present an ordered distribution in the [Zr(2)(PO(4))(3)](-) network which leads to a doubling of the classical c parameter (a = 8.7160(3) A, c = 46.105(1) A). Above 600 degrees C, Nd(3+) diffusion occurs leading at 1000 degrees C to the loss of the supercell. This reversible cationic diffusion in a preserved 3D [Zr(2)(PO(4))(3)](-) network is followed through thermal X-ray diffraction. Ionic conductivity measurements have been undertaken by impedance spectroscopy, while some results concerning the sintering of the NASICON compound are given. PMID:18521448

  9. Final Report of the 2nd Ex-Vessel Neutron Dosimetry Installation And Evaluations for Yonggwang Unit 1 Reactor Pressure Vessel

    International Nuclear Information System (INIS)

    This report describes a neutron fluence assessment performed for the Yonggwang Unit 1 pressure vessel belt line region based on the guidance specified in Regulatory Guide 1.190. In this assessment, maximum fast neutron exposures expressed in terms of fast neutron fluence (E>1 MeV) and iron atom displacements (dpa) were established for the belt line region of the pressure vessel. During Cycle 16 of reactor operation, 2nd Ex-Vessel Neutron Dosimetry Program was instituted at Yonggwang Unit 1 to provide continuous monitoring of the belt line region of the reactor vessel. The use of the Ex-Vessel Neutron Dosimetry Program coupled with available surveillance capsule measurements provides a plant specific data base that enables the evaluation of the vessel exposure and the uncertainty associated with that exposure over the service life of the unit. Ex-Vessel Neutron Dosimetry has been evaluated at the conclusion of Cycle 16

  10. Final Report of the 2nd Ex-Vessel Neutron Dosimetry Installation And Evaluations for Yonggwang Unit 1 Reactor Pressure Vessel

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Byoung Chul; Yoo, Choon Sung; Lee, Sam Lai; Chang, Kee Ok; Gong, Un Sik; Choi, Kwon Jae; Chang, Jong Hwa; Li, Nam Jin; Hong, Joon Wha

    2007-01-15

    This report describes a neutron fluence assessment performed for the Yonggwang Unit 1 pressure vessel belt line region based on the guidance specified in Regulatory Guide 1.190. In this assessment, maximum fast neutron exposures expressed in terms of fast neutron fluence (E>1 MeV) and iron atom displacements (dpa) were established for the belt line region of the pressure vessel. During Cycle 16 of reactor operation, 2nd Ex-Vessel Neutron Dosimetry Program was instituted at Yonggwang Unit 1 to provide continuous monitoring of the belt line region of the reactor vessel. The use of the Ex-Vessel Neutron Dosimetry Program coupled with available surveillance capsule measurements provides a plant specific data base that enables the evaluation of the vessel exposure and the uncertainty associated with that exposure over the service life of the unit. Ex-Vessel Neutron Dosimetry has been evaluated at the conclusion of Cycle 16.

  11. Final Report of the 2nd Ex-Vessel Neutron Dosimetry Installation And Evaluations for Kori Unit 1 Reactor Pressure Vessel

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Byoung Chul; Yoo, Choon Sung; Lee, Sam Lai; Chang, Kee Ok; Gong, Un Sik; Choi, Kwon Jae; Chang, Jong Hwa; Kim, Kwan Hyun; Hong, Joon Wha

    2007-02-15

    This report describes a neutron fluence assessment performed for the Kori Unit 1 pressure vessel beltline region based on the guidance specified in Regulatory Guide 1.190. In this assessment, maximum fast neutron exposures expressed in terms of fast neutron fluence (E>1 MeV) and iron atom displacements (dpa) were established for the beltline region of the pressure vessel. After Cycle 22 of reactor operation, 2nd Ex-Vessel Neutron Dosimetry Program was instituted at Kori Unit 1 to provide continuous monitoring of the beltline region of the reactor vessel. The use of the Ex-Vessel Neutron Dosimetry Program coupled with available surveillance capsule measurements provides a plant specific data base that enables the evaluation of the vessel exposure and the uncertainty associated with that exposure over the service life of the unit. Ex-Vessel Neutron Dosimetry has been evaluated at the conclusion of Cycle 23.

  12. 2nd ESMO Consensus Conference on Lung Cancer: non-small-cell lung cancer first-line/second and further lines of treatment in advanced disease

    DEFF Research Database (Denmark)

    Besse, B; Adjei, A

    2014-01-01

    To complement the existing treatment guidelines for all tumour types, ESMO organises consensus conferences to focus on specific issues in each type of tumour. The 2nd ESMO Consensus Conference on Lung Cancer was held on 11-12 May 2013 in Lugano. A total of 35 experts met to address several questions on non-small-cell lung cancer (NSCLC) in each of four areas: pathology and molecular biomarkers, first-line/second and further lines of treatment in advanced disease, early-stage disease and locally advanced disease. For each question, recommendations were made including reference to the grade of recommendation and level of evidence. This consensus paper focuses on first line/second and further lines of treatment in advanced disease.

  13. CaF/sub 2/:Mn thermoluminescence: a single glow peak not described by 1st or 2nd order kinetics

    Energy Technology Data Exchange (ETDEWEB)

    Hornyak, W.F.; Levy, P.W.; Kierstead, J.A.

    1984-12-01

    The thermoluminescence (TL) of CaF/sub 2/:Mn has been studied using photon counting and digital recording. For doses of 10 rad or less the TL glow curves appear to consist of a single glow peak. However, there are indications - which are pronounced at larger doses - that one additional low intensity peak (area less than or equal to one percent) is superimposed on each side of the central peak. The intense peak is not described by 1st or 2nd order kinetics but is well described by the more general kinetics from which these kinetics are derived. These observations, and the results of additional kinetic analysis, demonstrate that retrapping is not negligible and may include all three peaks. In such systems, which are likely to include other dosimeter materials and minerals, peak height will not increase linearly with dose; an important factor for dosimetry and dating applications.

  14. Dielectric studies in a layered Ba based Bi-2222 cuprate Bi2Ba2Nd1.6Ce0.4Cu2O10+?

    International Nuclear Information System (INIS)

    The ceramic sample of a layered cuprate Bi2Ba2Nd1.6Ce0.4Cu2O10+?, so-called Ba-based Bi-2222 compound was investigated by the measurement of the dielectric permittivity and the dissipation factor as a function of temperature (80-300 K) and frequency (20-106 Hz). The dielectric constant was measured as high as ?1000 at 1 kHz and 300 K with relatively low dissipation factor. However, it decreases systematically with decreasing temperature or with increasing frequency due to the dipolar relaxation process. This thermally activated relaxation process plays a dominant role for the low frequency dielectric response. The associated relaxation time obeys Arrhenius' relation with the activation energy of 0.60 eV. And we propose that this dipolar polarization may mainly originate from the hopping of charge carriers between localized sites over the potential barriers

  15. [Novel conformational peptide antigen, which simulates an immunodominant epitope of the 2nd extracellular loop of ?1-adrenoceptor. Computer simulation, synthesis, spatial structure].

    Science.gov (United States)

    Bibilashvili, R Sh; Sidorova, M V; Molokoedov, A S; Bespalova, Zh D; Bocharov, E B; Efremov, E E; Sharf, T V; Rogova, M M; Mironova, N A; Zykov, K A; Golitsyn, S P

    2013-01-01

    By means of computer simulation has been built polypeptide antigen conformational structure that imitates the immunodominant epitope of the 2nd extracellular loop of ?1-adrenoreceptor. A linear 25-membered peptide corresponding to calculated sequence was synthesized by means of solid-phase methoyd using Fmoc-technology, then directed by the closure ofdisulfide bridges was obtained original bicyclic polypeptide corresponding to the proposed structure of the conformational antigen. With the help of high-resolution NMR spectroscopy 3D structure of synthetic conformational antigen was investigated. It was shown that the structure of the bicyclic polypeptide similar to that of building computer model. Bicyclic conformational antigen was suitable for the detection of autoantibodies in the blood serum of patients with rhythm and conductivity violation without evidence of organic disease of the cardiovascular system. PMID:25696928

  16. Wound healing and soft tissue effects of CO2, contact Nd: YAG and combined CO2-Nd: YAG laser beams on rabbit trachea.

    Science.gov (United States)

    Laranne, J; Lagerstedt, A; Pukander, J; Rantala, I

    1997-11-01

    Rabbit trachea was used as an experimental model to study tissue effects and healing of full-thickness tracheal lesions produced by CO2, contact Nd: YAG and combined, coaxial CO2-Nd: YAG (Combo) laser beams. Two power settings (10 W and 16 W) were used with CO2 and contact Nd: YAG lasers. Three different CO2/Nd:YAG power ratios (1:1, 1:2 and 1:4) and power settings (12 W 15 W and 16 W) were used with the Combolaser. Histological specimens for light and transmission electron microscopy were prepared immediately and 1, 3, 5, 7, 14 and 21 days postoperatively. The wound with the most precise and fastest healing was produced by contact Nd: YAG laser. CO2 laser produced a moderate amount of charring and the largest amount of coagulated tissue with a slightly prolonged healing period. In the acute phase, tissue defects produced by the Combolaser with power ratios 1:1 and 1:2 resembled the CO2 laser lesions but with slightly less charring. The power ratio 1:4 diminished the cutting properties of the beam considerably. During the healing period the Combolaser produced the most intensive inflammation and granulation tissue formation resulting in delayed regeneration of the lesion. In transmission electron micrographs the most severe damage to chondrocytes was seen after using the Combolaser. These findings indicate that the Combolaser produces deeper tissue damage than CO2 or contact Nd:YAG laser. However, the Combolaser appears to be suitable for tracheobronchial operations, owing to its good simultaneous cutting and haemostatic properties. PMID:9442836

  17. Verification-based Software-fault Detection

    OpenAIRE

    Gladisch, Christoph David

    2011-01-01

    Software is used in many safety- and security-critical systems. Software development is, however, an error-prone task. In this dissertation new techniques for the detection of software faults (or software "bugs") are described which are based on a formal deductive verification technology. The described techniques take advantage of information obtained during verification and combine verification technology with deductive fault detection and test generation in a very unified way.

  18. Software Geometry in Simulations

    Science.gov (United States)

    Alion, Tyler; Viren, Brett; Junk, Tom

    2015-04-01

    The Long Baseline Neutrino Experiment (LBNE) involves many detectors. The experiment's near detector (ND) facility, may ultimately involve several detectors. The far detector (FD) will be significantly larger than any other Liquid Argon (LAr) detector yet constructed; many prototype detectors are being constructed and studied to motivate a plethora of proposed FD designs. Whether it be a constructed prototype or a proposed ND/FD design, every design must be simulated and analyzed. This presents a considerable challenge to LBNE software experts; each detector geometry must be described to the simulation software in an efficient way which allows for multiple authors to easily collaborate. Furthermore, different geometry versions must be tracked throughout their use. We present a framework called General Geometry Description (GGD), written and developed by LBNE software collaborators for managing software to generate geometries. Though GGD is flexible enough to be used by any experiment working with detectors, we present it's first use in generating Geometry Description Markup Language (GDML) files to interface with LArSoft, a framework of detector simulations, event reconstruction, and data analyses written for all LAr technology users at Fermilab. Brett is the other of the framework discussed here, the General Geometry Description (GGD).

  19. Development of a Risk-Based Probabilistic Performance-Assessment Method for Long-Term Cover Systems - 2nd Edition

    International Nuclear Information System (INIS)

    A probabilistic, risk-based performance-assessment methodology has been developed to assist designers, regulators, and stakeholders in the selection, design, and monitoring of long-term covers for contaminated subsurface sites. This report describes the method, the software tools that were developed, and an example that illustrates the probabilistic performance-assessment method using a repository site in Monticello, Utah. At the Monticello site, a long-term cover system is being used to isolate long-lived uranium mill tailings from the biosphere. Computer models were developed to simulate relevant features, events, and processes that include water flux through the cover, source-term release, vadose-zone transport, saturated-zone transport, gas transport, and exposure pathways. The component models were then integrated into a total-system performance-assessment model, and uncertainty distributions of important input parameters were constructed and sampled in a stochastic Monte Carlo analysis. Multiple realizations were simulated using the integrated model to produce cumulative distribution functions of the performance metrics, which were used to assess cover performance for both present- and long-term future conditions. Performance metrics for this study included the water percolation reaching the uranium mill tailings, radon gas flux at the surface, groundwater concentrations, and dose. Results from uncertainty analyses, sensitivity analyses, and alternative design citivity analyses, and alternative design comparisons are presented for each of the performance metrics. The benefits from this methodology include a quantification of uncertainty, the identification of parameters most important to performance (to prioritize site characterization and monitoring activities), and the ability to compare alternative designs using probabilistic evaluations of performance (for cost savings)

  20. Software engineering

    CERN Document Server

    Thorin, Marc

    2013-01-01

    Software Engineering describes the conceptual bases as well as the main methods and rules on computer programming. This book presents software engineering as a coherent and logically built synthesis and makes it possible to properly carry out an application of small or medium difficulty that can later be developed and adapted to more complex cases. This text is comprised of six chapters and begins by introducing the reader to the fundamental notions of entities, actions, and programming. The next two chapters elaborate on the concepts of information and consistency domains and show that a proc

  1. CROSS-SECTION GENERATION OF VARIOUS GEO-SCIENTIFIC FEATURES WITHOUT CONTOUR DIGITIZATION USING A VISUAL C++ BASED SOFTWARE APPLICATION ‘VIGAT 2005’

    OpenAIRE

    Dasgupta A. R.; Solanki Ajay M.; Rathod Brijesh G.; Srivastava Naveenchandra N.; Patel Vivek R.; Machhar Suresh P.

    2007-01-01

    Cross-section can be described as a two dimensional dataset where the horizontal distances are represented on the x-axis and the depth on the y-axis. A cross-section is a window into the subsurface.
    This work presents the construction of cross sections with the help of 'Vigat 2005' - a Visual C ++ based software application. Its main purpose is to provide cross section views of geoscientific features
    and to interpret their variation within the area of study. In geologica...

  2. Software for Better Documentation of Other Software

    Science.gov (United States)

    Pinedo, John

    2003-01-01

    The Literate Programming Extraction Engine is a Practical Extraction and Reporting Language- (PERL-)based computer program that facilitates and simplifies the implementation of a concept of self-documented literate programming in a fashion tailored to the typical needs of scientists. The advantage for the programmer is that documentation and source code are written side-by-side in the same file, reducing the likelihood that the documentation will be inconsistent with the code and improving the verification that the code performs its intended functions. The advantage for the user is the knowledge that the documentation matches the software because they come from the same file. This program unifies the documentation process for a variety of programming languages, including C, C++, and several versions of FORTRAN. This program can process the documentation in any markup language, and incorporates the LaTeX typesetting software. The program includes sample Makefile scripts for automating both the code-compilation (when appropriate) and documentation-generation processes into a single command-line statement. Also included are macro instructions for the Emacs display-editor software, making it easy for a programmer to toggle between editing in a code or a documentation mode.

  3. Software reengineering

    Science.gov (United States)

    Fridge, Ernest M., III

    1991-01-01

    Programs in use today generally have all of the function and information processing capabilities required to do their specified job. However, older programs usually use obsolete technology, are not integrated properly with other programs, and are difficult to maintain. Reengineering is becoming a prominent discipline as organizations try to move their systems to more modern and maintainable technologies. The Johnson Space Center (JSC) Software Technology Branch (STB) is researching and developing a system to support reengineering older FORTRAN programs into more maintainable forms that can also be more readily translated to a modern languages such as FORTRAN 8x, Ada, or C. This activity has led to the development of maintenance strategies for design recovery and reengineering. These strategies include a set of standards, methodologies, and the concepts for a software environment to support design recovery and reengineering. A brief description of the problem being addressed and the approach that is being taken by the STB toward providing an economic solution to the problem is provided. A statement of the maintenance problems, the benefits and drawbacks of three alternative solutions, and a brief history of the STB experience in software reengineering are followed by the STB new FORTRAN standards, methodology, and the concepts for a software environment.

  4. Free Software

    Science.gov (United States)

    McClain, John

    This collection, created by John McClain of Cornell University, of macros for Microsoft Excel addresses numerous topics like decision trees, the Central Limit Theorem, queueing, critical path analysis, regression with prediction intervals, and recognizing departures from normality. There are eighteen different macros to choose from. This software requires the user to possess a copy of Microsoft Excel for proper use.

  5. Software summaries

    International Nuclear Information System (INIS)

    A listing containing 37 brief summaries of computer-aided control system design software packages, as compiled by Professor Dean K. Frederick of Rensselaer Polytechnic Institute and Dr. Charles J Herget and Fan McFarland of Lawrence Livermore National Laboratory is presented. Updated versions of these summaries will be made available in the future

  6. Software Update.

    Science.gov (United States)

    Currents, 2000

    2000-01-01

    A chart of 40 alumni-development database systems provides information on vendor/Web site, address, contact/phone, software name, price range, minimum suggested workstation/suggested server, standard reports/reporting tools, minimum/maximum record capacity, and number of installed sites/client type. (DB)

  7. Report of the 2nd RCM on nanoscale radiation engineering of advanced materials for potential biomedical applicationsReport of the 2nd RCM on nanoscale radiation engineering of advanced materials for potential biomedical applications

    International Nuclear Information System (INIS)

    There are critical needs for advanced materials in the area of biomaterial engineering, primarily in generating biomaterials of enhanced specific functionalities, improved biocompatibility, and minimal natural rejection but with enhanced interfacial adhesion. These can be achieved by introduction of proper functionalities at the nanoscale dimensions for which, due to their characteristics, radiation techniques are uniquely suited. Accordingly, many of the IAEA Member States (MS) have interest in creating advanced materials for various health-care applications using a wide array of radiation sources and their broad expertise. In seeking new knowledge to advance the field and tackle this specific problem, to collaborate to enhance the quality of the scientific research and improve their efficiency and effectiveness, MS had requested the support of the IAEA for such collaboration. Based on these requests, and the conclusions and recommendations of the Consultant's meeting on Advanced Materials on the Nano-scale Synthesized by Radiation-Induced Processes, held on 10-14 December 2007, the present CRP was formulated and started in 2009. The first RCM was held in 30 March – 3 April 2009, in Vienna, where the work plan for both individual participants and collaborations were discussed and accepted, as reported in the Meeting Report published as IAEA Working Material (http://www-naweb.iaea.org/napc/iachem/working_materials.html). The second RCM was held on 15-19 November 2010, Paris, France, and was attended by 17 participants (chief scientific investigators or team members) and one cost-free observer from Brazil. The participants presented their research achievements since the first RCM, centred on the main expected outputs of this CRP: a. Methodologies to prepare and characterize nanogels; nanoparticles and nanoporous membranes, as well as to synthesize and modify nanoparticle surfaces by attaching organic ligands by radiation; b. Methodologies to radiation synthesize polymeric, inorganic and hybrid nanocarriers, providing a controlled loading and improved releasing rate of drugs; and c. Demonstration of novel functional surfaces for cell-sheet engineering fabricated by utilizing advanced radiation technology, towards improved cell-matrix interactions and cell function control. This meeting report presents in its first part the summaries of the achievements, the conclusions reached and recommendations given, the various collaborations realized among the participants, as well as the list of scientific publications. The second part of the report consists of the full reports of the participants work during the past yearThere are critical needs for advanced materials in the area of biomaterial engineering, primarily in generating biomaterials of enhanced specific functionalities, improved biocompatibility, and minimal natural rejection but with enhanced interfacial adhesion. These can be achieved by introduction of proper functionalities at the nanoscale dimensions for which, due to their characteristics, radiation techniques are uniquely suited. Accordingly, many of the IAEA Member States (MS) have interest in creating advanced materials for various health-care applications using a wide array of radiation sources and their broad expertise. In seeking new knowledge to advance the field and tackle this specific problem, to collaborate to enhance the quality of the scientific research and improve their efficiency and effectiveness, MS had requested the support of the IAEA for such collaboration. Based on these requests, and the conclusions and recommendations of the Consultant's meeting on Advanced Materials on the Nano-scale Synthesized by Radiation-Induced Processes, held on 10-14 December 2007, the present CRP was formulated and started in 2009. The first RCM was held in 30 March – 3 April 2009, in Vienna, where the work plan for both individual participants and collaborations were discussed and accepted, as reported in the Meeting Report published as IAEA Working Material (http://www-naweb.iaea.org/napc/iachem/working_materia

  8. Anticipating and blocking HIV-1 escape by second generation antiviral shRNAs

    Directory of Open Access Journals (Sweden)

    Berkhout Ben

    2010-06-01

    Full Text Available Abstract Background RNA interference (RNAi is an evolutionary conserved gene silencing mechanism that mediates the sequence-specific breakdown of target mRNAs. RNAi can be used to inhibit HIV-1 replication by targeting the viral RNA genome. However, the error-prone replication machinery of HIV-1 can generate RNAi-resistant variants with specific mutations in the target sequence. For durable inhibition of HIV-1 replication the emergence of such escape viruses must be controlled. Here we present a strategy that anticipates HIV-1 escape by designing 2nd generation short hairpin RNAs (shRNAs that form a complete match with the viral escape sequences. Results To block the two favorite viral escape routes observed when the HIV-1 integrase gene sequence is targeted, the original shRNA inhibitor was combined with two 2nd generation shRNAs in a single lentiviral expression vector. We demonstrate in long-term viral challenge experiments that the two dominant viral escape routes were effectively blocked. Eventually, virus breakthrough did however occur, but HIV-1 evolution was skewed and forced to use new escape routes. Conclusion These results demonstrate the power of the 2nd generation RNAi concept. Popular viral escape routes are blocked by the 2nd generation RNAi strategy. As a consequence viral evolution was skewed leading to new escape routes. These results are of importance for a deeper understanding of HIV-1 evolution under RNAi pressure.

  9. Instrumentation-software system of controllable well logging neutron generator AINK-36-3Ts and its application in petroleum geology

    International Nuclear Information System (INIS)

    Science and Technical Department of JSC Tatneftegeofizika has developed in 1999 an instrumentation-software system of pulsed neutron logging AINK- 36-3Ts which offers advanced capabilities in logging oil and gas wells. This system provides as follows: (i) multispaced (3 spacings) pulsed neutron gamma log; (ii) multispaced neutron activation oxygen log; (iii) natural gamma-ray log. Specific features of this system are as follows: use of symmetrical spacings (60 cm); use of PIC-processors in the downhole unit; and use of realistic adaptive models of a PNGL response. Basic instrumentation specifications of the system are as follows: yield of 14-MeV neutrons: 6.107 n/s; modulation frequency of neutron bursts: 20 Hz; guaranteed serviceability period of the neutron tube: 100 h; outer diameter of the logging tool: 36 mm; total length of the logging tool: 3.1 m; number of spacings: 3 (30, 60, -60 cm); telemetry type: Manchester II. Preprocessing of the tool response is made in the downhole unit, next the digital information is transmitted to the surface and processed on a surface-based workstation. High stability of the neutron yield and two-exponent model applied in data processing allow one to produce parameter ?af with an accuracy of up to 5% and fluid production rate in well (from neutron-activation oxygen log) with an accuracy of up to 5%. This instrumentation-software system can be applied on oil-and-gas fields to solve the following problems: litelds to solve the following problems: lithological analysis of productive formations and estimating their flow properties; controlling the position of WOC and GFC in pay zones under production; determining the residual oil-saturation in completely watered, originally oil- prone formations; identifying the hydrodynamic communication between perforated productive formations and above- and below-lying water-bearing beds. The use of symmetrical spacings provides a separate estimation of upward and downward water (or watered product) flow rates. The AINK-36-3Ts instrumentation-software system is presently subject to testing on the oil fields of Tatarstan. (author)

  10. ADESSO : Scientific software development environment

    OpenAIRE

    Machado, Rubens C.; Saude, Andre V.; Lotufo, Roberto

    2003-01-01

    This paper presents the Adesso, a computational environment for the development of scientific software. The Adesso environment leverages the reusable software component programming model to support the development and integration of components to several scientific programming platforms. The Adesso system is based on an XML component database and a set of XML document transformation tools for the automatic generation of component code, documentation and packaging. An authoring tool, built wit...

  11. Simulating A Factory Via Software

    Science.gov (United States)

    Schroer, Bernard J.; Zhang, Shou X.; Tseng, Fan T.

    1990-01-01

    Software system generates simulation program from user's responses to questions. AMPS/PC system is simulation software tool designed to aid user in defining specifications of manufacturing environment and then automatically writing code for target simulation language, GPSS/PC. Domain of problems AMPS/PC simulates is that of manufacturing assembly lines with subassembly lines and manufacturing cells. Written in Turbo Pascal Version 4.

  12. SOFTWARE TOOL FOR LEARNING THE GENERATION OF THE CARDIOID CURVE IN AN AUTOCAD ENVIRONMENT / HERRAMIENTA SOFTWARE PARA EL APRENDIZAJE DE LA GENERACIÓN DE LA CARDIODE EN UN ENTORNO AUTOCAD

    Scientific Electronic Library Online (English)

    MIGUEL ÁNGEL, GÓMEZ-ELVIRA-GONZÁLEZ; JOSÉ IGNACIO, ROJAS-SOLA; MARÍA DEL PILAR, CARRANZA-CAÑADAS.

    2012-02-01

    Full Text Available Este artículo presenta una novedosa aplicación desarrollada en Visual LISP para el entorno AutoCAD, que presenta de forma rápida e intuitiva la generación de la cardiode de cinco formas diferentes, siendo dicha curva cíclica, la que presenta una amplia gama de aplicaciones artísticas y técnicas, ent [...] re ellas, el perfil de algunas levas. Abstract in english This article presents a novel application which has been developed in Visual LISP for an AutoCAD environment, and which shows the generation of the cardioid curve intuitively and quickly in five different ways (using the conchoid of a circumference, pedal curve of a circumference, inverse of a parab [...] ola, orthoptic curve of a circumference, and epicycloid of a circumference). This cyclic curve has a large number of artistic and technical applications, among them the profile of some cams.

  13. 8th World Congress of Music Therapy, 2nd International Congress of the World Federation of Music Therapy, Hamburg, Germany, July 14-20, 1996 - Interview with Prof. Dr. Hans-Helmut Decker-Voigt

    OpenAIRE

    Barbara L Wheeler

    2011-01-01

    An interview with Prof. Dr. Hans-Helmut Decker-Voigt about  his efforts in organising the 8th World Congress of Music Therapy, 2nd International Congress of the World Federation of Music Therapy held in Hamburg, Germany, July 14-20, 1996

  14. Avaliação do Acesso em Saúde na 2ª Microrregião de Saúde, CE Evaluation of the Access to Health in the 2nd Microregion of Health-CE

    Directory of Open Access Journals (Sweden)

    Maria Verônica Sales da Silva

    2012-05-01

    Full Text Available O objetivo desta pesquisa é avaliar os indicadores da Central de Regulação da 2ª Microrregional de Saúde. METODOLOGIA: estudo documental, descritivo e de avaliação, realizado na 2ª Microrregional de Saúde-CE, desenvolvido nas Centrais de Marcação de Consultas (CMC dos municípios, que envolvem 16 profissionais ligados à regulação microrregional e regional. A coleta dos dados realizou-se no período de fevereiro a agosto de 2007, em fonte documental. Este projeto foi submetido ao Comitê de Ética da Universidade Federal do Ceará-UFC. RESULTADOS: a aplicação do parâmetro de cobertura permitiu identificar que os municípios apresentaram baixa cobertura de consultas especializadas para a população microrregional. Apenas dois municípios possuem mais de 90% de cobertura; nos demais, esse parâmetro está aquém do estimado, tendo dois municípios apresentado 28% de cobertura e apenas um, 13%, representando o menor percentual. A supervisão da Programação pactuada e Integrada-PPI nos municípios ainda é uma atividade incipiente no âmbito das ações de controle. Percebeu-se que as CMC dos municípios da 2ª microrregião estão com indicadores de oferta e demanda estrangulados, os gestores municipais não programam o suficiente para atender a demanda da população. A regulação estruturada nesta microrregião não cumpre o papel de otimizar a utilização dos serviços de referência nos espaços supramunicipais segundo os critérios das necessidades de saúde da população.The target of this research is to evaluate the indicators of the Regulation Center of the 2nd Microregional of Heath. METHODOLOGY: documental, descriptive and evaluative study, held in the 2nd Microregional of Health - CE, developed in the appointment setting centers of the districts. The object of study envolved 16 professionals linked to the microregional regulation. The gathering of data was made between February and August 2007, in a documental source. In relation to ethic matters, we have submitted this project to the Ethics Committee of UFC. RESULTS: The application of the parameter of covering allowed us to identify that the districts showed a low tax of appointments specialized to the microregional population. Only 2 districts have more than 90% covering, in the other ones, this parameter is below the estimated, showing two districts with 28% of covering and only one with 13% of covering which represented the lowest percentage. The supervision of PPI in the districts is still an incipient activity in terms of controlling actions. We have noticed that the CMC of the districts of the 2 nd microregional have their outnumbered indicators of offering and demands and the municipal managers do not programm enough to meet the population needs. The structured regulation in this microregion does not improve the utilization of reference services in the supra-municipal spaces, according to the necessity criteria of population health.

  15. Avaliação do Acesso em Saúde na 2ª Microrregião de Saúde, CE / Evaluation of the Access to Health in the 2nd Microregion of Health-CE

    Scientific Electronic Library Online (English)

    Maria Verônica Sales da, Silva; Maria Josefina da, Silva; Lucilane Maria Sales da, Silva; Adail Afrânio Marcelino do, Nascimento; Ana Kelve Castro, Damasceno.

    2012-05-01

    Full Text Available O objetivo desta pesquisa é avaliar os indicadores da Central de Regulação da 2ª Microrregional de Saúde. METODOLOGIA: estudo documental, descritivo e de avaliação, realizado na 2ª Microrregional de Saúde-CE, desenvolvido nas Centrais de Marcação de Consultas (CMC) dos municípios, que envolvem 16 pr [...] ofissionais ligados à regulação microrregional e regional. A coleta dos dados realizou-se no período de fevereiro a agosto de 2007, em fonte documental. Este projeto foi submetido ao Comitê de Ética da Universidade Federal do Ceará-UFC. RESULTADOS: a aplicação do parâmetro de cobertura permitiu identificar que os municípios apresentaram baixa cobertura de consultas especializadas para a população microrregional. Apenas dois municípios possuem mais de 90% de cobertura; nos demais, esse parâmetro está aquém do estimado, tendo dois municípios apresentado 28% de cobertura e apenas um, 13%, representando o menor percentual. A supervisão da Programação pactuada e Integrada-PPI nos municípios ainda é uma atividade incipiente no âmbito das ações de controle. Percebeu-se que as CMC dos municípios da 2ª microrregião estão com indicadores de oferta e demanda estrangulados, os gestores municipais não programam o suficiente para atender a demanda da população. A regulação estruturada nesta microrregião não cumpre o papel de otimizar a utilização dos serviços de referência nos espaços supramunicipais segundo os critérios das necessidades de saúde da população. Abstract in english The target of this research is to evaluate the indicators of the Regulation Center of the 2nd Microregional of Heath. METHODOLOGY: documental, descriptive and evaluative study, held in the 2nd Microregional of Health - CE, developed in the appointment setting centers of the districts. The object of [...] study envolved 16 professionals linked to the microregional regulation. The gathering of data was made between February and August 2007, in a documental source. In relation to ethic matters, we have submitted this project to the Ethics Committee of UFC. RESULTS: The application of the parameter of covering allowed us to identify that the districts showed a low tax of appointments specialized to the microregional population. Only 2 districts have more than 90% covering, in the other ones, this parameter is below the estimated, showing two districts with 28% of covering and only one with 13% of covering which represented the lowest percentage. The supervision of PPI in the districts is still an incipient activity in terms of controlling actions. We have noticed that the CMC of the districts of the 2 nd microregional have their outnumbered indicators of offering and demands and the municipal managers do not programm enough to meet the population needs. The structured regulation in this microregion does not improve the utilization of reference services in the supra-municipal spaces, according to the necessity criteria of population health.

  16. Desempenho ortográfico de escolares do 2º ao 5º ano do ensino particular Spelling performance of students of 2nd to 5th grade from private teaching

    Directory of Open Access Journals (Sweden)

    Simone Aparecida Capellini

    2012-04-01

    Full Text Available OBJETIVOS: caracterizar, comparar e classificar o desempenho dos escolares do 2º ao 5º ano do ensino particular segundo a semiologia dos erros. MÉTODO: foram avaliados 115 escolares do 2º ao 5º ano, sendo 27 do 2°ano, 30 do 3° e 4° ano e 28 do 5° ano escolar, divididos em quatro grupos, respectivamente GI, GII, GIII e GIV. As provas do protocolo de avaliação da ortografia - Pró-Ortografia foram divididas em: versão coletiva (escrita de letras do alfabeto, ditado randomizado das letras do alfabeto, ditado de palavras, ditado de pseudopalavras, ditado com figuras, escrita temática induzida por figura e versão individual (ditado de frases, erro proposital, ditado soletrado, memória lexical ortográfica. RESULTADOS: houve diferença estatisticamente significante na comparação intergrupos, indicando que com o aumento da média de acertos em todas as provas da versão coletiva e individual e com o aumento da seriação escolar, os grupos diminuíram a média de erros na escrita com base na semiologia do erro. A maior freqüência de erros encontrada foi de ortografia natural. CONCLUSÃO: os dados deste estudo evidenciaram que o aumento da média de acertos de acordo com a seriação escolar pode ser indicativo do funcionamento normal de desenvolvimento da escrita infantil nesta população. A maior frequência de erros de ortografia natural encontrada indica que pode não estar ocorrendo instrução formal sobre a correspondência fonema-grafema, uma vez que os mesmos estão na dependência direta da aprendizagem da regra de correspondência direta fonema-grafema.PURPOSE: to characterize, compare and classify the performance of students from 2nd to 5th grades of private teaching according to the semiology of errors. METHOD: 115 students from the 2nd to 5th grades, 27 from the 2nd grade, 30 students from the 3rd and 4th grades, and 28 from the 5th grade divided into four groups, respectively, GI, GII, GIII and GIV, were evaluated. The tests of Spelling Evaluation Protocol - Pro-Orthography were divided into: collective version (writing letters of the alphabet, randomized dictation of letters, word dictation, non-word dictation, dictation with pictures, thematic writing induced by picture and individual version (dictation of sentences, purposeful error, spelled dictation, spelling lexical memory. RESULTS: there was a statistically significant difference in inter-group comparison indicating that there was an increase in average accuracy for all tests as for the individual and collective version. With the increase in grade level, the groups decreased the average of writing errors based on the semiology of errors. We found a higher frequency of natural spelling errors. CONCLUSION: data from this study showed that the increase in average accuracy according to grade level may be an indicative for normal development of student's writing in this population. The higher frequency of natural spelling errors found indicates that formal instruction on phoneme-grapheme correspondence may not be occurring, since that they are directly dependent on the learning of the rule of direct phoneme-grapheme correspondence.

  17. Progress report of the Research Group. 1st part: Tore Supra. 2nd part: Fontenay-aux-Roses

    International Nuclear Information System (INIS)

    Three major events dominated the activities of the EURATOM/CEA association during 1980: the decision to launch the realization of the TORE SUPRA project, the progressive recognition of high frequency heating as a solution for the future, and the increasing support given to the development of heating methods and diagnostics in the JET project. It is estimated that project studies are sufficiently advanced and that industrial fabrication problems have been sufficiently covered for the realization of Tore Supra to begin in 1981. One of the successes of the work carried out is the complete validation for the superfluid helium cooling system. The satisfactory development of high frequency heating and the increasing credibility of this form of heating for future work are very important factors. In this context, the decision of the JET to envisage a large amount of ionic cyclotron heating is particularly important. The results obtained in 1980 are in fact very encouraging. The maximum power of the 500 kW T.F.R. generator was coupled with the plasma and it was possible to establish an energy Q-value. Even though the injection of neutral particles can now be considered as a proved heating method, studies of the accompanying physical phenomena are still important. The T.F.R. experiments carried out in this field in 1980 were very useful. The importance of the realization and development activities conducted during 1980, should not mask the enormous effort that made, both experimentally and theoretically, in order to understand key physical phenomena in plasma. The main peoccupation concerned small and large disruptions and all aspects of the associated instabilities. A detailed analysis of the experimental results using numerical models has led to improved empirical knowledge on the elementary transport phenomena taking place. Increasingly detailed studies on microinstabilities were also fruitful and have even led to a complete reversal in some of the ideas held about the case of universal instabilities

  18. System software

    International Nuclear Information System (INIS)

    General system BESM-6 software consisting of the operating system (OS) ''Dubna'' and monitoring programming system (MPS) is described. The OS ''Dubna'' posesses means for simultaneous solution in a multiprogram regime of up to 16 software programs with automatic dynamic redistribution of operative memory and machine time between them. There is a developed buffering of data input-output. The MPS comprises a monitor interpreting controlling punch cards and assuring their realization. The MPS consists of translators from algorithmic languages FORTRAN, ALGOL, Assembler from the autocode Madlen. A substantial part of the system is a loader performing the assembly from standard operating program masses and operative memory distribution. As OS CDC-6500 OC NOS/BE is applied. For M-6000, TRA, Electronika-100 and EC-1060 variants of CDC-6500 computer code-analogs variants are developed

  19. Calculation Software

    Science.gov (United States)

    1994-01-01

    MathSoft Plus 5.0 is a calculation software package for electrical engineers and computer scientists who need advanced math functionality. It incorporates SmartMath, an expert system that determines a strategy for solving difficult mathematical problems. SmartMath was the result of the integration into Mathcad of CLIPS, a NASA-developed shell for creating expert systems. By using CLIPS, MathSoft, Inc. was able to save the time and money involved in writing the original program.

  20. Software preservation

    Directory of Open Access Journals (Sweden)

    Tadej Vodopivec

    2011-01-01

    Full Text Available Comtrade Ltd. covers a wide range of activities related to information and communication technologies; its deliverables include web applications, locally installed programs,system software, drivers, embedded software (used e.g. in medical devices, auto parts,communication switchboards. Also the extensive knowledge and practical experience about digital long-term preservation technologies have been acquired. This wide spectrum of activities puts us in the position to discuss the often overlooked aspect of the digital preservation - preservation of software programs. There are many resources dedicated to digital preservation of digital data, documents and multimedia records,but not so many about how to preserve the functionalities and features of computer programs. Exactly these functionalities - dynamic response to inputs - render the computer programs rich compared to documents or linear multimedia. The article opens the questions on the beginning of the way to the permanent digital preservation. The purpose is to find a way in the right direction, where all relevant aspects will be covered in proper balance. The following questions are asked: why at all to preserve computer programs permanently, who should do this and for whom, when we should think about permanent program preservation, what should be persevered (such as source code, screenshots, documentation, and social context of the program - e.g. media response to it ..., where and how? To illustrate the theoretic concepts given the idea of virtual national museum of electronic banking is also presented.

  1. Excellence and evidence in staffing: a data-driven model for excellence in staffing (2nd edition).

    Science.gov (United States)

    Baggett, Margarita; Batcheller, Joyce; Blouin, Ann Scott; Behrens, Elizabeth; Bradley, Carol; Brown, Mary J; Brown, Diane Storer; Bolton, Linda Burnes; Borromeo, Annabelle R; Burtson, Paige; Caramanica, Laura; Caspers, Barbara A; Chow, Marilyn; Christopher, Mary Ann; Clarke, Sean P; Delucas, Christine; Dent, Robert L; Disser, Tony; Eliopoulos, Charlotte; Everett, Linda Q; Garcia, Amy; Glassman, Kimberly; Goodwin, Susan; Haagenson, Deb; Harper, Ellen; Harris, Kathy; Hoying, Cheryl L; Hughes-Rease, Marsha; Kelly, Lesly; Kiger, Anna J; Kobs-Abbott, Ann; Krueger, Janelle; Larson, Jackie; March, Connie; Martin, Deborah Maust; Mazyck, Donna; Meenan, Penny; McGaffigan, Patricia; Myers, Karen K; Nell, Kate; Newcomer, Britta; Cathy, Rick; O'Rourke, Maria; Rosa, Billy; Rose, Robert; Rudisill, Pamela; Sanford, Kathy; Simpson, Roy L; Snowden, Tami; Strickland, Bob; Strohecker, Sharon; Weems, Roger B; Welton, John; Weston, Marla; Valentine, Nancy M; Vento, Laura; Yendro, Susan

    2014-01-01

    The Patient Protection and Affordable Care Act (PPACA, 2010) and the Institute of Medicine's (IOM, 2011) Future of Nursing report have prompted changes in the U.S. health care system. This has also stimulated a new direction of thinking for the profession of nursing. New payment and priority structures, where value is placed ahead of volume in care, will start to define our health system in new and unknown ways for years. One thing we all know for sure: we cannot afford the same inefficient models and systems of care of yesterday any longer. The Data-Driven Model for Excellence in Staffing was created as the organizing framework to lead the development of best practices for nurse staffing across the continuum through research and innovation. Regardless of the setting, nurses must integrate multiple concepts with the value of professional nursing to create new care and staffing models. Traditional models demonstrate that nurses are a commodity. If the profession is to make any significant changes in nurse staffing, it is through the articulation of the value of our professional practice within the overall health care environment. This position paper is organized around the concepts from the Data-Driven Model for Excellence in Staffing. The main concepts are: Core Concept 1: Users and Patients of Health Care, Core Concept 2: Providers of Health Care, Core Concept 3: Environment of Care, Core Concept 4: Delivery of Care, Core Concept 5: Quality, Safety, and Outcomes of Care. This position paper provides a comprehensive view of those concepts and components, why those concepts and components are important in this new era of nurse staffing, and a 3-year challenge that will push the nursing profession forward in all settings across the care continuum. There are decades of research supporting various changes to nurse staffing. Yet little has been done to move that research into practice and operations. While the primary goal of this position paper is to generate research and innovative thinking about nurse staffing across all health care settings, a second goal is to stimulate additional publications. This includes a goal of at least 20 articles in Nursing Economic$ on best practices in staffing and care models from across the continuum over the next 3 years. PMID:25144948

  2. Report of the 2nd RCM on nanoscale radiation engineering of advanced materials for potential biomedical applications

    International Nuclear Information System (INIS)

    There are critical needs for advanced materials in the area of biomaterial engineering, primarily in generating biomaterials of enhanced specific functionalities, improved biocompatibility, and minimal natural rejection but with enhanced interfacial adhesion. These can be achieved by introduction of proper functionalities at the nanoscale dimensions for which, due to their characteristics, radiation techniques are uniquely suited. Accordingly, many of the IAEA Member States (MS) have interest in creating advanced materials for various health-care applications using a wide array of radiation sources and their broad expertise. In seeking new knowledge to advance the field and tackle this specific problem, to collaborate to enhance the quality of the scientific research and improve their efficiency and effectiveness, MS had requested the support of the IAEA for such collaboration. Based on these requests, and the conclusions and recommendations of the Consultant's meeting on Advanced Materials on the Nano-scale Synthesized by Radiation-Induced Processes, held on 10-14 December 2007, the present CRP was formulated and started in 2009. The first RCM was held in 30 March – 3 April 2009, in Vienna, where the work plan for both individual participants and collaborations were discussed and accepted, as reported in the Meeting Report published as IAEA Working Material (http://www-naweb.iaea.org/napc/iachem/working_materials.html). The second RCM was held on 15-19 November 2010, Paris, France, and was attended by 17 participants (chief scientific investigators or team members) and one cost-free observer from Brazil. The participants presented their research achievements since the first RCM, centred on the main expected outputs of this CRP: a. Methodologies to prepare and characterize nanogels; nanoparticles and nanoporous membranes, as well as to synthesize and modify nanoparticle surfaces by attaching organic ligands by radiation; b. Methodologies to radiation synthesize polymeric, inorganic and hybrid nanocarriers, providing a controlled loading and improved releasing rate of drugs; and c. Demonstration of novel functional surfaces for cell-sheet engineering fabricated by utilizing advanced radiation technology, towards improved cell-matrix interactions and cell function control. This meeting report presents in its first part the summaries of the achievements, the conclusions reached and recommendations given, the various collaborations realized among the participants, as well as the list of scientific publications. The second part of the report consists of the full reports of the participants work during the past year

  3. Honeywell Modular Automation System Computer Software Documentation

    International Nuclear Information System (INIS)

    The purpose of this Computer Software Document (CSWD) is to provide configuration control of the Honeywell Modular Automation System (MAS) in use at the Plutonium Finishing Plant (PFP). This CSWD describes hardware and PFP developed software for control of stabilization furnaces. The Honeywell software can generate configuration reports for the developed control software. These reports are described in the following section and are attached as addendum's. This plan applies to PFP Engineering Manager, Thermal Stabilization Cognizant Engineers, and the Shift Technical Advisors responsible for the Honeywell MAS software/hardware and administration of the Honeywell System

  4. Squezed-light generation in a nonlinear planar waveguide with a periodic corrugation.

    Czech Academy of Sciences Publication Activity Database

    Pe?ina ml., Jan; Haderka, Ond?ej; Sibilia, C.; Bertolotti, M.; Scalora, M.

    2007-01-01

    Ro?. 76, ?. 3 (2007), 033813/1-033813/14. ISSN 1050-2947 Grant ostatní: GA ?R(CZ) GA202/05/0498 Institutional research plan: CEZ:AV0Z10100522 Keywords : matched 2nd-harmonic generation * photonic-bandgap structures * lithium -niobate * oscillations * enhancement * states Subject RIV: BH - Optics, Masers, Lasers Impact factor: 2.893, year: 2007

  5. SOFTWARE METRICS VALIDATION METHODOLOGIES IN SOFTWARE ENGINEERING

    Directory of Open Access Journals (Sweden)

    K.P. Srinivasan

    2014-12-01

    Full Text Available In the software measurement validations, assessing the validation of software metrics in software engineering is a very difficult task due to lack of theoretical methodology and empirical methodology [41, 44, 45]. During recent years, there have been a number of researchers addressing the issue of validating software metrics. At present, software metrics are validated theoretically using properties of measures. Further, software measurement plays an important role in understanding and controlling software development practices and products. The major requirement in software measurement is that the measures must represent accurately those attributes they purport to quantify and validation is critical to the success of software measurement. Normally, validation is a collection of analysis and testing activities across the full life cycle and complements the efforts of other quality engineering functions and validation is a critical task in any engineering project. Further, validation objective is to discover defects in a system and assess whether or not the system is useful and usable in operational situation. In the case of software engineering, validation is one of the software engineering disciplines that help build quality into software. The major objective of software validation process is to determine that the software performs its intended functions correctly and provides information about its quality and reliability. This paper discusses the validation methodology, techniques and different properties of measures that are used for software metrics validation. In most cases, theoretical and empirical validations are conducted for software metrics validations in software engineering [1-50].

  6. Participation in EU SARNET2 Project for an Enhancement of Severe Accident Evaluation Capability in Domestic NPPs: 2nd Year R and D Activities

    International Nuclear Information System (INIS)

    The following results were obtained from the 2nd year SARNET2 research activities. · WP4-2: Preliminary analysis of the RPV lower head corium behavior and heat transfer to the PRV wall for the reference plant (APR1400) using ASTEC 2.0. · WP6-4: Preliminary analysis of the effect of metal components on the cavity concrete erosion process and cooling of relevant corium with the OECD CCI-4 experimental data using the CORQUENCH3.3 code, and a MAAP4-based MCCI uncertainty analysis for the reference plant (OPR1000). · WP7-1: Analysis of solidification process of molten debris according to different material compositions and analysis of physico-chemical characteristics of material components as a preliminary step for applying to the reactor case. · WP7-2: Benchmark blind/open calculations for code validation against the IRSN ENACCEF experiments and reflection of its results to the SARNET2 WP7-2 reports. · WP8-3: Analysis of FP behavior for the reference plant (OPR1000) using MIDAS code and preparation of a TMI-2 based common input deck for ST analysis in containment. The foregoing research results and experimental database for main SA issues obtained by this research are expected to be used for resolving SA issue remained in domestic NPPs (operating, to be constructed, future) and enhancing the evaluating capability of Level-2 PSA

  7. Structural and magnetic study of order-disorder behavior in the double perovskites Ba2Nd1-xMnxMoO6.

    Science.gov (United States)

    Coomer, Fiona C; Cussen, Edmund J

    2014-01-21

    The synthesis and structural and magnetic characterization of the site-ordered double perovskites, Ba2Nd1-xMnxMoO6, 0 0.3, no deviation from the ideal cubic Fm3?m symmetry is observed. Furthermore, dc-susceptibility measurements confirm that Mn(2+) is being doped onto the Nd(3+) site, and the associated oxidation of Mo(5+) to Mo(6+). For all compositions, the Curie-Weiss paramagnetic behavior above 150 K indicates negative Weiss constants that range from -24(2) and -85(2) K. This net antiferromagnetic interaction is weakest when x ? 0.5, where the disorder in cation site occupancy and competition with ferromagnetic interactions is the greatest. Despite these strong antiferromagnetic interactions, there is no evidence in the dc-susceptibility of a bulk cancellation of spins for x > 0.05. Low-temperature neutron diffraction measurements indicate that there is no long-range magnetic order for 0.1 ? x Mn0.90MoO6 exhibits additional Bragg scattering at 2 K, indicative of long-range antiferromagnetic ordering of the Mn(2+) cations, with a propagation vector k = (1/2, 1/2, 1/2). The scattering intensities can be modeled using a noncollinear magnetic structure with the Mn(2+) moments orientated antiferromagnetically along the four different ?111? directions. PMID:24392887

  8. Thermoluminescent and stuctural properties of BaAl2O4:Eu2+,Nd3+,Gd3+phosphors prepared by combustion method

    International Nuclear Information System (INIS)

    BaAl2O4:Eu2+,Nd3+,Gd3+ phosphors were prepared by a combustion method at different initiating temperatures (400–1200 °C), using urea as a comburent. The powders were annealed at different temperatures in the range of 400–1100 °C for 3 h. X-ray diffraction data show that the crystallinity of the BaAl2O4 structure greatly improved with increasing annealing temperature. Blue-green photoluminescence, with persistent/long afterglow, was observed at 498 nm. This emission was attributed to the 4f65d1–4f7 transitions of Eu2+ ions. The phosphorescence decay curves were obtained by irradiating the samples with a 365 nm UV light. The glow curves of the as-prepared and the annealed samples were investigated in this study. The thermoluminescent (TL) glow peaks of the samples prepared at 600 °C and 1200 °C were both stable at ?72 °C suggesting that the traps responsible for the bands were fixed at this position irrespective of annealing temperature. These bands are at a similar position, which suggests that the traps responsible for these bands are similar. The rate of decay of the sample annealed at 600 °C was faster than that of the sample prepared at 1200 °C.

  9. Ba2NdZrO5.5 as a potential substrate material for YBa2Cu3O7-? superconducting films

    International Nuclear Information System (INIS)

    The new oxide Ba2NdZrO5.5 (BNZO) has been produced by the standard solid state reaction method. X-ray diffraction analysis (XRD) revealed that this synthesized material has an ordered complex cubic perovskite structure characteristic of A2BB'O6 crystalline structure with a lattice parameter of a = 8.40 Aa. It was established through EDX analysis that there is no trace of impurities. Chemical stability of BNZO with YBa2Cu3O7-? (YBCO) has been studied by means of Rietveld analysis of experimental XRD data on several samples of BNZO-YBCO composites. Quantitative analysis of phases on XRD patterns show that all peaks have been indexed for both BNZO and YBCO, and no extra peak is detectable. YBCO and BNZO remain as two different separate phases in the composites with no chemical reaction. Electrical measurements also revealed that superconducting transition temperature of pure YBCO and BNZO-YBCO composites is 90 K. These favorable characteristics of BNZO show that it can be used as a potential substrate material for deposition of YBCO superconducting films. (copyright 2007 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim) (orig.)

  10. Creative scientific research international session of 2nd meeting on advanced pulsed-neutron research on quantum functions in nano-scale materials

    International Nuclear Information System (INIS)

    1 MW-class pulsed-neutron sources will be constructed in Japan, United State and United Kingdom in a few years. Now is the time for a challenge to innovate on neutron science and extend new science fields. Toward the new era, we develop new pulsed-neutron technologies as well as new neutron devices under the international collaborations with existing pulsed-neutron facilities, such as the UK-Japan collaboration program on neutron scattering. At the same time, the new era will bring international competitions to neutron researchers. We aim to create new neutron science toward the new pulsed-neutron era by introducing the new technologies developed here. For this purpose, we have started the research project, 'Advanced pulsed-neutron research on quantum functions in nano-scale materials,' in the duration between JFY2004 and JFY2008. The 2nd meeting of this project was held on 22-24 February 2005 to summarize activities in FY2004 and to propose research projects in the coming new fiscal year. In this international session as a part of this meeting, the scientific results and research plans on the UK-Japan collaboration program, the research plans on the collaboration between IPNS (Intense Pulsed Neutron Source, Argonne National Laboratory) and KENS (Neutron Science Laboratory, KEK), also the recent scientific results arisen form this project were presented. (author)

  11. The 2000 activities and the 2nd Workshop on Human Resources Development in the Nuclear Field as part of Asian regional cooperation

    International Nuclear Information System (INIS)

    In 1999, the Project for Human Resources Development (HRD) was initiated as defined in the framework of the Forum for Nuclear Cooperation in Asia (FNCA), organized by the Atomic Energy Commission of Japan. The objective of the HRD Project is to solidify the foundation of technologies for nuclear development and utilization in Asia by promoting human resources development in Asian countries. In the Project there are two kind of activities; in-workshop activities and outside-of-workshop activities. As in-workshop activities, the 2nd Workshop on Human Resources Development in the Nuclear Field was held on November 27 and 28, 2000, at the Tokai Research Institute of JAERI. As outside-of-workshop activities. 'The presentation of the present state of international training and education in the nuclear field in Japan' was held on November 29, 2000 after the workshop. Participating countries were China, Indonesia, South Korea, Japan, Malaysia, the Philippines, Thailand, and Vietnam. The secretariat for the Human Resources Development Projects is provided by the Nuclear Technology and Education Center of the Japan Atomic Energy Research Institute. This report consists of presentation papers and materials at the Workshop, presentation documents of 'The present state of international training and education in the nuclear field in Japan', a letter of proposal from the Project Leader of Japan to the project leaders of the participating countries after the Workshop and a presentation paper on Human Resources Development at the 3rd Coordinators Meeting of FNCA at Tokyo on March 14-16, 2001. (author)

  12. Proceedings of the 2nd international workshop on electromagnetic forces and related effects on blankets and other structures surrounding the fusion plasma torus

    International Nuclear Information System (INIS)

    This publication is the collection of the papers presented at the title meeting. The subjects of the papers presented were categorized in six parts and are contained in this volume. In the first part, the valuable experiences are presented concerning electromagnetic phenomena in existing large devices or those under construction. In the 2nd part, the papers are mainly concerning on the evaluation of the electromagnetic fields and forces for the next experimental reactors. In the 3rd part, electromagnetomechanical coupling problems were treated by numerical and experimental approaches. In the part 4, numerical and experimental approaches for ferromagnetic structures are performed. In the 5th part, papers related to the structural integrity evaluation are presented. The part 6 is devoted to the proposal of the intelligent material system. A summary of the panel discussion held at the final session of the workshop is also included at the end of this volume. The 22 of the presented papers are indexed individually. (J.P.N.)

  13. Microstructure and Mechanical Properties of Friction Stir-Welded Mg-2Nd-0.3Zn-0.4Zr Magnesium Alloy

    Science.gov (United States)

    Zhao, Yong; Wang, Qingzhao; He, Xudan; Huang, Jian; Yan, Keng; Dong, Jie

    2014-11-01

    A 2 mm thick Mg-2Nd-0.3Zn-0.4Zr (NZ20K) and AZ31 plates were friction stir welded. The microstructures of joint were compared and the tensile properties at room temperature and 200 °C were measured. The fracture features and the microhardness of joints were investigated. The effect of the strengthening phases in NZ20K joint was discussed compared with AZ31 joint. The results indicate that NZ20K shows better property especially at high-temperature environment. The grain of NZ20K in the nugget zone (NZ) is refined obviously with uniform distribution of strengthening phase particles and it shows clear boundary between NZ and thermo-mechanically affected zone (TMAZ). The grains of TMAZ are elongated because of the stir action of tool pin. The heat-affected zone is narrow with coarse grains. Mg12Nd is the main strengthening phase in NZ20K joint through XRD analysis. The ultimate tensile strength of NZ20K joint decreases a little from room temperature to 200 °C for its main strengthening phase particle-Mg12Nd being stable when the temperature goes up. On the contrast, the ultimate tensile strength of AZ31 joint decreases a lot at 200 °C for its strengthening phase soften or dissolve at high temperature. The hardness of NZ20K joint is higher than AZ31 joint and the lowest hardness of both joints is achieved on the advancing side where the fracture occurred.

  14. Report on 2nd Royan Institute International Summer School on developmental biology and stem cells Tehran, Iran, 17-22nd July 2011.

    Science.gov (United States)

    Newgreen, Donald; Grounds, Miranda; Jesuthasan, Suresh; Rashidi, Hassan; Familari, Mary

    2012-03-01

    The 2nd Royan Institute International Summer School was built around the topic of stem cells and grounding in the discipline of developmental biology. The meeting provided not only direct transfer of technical and intellectual information, the normal process in scientific meetings, but was also a forum for the exchange of personal ideas of science as a creative pursuit. This summer school introduced aspiring young Iranian scientists to international researchers and exposed the latter to a rich culture that highly values learning and education, attested by the confident, intelligent young men and women who asked probing questions and who were eager to participate in the workshops. Hossein Baharvand's dedication and passion for science have led to an impressive record of national and international peer-reviewed publications and an increasing number of students who pursue science in Iran, and shows how the right people can create an environment where good science, good science education and motivation will flourish. This report summarizes some of the activities of the workshop in the Royan Institute and the impressions of the visiting scientists in the wider context of the scientific and cultural heritage of Iran. PMID:22364877

  15. U-Pb and K-Ar geochronology from the Cerro Empexa Formation, 1st and 2nd Regions, Precordillera, northern Chile

    International Nuclear Information System (INIS)

    The Cerro Empexa Formation (Galli, 1957) is a regionally distributed andesitic volcanic and continental sedimentary unit exposed in the Precordillera of the 1st and 2nd Regions of northern Chile. The formation has generally been considered to lie within the Lower or 'mid' Cretaceous, however, this assignment is based on scant, unreliable geochronologic data. Furthermore, there are conflicting interpretations as to whether the unit predates or postdates the first major Mesozoic shortening event affecting northern Chile. Because of the formation's presumed mid-Cretaceous age and its stratigraphic position over older back-arc sedimentary successions, the unit has been interpreted to represent products of the first eastward jump in the Andean magmatic arc from the arc's initial position in the Cordillera de la Costa (Scheuber and Reutter, 1992). In this paper we present the results of mapping and field observations that indicate exposures previously assigned to the Cerro Empexa Formation include two andesitic volcanic units separated by a major unconformity. The Cerro Empexa Formation proper lies above this unconformity. We also present U-Pb zircon and K-Ar geochronology that indicate the Cerro Empexa Formation is latest Cretaceous in its lower levels, and integrate our data with previously reported 40 Ar/39 Ar and fission-track data in the Cerros de Montecristo area (Maksaev, 1990; Maksaev and Zentilli, 1999) to show that 1800±600 m of rocks were d) to show that 1800±600 m of rocks were deposited within ca. 2.5 m.y (au)

  16. Report on the 2nd Florence International Symposium on Advances in Cardiomyopathies: 9th meeting of the European Myocardial and Pericardial Diseases WG of the ESC

    Directory of Open Access Journals (Sweden)

    Franco Cecchi

    2012-12-01

    Full Text Available A bridge between clinical and basic science aiming at cross fertilization, with leading experts presenting alongside junior investigators, is the key feature of the “2nd Florence International Symposium on Advances in Cardiomyopathies” , 9th Meeting of the Myocardial and Pericardial Diseases Working Group of the European Society of Cardiology, which was held in Florence, Italy on 26-­-28th September 2012. Patients with cardiomyopathies, with an estimated 3 per thousand prevalence in the general population, constitute an increasingly large proportion of patients seen by most cardiologists. This class of diseases, which are mostly genetically determined with different transmission modalities, can cause important and often unsolved management problems, despite rapid advances in the field. On the other hand, few other areas of cardiology have seen such an impressive contribution from basic science and translational research to the understanding of their pathophysiology and clinical management. The course was designed to constantly promote close interaction between basic science and clinical practice and highlight the top scientific and translational discoveries in this field in 10 scientific sessions. It was preceded by two mini-­-courses, which included the basic concepts of cardiomyocyte mechanical and electrophysiological properties and mechanisms, how-­-to sessions for clinical diagnosis and management and illustrative case study presentations of different cardiomyopathies.

  17. Results of on-line tests of the ENABLE prototype, a 2nd level trigger processor for the TRT of ATLAS/LHC

    International Nuclear Information System (INIS)

    The Enable Machine is a systolic 2nd level trigger processor for the transition radiation tracker (TRT) of ATLAS/LHC. The task of the processor is to find the best candidate for a lepton track in a high background of pions according to the EAST benchmark algorithm in less than 10 ?s. As described earlier, this is done in three steps. First all interesting tracks are histogrammed by accumulating for each track the coincidences between the track mask and the region-of-interest (RoI). Next the best defined track is identified. Eventually this track is classified as e or ?. A prototype has been developed and tested within the EAST/RD-11 collaboration at CERN. It operates at 50 MHz and finds up to 400 tracks in less than 10 ?s. It is assembled of an interface board and one or more histogrammer boards. The modular design makes the Enable Machine easily scalable. The histogrammer units are systolic arrays consisting of a matrix of 36 field programmable gate arrays. Through this it is possible to optimize the trigger algorithm, to adapt it to a changed detector setup, and even to implement completely new algorithms. For the beam tests in autumn 1993 at CERN the overall functionality within the detector environment could be shown. The authors were able to link successfully the Enable prototype to the detector raw data stream as well as to the data acquisition

  18. Giant dielectric permittivity caused by carrier hopping in a layered cuprate Bi2Ba2Nd1.6Ce0.4Cu2O10+?

    International Nuclear Information System (INIS)

    The ceramic sample of a layered cuprate Bi2Ba2Nd1.6Ce0.4Cu2O10+?, so-called Ba-based Bi-2222 compound was studied by the measurement of the temperature (80-300 K) and the frequency (20-106 Hz) dependence of the complex dielectric permittivity. The dielectric constant was measured as high as ?1000 at 1 kHz and 300 K with relatively low dissipation factor. However, it decreases systematically with decreasing temperature or with increasing frequency due to the dipolar relaxation process. This thermally activated relaxation process plays a dominant role for the low frequency dielectric response. Furthermore, the frequency-dependent ac conductivity was found to obey the power law ?=A?s. The results were interpreted in terms of Pike's model of hopping transport of localized charge carriers which yields explicitly the ?s behavior and the temperature dependence of s. And we calculated the ionization energy of localized carriers Wm=0.35 eV for the present sample

  19. Software for parallel processing applications

    International Nuclear Information System (INIS)

    Parallel computing has been used to solve large computing problems in high-energy physics. Typical problems include offline event reconstruction, monte carlo event-generation and reconstruction, and lattice QCD calculations. Fermilab has extensive experience in parallel computing using CPS (cooperative processes software) and networked UNIX workstations for the loosely-coupled problems of event reconstruction and monte carlo generation and CANOPY and ACPMAPS for Lattice QCD. Both systems will be discussed. Parallel software has been developed by many other groups, both commercial and research-oriented. Examples include PVM, Express and network-Linda for workstation clusters and PCN and STRAND88 for more tightly-coupled machines

  20. Simulation Software

    Science.gov (United States)

    1996-01-01

    Various NASA Small Business Innovation Research grants from Marshall Space Flight Center, Langley Research Center and Ames Research Center were used to develop the 'kernel' of COMCO's modeling and simulation software, the PHLEX finite element code. NASA needed it to model designs of flight vehicles; one of many customized commercial applications is UNISIM, a PHLEX-based code for analyzing underground flows in oil reservoirs for Texaco, Inc. COMCO's products simulate a computational mechanics problem, estimate the solution's error and produce the optimal hp-adapted mesh for the accuracy the user chooses. The system is also used as a research or training tool in universities and in mechanical design in industrial corporations.