WorldWideScience

Sample records for 2nd generation software

  1. STARS 2.0: 2nd-generation open-source archiving and query software

    Science.gov (United States)

    Winegar, Tom

    2008-07-01

    The Subaru Telescope is in process of developing an open-source alternative to the 1st-generation software and databases (STARS 1) used for archiving and query. For STARS 2, we have chosen PHP and Python for scripting and MySQL as the database software. We have collected feedback from staff and observers, and used this feedback to significantly improve the design and functionality of our future archiving and query software. Archiving - We identified two weaknesses in 1st-generation STARS archiving software: a complex and inflexible table structure and uncoordinated system administration for our business model: taking pictures from the summit and archiving them in both Hawaii and Japan. We adopted a simplified and normalized table structure with passive keyword collection, and we are designing an archive-to-archive file transfer system that automatically reports real-time status and error conditions and permits error recovery. Query - We identified several weaknesses in 1st-generation STARS query software: inflexible query tools, poor sharing of calibration data, and no automatic file transfer mechanisms to observers. We are developing improved query tools and sharing of calibration data, and multi-protocol unassisted file transfer mechanisms for observers. In the process, we have redefined a 'query': from an invisible search result that can only transfer once in-house right now, with little status and error reporting and no error recovery - to a stored search result that can be monitored, transferred to different locations with multiple protocols, reporting status and error conditions and permitting recovery from errors.

  2. 2nd Generation Alkaline Electrolysis : Final report

    DEFF Research Database (Denmark)

    Yde, Lars; Kjartansdóttir, Cecilia Kristin

    2013-01-01

    This report provides the results of the 2nd Generation Alkaline Electrolysis project which was initiated in 2008. The project has been conducted from 2009-2012 by a consortium comprising Århus University Business and Social Science – Centre for Energy Technologies (CET (former HIRC)), Technical University of Denmark – Mechanical Engineering (DTU-ME), Technical University of Denmark – Energy Conversion (DTU-EC), FORCE Technology and GreenHydrogen.dk. The project has been supported by EUDP.

  3. 2nd Generation QUATARA Flight Computer Project

    Science.gov (United States)

    Falker, Jay; Keys, Andrew; Fraticelli, Jose Molina; Capo-Iugo, Pedro; Peeples, Steven

    2015-01-01

    Single core flight computer boards have been designed, developed, and tested (DD&T) to be flown in small satellites for the last few years. In this project, a prototype flight computer will be designed as a distributed multi-core system containing four microprocessors running code in parallel. This flight computer will be capable of performing multiple computationally intensive tasks such as processing digital and/or analog data, controlling actuator systems, managing cameras, operating robotic manipulators and transmitting/receiving from/to a ground station. In addition, this flight computer will be designed to be fault tolerant by creating both a robust physical hardware connection and by using a software voting scheme to determine the processor's performance. This voting scheme will leverage on the work done for the Space Launch System (SLS) flight software. The prototype flight computer will be constructed with Commercial Off-The-Shelf (COTS) components which are estimated to survive for two years in a low-Earth orbit.

  4. 2nd Generation alkaline electrolysis. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Yde, L. [Aarhus Univ. Business and Social Science - Centre for Energy Technologies (CET), Aarhus (Denmark); Kjartansdottir, C.K. [Technical Univ. of Denmark. DTU Mechanical Engineering, Kgs. Lyngby (Denmark); Allebrod, F. [Technical Univ. of Denmark. DTU Energy Conversion, DTU Risoe Campus, Roskilde (Denmark)] [and others

    2013-03-15

    The overall purpose of this project has been to contribute to this load management by developing a 2{sup nd} generation of alkaline electrolysis system characterized by being compact, reliable, inexpensive and energy efficient. The specific targets for the project have been to: 1) Increase cell efficiency to more than 88% (according to the higher heating value (HHV)) at a current density of 200 mA /cm{sup 2}; 2) Increase operation temperature to more than 100 degree Celsius to make the cooling energy more valuable; 3) Obtain an operation pressure more than 30 bar hereby minimizing the need for further compression of hydrogen for storage; 4) Improve stack architecture decreasing the price of the stack with at least 50%; 5) Develop a modular design making it easy to customize plants in the size from 20 to 200 kW; 6) Demonstrating a 20 kW 2{sup nd} generation stack in H2College at the campus of Arhus University in Herning. The project has included research and development on three different technology tracks of electrodes; an electrochemical plating, an atmospheric plasma spray (APS) and finally a high temperature and pressure (HTP) track with operating temperature around 250 deg. C and pressure around 40 bar. The results show that all three electrode tracks have reached high energy efficiencies. In the electrochemical plating track a stack efficiency of 86.5% at a current density of 177mA/cm{sup 2} and a temperature of 74.4 deg. C has been shown. The APS track showed cell efficiencies of 97%, however, coatings for the anode side still need to be developed. The HTP cell has reached 100 % electric efficiency operating at 1.5 V (the thermoneutral voltage) with a current density of 1. 1 A/cm{sup 2}. This track only tested small cells in an externally heated laboratory set-up, and thus the thermal loss to surroundings cannot be given. The goal set for the 2{sup nd} generation electrolyser system, has been to generate 30 bar pressure in the cell stack. An obstacle to be overcomed has been to ensure equalisation of the H{sub 2} and O{sub 2} pressure to avoid that mixing of gasses can occur. To solve this problem, a special equilibrium valve has been developed to mechanically control that the pressure of the H{sub 2} at all times equals the O{sub 2} side. The developments have resulted in a stack design, which is a cylindrical pressure vessel, with each cell having a cell ''wall'' sufficiently thick, to resist the high pressure and sealed with O-rings for perfect sealing at high pressures. The stack has in test proved to resist a pressure on 45 bar, though some adjustment is still needed to optimize the pressure resistance and efficiency. When deciding on the new stack design both a 'zero gap' and 'non-zero gap' was considered. The zero gap design is more efficient than non-zero gap, however the design is more complex and very costly, primarily because the additional materials and production costs for zero gap electrodes. From these considerations, the concept of a ''low gap'', low diameter, high pressure and high cell number electrolyser stack was born, which could offer an improved efficiency of the electrolyser without causing the same high material and production cost as a zero gap zero gap solution. As a result the low gap design and pressurized stack has reduced the price by 60% of the total system, as well as a reduced system footprint. The progress of the project required a special focus on corrosion testing and examination of polymers in order to find alternative durable membrane and gasket materials. The initial literature survey and the first tests indicated that the chemical resistance of polymers presented a greater challenge than anticipated, and that test data from commercial suppliers were insufficient to model the conditions in the electrolyser. The alkali resistant polymers (e.g. Teflon) are costly and the search for cheaper alternatives turned into a major aim. A number of different tests were run under accelerated conditions and the degradation mechani

  5. The 2nd Generation Real Time Mission Monitor (RTMM) Development

    Science.gov (United States)

    Blakeslee, Richard; Goodman, Michael; Meyer, Paul; Hardin, Danny; Hall, John; He, Yubin; Regner, Kathryn; Conover, Helen; Smith, Tammy; Lu, Jessica; Garrett, Michelle

    2009-01-01

    The NASA Real Time Mission Monitor (RTMM) is a visualization and information system that fuses multiple Earth science data sources, to enable real time decisionmaking for airborne and ground validation experiments. Developed at the National Aeronautics and Space Administration (NASA) Marshall Space Flight Center, RTMM is a situational awareness, decision-support system that integrates satellite imagery and orbit data, radar and other surface observations (e.g., lightning location network data), airborne navigation and instrument data sets, model output parameters, and other applicable Earth science data sets. The integration and delivery of this information is made possible using data acquisition systems, network communication links, network server resources, and visualizations through the Google Earth virtual globe application. In order to improve the usefulness and efficiency of the RTMM system, capabilities are being developed to allow the end-user to easily configure RTMM applications based on their mission-specific requirements and objectives. This second generation RTMM is being redesigned to take advantage of the Google plug-in capabilities to run multiple applications in a web browser rather than the original single application Google Earth approach. Currently RTMM employs a limited Service Oriented Architecture approach to enable discovery of mission specific resources. We are expanding the RTMM architecture such that it will more effectively utilize the Open Geospatial Consortium Sensor Web Enablement services and other new technology software tools and components. These modifications and extensions will result in a robust, versatile RTMM system that will greatly increase flexibility of the user to choose which science data sets and support applications to view and/or use. The improvements brought about by RTMM 2nd generation system will provide mission planners and airborne scientists with enhanced decision-making tools and capabilities to more efficiently plan, prepare and execute missions, as well as to playback and review past mission data. To paraphrase the old television commercial RTMM doesn t make the airborne science, it makes the airborne science easier.

  6. Super Boiler 2nd Generation Technology for Watertube Boilers

    Energy Technology Data Exchange (ETDEWEB)

    Mr. David Cygan; Dr. Joseph Rabovitser

    2012-03-31

    This report describes Phase I of a proposed two phase project to develop and demonstrate an advanced industrial watertube boiler system with the capability of reaching 94% (HHV) fuel-to-steam efficiency and emissions below 2 ppmv NOx, 2 ppmv CO, and 1 ppmv VOC on natural gas fuel. The boiler design would have the capability to produce >1500 F, >1500 psig superheated steam, burn multiple fuels, and will be 50% smaller/lighter than currently available watertube boilers of similar capacity. This project is built upon the successful Super Boiler project at GTI. In that project that employed a unique two-staged intercooled combustion system and an innovative heat recovery system to reduce NOx to below 5 ppmv and demonstrated fuel-to-steam efficiency of 94% (HHV). This project was carried out under the leadership of GTI with project partners Cleaver-Brooks, Inc., Nebraska Boiler, a Division of Cleaver-Brooks, and Media and Process Technology Inc., and project advisors Georgia Institute of Technology, Alstom Power Inc., Pacific Northwest National Laboratory and Oak Ridge National Laboratory. Phase I of efforts focused on developing 2nd generation boiler concepts and performance modeling; incorporating multi-fuel (natural gas and oil) capabilities; assessing heat recovery, heat transfer and steam superheating approaches; and developing the overall conceptual engineering boiler design. Based on our analysis, the 2nd generation Industrial Watertube Boiler when developed and commercialized, could potentially save 265 trillion Btu and $1.6 billion in fuel costs across U.S. industry through increased efficiency. Its ultra-clean combustion could eliminate 57,000 tons of NOx, 460,000 tons of CO, and 8.8 million tons of CO2 annually from the atmosphere. Reduction in boiler size will bring cost-effective package boilers into a size range previously dominated by more expensive field-erected boilers, benefiting manufacturers and end users through lower capital costs.

  7. A ZeroDimensional Model of a 2nd Generation Planar SOFC Using Calibrated Parameters

    Directory of Open Access Journals (Sweden)

    Brian Elmegaard

    2006-12-01

    Full Text Available This paper presents a zero-dimensional mathematical model of a planar 2nd generation coflow SOFC developed for simulation of power systems. The model accounts for the electrochemical oxidation of hydrogen as well as the methane reforming reaction and the water-gas shift reaction. An important part of the paper is the electrochemical sub-model, where experimental data was used to calibrate specific parameters. The SOFC model was implemented in the DNA simulation software which is designed for energy system simulation. The result is an accurate and flexible tool suitable for simulation of many different SOFC-based power systems.

  8. From 1st- to 2nd-Generation Biofuel Technologies: Extended Executive Summary

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2008-07-01

    This report looks at the technical challenges facing 2nd-generation biofuels, evaluates their costs and examines related current policies to support their development and deployment. The potential for production of more advanced biofuels is also discussed. Although significant progress continues to be made to overcome the technical and economic challenges, 2nd-generation biofuels still face major constraints to their commercial deployment.

  9. Colchicine treatment of jute seedlings in the 1st and 2nd generation after irradiation

    International Nuclear Information System (INIS)

    Colchicine treatment (0.05% for 12 h) to 15 day old seedlings in the 1st generation after X-ray or gamma-ray exposure was lethal. In contrast the same colchicine treatment to 15 day old seedlings in the 2nd generation was effective in inducing polyploids. (author)

  10. Sustainable Production of Fuel : A Study for Customer Adoption of 2nd Generation of Biofuel

    OpenAIRE

    Jin, Ying

    2010-01-01

    Abstract Finding a new fuel to substitute gasoline which reducing rapidly every year, is an urgent problem in the world. In this situation, biofuel is considered to be one kind of new fuel which make no pollution. Nowadays, 1st generation biofuel is familiar with people and adopted by customers, which make it have a stable market share. Since it also brings new problems, 2nd generation biofuel appear and solve all the problems.In the thesis, I compared the pros and cons between the 1st and 2n...

  11. Effects of Thermal Cycling on Control and Irradiated EPC 2nd Generation GaN FETs

    Science.gov (United States)

    Patterson, Richard L.; Scheick, Leif; Lauenstein, Jean-Marie; Casey, Megan; Hammoud, Ahmad

    2013-01-01

    The power systems for use in NASA space missions must work reliably under harsh conditions including radiation, thermal cycling, and exposure to extreme temperatures. Gallium nitride semiconductors show great promise, but information pertaining to their performance is scarce. Gallium nitride N-channel enhancement-mode field effect transistors made by EPC Corporation in a 2nd generation of manufacturing were exposed to radiation followed by long-term thermal cycling in order to address their reliability for use in space missions. Results of the experimental work are presented and discussed.

  12. The Planar Optics Phase Sensor: a study for the VLTI 2nd Generation Fringe Tracker

    CERN Document Server

    Blind, Nicolas; Absil, Olivier; Alamir, Mazen; Berger, Jean-Philippe; Defrère, Denis; Feautrier, Philippe; Hénault, Franois; Jocou, Laurent; Kern, Pierre; Laurent, Thomas; Malbet, Fabien; Mourard, Denis; Rousselet-Perrault, Karine; Sarlette, Alain; Surdej, Jean; Tarmoul, Nassima; Tatulli, Eric; Vincent, Lionel; 10.1117/12.857114

    2010-01-01

    In a few years, the second generation instruments of the Very Large Telescope Interferometer (VLTI) will routinely provide observations with 4 to 6 telescopes simultaneously. To reach their ultimate performance, they will need a fringe sensor capable to measure in real time the randomly varying optical paths differences. A collaboration between LAOG (PI institute), IAGL, OCA and GIPSA-Lab has proposed the Planar Optics Phase Sensor concept to ESO for the 2nd Generation Fringe Tracker. This concept is based on the integrated optics technologies, enabling the conception of extremely compact interferometric instruments naturally providing single-mode spatial filtering. It allows operations with 4 and 6 telescopes by measuring the fringes position thanks to a spectrally dispersed ABCD method. We present here the main analysis which led to the current concept as well as the expected on-sky performance and the proposed design.

  13. The New 2nd-Generation SRF R&D Facility at Jefferson Lab: TEDF

    Energy Technology Data Exchange (ETDEWEB)

    Reece, Charles E.; Reilly, Anthony V.

    2012-09-01

    The US Department of Energy has funded a near-complete renovation of the SRF-based accelerator research and development facilities at Jefferson Lab. The project to accomplish this, the Technical and Engineering Development Facility (TEDF) Project has completed the first of two phases. An entirely new 3,100 m{sup 2} purpose-built SRF technical work facility has been constructed and was occupied in summer of 2012. All SRF work processes with the exception of cryogenic testing have been relocated into the new building. All cavity fabrication, processing, thermal treatment, chemistry, cleaning, and assembly work is collected conveniently into a new LEED-certified building. An innovatively designed 800 m2 cleanroom/chemroom suite provides long-term flexibility for support of multiple R&D and construction projects as well as continued process evolution. The characteristics of this first 2nd-generation SRF facility are described.

  14. Boosting biogas yield of anaerobic digesters by utilizing concentrated molasses from 2nd generation bioethanol plant

    Energy Technology Data Exchange (ETDEWEB)

    Sarker, Shiplu [Department of Renewable Energy, Faculty of Engineering and Science, University of Agder, Grimstad-4879 (Norway); Moeller, Henrik Bjarne [Department of Biosystems Engineering, Faculty of Science and Technology, Aarhus University, Research center Foulum, Blichers Alle, Post Box 50, Tjele-8830 (Denmark)

    2013-07-01

    Concentrated molasses (C5 molasses) from 2nd generation bioethanol plant has been investigated for enhancing productivity of manure based digesters. A batch study at mesophilic condition (35+- 1 deg C) showed the maximum methane yield from molasses as 286 LCH4/kgVS which was approximately 63% of the calculated theoretical yield. In addition to the batch study, co-digestion of molasses with cattle manure in a semi-continuously stirred reactor at thermophilic temperature (50+- 1 deg C) was also performed with a stepwise increase in molasses concentration. The results from this experiment revealed the maximum average biogas yield of 1.89 L/L/day when 23% VSmolasses was co-digested with cattle manure. However, digesters fed with more than 32% VSmolasses and with short adaptation period resulted in VFA accumulation and reduced methane productivity indicating that when using molasses as biogas booster this level should not be exceeded.

  15. Multi-objective Optimization of a Solar Assisted 1st and 2nd Generation Sugarcane Ethanol Production Plant

    OpenAIRE

    Zevenhoven, Ron; Wallerand, Anna Sophia; Queiroz Albarelli, Juliana; Viana Ensinas, Adriano; Ambrosetti, Gianluca; Mian, Alberto

    2014-01-01

    Ethanol production sites utilizing sugarcane as feedstock are usually located in regions with high land availability and decent solar radiation. This offers the opportunity to cover parts of the process energy demand with concentrated solar power (CSP) and thereby increase the fuel production and carbon conversion efficiency. A plant is examined that produces 1st and 2nd generation ethanol by fermentation of sugars (from sugarcane) and enzymatic hydrolysis of the lignocellulosic residues (bag...

  16. The 2nd Generation Street Children (SGSC) in Accra: Developing Teaching Strategies To Enhance Positive Learning Outcomes in Schools

    OpenAIRE

    Alhassan Abdul-Razak Kuyini; Okechuwu Abosi

    2011-01-01

    Ghana is witnessing an increasing number of 2nd generation street children (SGSC) living in the street of Accra, the capital city as a result of many factors including teenage pregnancy among street girls, ethnic conflicts and rural-urban migration. Street presents enormous risks to street children; they are excluded from safe-family environment, basic services like health and education, and protection against exploitation. This article explored the inclusion of 27 SGSC in regular schools in ...

  17. Generation of higher order Gauss-Laguerre modes in single-pass 2nd harmonic generation

    DEFF Research Database (Denmark)

    Buchhave, Preben; Tidemand-Lichtenberg, Peter

    2008-01-01

    We present a realistic method for dynamic simulation of the development of higher order modes in second harmonic generation. The deformation of the wave fronts due to the nonlinear interaction is expressed by expansion in higher order Gauss-Laguerre modes.

  18. Next Generation Millimeter/Submillimeter Array to Search for 2nd Earth

    OpenAIRE

    Saito, Masao; Iguchi, Satoru

    2011-01-01

    ALMA is a revolutionary radio telescope at present and its full operation will start from 2012. It is expected that ALMA will resolve several cosmic questions and will show a new cosmic view to us. Our passion for astronomy naturally goes beyond ALMA because we believe that the 21st-century Astronomy should pursue the new scientific frontier. In this conference, we propose a project of the future radio telescope to search for Habitable planets and finally detect 2nd Earth as...

  19. White paper on perspectives of biofuels in Denmark - with focus on 2nd generation bioethanol; Hvidbog om perspektiver for biobraendstoffer i Danmark - med fokus paa 2. generations bioethanol

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, Gy.; Foghmar, J.

    2009-11-15

    The white paper presents the perspectives - both options and barriers - for a Danish focus on production and use of biomass, including sustainable 2nd generation bioethanol, for transport. The white paper presents the current knowledge of biofuels and bioethanol and recommendations for a Danish strategy. (ln)

  20. Time resolved 2nd harmonic generation at LaAlO3/SrTiO3 Interfaces

    Science.gov (United States)

    Adhikari, Sanjay; Eom, Chang-Beom; Ryu, Sangwoo; Cen, Cheng

    2014-03-01

    Ultrafast spectroscopy can produce information of carrier/lattice dynamics, which is especially valuable for understanding phase transitions at LaAlO3/SrTiO3 interfaces. LaAlO3 (LAO) and SrTiO3 (STO) are both associated with wide band gap, which allows deep penetration of commonly used laser wavelengths and therefore usually leads to overwhelming bulk signal background. Here we report a time resolved study of a 2nd harmonic generation (SHG) signal resulting from impulsive below-the-band-gap optical pumping. The nonlinear nature of the signal enables us to probe the interface directly. Output of a home built Ti:Sapphire laser and BBO crystal were used to generate 30fs pulses of two colors (405nm and 810nm). The 405nm pulse was used to pump the LAO/STO interfaces, while 2nd harmonics of the 810nm pulse generated at the interfaces was probed as a function of the time delay. Signals from samples with varying LAO thicknesses clearly correlates to the metal-insulator transition. Distinct time dependent signals were observed at LAO/STO interfaces grown on different substrates. Experiments performed at different optical polarization geometries, interface electric fields and temperatures allow us to paint a clearer picture of the novel oxide heterostructures under investigation.

  1. Clinical evaluation of the 2nd generation radio-receptor assay for anti-thyrotropin receptor antibodies (TRAb) in Graves' disease

    International Nuclear Information System (INIS)

    Full text: Detection of autoantibodies to the TSH receptor by radioreceptorial assays (RRA) is largely requested in clinical practice for the diagnosis of Graves' disease and its differentiation from diffuse thyroid autonomy. Additionally, TRAb measurement during antithyroid drug treatment can be useful to evaluate the risk of disease's relapse alter therapy discontinuation. Nevertheless, some patients affected by Graves' disease are TRAb negative when 1st generation assay is used. Recently a new RRA method for TRAb assay was developed by using human recombinant TSH-receptor and solid-phase technique. Aim of our work was the comparison between 1st and 2nd generation TRAb assays in Graves' disease patients and, particularly, the evaluation of 2nd generation test in a sub-group of patients affected by Graves' disease but with negative 1st generation TRAb assay. We evaluated the diagnostic performance of a newly developed 2nd generation TRAb assay (DYNOtest(r) TRAK human, BRAHMS Diagnostica GmbH, Germany) in 46 patients affected by Graves' disease with negative 1st generation TRAb assay (TRAK Assay(r), BRAHMS Diagnostica GmbH, Germany) . A control groups of 50 Graves' disease patients with positive 1st generation TRAb assay, 50 patients affected by Hashimoto's thyroiditis and 50 patients affected by nodular goiter were also examined. 41 out of 46 patients affected by Graves' disease with negative 1st generation TRAb assay showed a positive 2nd generation test. The overall sensitivity of the 2nd generation test was significantly improved respect the 1st generation assay in Graves' disease patients (?2 = 22.5, p<0.0001). 1 and 3 out of 50 patients affected by Hashimoto's thyroiditis were positive by 1st and 2nd generation TRAB assay, respectively. All these patients showed primary hypothyroidism. No differences resulted in euthyroid Hashimoto's thyroiditis sub-group and in nodular goiter control group. The 2nd generation TRAB assay is clearly more sensitive than the 1st generation test and should be used in clinical practice to minimize the incidence of TRAb negative Graves' disease. Long-term prospective studies are needed to evaluate the prognostic role of 2nd generation TRAb assay in Graves' disease treated by antithyroid drugs. (author)

  2. Cogeneration and production of 2nd generation bio fuels using biomass gasification; Cogeneracion y produccion de biocombustibles de 2 generacion mediante gasificacion de biomasa

    Energy Technology Data Exchange (ETDEWEB)

    Uruena Leal, A.; Diez Rodriguez, D.; Antolin Giraldo, G.

    2011-07-01

    Thermochemical decomposition process of gasification, in which a carbonaceous fuel, under certain conditions of temperature and oxygen deficiency, results in a series of reactions that will produce a series of gaseous products is now widely used for high performance energetic and versatility of these gaseous products for energy and 2nd generation bio fuels and reduce the emission of greenhouse gases. (Author)

  3. Self-assembling software generator

    Science.gov (United States)

    Bouchard, Ann M. (Albuquerque, NM); Osbourn, Gordon C. (Albuquerque, NM)

    2011-11-25

    A technique to generate an executable task includes inspecting a task specification data structure to determine what software entities are to be generated to create the executable task, inspecting the task specification data structure to determine how the software entities will be linked after generating the software entities, inspecting the task specification data structure to determine logic to be executed by the software entities, and generating the software entities to create the executable task.

  4. Strategies for 2nd generation biofuels in EU - Co-firing to stimulate feedstock supply development and process integration to improve energy efficiency and economic competitiveness

    International Nuclear Information System (INIS)

    The present biofuel policies in the European Union primarily stimulate 1st generation biofuels that are produced based on conventional food crops. They may be a distraction from lignocellulose based 2nd generation biofuels - and also from biomass use for heat and electricity - by keeping farmers' attention and significant investments focusing on first generation biofuels and the cultivation of conventional food crops as feedstocks. This article presents two strategies that can contribute to the development of 2nd generation biofuels based on lignocellulosic feedstocks. The integration of gasification-based biofuel plants in district heating systems is one option for increasing the energy efficiency and improving the economic competitiveness of such biofuels. Another option, biomass co-firing with coal, generates high-efficiency biomass electricity and reduces CO2 emissions by replacing coal. It also offers a near-term market for lignocellulosic biomass, which can stimulate development of supply systems for biomass also suitable as feedstock for 2nd generation biofuels. Regardless of the long-term priorities of biomass use for energy, the stimulation of lignocellulosic biomass production by development of near term and cost-effective markets is judged to be a no-regrets strategy for Europe. Strategies that induce a relevant development and exploit existing energy infrastructures in order to reduce risk and reach lower costs, are proposed an attractive complement the present and prospective biofuel policies.

  5. Strategies for 2nd generation biofuels in EU - Co-firing to stimulate feedstock supply development and process integration to improve energy efficiency and economic competitiveness

    International Nuclear Information System (INIS)

    The present biofuel policies in the European Union primarily stimulate 1st generation biofuels that are produced based on conventional food crops. They may be a distraction from lignocellulose based 2nd generation biofuels - and also from biomass use for heat and electricity - by keeping farmers' attention and significant investments focusing on first generation biofuels and the cultivation of conventional food crops as feedstocks. This article presents two strategies that can contribute to the development of 2nd generation biofuels based on lignocellulosic feedstocks. The integration of gasification-based biofuel plants in district heating systems is one option for increasing the energy efficiency and improving the economic competitiveness of such biofuels. Another option, biomass co-firing with coal, generates high-efficiency biomass electricity and reduces CO2 emissions by replacing coal. It also offers a near-term market for lignocellulosic biomass, which can stimulate development of supply systems for biomass also suitable as feedstock for 2nd generation biofuels. Regardless of the long-term priorities of biomass use for energy, the stimulation of lignocellulosic biomass production by development of near term and cost-effective markets is judged to be a no-regrets strategy for Europe. Strategies that induce a relevant development and exploit existing energy infrastructures in order to reduce risk and reach lower costs, are proposed an attractive complement the present and prospective biofuel policies. (author)

  6. Anaerobic digestion in combination with 2nd generation ethanol production for maximizing biofuels yield from lignocellulosic biomass – testing in an integrated pilot-scale biorefinery plant

    DEFF Research Database (Denmark)

    Uellendahl, Hinrich; Ahring, Birgitte Kiær

    2010-01-01

    An integrated biorefinery concept for 2nd generation bioethanol production together with biogas production from the fermentation effluent was tested in pilot-scale. The pilot plant comprised pretreatment, enzymatic hydrolysis, hexose and pentose fermentation into ethanol and anaerobic digestion of the fermentation effluent in a UASB (upflow anaerobic sludge blanket) reactor. Operation of the 770 liter UASB reactor was tested under both mesophilic (38ºC) and thermophilic (53ºC) conditions with in...

  7. Experimental Investigation of 2nd Generation Bioethanol Derived from Empty-Fruit-Bunch (EFB) of Oil-Palmon Performance and Exhaust Emission of SI Engine

    OpenAIRE

    Yanuandri Putrasari; Haznan Abimanyu; Achmad Praptijanto; Arifin Nur; Yan Irawan; Sabar Pangihutan

    2014-01-01

    The experimental investigation of 2nd generation bioethanol derived from EFB of oil-palm blended with gasoline for 10, 20, 25% by volume and pure gasoline were conducted on performance and exhaust emission tests of SI engine. A four stroke, four cylinders, programmed fuel injection (PGMFI), 16 valves variable valve timing and electronic lift control (VTEC), single overhead camshaft (SOHC), and 1497 cm3 SI engine (Honda/L15A) was used in this investigation. Engine performance test was carried ...

  8. Development of WWER-440 fuel. Use of fuel assemblies of 2-nd and 3-rd generations with increased enrichment

    International Nuclear Information System (INIS)

    The problem of increasing the power of units at NPPs with WWER-440 is of current importance. There are all the necessary prerequisites for the above-stated problem as a result of updating the design of fuel assemblies and codes. The decrease of power peaking factor in the core is achieved by using profiled fuel assemblies, fuel-integrated burning absorber, FAs with modernized docking unit, modern codes, which allows decreasing conservatism of RP safety substantiation. A wide range of experimental studies of fuel behaviour has been performed which has reached burn-up of (50-60) MW·day/kgU in transition and emergency conditions, post-reactor studies of fuel assemblies, fuel rods and fuel pellets with a 5-year operating period have been performed, which prove high reliability of fuel, presence of a large margin in the fuel pillar, which helps reactor operation at increased power. The results of the work performed on introduction of 5-6 fuel cycles show that the ultimate fuel state on operability in WWER-440 reactors is far from being achieved. Neutron-physical and thermal-hydraulic characteristics of the cores of working power units with RP V-213 are such that actual (design and measured) power peaking factors on fuel assemblies and fuel rods, as a rule, are smaller than the maximum design values. This factor is a real reserve for power forcing. There is experience of operating Units 1, 2, 4 of the Kola NPP and Unit 2 of the Rovno NPP at increased power. Units of the Loviisa NPP are operated at 109 % power. During transfer to work at increased power it is reasonable to use fuel assemblies with increased height of the fuel pillar, which allows decreasing medium linear power distribution. Further development of the 2-nd generation fuel assembly design and consequent transition to working fuel assemblies of the 3-rd generation provides significant improvement of fuel consumption under the conditions of WWER-440 reactors operation with more continuous fuel cycles and increased power

  9. Open pit mine planning and design. Vol 1. Fundamentals; Vol. 2. CSMine software package and orebodey case examples. 2nd.

    Energy Technology Data Exchange (ETDEWEB)

    Hustrulid, W.; Kuchta, M. [University of Utah, Salt Lake City, UT (United States)

    2006-04-15

    This book is designed to be both a textbook and a reference book describing the principles involved in the planning and design of open pit mines. Volume 1 deals with the fundamental concepts involved in the planning and design of an open pit mine. The eight chapters cover mine planning, mining revenues and costs, orebody description, geometrical considerations, pit limits, and production planning, mineral resources and ore reserves, and responsible mining. There is an extensive coverage of environmental considerations and basic economic principles. A large number of examples have been included to illustrate the applications. A second volume is devoted to a mine design and software package, CSMine. CSMine is user-friendly mine planning and design software developed specifically to illustrate the practical application of the involved principles. It also comprises the CSMine tutorial, the CSMine user's manual and eight orebody case examples, including drillhole data sets for performing a complete open pit mine evaluation. 545 ills., 211 tabs.

  10. Immobilized High Level Waste (HLW) Interim Storage Alternative Generation and analysis and Decision Report 2nd Generation Implementing Architecture

    Energy Technology Data Exchange (ETDEWEB)

    CALMUS, R.B.

    2000-09-14

    Two alternative approaches were previously identified to provide second-generation interim storage of Immobilized High-Level Waste (IHLW). One approach was retrofit modification of the Fuel and Materials Examination Facility (FMEF) to accommodate IHLW. The results of the evaluation of the FMEF as the second-generation IHLW interim storage facility and subsequent decision process are provided in this document.

  11. BASE - 2nd generation software for microarray data management and analysis

    OpenAIRE

    Nordborg Nicklas; Vallon-Christersson Johan; Svensson Martin; Häkkinen Jari

    2009-01-01

    Abstract Background Microarray experiments are increasing in size and samples are collected asynchronously over long time. Available data are re-analysed as more samples are hybridized. Systematic use of collected data requires tracking of biomaterials, array information, raw data, and assembly of annotations. To meet the information tracking and data analysis challenges in microarray experiments we reimplemented and improved BASE version 1.2. Results The new BASE presented in this report is ...

  12. Advances with the new AIMS fab 193 2nd generation: a system for the 65 nm node including immersion

    Science.gov (United States)

    Zibold, Axel M.; Poortinga, E.; Doornmalen, H. v.; Schmid, R.; Scherubl, T.; Harnisch, W.

    2005-06-01

    The Aerial Image Measurement System, AIMS, for 193nm lithography emulation is established as a standard for the rapid prediction of wafer printability for critical structures including dense patterns and defects or repairs on masks. The main benefit of AIMS is to save expensive image qualification consisting of test wafer exposures followed by wafer CD-SEM resist or wafer analysis. By adjustment of numerical aperture (NA), illumination type and partial coherence (?) to match any given stepper/ scanner, AIMS predicts the printability of 193nm reticles such as binary with, or without OPC and phase shifting. A new AIMS fab 193 second generation system with a maximum NA of 0.93 is now available. Improvements in field uniformity, stability over time, measurement automation and higher throughput meet the challenging requirements of the 65nm node. A new function, "Global CD Map" can be applied to automatically measure and analyse the global CD uniformity of repeating structures across a reticle. With the options of extended depth-of-focus (EDOF) software and the upcoming linear polarisation capability in the illumination the new AIMS fab 193 second generation system is able to cover both dry and immersion requirements for NA < 1. Rigorous simulations have been performed to study the effects of polarisation for imaging by comparing the aerial image of the AIMS to the resist image of the scanner.

  13. Methodology for measuring the impact of mobile technology change from 2nd to 3th generation percerved by users of smes in Barranquilla

    Directory of Open Access Journals (Sweden)

    Jairo Polo

    2011-06-01

    Full Text Available This article presents the results of a research project undertaken to obtain a Masters inBusiness Administration from the Business School at the Universidad del Norte, whosepurpose was to identify and test a methodology to measure the impact exerted by thechange from 2nd to 3rd generation mobile tech, based on the perception of users belongingto Barranquilla SME, motivated by the influence of technological changes in behavior andthe knowledge creation among society members, and the importance it has taken to thesurvival of organizations the adoption of applications for process automation, web-basedapplications, voice, data and video that allow the development of competitive advantages,based on information and creativity for new and better products or services.

  14. FT-IR Investigation of Hoveyda-Grubbs'2nd Generation Catalyst in Self-Healing Epoxy Mixtures

    International Nuclear Information System (INIS)

    The development of smart composites capable of self-repair on aeronautical structures is still at the planning stage owing to complex issues to overcome. A very important issue to solve concerns the components' stability of the proposed composites which are compromised at the cure temperatures necessary for good performance of the composite. In this work we analyzed the possibility to apply Hoveyda Grubbs' second generation catalyst (HG2) to develop self-healing systems. Our experimental results have shown critical issues in the use of epoxy precursors in conjunction with Hoveyda-Grubbs II metathesis catalyst. However, an appropriate curing cycle of the self-healing mixture permits to overcome the critical issues making possible high temperatures for the curing process without deactivating self-repair activity.

  15. Control system for the 2nd generation Berkeley automounters (BAM2) at GM/CA-CAT macromolecular crystallography beamlines

    Energy Technology Data Exchange (ETDEWEB)

    Makarov, O., E-mail: makarov@anl.gov [GM/CA-CAT, Biosciences Division, Argonne National Laboratory, Argonne, IL 60439 (United States); Hilgart, M.; Ogata, C.; Pothineni, S. [GM/CA-CAT, Biosciences Division, Argonne National Laboratory, Argonne, IL 60439 (United States); Cork, C. [Physical Biosciences Division, Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States)

    2011-09-01

    GM/CA-CAT at Sector 23 of the Advanced Photon Source (APS) is an NIH funded facility for crystallographic structure determination of biological macromolecules by X-ray diffraction. A second-generation Berkeley automounter is being integrated into the beamline control system at the 23BM experimental station. This new device replaces the previous all-pneumatic gripper motions with a combination of pneumatics and XYZ motorized linear stages. The latter adds a higher degree of flexibility to the robot including auto-alignment capability, accommodation of a larger capacity sample Dewar of arbitrary shape, and support for advanced operations such as crystal washing, while preserving the overall simplicity and efficiency of the Berkeley automounter design.

  16. Anaerobic digestion in combination with 2nd generation ethanol production for maximizing biofuels yield from lignocellulosic biomass – testing in an integrated pilot-scale biorefinery plant.

    DEFF Research Database (Denmark)

    Uellendahl, Hinrich; Ahring, Birgitte Kiær

    An integrated biorefinery concept for 2nd generation bioethanol production together with biogas production from the fermentation effluent was tested in pilot-scale. The pilot plant comprised pretreatment, enzymatic hydrolysis, hexose and pentose fermentation into ethanol and anaerobic digestion of the fermentation effluent in a UASB (upflow anaerobic sludge blanket) reactor. Operation of the 770 liter UASB reactor was tested under both mesophilic (38ºC) and thermophilic (53ºC) conditions with increasing loading rates of the liquid fraction of the effluent from ethanol fermentation. At an OLR of 3.5 kg-VS/(m3•d) a methane yield of 340 L/kg-VS was achieved for thermophilic operation while 270 L/kg-VS was obtained under mesophilic conditions. Thermophilic operation was, however, less robust towards further increase of the loading rate and for loading rates higher than 5 kg-VS/(m3•d) the yield was higher for mesophilic than for thermophilic operation. The effluent from the ethanol fermentation showed no signs of toxicity to the anaerobic microorganisms. Implementation of the biogas production from the fermentation effluent accounted for about 30% higher biofuels yield in the biorefinery compared to a system with only bioethanol production.

  17. Experimental Investigation of 2nd Generation Bioethanol Derived from Empty-fruit-bunch (EFB of Oil-palm on Performance and Exhaust Emission of SI Engine

    Directory of Open Access Journals (Sweden)

    Yanuandri Putrasari

    2014-07-01

    Full Text Available The experimental investigation of 2nd generation bioethanol derived from EFB of oil-palm blended with gasoline for 10, 20, 25% by volume and pure gasoline were conducted on performance and exhaust emission tests of SI engine. A four stroke, four cylinders, programmed fuel injection (PGMFI, 16 valves variable valve timing and electronic lift control (VTEC, single overhead camshaft (SOHC, and 1,497 cm3 SI engine (Honda/L15A was used in this investigation. Engine performance test was carried out for brake torque, power, and fuel consumption. The exhaust emission was analyzed for carbon monoxide (CO and hydrocarbon (HC. The engine was operated on speed range from1,500 until 4,500 rev/min with 85% throttle opening position. The results showed that the highest brake torque of bioethanol blends achieved by 10% bioethanol content at 3,000 to 4,500 rpm, the brake power was greater than pure gasoline at 3,500 to 4,500 rpm for 10% bioethanol, and bioethanol-gasoline blends of 10 and 20% resulted greater bsfc than pure gasoline at low speed from 1,500 to 3,500 rpm. The trend of CO and HC emissions tended to decrease when the engine speed increased.

  18. Sistema especialista de 2ª geração para diagnose técnica: modelo e procedimento 2nd generation expert system for technical diagnosis: a model and a procedure

    Directory of Open Access Journals (Sweden)

    Néocles Alves Pereira

    1994-04-01

    Full Text Available Este trabalho trata da diagnose em equipamentos industriais através do uso de Sistemas Especialistas. Com o objetivo de desenvolver procedimentos que contribuam na construção de Sistemas Especialistas para diagnose em Manutenção Industrial, consideramos os chamados Sistemas Especialistas de 2ª Geração. Propomos um modelo modificado e um procedimento de diagnose. Na estratégia de diagnose utilizamos uma busca "top-down best-first", que combina dois tipos de tratamento de incerteza: (i entropia, para decidir pelo melhor caminho nas estruturas de conhecimento, e (ii crença nos sintomas, para validar os diagnósticos obtidos. Esta proposta traz as seguintes vantagens: base de conhecimento mais completa, melhores explicação e apresentação de diagnósticos finais. Desenvolvemos um protótipo com base em informações reais sobre bombas centrífugas.This paper deals with the diagnosis of industrial equipments through the use of Expert Systems. Intending to develop procedures that result in diagnosis knowledge bases for Industrial Maintenance, we have considered 2nd Generation Expert Systems. We have proposed a modified model and a diagnosis procedure. We used for the diagnosis strategy a "top-down best-first search", that combines two types of uncertainty treatment: (i entropy, to find the best way in the search throughout knowledge structures, (ii belief in the symptoms, to validate the resultant diagnostics. This proposal has the following advantages: a more complete knowledge base, a better explanation and presentation of the resultant diagnostics. We have developed a prototype considering real informations about centrifugal pumps.

  19. Power plant intake quantification of wheat straw composition for 2nd generation bioethanol optimization : A Near Infrared Spectroscopy (NIRS) feasibility study

    DEFF Research Database (Denmark)

    Lomborg, Carina J.; Thomsen, Mette Hedegaard

    2010-01-01

    Optimization of 2nd generation bioethanol production from wheat straw requires comprehensive knowledge of plant intake feedstock composition. Near Infrared Spectroscopy is evaluated as a potential method for instantaneous quantification of the salient fermentation wheat straw components: cellulose (glucan), hemicelluloses (xylan, arabinan), and lignin. Aiming at chemometric multivariate calibration, 44 pre-selected samples were subjected to spectroscopy and reference analysis. For glucan and xylan prediction accuracies (slope: 0.89, 0.94) and precisions (r2: 0.87) were obtained, corresponding to error of prediction levels at 8–9%. Models for arabinan and lignin were marginally less good, and especially for lignin a further expansion of the feasibility dataset was deemed necessary. The results are related to significant influences from sub-sampling/mass reduction errors in the laboratory regimen. A relative high proportion of outliers excluded from the present models (10–20%) may indicate that comminution sample preparation is most likely always needed. Different solutions to these issues are suggested.

  20. Using Software Categories for the Development of Generative Software

    OpenAIRE

    Nazari, Pedram Mir Seyed; Rumpe, Bernhard

    2015-01-01

    In model-driven development (MDD) software emerges by systematically transforming abstract models to concrete source code. Ideally, performing those transformations is to a large extent the task of code generators. One approach for developing a new code generator is to write a reference implementation and separate it into handwritten and generatable code. Typically, the generator developer manually performs this separation a process that is often time-consuming, labor-intens...

  1. Waveform generator for Software Defined Radio

    OpenAIRE

    Martins, Francisco Arrabaça

    2012-01-01

    This dissertation is inserted into the area of radio frequency electronics, specially in signal generation to characterize systems with Software Defined Radio (SDR) architecture. This architecture has like a concept defining a radio completely adjustable by software, by converting blocks of the analog domain to the digital domain. This architecture has like a concept defining a radio completely adjustable by software, by converting blocks of the analog domain to the digital dom...

  2. TOWARDS TEST CASES GENERATION FROM SOFTWARE SPECIFICATIONS

    Directory of Open Access Journals (Sweden)

    R. Jeevarathinam,

    2010-11-01

    Full Text Available Verification and Validation of software systems often consumes up to 70% of the development resources. Testing is one of the most frequently used Verification and Validation techniques for verifyingsystems. Many agencies that certify software systems for use require that the software be tested to certain specified levels of coverage. Currently, developing test cases to meet these requirements takes a major portion of the resources. Automating this task would result in significant time and cost savings. This testing research is aimed at the generation of such test cases. In the proposed approach a formal model of the required software behavior (a formal specification is used for test-case generation and as an oracle to determine if theimplementation produced the correct output during testing. This is referred to as Specification Based Testing. Specification based testing offers several advantages to traditional code based testing. The formal specification can be used as the source artifact to generate functional tests for the final product and since the test cases are produced at an earlier stage in the software development, they are available before the implementation is completed. Central to this approach is the use of model checkers as test case generation engines. Model checking is a technique for exploring the reachable state-space of a system model to verify properties of interest.There are several research challenges that must be addressed to realize this test generation approach.

  3. Generating and Evaluating Software Product Ideas.

    Science.gov (United States)

    Coyne, John P.

    1989-01-01

    Ten ways to evaluate new software product ideas are presented, such as talking with computer user groups and advertising the product before development to determine consumer interest. Ten methods for generating new product ideas are also offered, including reading material on the fringe of one's work and soliciting opinions of potential clients.…

  4. Monte Carlo generators in ATLAS software

    International Nuclear Information System (INIS)

    This document describes how Monte Carlo (MC) generators can be used in the ATLAS software framework (Athena). The framework is written in C++ using Python scripts for job configuration. Monte Carlo generators that provide the four-vectors describing the results of LHC collisions are written in general by third parties and are not part of Athena. These libraries are linked from the LCG Generator Services (GENSER) distribution. Generators are run from within Athena and the generated event output is put into a transient store, in HepMC format, using StoreGate. A common interface, implemented via inheritance of a GeneratorModule class, guarantees common functionality for the basic generation steps. The generator information can be accessed and manipulated by helper packages like TruthHelper. The ATLAS detector simulation as well access the truth information from StoreGate1. Steering is done through specific interfaces to allow for flexible configuration using ATLAS Python scripts. Interfaces to most general purpose generators, including: Pythia6, Pythia8, Herwig, Herwig++ and Sherpa are provided, as well as to more specialized packages, for example Phojet and Cascade. A second type of interface exist for the so called Matrix Element generators that only generate the particles produced in the hard scattering process and write events in the Les Houches event format. A generic interface to pass these events to Pythia6 and Herwig for parton showering and hadronisation has been written.

  5. Monte Carlo generators in ATLAS software

    Energy Technology Data Exchange (ETDEWEB)

    Ay, C [University Goettingen, II physics institutes of Physics (Germany); Buckley, A [Institute for Particle Physics Phenomenology, Durham University (United Kingdom); Butterworth, J [Department of Physics and Astronomy, University College London (United Kingdom); Ferland, J [University of Montreal, Montreal (Canada); Hinchliffe, I [Lawrence Berkeley National Laboratory, Berkeley, CA, 94720 (United States); Jinnouchi, O [KEK, High Energy Accelerator Research Organization, 1-1 Oho, Tsukuba 305-0801 (Japan); Katzy, J; Lobodzinska, E; Qin, Z [DESY, Hamburg (Germany); Kersevan, B [Faculty of Mathematics and Physics, University of Ljubljana, Jadranska 19, SI-1000 Ljubljana (Slovenia); Monk, J [University College London, Department of Physics and Astronomy, Gower Street, London WC1E6BT (United Kingdom); Savinov, V [University of Pittsburgh, Department of Physics and Astronomy 3941 O' Hara Street, Pittsburgh, PA 15260 (United States); Schumacher, J [TU Dresden, Institute fuer Kern- und Teilchenphysik, D-01069 Dresden (Germany)

    2010-04-01

    This document describes how Monte Carlo (MC) generators can be used in the ATLAS software framework (Athena). The framework is written in C++ using Python scripts for job configuration. Monte Carlo generators that provide the four-vectors describing the results of LHC collisions are written in general by third parties and are not part of Athena. These libraries are linked from the LCG Generator Services (GENSER) distribution. Generators are run from within Athena and the generated event output is put into a transient store, in HepMC format, using StoreGate. A common interface, implemented via inheritance of a GeneratorModule class, guarantees common functionality for the basic generation steps. The generator information can be accessed and manipulated by helper packages like TruthHelper. The ATLAS detector simulation as well access the truth information from StoreGate{sup 1}. Steering is done through specific interfaces to allow for flexible configuration using ATLAS Python scripts. Interfaces to most general purpose generators, including: Pythia6, Pythia8, Herwig, Herwig++ and Sherpa are provided, as well as to more specialized packages, for example Phojet and Cascade. A second type of interface exist for the so called Matrix Element generators that only generate the particles produced in the hard scattering process and write events in the Les Houches event format. A generic interface to pass these events to Pythia6 and Herwig for parton showering and hadronisation has been written.

  6. 2nd Historic Mortars Conference

    CERN Document Server

    Hughes, John; Groot, Caspar; Historic Mortars : Characterisation, Assessment and Repair

    2012-01-01

    This volume focuses on research and practical issues connected with mortars on historic structures. The book is divided into four sections: Characterisation of Historic Mortars, Repair Mortars and Design Issues, Experimental Research into Properties of Repair Mortars, and Assessment and Testing. The papers present the latest work of researchers in their field. The individual contributions were selected from the contributions to the 2nd Historic Mortars Conference, which took place in Prague, September, 22-24, 2010. All papers were reviewed and improved as necessary before publication. This peer review process by the editors resulted in the 34 individual contributions included in here. One extra paper reviewing and summarising State-of-the-Art knowledge covered by this publication was added as a starting and navigational point for the reader. The editors believe that having these papers in print is important and they hope that it will stimulate further research into historic mortars and related subjects. 

  7. Generating Protocol Software from CPN Models Annotated with Pragmatics

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge; Kristensen, Lars M.; Kindler, Ekkart

    2013-01-01

    Model-driven software engineering (MDSE) provides a foundation for automatically generating software based on models that focus on the problem domain while abstracting from the details of underlying implementation platforms. Coloured Petri Nets (CPNs) have been widely used to formally model and verify protocol software, but limited work exists on using CPN models of protocols as a basis for automated code generation. The contribution of this paper is a method for generating protocol software fro...

  8. Automated Environment Generation for Software Model Checking

    Science.gov (United States)

    Tkachuk, Oksana; Dwyer, Matthew B.; Pasareanu, Corina S.

    2003-01-01

    A key problem in model checking open systems is environment modeling (i.e., representing the behavior of the execution context of the system under analysis). Software systems are fundamentally open since their behavior is dependent on patterns of invocation of system components and values defined outside the system but referenced within the system. Whether reasoning about the behavior of whole programs or about program components, an abstract model of the environment can be essential in enabling sufficiently precise yet tractable verification. In this paper, we describe an approach to generating environments of Java program fragments. This approach integrates formally specified assumptions about environment behavior with sound abstractions of environment implementations to form a model of the environment. The approach is implemented in the Bandera Environment Generator (BEG) which we describe along with our experience using BEG to reason about properties of several non-trivial concurrent Java programs.

  9. Techno-economic evaluation of 2nd generation bioethanol production from sugar cane bagasse and leaves integrated with the sugar-based ethanol process

    Directory of Open Access Journals (Sweden)

    Macrelli Stefano

    2012-04-01

    Full Text Available Abstract Background Bioethanol produced from the lignocellulosic fractions of sugar cane (bagasse and leaves, i.e. second generation (2G bioethanol, has a promising market potential as an automotive fuel; however, the process is still under investigation on pilot/demonstration scale. From a process perspective, improvements in plant design can lower the production cost, providing better profitability and competitiveness if the conversion of the whole sugar cane is considered. Simulations have been performed with AspenPlus to investigate how process integration can affect the minimum ethanol selling price of this 2G process (MESP-2G, as well as improve the plant energy efficiency. This is achieved by integrating the well-established sucrose-to-bioethanol process with the enzymatic process for lignocellulosic materials. Bagasse and leaves were steam pretreated using H3PO4 as catalyst and separately hydrolysed and fermented. Results The addition of a steam dryer, doubling of the enzyme dosage in enzymatic hydrolysis, including leaves as raw material in the 2G process, heat integration and the use of more energy-efficient equipment led to a 37 % reduction in MESP-2G compared to the Base case. Modelling showed that the MESP for 2G ethanol was 0.97 US$/L, while in the future it could be reduced to 0.78 US$/L. In this case the overall production cost of 1G + 2G ethanol would be about 0.40 US$/L with an output of 102 L/ton dry sugar cane including 50 % leaves. Sensitivity analysis of the future scenario showed that a 50 % decrease in the cost of enzymes, electricity or leaves would lower the MESP-2G by about 20%, 10% and 4.5%, respectively. Conclusions According to the simulations, the production of 2G bioethanol from sugar cane bagasse and leaves in Brazil is already competitive (without subsidies with 1G starch-based bioethanol production in Europe. Moreover 2G bioethanol could be produced at a lower cost if subsidies were used to compensate for the opportunity cost from the sale of excess electricity and if the cost of enzymes continues to fall.

  10. A Practical GLR Parser Generator for Software Reverse Engineering

    OpenAIRE

    Teng Geng; Fu Xu; Han Mei; Wei Meng; Zhibo Chen; Changqing Lai

    2014-01-01

    Traditional parser generators use deterministic parsing methods. These methods can not meet the parsing requirements of software reverse engineering effectively. A new parser generator is presented which can generate GLR parser with automatic error recovery. The generated GLR parser has comparable parsing speed with the traditional LALR(1) parser and can be used in the parsing of software reverse engineering.

  11. Minimal Unroll Factor for Code Generation of Software Pipelining

    OpenAIRE

    Bachir M.; Touati S.-A.-A.; Brault F.; Gregg D.; Cohen A

    2012-01-01

    We address the problem of generating compact code from software pipelined loops. Although software pipelining is a powerful technique to extract fine- grain parallelism, it generates lifetime intervals spanning multiple loop iterations. These intervals require periodic register allocation (also called variable expansion), which in turn yields a code generation challenge. We are looking for the minimal unrolling factor enabling the periodic register allocation of software pipelined kernels. Th...

  12. Automatic Testcase Generation for Flight Software

    Science.gov (United States)

    Bushnell, David Henry; Pasareanu, Corina; Mackey, Ryan M.

    2008-01-01

    The TacSat3 project is applying Integrated Systems Health Management (ISHM) technologies to an Air Force spacecraft for operational evaluation in space. The experiment will demonstrate the effectiveness and cost of ISHM and vehicle systems management (VSM) technologies through onboard operation for extended periods. We present two approaches to automatic testcase generation for ISHM: 1) A blackbox approach that views the system as a blackbox, and uses a grammar-based specification of the system's inputs to automatically generate *all* inputs that satisfy the specifications (up to prespecified limits); these inputs are then used to exercise the system. 2) A whitebox approach that performs analysis and testcase generation directly on a representation of the internal behaviour of the system under test. The enabling technologies for both these approaches are model checking and symbolic execution, as implemented in the Ames' Java PathFinder (JPF) tool suite. Model checking is an automated technique for software verification. Unlike simulation and testing which check only some of the system executions and therefore may miss errors, model checking exhaustively explores all possible executions. Symbolic execution evaluates programs with symbolic rather than concrete values and represents variable values as symbolic expressions. We are applying the blackbox approach to generating input scripts for the Spacecraft Command Language (SCL) from Interface and Control Systems. SCL is an embedded interpreter for controlling spacecraft systems. TacSat3 will be using SCL as the controller for its ISHM systems. We translated the SCL grammar into a program that outputs scripts conforming to the grammars. Running JPF on this program generates all legal input scripts up to a prespecified size. Script generation can also be targeted to specific parts of the grammar of interest to the developers. These scripts are then fed to the SCL Executive. ICS's in-house coverage tools will be run to measure code coverage. Because the scripts exercise all parts of the grammar, we expect them to provide high code coverage. This blackbox approach is suitable for systems for which we do not have access to the source code. We are applying whitebox test generation to the Spacecraft Health INference Engine (SHINE) that is part of the ISHM system. In TacSat3, SHINE will execute an on-board knowledge base for fault detection and diagnosis. SHINE converts its knowledge base into optimized C code which runs onboard TacSat3. SHINE can translate its rules into an intermediate representation (Java) suitable for analysis with JPF. JPF will analyze SHINE's Java output using symbolic execution, producing testcases that can provide either complete or directed coverage of the code. Automatically generated test suites can provide full code coverage and be quickly regenerated when code changes. Because our tools analyze executable code, they fully cover the delivered code, not just models of the code. This approach also provides a way to generate tests that exercise specific sections of code under specific preconditions. This capability gives us more focused testing of specific sections of code.

  13. Automatic program generation: future of software engineering

    Energy Technology Data Exchange (ETDEWEB)

    Robinson, J.H.

    1979-01-01

    At this moment software development is still more of an art than an engineering discipline. Each piece of software is lovingly engineered, nurtured, and presented to the world as a tribute to the writer's skill. When will this change. When will the craftsmanship be removed and the programs be turned out like so many automobiles from an assembly line. Sooner or later it will happen: economic necessities will demand it. With the advent of cheap microcomputers and ever more powerful supercomputers doubling capacity, much more software must be produced. The choices are to double the number of programers, double the efficiency of each programer, or find a way to produce the needed software automatically. Producing software automatically is the only logical choice. How will automatic programing come about. Some of the preliminary actions which need to be done and are being done are to encourage programer plagiarism of existing software through public library mechanisms, produce well understood packages such as compiler automatically, develop languages capable of producing software as output, and learn enough about the whole process of programing to be able to automate it. Clearly, the emphasis must not be on efficiency or size, since ever larger and faster hardware is coming.

  14. Next-generation business intelligence software with Silverlight 3

    CERN Document Server

    Czernicki, Bart

    2010-01-01

    Business Intelligence (BI) software is the code and tools that allow you to view different components of a business using a single visual platform, making comprehending mountains of data easier. Applications that include reports, analytics, statistics, and historical and predictive modeling are all examples of BI applications. Currently, we are in the second generation of BI software, called BI 2.0. This generation is focused on writing BI software that is predictive, adaptive, simple, and interactive. As computers and software have evolved, more data can be presented to end users with increas

  15. Abstracts: 2nd interventional MRI symposium

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1997-09-01

    Main topics of the 2nd interventional MRI symposium were: MR compatibility and pulse sequences; MR thermometry, biopsy, musculoskeletal system; laser-induced interstitial thermotherapy, radiofrequency ablations; intraoperative MR; vascular applications, breast, endoscopy; focused ultrasound, cryotherapy, perspectives; poster session with 34 posters described. (AJ)

  16. Using DSL for Automatic Generation of Software Connectors.

    Czech Academy of Sciences Publication Activity Database

    Bureš, Tomáš; Malohlava, M.; Hn?tynka, P.

    Los Alamitos : IEEE Computer Society, 2008, s. 138-147. ISBN 978-0-7695-3091-8. [ICCBSS 2008. International Conference on Composition-Based Software Systems /7./. Madrid (ES), 25.02.2008-29.02.2008,] R&D Projects: GA AV ?R 1ET400300504 Institutional research plan: CEZ:AV0Z10300504 Keywords : component based systems * software connectors * code generation * domain-specific languages Subject RIV: JC - Computer Hardware ; Software

  17. Creating the next generation control system software

    International Nuclear Information System (INIS)

    A new 1980's style support package for future accelerator control systems is proposed. It provides a way to create accelerator applications software without traditional programming. Visual Interactive Applications (VIA) is designed to meet the needs of expanded accelerator complexes in a more cost effective way than past experience with procedural languages by using technology from the personal computer and artificial intelligence communities. 4 refs

  18. 2nd International Arctic Ungulate Conference

    OpenAIRE

    Anonymous, A.

    1996-01-01

    The 2nd International Arctic Ungulate Conference was held 13-17 August 1995 on the University of Alaska Fairbanks campus. The Institute of Arctic Biology and the Alaska Cooperative Fish and Wildlife Research Unit were responsible for organizing the conference with assistance from biologists with state and federal agencies and commercial organizations. David R. Klein was chair of the conference organizing committee. Over 200 people attended the conference, coming from 10 different countries. T...

  19. Software Stream Cipher based on pGSSG Generator

    Directory of Open Access Journals (Sweden)

    Antoniya Tasheva

    2015-05-01

    Full Text Available Secrecy of a software stream cipher based on p-ary Generalized Self-Shrinking Generator (pGSSG is examined in this paper. Background information for the generator’s algorithm is provided. The software architecture and key management for the cipher initialization are explained. Galois Field GF(25732 and feedback polynomials are chosen for initialization of the generator. In order to examine the secrecy mathematical model of the software system is made. It is proved that the cipher is not perfect but the empirical tests result in less than 0,0125% deviation of the encrypted files’ entropy from the perfect secrecy. At last the proposed cipher is compared to four eSTREAM finalists by key length and period.

  20. 2nd International Conference on Mobile and Wireless Technology

    CERN Document Server

    Wattanapongsakorn, Naruemon

    2015-01-01

    This book provides a snapshot of the current state-of-the-art in the fields of mobile and wireless technology, security and applications.  The proceedings of the 2nd International Conference on Mobile and Wireless Technology (ICMWT2015), it represents the outcome of a unique platform for researchers and practitioners from academia and industry to share cutting-edge developments in the field of mobile and wireless science technology, including those working on data management and mobile security.   The contributions presented here describe the latest academic and industrial research from the international mobile and wireless community.  The scope covers four major topical areas: mobile and wireless networks and applications; security in mobile and wireless technology; mobile data management and applications; and mobile software.  The book will be a valuable reference for current researchers in academia and industry, and a useful resource for graduate-level students working on mobile and wireless technology...

  1. Development of the software generation method using model driven software engineering tool

    International Nuclear Information System (INIS)

    The methodologies to generate the automated software design specification and source code for the nuclear I and C systems software using model driven language is developed in this work. For qualitative analysis of the algorithm, the activity diagram is modeled and generated using Unified Modeling Language (UML), and then the sequence diagram is designed for automated source code generation. For validation of the generated code, the code audits and module test is performed using Test and QA tool. The code coverage and complexities of example code are examined in this stage. The low pressure pressurizer reactor trip module of the Plant Protection System was programmed as subject for this task. The test result using the test tool shows that errors were easily detected from the generated source codes that have been generated using test tool. The accuracy of input/output processing by the execution modules was clearly identified

  2. Computer Software Configuration Item-Specific Flight Software Image Transfer Script Generator

    Science.gov (United States)

    Bolen, Kenny; Greenlaw, Ronald

    2010-01-01

    A K-shell UNIX script enables the International Space Station (ISS) Flight Control Team (FCT) operators in NASA s Mission Control Center (MCC) in Houston to transfer an entire or partial computer software configuration item (CSCI) from a flight software compact disk (CD) to the onboard Portable Computer System (PCS). The tool is designed to read the content stored on a flight software CD and generate individual CSCI transfer scripts that are capable of transferring the flight software content in a given subdirectory on the CD to the scratch directory on the PCS. The flight control team can then transfer the flight software from the PCS scratch directory to the Electronically Erasable Programmable Read Only Memory (EEPROM) of an ISS Multiplexer/ Demultiplexer (MDM) via the Indirect File Transfer capability. The individual CSCI scripts and the CSCI Specific Flight Software Image Transfer Script Generator (CFITSG), when executed a second time, will remove all components from their original execution. The tool will identify errors in the transfer process and create logs of the transferred software for the purposes of configuration management.

  3. Model Based Analysis and Test Generation for Flight Software

    Science.gov (United States)

    Pasareanu, Corina S.; Schumann, Johann M.; Mehlitz, Peter C.; Lowry, Mike R.; Karsai, Gabor; Nine, Harmon; Neema, Sandeep

    2009-01-01

    We describe a framework for model-based analysis and test case generation in the context of a heterogeneous model-based development paradigm that uses and combines Math- Works and UML 2.0 models and the associated code generation tools. This paradigm poses novel challenges to analysis and test case generation that, to the best of our knowledge, have not been addressed before. The framework is based on a common intermediate representation for different modeling formalisms and leverages and extends model checking and symbolic execution tools for model analysis and test case generation, respectively. We discuss the application of our framework to software models for a NASA flight mission.

  4. Improved ant algorithms for software testing cases generation.

    Science.gov (United States)

    Yang, Shunkun; Man, Tianlong; Xu, Jiaqi

    2014-01-01

    Existing ant colony optimization (ACO) for software testing cases generation is a very popular domain in software testing engineering. However, the traditional ACO has flaws, as early search pheromone is relatively scarce, search efficiency is low, search model is too simple, positive feedback mechanism is easy to produce the phenomenon of stagnation and precocity. This paper introduces improved ACO for software testing cases generation: improved local pheromone update strategy for ant colony optimization, improved pheromone volatilization coefficient for ant colony optimization (IPVACO), and improved the global path pheromone update strategy for ant colony optimization (IGPACO). At last, we put forward a comprehensive improved ant colony optimization (ACIACO), which is based on all the above three methods. The proposed technique will be compared with random algorithm (RND) and genetic algorithm (GA) in terms of both efficiency and coverage. The results indicate that the improved method can effectively improve the search efficiency, restrain precocity, promote case coverage, and reduce the number of iterations. PMID:24883391

  5. Manual for wave generation and analysis : software in Matlab

    DEFF Research Database (Denmark)

    Jakobsen, Morten MØller

    2015-01-01

    This Manual is for the included wave generation and analysis software and graphical user interface. The package is made for Matlab and is meant for educational purposes. The code is free to use under the GNU Public License (GPL). It is still in development and should be considered as such. If you have questions, suggestions, or additions to the code you can contact the author.

  6. Model-driven Generative Development of Measurement Software

    OpenAIRE

    Monperrus, Martin; Jézéquel, Jean-Marc; Baudry, Benoit; Champeau, Joël; Hoeltzener, Brigitte

    2011-01-01

    Metrics offer a practical approach to evaluate non-functional properties of domain-specific models. However, it is tedious and costly to develop and maintain a measurement software for each domain specific modeling language (DSML). In this paper, we present the principles of a domain-independent, metamodel-independent and generative approach to measuring models. The approach is operationalized through a prototype that synthesizes a measurement infrastructure for a DSML. This model-driven meas...

  7. Software Test Case Automated Generation Algorithm with Extended EDPN Model

    OpenAIRE

    Jinlong Tao; Lirong Chen

    2013-01-01

    To improve the sufficiency for software testing and the performance of testing algorithms, an improved event-driven Petri network model using combination method is proposed, abbreviated as OEDPN model. Then it is applied to OATS method to extend the implementation of OATS. On the basis of OEDPN model, the marked associate recursive method of state combination on category is presented to solve problems of combined conflict. It is also for test case explosion generated by redundant test cases a...

  8. Curvature of 2nd type induced on plane distribution

    Directory of Open Access Journals (Sweden)

    Omelyan O.

    2014-11-01

    Full Text Available In many-dimensional projective space the plane distribution is considered. The curvature of group connection of 2-nd type, induced by composite clothing of plane distribution, is constructed. It is proved, that a immovability of Cartan’s plane and Bortolotti’s hyperplane in case of holonomic distribution attracts the vanishing of curvature tensor of 2-nd type.

  9. 2nd International Arctic Ungulate Conference

    Directory of Open Access Journals (Sweden)

    A. Anonymous

    1996-01-01

    Full Text Available The 2nd International Arctic Ungulate Conference was held 13-17 August 1995 on the University of Alaska Fairbanks campus. The Institute of Arctic Biology and the Alaska Cooperative Fish and Wildlife Research Unit were responsible for organizing the conference with assistance from biologists with state and federal agencies and commercial organizations. David R. Klein was chair of the conference organizing committee. Over 200 people attended the conference, coming from 10 different countries. The United States, Canada, and Norway had the largest representation. The conference included invited lectures; panel discussions, and about 125 contributed papers. There were five technical sessions on Physiology and Body Condition; Habitat Relationships; Population Dynamics and Management; Behavior, Genetics and Evolution; and Reindeer and Muskox Husbandry. Three panel sessions discussed Comparative caribou management strategies; Management of introduced, reestablished, and expanding muskox populations; and Health risks in translocation of arctic ungulates. Invited lectures focused on the physiology and population dynamics of arctic ungulates; contaminants in food chains of arctic ungulates and lessons learned from the Chernobyl accident; and ecosystem level relationships of the Porcupine Caribou Herd.

  10. Advanced Chemistry Collection, 2nd Edition

    Science.gov (United States)

    2001-11-01

    Software requirements are given in Table 3. Some programs have additional special requirements. Please see the individual program abstracts at JCE Online or the documentation included on the CD-ROM for more specific information. Table 3. General software requirements for the Advanced Chemistry Collection. ComputerSystemOther Software(Required by one or more programs) Mac OS compatibleSystem 7.6.1 or higherAcrobat Reader (included)Mathcad; Mathematica;MacMolecule2; QuickTime 4; HyperCard Player Windows CompatibleWindows 2000, 98, 95, NT 4Acrobat Reader (included)Mathcad; Mathematica;PCMolecule2; QuickTime 4;HyperChem; Excel Literature Cited General Chemistry Collection, 5th ed.; J. Chem. Educ. Software, 2001, SP16. Advanced Chemistry Collection; J. Chem. Educ. Software, 2001, SP28.

  11. Next Generation of ECT Software for Data Analysis of Steam Generator Tubes

    International Nuclear Information System (INIS)

    Improvements to existing EddyOne eddy current analysis software are being presented. Those improvements are geared towards improved interaction between the software and ECT analyst by having a better and more featured user interface, while keeping some industry standard signal display norms intact to keep the familiar factor and ease the transition to the next generation of EddyOne. Improvements presented in this paper thus ease the transition to the new software by reducing training requirements for the existing analysts and for new analysts coming to the industry. Further, by utilizing modern technologies next generation of software is able to further reduce maintenance and deployment costs of the whole system for future to come.(author).

  12. High-Quality Random Number Generation Software for High-Performance Computing Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Random number (RN) generation is the key software component that permits random sampling. Software for parallel RN generation (RNG) should be based on RNGs that are...

  13. Optimized generation of high resolution breast anthropomorphic software phantoms

    Energy Technology Data Exchange (ETDEWEB)

    Pokrajac, David D.; Maidment, Andrew D. A.; Bakic, Predrag R. [Computer and Information Sciences Department, Delaware State University, Dover, Delaware 19901 (United States); Department of Radiology, University of Pennsylvania, Philadelphia, Pennsylvania 19104 (United States)

    2012-04-15

    Purpose: The authors present an efficient method for generating anthropomorphic software breast phantoms with high spatial resolution. Employing the same region growing principles as in their previous algorithm for breast anatomy simulation, the present method has been optimized for computational complexity to allow for fast generation of the large number of phantoms required in virtual clinical trials of breast imaging. Methods: The new breast anatomy simulation method performs a direct calculation of the Cooper's ligaments (i.e., the borders between simulated adipose compartments). The calculation corresponds to quadratic decision boundaries of a maximum a posteriori classifier. The method is multiscale due to the use of octree-based recursive partitioning of the phantom volume. The method also provides user-control of the thickness of the simulated Cooper's ligaments and skin. Results: Using the proposed method, the authors have generated phantoms with voxel size in the range of (25-1000 {mu}m){sup 3}/voxel. The power regression of the simulation time as a function of the reciprocal voxel size yielded a log-log slope of 1.95 (compared to a slope of 4.53 of our previous region growing algorithm). Conclusions: A new algorithm for computer simulation of breast anatomy has been proposed that allows for fast generation of high resolution anthropomorphic software phantoms.

  14. Anti-Random Test Generation In Software Testing

    OpenAIRE

    Seema Rani; Kulvinder Singh

    2011-01-01

    The main purpose of software testing is found a error and then correct it. Random testing selects test cases randomly but it does not explore the previous information. Anti-random testing in which each test applied its total distance from all previous tests is maximum. Anti-Random testing is a variation of random testing, which is the process of generating random input and sending that input to a system for test. In which use hamming Distance and Cartesian Distance for measure of difference.

  15. GWAS: Hypertension JSNP 2nd stage (GeMDBJ) [gwas

    Lifescience Database Archive (English)

    Full Text Available Hypertension JSNP GeMDBJ : Hypertension JSNP 2nd stage (GeMDBJ) Study details Disease Name : Hyp ... se Comments : This data is originated from GeMDBJ (Genome ... Medicine Database of Japan) , which has been creat ...

  16. 2nd Death Reported in Nationwide Salmonella Outbreak

    Science.gov (United States)

    ... nih.gov/medlineplus/news/fullstory_154561.html 2nd Death Reported in Nationwide Salmonella Outbreak Tainted cucumbers also ... THURSDAY, Sept. 10, 2015 (HealthDay News) -- A second death has been reported in a salmonella outbreak that ...

  17. Software para la Evaluación y Selección de Generadores Independientes / Independent Generator Evaluation and Selection Software

    Scientific Electronic Library Online (English)

    Marcos Alberto, de Armas Teyra; Miguel, Castro Fernández.

    2014-04-01

    Full Text Available En muchas industrias, edificios, empresas y servicios por razones productivas, emergentes o de confiablidad, es necesario generar energía eléctrica independientemente del Sistema Eléctrico Nacional. En otros casos se necesitan plantas de generación de energía eléctrica que puedan trabajar de forma a [...] islada alimentando un circuito determinado. Para realizar la selección más económica y eficiente de la capacidad a instalar se debe tomar en consideración no sólo el comportamiento del sistema donde se va a insertar la unidad, sino también, las características operacionales del generador ante las fluctuaciones de la carga, sus límites operacionales y la estabilidad resultante. Este trabajo presenta un software que permite realizar la selección más adecuada considerando la curva de capacidad y la estabilidad del generador ante las particularidades del sistema. Con su aplicación es posible reducir los gastos y las pérdidas económicas debidas a una selección inapropiada. Como caso se presenta su aplicación a una planta de una fábrica de envases de alimentos. Abstract in english In many industries, buildings and services is necessary to employ independent power plants to supply the electric power to a particular system. In other cases, islanded operation is desired due to several specific situations. In order to realize the most efficient, economic and adequate selection of [...] the generator capacity is necessary to consider not only the systems load characteristic, also is necessary to know the limits of capabilities and the resultant stability of the power generator. This paper presents a software that allows to select the adequate generator, expose the operating limits and the resulting stability in fluctuating load condition. With its application is possible to reduce economic losses and the costs due to an impropriated generator selection with an oversized o sub utilized machine. As a case a 2500 kVA power plant is analyzed.

  18. Molecular motors and the 2nd law of thermodynamics

    Science.gov (United States)

    Wang, Zhisong

    2014-03-01

    Molecular motors from biology and nanotechnology often operate on chemical energy of fuel molecules in an isothermal environment, unlike macroscopic heat engines that draw energy from a heat flow between two temperatures. Nevertheless, isothermal molecular motors are still subject to the 2nd law of thermodynamics in a fundamental way: their directional motion must cost a finite amount of energy other than the environmental heat even though no work is done; otherwise the 2nd law would be violated. Hence the 2nd law requires a finite energy price for pure direction of molecular motors. But what is the lowest price of direction allowed by the 2nd law? And how does the 2nd law-decreed price of direction limit performance of molecular motors? In the talk, I shall present our theoretical study of the 2nd law-molecular motor link on basis of the accumulated biomotor phenomenology, and also introduce our experimental effort to develop biomimetic DNA bipedal nanomotors following the mechanistic guidelines out of the theoretical study. [Main contents of this talk are from references:] This work is partially supported by FRC grants R-144-000-259-112, R-144-000-290-112 and R-144-000-320-112.

  19. Proceedings of the 2nd Educators' Symposium

    OpenAIRE

    '

    2007-01-01

    Preface Putting the model-driven development (MDD) approaches and technologies for software-based systems vision, in which development is centered round the manipulation of models, into practice requires not only sophisticated modeling approaches and tools, but also considerable training and education efforts. To make people ready for MDD, its principles and applications need to be taught to practitioners in industry, incorporated in university curricula, and probab...

  20. A NEO population generation and observation simulation software tool

    Science.gov (United States)

    Müller, Sven; Gelhaus, Johannes; Hahn, Gerhard; Franco, Raffaella

    One of the main targets of ESA's Space Situational Awareness (SSA) program is to build a wide knowledge base about objects that can potentially harm Earth (Near-Earth Objects, NEOs). An important part of this effort is to create the Small Bodies Data Centre (SBDC) which is going to aggregate measurement data from a fully-integrated NEO observation sensor network. Until this network is developed, artificial NEO measurement data is needed in order to validate SBDC algorithms. Moreover, to establish a functioning NEO observation sensor network, it has to be determined where to place sensors, what technical requirements have to be met in order to be able to detect NEOs and which observation strategies work the best. Because of this, a sensor simulation software was needed. This paper presents a software tool which allows users to create and analyse NEO populations and to simulate and analyse population observations. It is a console program written in Fortran and comes with a Graphical User Interface (GUI) written in Java and C. The tool can be distinguished into the components ``Population Generator'' and ``Observation Simulator''. The Population Generator component is responsible for generating and analysing a NEO population. Users can choose between creating fictitious (random) and synthetic populations. The latter are based on one of two models describing the orbital and size distribution of observed NEOs: The existing socalled ``Bottke Model'' (Bottke et al. 2000, 2002) and the new ``Granvik Model'' (Granvik et al. 2014, in preparation) which has been developed in parallel to the tool. Generated populations can be analysed by defining 2D, 3D and scatter plots using various NEO attributes. As a result, the tool creates the appropiate files for the plotting tool ``gnuplot''. The tool's Observation Simulator component yields the Observation Simulation and Observation Analysis functions. Users can define sensor systems using ground- or space-based locations as well as optical or radar sensors and simulate observation campaigns. The tool outputs field-of-view crossings and actual detections of the selected NEO population objects. Using the Observation Analysis users are able to process and plot the results of the Observation Simulation. In order to enable end-users to handle the tool in a user-intuitive and comfortable way, a GUI has been created based on the modular Eclipse Rich Client Platform (RCP) technology. Through the GUI users can easily enter input data for the tool, execute it and view its output data in a clear way. Additionally, the GUI runs gnuplot to create plot pictures and presents them to the user. Furthermore, users can create projects to organise executions of the tool.

  1. The 2nd Colloquium on Process Simulation. Computational Fluid Dynamics Coupled With Chemical Kinetics, Combustion and Thermodynamics

    Science.gov (United States)

    Jokilaakso, Ari

    The articles collected in this volume were presented at the 2nd Colloquium on Process Simulation held at Helsinki University of Technology, Espoo, Finland, 6-8 Jun. 1995. The processes for producing chemicals, energy, and materials encounter environmental concern and laws which challenge engineers to develop the processes towards more efficient, economical and safe operation. A more thorough understanding of the processes and phenomena involved is necessary. Formerly, the development of the processes was largely based on trial and error, whereas today, the development of computer performance together with the diversification of modelling software enables simulation of the processes. The increased capacity and possibilities for modelling the processes brought by the improved hardware and software, have generated a strong demand for more accurate mathematical descriptions of the processes. Especially, the coupling of computational fluid dynamics and chemical kinetics, combustion, and thermodynamics is of current interest in process oriented technology. This colloquium attempts to give examples of modelling efforts in operation in different universities, research institutes and companies.

  2. 2nd ISPRA nuclear electronics symposium, Stresa, Italy May 20-23, 1975

    International Nuclear Information System (INIS)

    Two round tables were annexed to the 2nd Ispra Nuclear Electronics Symposium. The first one was concerned with software support for the implementation of microprocessors, MOS and bipolar microporcessors, environmental data systems, and the use of microprocessors and minicomputers in nuclear, biomedical and environmental fields. Nuclear electronics future, and its diversification, gravitational waves and electronics, the environmental measurements of air and water quality were discussed during the second round table, and relevant feelings brought out during the discussion on the extension of nuclear electronics techniques to other fields

  3. JNCI 92#3/2nd pages

    Science.gov (United States)

    New Guidelines to Evaluate the Response to Treatment in Solid Tumors Patrick Therasse, Susan G. Arbuck, Elizabeth A. Eisenhauer, Jantien Wanders, Richard S. Kaplan, Larry Rubinstein, Jaap Verweij, Martine Van Glabbeke, Allan T. van Oosterom, Michaele C. Christian, Steve G. Gwyther Anticancer cytotoxic agents go through a process by which their antitumor activityÐon the basis of the amount of tu-mor shrinkage they could generateÐhas been investigated.

  4. The 2nd reactor core of the NS Otto Hahn

    International Nuclear Information System (INIS)

    Details of the design of the 2nd reactor core are given, followed by a brief report summarising the operating experience gained with this 2nd core, as well as by an evaluation of measured data and statements concerning the usefulness of the knowledge gained for the development of future reactor cores. Quite a number of these data have been used to improve the concept and thus the specifications for the fuel elements of the 3rd core of the reactor of the NS Otto Hahn. (orig./HP)

  5. Thermoluminescent characteristics of ZrO2:Nd films

    International Nuclear Information System (INIS)

    In this work it is exposed the obtained results after analysing the photo luminescent and thermoluminescent characteristics of activated zirconium oxide with neodymium (ZrO2 :Nd) and its possible application in the UV radiation dosimetry. The realized experiments had as objective to study the characteristics such as the optimum thermal erased treatment, the influence of light on the response, the response depending on the wavelength, the fadeout of the information, the temperature effect, the response depending on the time and the recurring of the response. The results show that the ZrO2 :Nd is a promising material to be used as Tl dosemeter for the UV radiation. (Author)

  6. Advanced Virgo: a 2nd generation interferometric gravitational wave detector

    CERN Document Server

    ,

    2014-01-01

    Advanced Virgo is the project to upgrade the Virgo interferometric detector of gravitational waves, with the aim of increasing the number of observable galaxies (and thus the detection rate) by three orders of magnitude. The project is now in an advanced construction phase and the assembly and integration will be completed by the end of 2015. Advanced Virgo will be part of a network with the two Advanced LIGO detectors in the US and GEO HF in Germany, with the goal of contributing to the early detections of gravitational waves and to opening a new observation window on the universe. In this paper we describe the main features of the Advanced Virgo detector and outline the status of the construction.

  7. Advanced Virgo: a 2nd generation interferometric gravitational wave detector

    OpenAIRE

    Acernese, F.; Agathos, M; Agatsuma, K; Aisa, D.; Amarni, J.; Ballardin, G.; Barsuglia, M; Basti, A.; Basti, F.; Bavigadda, V; Bejger, M.; Beker, M. G.; Belczynski, C; Bertolini, A; Bloemen, S

    2014-01-01

    Advanced Virgo is the project to upgrade the Virgo interferometric detector of gravitational waves, with the aim of increasing the number of observable galaxies (and thus the detection rate) by three orders of magnitude. The project is now in an advanced construction phase and the assembly and integration will be completed by the end of 2015. Advanced Virgo will be part of a network with the two Advanced LIGO detectors in the US and GEO HF in Germany, with the goal of contri...

  8. Safety profile of bilastine: 2nd generation H1-antihistamines.

    Science.gov (United States)

    Scaglione, F

    2012-12-01

    Bilastine is a new H1 antagonist with no sedative side effects, no cardiotoxic effects, and no hepatic metabolism. In addition, bilastine has proved to be effective for the symptomatic treatment of allergic rhinoconjunctivitis and urticaria. Pharmacological studies have shown that bilastine is highly selective for the H1 receptor in both in vivo and in vitro studies, and with no apparent affinity for other receptors. The absorption of bilastine is fast, linear and dose-proportional; it appears to be safe and well tolerated at all doses levels in healthy population. Multiple administration of bilastine has confirmed the linearity of the kinetic parameters. The distribution in the brain is undetectable. The safety profile in terms of adverse effects is very similar to placebo in all Phase I, II and III clinical trials. Bilastine (20 mg), unlike cetirizine, does not increase alcohol effects on the CNS. Bilastine 20 mg does not increase the CNS depressant effect of lorazepam. Bilastine 20 mg is similar to placebo in the driving test. Therefore, it meets the current criteria for medication used in the treatment of allergic rhinitis and urticaria. PMID:23242729

  9. GENESIS: Agile Generation of Information Management Oriented Software

    Directory of Open Access Journals (Sweden)

    Juan Erasmo Gómez

    2010-06-01

    Full Text Available The specification for an information system can be clear from the beginning: it must acquire, display, query and modify data, using a database. The main issue is to decide which information to manage. In the case originating this work, information was always evolving, even up to the end of the project. This implies the construction of a new system each time the information is redefined. This article presents Genesis, an agile development infrastructure, and proposes an approach for the immediate construction of required information systems. Experts describe their information needs and queries, and Genesis generates the corresponding application, with the appropriate graphical interfaces and database.La especificación de un sistema de información puede estar clara desde el principio: debe adquirir, desplegar, consultar y modificar datos, usando una base de datos. El asunto es decidir cuál información manejar. En el caso que origina este trabajo, la información evoluciona permanentemente, incluso hasta el final del proyecto. Esto implica la construcción de un nuevo sistema cada vez que se redefine la información. Este artículo presenta Genesis, una infraestructura ágil para la construcción inmediata del sistema de información que sea requerido. Los expertos describen su información y consultas. Genesis produce el software correspondiente, generando las interfaces gráficas y la base de datos apropiados.

  10. 2nd Quarter Transportation Report FY 2014

    Energy Technology Data Exchange (ETDEWEB)

    Gregory, L.

    2014-07-30

    This report satisfies the U.S. Department of Energy (DOE), National Nuclear Security Administration Nevada Field Office (NNSA/NFO) commitment to prepare a quarterly summary report of radioactive waste shipments to the Nevada National Security Site (NNSS) Radioactive Waste Management Complex (RWMC) at Area 5. There were no shipments sent for offsite treatment and returned to the NNSS this quarter. This report summarizes the second quarter of fiscal year (FY) 2014 low-level radioactive waste (LLW) and mixed low-level radioactive waste (MLLW) shipments. This report also includes annual summaries for FY 2014 in Tables 4 and 5. Tabular summaries are provided which include the following: Sources of and carriers for LLW and MLLW shipments to and from the NNSS; Number and external volume of LLW and MLLW shipments; Highway routes used by carriers; and Incident/accident data applicable to LLW and MLLW shipments. In this report shipments are accounted for upon arrival at the NNSS, while disposal volumes are accounted for upon waste burial. The disposal volumes presented in this report do not include minor volumes of non-radioactive materials that were approved for disposal. Volume reports showing cubic feet (ft3) generated using the Low-Level Waste Information System may vary slightly due to differing rounding conventions.

  11. Software Defined Radio Architecture Contributions to Next Generation Space Communications

    Science.gov (United States)

    Kacpura, Thomas J.; Eddy, Wesley M.; Smith, Carl R.; Liebetreu, John

    2015-01-01

    Space communications architecture concepts, comprising the elements of the system, the interactions among them, and the principles that govern their development, are essential factors in developing National Aeronautics and Space Administration (NASA) future exploration and science missions. Accordingly, vital architectural attributes encompass flexibility, the extensibility to insert future capabilities, and to enable evolution to provide interoperability with other current and future systems. Space communications architectures and technologies for this century must satisfy a growing set of requirements, including those for Earth sensing, collaborative observation missions, robotic scientific missions, human missions for exploration of the Moon and Mars where surface activities require supporting communications, and in-space observatories for observing the earth, as well as other star systems and the universe. An advanced, integrated, communications infrastructure will enable the reliable, multipoint, high-data-rate capabilities needed on demand to provide continuous, maximum coverage for areas of concentrated activity. Importantly, the cost/value proposition of the future architecture must be an integral part of its design; an affordable and sustainable architecture is indispensable within anticipated future budget environments. Effective architecture design informs decision makers with insight into the capabilities needed to efficiently satisfy the demanding space-communication requirements of future missions and formulate appropriate requirements. A driving requirement for the architecture is the extensibility to address new requirements and provide low-cost on-ramps for new capabilities insertion, ensuring graceful growth as new functionality and new technologies are infused into the network infrastructure. In addition to extensibility, another key architectural attribute of the space communication equipment's interoperability with other NASA communications systems, as well as those communications and navigation systems operated by international space agencies and civilian and government agencies. In this paper, we review the philosophies, technologies, architectural attributes, mission services, and communications capabilities that form the structure of candidate next-generation integrated communication architectures for space communications and navigation. A key area that this paper explores is from the development and operation of the software defined radio for the NASA Space Communications and Navigation (SCaN) Testbed currently on the International Space Station (ISS). Evaluating the lessons learned from development and operation feed back into the communications architecture. Leveraging the reconfigurability provides a change in the way that operations are done and must be considered. Quantifying the impact on the NASA Space Telecommunications Radio System (STRS) software defined radio architecture provides feedback to keep the standard useful and up to date. NASA is not the only customer of these radios. Software defined radios are developed for other applications, and taking advantage of these developments promotes an architecture that is cost effective and sustainable. Developments in the following areas such as an updated operating environment, higher data rates, networking and security can be leveraged. The ability to sustain an architecture that uses radios for multiple markets can lower costs and keep new technology infused.

  12. Stability of 2nd Hilbert points of canonical curves

    CERN Document Server

    Fedorchuk, Maksym

    2011-01-01

    We establish GIT semistability of the 2nd Hilbert point of every Gieseker-Petri general canonical curve by a simple geometric argument. As a consequence, we obtain an upper bound on slopes of general families of Gorenstein curves. We also explore the question of what replaces hyperelliptic curves in the GIT quotients of the Hilbert scheme of canonical curves.

  13. 2nd International Conference on Nuclear Physics in Astrophysics

    CERN Document Server

    Fülöp, Zsolt; Somorjai, Endre; The European Physical Journal A : Volume 27, Supplement 1, 2006

    2006-01-01

    Launched in 2004, "Nuclear Physics in Astrophysics" has established itself in a successful topical conference series addressing the forefront of research in the field. This volume contains the selected and refereed papers of the 2nd conference, held in Debrecen in 2005 and reprinted from "The European Physical Journal A - Hadrons and Nuclei".

  14. 2nd International Conference on Data Management Technologies and Applications

    CERN Document Server

    2013-01-01

    The 2nd International Conference on Data Management Technologies and Applications (DATA) aims to bring together researchers, engineers and practitioners interested on databases, data warehousing, data mining, data management, data security and other aspects of information systems and technology involving advanced applications of data.

  15. Book Review: Bioassays with Arthropods: 2nd Edition

    Science.gov (United States)

    The technical book "Bioassays with Arthropods: 2nd Edition" (2007. Jacqueline L. Robertson, Robert M. Russell, Haiganoush K, Preisler and N. E. Nevin, Eds. CRC Press, Boca Raton, FL, 224 pp.) was reviewed for the scientific readership of the peer-reviewed publication Journal of Economic Entomology. ...

  16. Application of a path sensitizing method on automated generation of test specifications for control software

    International Nuclear Information System (INIS)

    An automated generation method for test specifications has been developed for sequential control software in plant control equipment. Sequential control software can be represented as sequential circuits. The control software implemented in a control equipment is designed from these circuit diagrams. In logic tests of VLSI's, path sensitizing methods are widely used to generate test specifications. But the method generates test specifications at a single time only, and can not be directly applied to sequential control software. The basic idea of the proposed method is as follows. Specifications of each logic operator in the diagrams are defined in the software design process. Therefore, test specifications of each operator in the control software can be determined from these specifications, and validity of software can be judged by inspecting all of the operators in the logic circuit diagrams. Candidates for sensitized paths, on which test data for each operator propagates, can be generated by the path sensitizing method. To confirm feasibility of the method, it was experimentally applied to control software in digital control equipment. The program could generate test specifications exactly, and feasibility of the method was confirmed. (orig.) (3 refs., 7 figs.)

  17. Development of a Prototype Automation Simulation Scenario Generator for Air Traffic Management Software Simulations

    Science.gov (United States)

    Khambatta, Cyrus F.

    2007-01-01

    A technique for automated development of scenarios for use in the Multi-Center Traffic Management Advisor (McTMA) software simulations is described. The resulting software is designed and implemented to automate the generation of simulation scenarios with the intent of reducing the time it currently takes using an observational approach. The software program is effective in achieving this goal. The scenarios created for use in the McTMA simulations are based on data taken from data files from the McTMA system, and were manually edited before incorporation into the simulations to ensure accuracy. Despite the software s overall favorable performance, several key software issues are identified. Proposed solutions to these issues are discussed. Future enhancements to the scenario generator software may address the limitations identified in this paper.

  18. 2nd International Conference on Green Communications and Networks 2012

    CERN Document Server

    Ma, Maode; GCN 2012

    2013-01-01

    The objective of the 2nd International Conference on Green Communications and Networks 2012 (GCN 2012) is to facilitate an exchange of information on best practices for the latest research advances in the area of communications, networks and intelligence applications. These mainly involve computer science and engineering, informatics, communications and control, electrical engineering, information computing, and business intelligence and management. Proceedings of the 2nd International Conference on Green Communications and Networks 2012 (GCN 2012) will focus on green information technology and applications, which will provide in-depth insights for engineers and scientists in academia, industry, and government. The book addresses the most innovative research developments including technical challenges, social and economic issues, and presents and discusses the authors’ ideas, experiences, findings, and current projects on all aspects of advanced green information technology and applications. Yuhang Yang is ...

  19. 2nd Interdiciplinary Conference on Production, Logistics and Traffic 2015

    CERN Document Server

    Friedrich, Hanno; Thaller, Carina; Geiger, Christiane

    2016-01-01

    This contributed volume contains the selected and reviewed papers of the 2nd Interdisciplinary Conference on Production, Logistics and Traffic (ICPLT) 2015, Dortmund, Germany. The topical focus lies on economic, ecological and societal issues related to commercial transport. The authors are international experts and the paper collection presents the state-of-the-art in the field, thus making this book a valuable read for both practitioners and researchers.

  20. 2nd International Conference on Intelligent Technologies and Engineering Systems

    CERN Document Server

    Chen, Cheng-Yi; Yang, Cheng-Fu

    2014-01-01

    This book includes the original, peer reviewed research papers from the 2nd International Conference on Intelligent Technologies and Engineering Systems (ICITES2013), which took place on December 12-14, 2013 at Cheng Shiu University in Kaohsiung, Taiwan. Topics covered include: laser technology, wireless and mobile networking, lean and agile manufacturing, speech processing, microwave dielectrics, intelligent circuits and systems, 3D graphics, communications, and structure dynamics and control.

  1. 2nd International Open and Distance Learning (IODL) Symposium

    OpenAIRE

    Reviewed by Murat BARKAN

    2006-01-01

    This closing remarks prepared and presented by Prof. Dr. Murat BARKAN Anadolu University, Eskisehir, TURKEY DEAR GUESTS, As the 2nd International Open and Distance Learning (IODL) Symposium is now drawing to end, I would like to thank you all for your outstanding speeches, distinguished presentations, constructive roundtable and session discussions, and active participation during the last five days. I hope you all share my view that the whole symposium has been...

  2. 2nd International Conference on Electric and Electronics (EEIC 2012)

    CERN Document Server

    Advances in Electric and Electronics

    2012-01-01

    This volume contains 108 full length papers presented at the 2nd International Conference on Electric and Electronics (EEIC 2012), held on April 21-22 in Sanya, China, which brings together researchers working in many different areas of education and learning to foster international collaborations and exchange of new ideas. This volume can be divided into two sections on the basis of the classification of manuscripts considered: the first section deals with Electric and the second section with Electronics.

  3. Open-source software for generating electrocardiogram signals

    CERN Document Server

    McSharry, P E; Sharry, Patrick E. Mc; Cifford, Gari D.

    2004-01-01

    ECGSYN, a dynamical model that faithfully reproduces the main features of the human electrocardiogram (ECG), including heart rate variability, RR intervals and QT intervals is presented. Details of the underlying algorithm and an open-source software implementation in Matlab, C and Java are described. An example of how this model will facilitate comparisons of signal processing techniques is provided.

  4. Using Automatic Code Generation in the Attitude Control Flight Software Engineering Process

    Science.gov (United States)

    McComas, David; O'Donnell, James R., Jr.; Andrews, Stephen F.

    1999-01-01

    This paper presents an overview of the attitude control subsystem flight software development process, identifies how the process has changed due to automatic code generation, analyzes each software development phase in detail, and concludes with a summary of our lessons learned.

  5. An application generator for rapid prototyping of Ada real-time control software

    Science.gov (United States)

    Johnson, Jim; Biglari, Haik; Lehman, Larry

    1990-01-01

    The need to increase engineering productivity and decrease software life cycle costs in real-time system development establishes a motivation for a method of rapid prototyping. The design by iterative rapid prototyping technique is described. A tool which facilitates such a design methodology for the generation of embedded control software is described.

  6. Specification and Generation of Environment for Model Checking of Software Components.

    Czech Academy of Sciences Publication Activity Database

    Pa?ízek, P.; Plášil, František

    2007-01-01

    Ro?. 176, - (2007), s. 143-154. ISSN 1571-0661 R&D Projects: GA AV ?R 1ET400300504 Institutional research plan: CEZ:AV0Z10300504 Keywords : software components * behavior protocols * model checking * automated generation of environment Subject RIV: JC - Computer Hardware ; Software

  7. Mapping and industrial IT project to a 2nd semester design-build project

    DEFF Research Database (Denmark)

    Nyborg, Mads; HØgh, Stig

    2010-01-01

    CDIO means bringing the engineer's daily life and working practice into the educational system. In our opinion this is best done by selecting an appropriate project from industry. In this paper we describe how we have mapped an industrial IT project to a 2nd semester design-build project in the Diploma IT program at the Technical University of Denmark. The system in question is a weighing system operating in a LAN environment. The system is used in the medical industry for producing tablets. We present the design of a curriculum to support the development of major components of the weighing system. A simple teaching model for software engineering is presented which combines technical disciplines with disciplines from section 2-4 in the CDIO syllabus. The implementation of a joint project involving several courses supports the CDIO perspective. Already the traditional IT-diploma education for decades has included many of the essential features of the CDIO (for example, focus on teamwork, development of social skills, the open nature of design problems). The specific project has previously been conducted on 5th Semester The project has now been brought forward to the 2nd semester of study. A successful implementation at this level requires careful planning of activities through the semester. Principles of the CDIO have been of great help in this regard. Finally we draw conclusions and give our recommendations based on those.

  8. Rapid Protoyping Software for Developing Next-Generation Air Traffic Management Algorithms Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Research on next-generation air traffic control systems are being conducted at several laboratories. Most of this work is being carried out using custom software....

  9. Rapid Protoyping Software for Developing Next-Generation Air Traffic Management Algorithms Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Research on next-generation air traffic management systems is being conducted at several laboratories using custom software. In order to provide a more uniform...

  10. Evaluation of Finite Element Method Based Software for Simulation of Hydropower Generator - Power Grid Interaction

    OpenAIRE

    Persarvet, Gustav

    2011-01-01

    The accuracy, ease of use, and execution time of the finite element method based software Maxwell coupled to the system simulation software Simplorer was evaluated for simulation of hydropower generator - power grid interaction. A generator test rig were modelled in Maxwell and coupled to Simplorer with a strong circuit coupling as a single machine infinite bus system. The accuracy of the model was measured by comparing the simulated output power oscillation frequency and damping characterist...

  11. Generation of embedded Hardware/Software from SystemC

    OpenAIRE

    Ouadjaout Salim; Houzet Dominique

    2006-01-01

    Designers increasingly rely on reusing intellectual property (IP) and on raising the level of abstraction to respect system-on-chip (SoC) market characteristics. However, most hardware and embedded software codes are recoded manually from system level. This recoding step often results in new coding errors that must be identified and debugged. Thus, shorter time-to-market requires automation of the system synthesis from high-level specifications. In this paper, we propose a design flow intend...

  12. A Code Generator for Software Component Services in Smart Devices

    OpenAIRE

    Ahmad, Manzoor

    2010-01-01

    A component is built to be reused and reusability has significant impact on component generality and flexibility requirement. A component model plays a critical role in reusability of software component and defines a set of standards for component implementation, evolution, composition, deployment and standardization of the run-time environment for execution of component. In component based development (CBD), standardization of the runtime environment includes specification of component’s int...

  13. Software Service Composition in Next Generation Networking Environments

    OpenAIRE

    Bether, Carsten

    2008-01-01

    Recent and continuing changes in the telecommunication market, which have been driven by lowered barriers to market, affordable flatrate broadband access, and freely available, mature software environments, have inspired researchers and industry to search for new capabilities to provide telecom services in a more flexible manner. A survey conducted in the context of this thesis found that inadequacy, inflexibility and immense costs are the the issues most mentioned in association with existin...

  14. A Novel Scheme to Design Software-Controllable Vector Microwave Signal Generator and its Application

    Directory of Open Access Journals (Sweden)

    L. Meng

    2010-01-01

    Full Text Available With the rapid development of wireless communications, there will be many communication standards in the future, which may cost much to buy the corresponding vector microwave signal generator. Hence, this study investigated a novel vector microwave signal generation method, which modeled the vector baseband signal by the CAD software (Agilent ADS and then control the conventional microwave signal generation hardware to output vector microwave signals. Compared with the specified vector microwave signal generator developed by Agilent, Anritsu, etc., our software-controllable microwave signal source is cheaper, more flexible and more convenient. Moreover, as an application of our method, we model and realize the TD-SCDMA baseband signal corrupted by multipath channel and Additive White Gaussian Noise (AWGN in ADS software and then control the hardware (Agilent E4432B to generate the TD-SCDMA microwave signals. The measurements of the TD-SCDMA microwave signals approve the validity of our method.

  15. 2nd International Conference on Computer Science, Applied Mathematics and Applications

    CERN Document Server

    Thi, Hoai; Nguyen, Ngoc

    2014-01-01

    The proceedings consists of 30 papers which have been selected and invited from the submissions to the 2nd International Conference on Computer Science, Applied Mathematics and Applications (ICCSAMA 2014) held on 8-9 May, 2014 in Budapest, Hungary. The conference is organized into 7 sessions: Advanced Optimization Methods and Their Applications, Queueing Models and Performance Evaluation, Software Development and Testing, Computational Methods for Mobile and Wireless Networks, Computational Methods for Knowledge Engineering, Logic Based Methods for Decision Making and Data Mining, and Nonlinear Systems and Applications, respectively. All chapters in the book discuss theoretical and practical issues connected with computational methods and optimization methods for knowledge engineering. The editors hope that this volume can be useful for graduate and Ph.D. students and researchers in Computer Science and Applied Mathematics. It is the hope of the editors that readers of this volume can find many inspiring idea...

  16. 2nd Meeting of DRDO Library Information Officers

    Directory of Open Access Journals (Sweden)

    Director Director

    1981-01-01

    Full Text Available The 2nd Meeting o f the Officers-in-Charge of Libraries TICs of the DRDO was held at DESlDOC on 2-3 December, 1980. The main aim of the Meeting was to identify the common problems among the Libraries/TICs in the dissemination of information to scientists and suggest remedial measures to improve the efficiency. The Meeting was inaugurated by Prof M Krishnamurthi, CCR&D(K. Forty-six delegates from R&D HQr/Labs/Estts attended the Meeting.

  17. CELEBRATED APRIL 2nd – INTERNATIONAL DAY OF PERSONS WITH AUTISM

    OpenAIRE

    Manuela KRCHANOSKA

    2014-01-01

    On April 2nd, the Macedonian Scientific Society for Autism, for the fourth time organized an event on the occasion of the International Day of Persons with Autism. The event with cultural and artistic character was held at the Museum of the Macedonian Struggle under the motto “They are not alone, we are with them”. The huge number of citizens only confirmed the motto. It seemed that the hall of the Museum of the Macedonian Struggle is too small for the warm hearts of the audience. More than 3...

  18. AMON: A Software System for Automatic Generation of Ontology Mappings

    OpenAIRE

    Sánchez-Alberca, A.; R García-García; Sorzano, C.O.S.; Gutiérrez-Cossío, Celia; Chagoyen, Mónica; Fernández López, Mariano

    2005-01-01

    Some of the most outstanding problems in Computer Science (e.g. access to heterogeneous information sources, use of different e-commerce standards, ontology translation, etc.) are often approached through the identification of ontology mappings. A manual mapping generation slows down, or even makes unfeasible, the solution of particular cases of the aforementioned problems via ontology mappings. Some algorithms and formal models for partial tasks of automatic generation of mappings have been ...

  19. Scoping analysis of the Advanced Test Reactor using SN2ND

    Energy Technology Data Exchange (ETDEWEB)

    Wolters, E.; Smith, M. (NE NEAMS PROGRAM); ( SC)

    2012-07-26

    A detailed set of calculations was carried out for the Advanced Test Reactor (ATR) using the SN2ND solver of the UNIC code which is part of the SHARP multi-physics code being developed under the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program in DOE-NE. The primary motivation of this work is to assess whether high fidelity deterministic transport codes can tackle coupled dynamics simulations of the ATR. The successful use of such codes in a coupled dynamics simulation can impact what experiments are performed and what power levels are permitted during those experiments at the ATR. The advantages of the SN2ND solver over comparable neutronics tools are its superior parallel performance and demonstrated accuracy on large scale homogeneous and heterogeneous reactor geometries. However, it should be noted that virtually no effort from this project was spent constructing a proper cross section generation methodology for the ATR usable in the SN2ND solver. While attempts were made to use cross section data derived from SCALE, the minimal number of compositional cross section sets were generated to be consistent with the reference Monte Carlo input specification. The accuracy of any deterministic transport solver is impacted by such an approach and clearly it causes substantial errors in this work. The reasoning behind this decision is justified given the overall funding dedicated to the task (two months) and the real focus of the work: can modern deterministic tools actually treat complex facilities like the ATR with heterogeneous geometry modeling. SN2ND has been demonstrated to solve problems with upwards of one trillion degrees of freedom which translates to tens of millions of finite elements, hundreds of angles, and hundreds of energy groups, resulting in a very high-fidelity model of the system unachievable by most deterministic transport codes today. A space-angle convergence study was conducted to determine the meshing and angular cubature requirements for the ATR, and also to demonstrate the feasibility of performing this analysis with a deterministic transport code capable of modeling heterogeneous geometries. The work performed indicates that a minimum of 260,000 linear finite elements combined with a L3T11 cubature (96 angles on the sphere) is required for both eigenvalue and flux convergence of the ATR. A critical finding was that the fuel meat and water channels must each be meshed with at least 3 'radial zones' for accurate flux convergence. A small number of 3D calculations were also performed to show axial mesh and eigenvalue convergence for a full core problem. Finally, a brief analysis was performed with different cross sections sets generated from DRAGON and SCALE, and the findings show that more effort will be required to improve the multigroup cross section generation process. The total number of degrees of freedom for a converged 27 group, 2D ATR problem is {approx}340 million. This number increases to {approx}25 billion for a 3D ATR problem. This scoping study shows that both 2D and 3D calculations are well within the capabilities of the current SN2ND solver, given the availability of a large-scale computing center such as BlueGene/P. However, dynamics calculations are not realistic without the implementation of improvements in the solver.

  20. TagGD: fast and accurate software for DNA Tag generation and demultiplexing.

    Science.gov (United States)

    Costea, Paul Igor; Lundeberg, Joakim; Akan, Pelin

    2013-01-01

    Multiplexing is of vital importance for utilizing the full potential of next generation sequencing technologies. We here report TagGD (DNA-based Tag Generator and Demultiplexor), a fully-customisable, fast and accurate software package that can generate thousands of barcodes satisfying user-defined constraints and can guarantee full demultiplexing accuracy. The barcodes are designed to minimise their interference with the experiment. Insertion, deletion and substitution events are considered when designing and demultiplexing barcodes. 20,000 barcodes of length 18 were designed in 5 minutes and 2 million barcoded Illumina HiSeq-like reads generated with an error rate of 2% were demultiplexed with full accuracy in 5 minutes. We believe that our software meets a central demand in the current high-throughput biology and can be utilised in any field with ample sample abundance. The software is available on GitHub (https://github.com/pelinakan/UBD.git). PMID:23469199

  1. Conference report: 2nd Medicon Valley Inhalation Symposium.

    Science.gov (United States)

    Lastow, Orest

    2014-02-01

    2nd Medicon Valley Inhalation Symposium 16 October 2013, Lund, Sweden The 2nd Medicon Valley Inhalation Symposium was arranged by the Medicon Valley Inhalation Consortium. It was held at the Medicon Village, which is the former AstraZeneca site in Lund, Sweden. It was a 1 day symposium focused on inhaled drug delivery and inhalation product development. 120 delegates listened to 11 speakers. The program was organized to follow the value chain of an inhalation product development. This year there was a focus on inhaled biomolecules. The inhaled delivery of insulin was covered by two presentations and a panel discussion. The future of inhaled drug delivery was discussed together with an overview of the current market situation. Two of the inhalation platforms, capsule inhalers and metered-dose inhalers, were discussed in terms of the present situation and the future opportunities. Much focus was on the regulatory and intellectual aspects of developing inhalation products. The manufacturing of a dry powder inhaler requires precision filling of powder, and the various techniques were presented. The benefits of nebulization and nasal delivery were illustrated with some case studies and examples. The eternal challenge of poor compliance was addressed from an industrial design perspective and some new approaches were introduced. PMID:24483190

  2. Afs password expiration starts Feb 2nd 2004

    CERN Document Server

    2004-01-01

    Due to security reasons, and in agreement with CERN management, afs/lxplus passwords will fall into line with Nice/Mail passwords on February 2nd and expire annually. As of the above date afs account holders who have not changed their passwords for over a year will have a 60 day grace period to make a change. Following this date their passwords will become invalid. What does this mean to you? If you have changed your afs password in the past 10 months the only difference is that 60 days before expiration you will receive a warning message. Similar warnings will also appear nearer the time of expiration. If you have not changed your password for more than 10 months, then, as of February 2nd you will have 60 days to change it using the command ?kpasswd'. Help to choose a good password can be found at: http://security.web.cern.ch/security/passwords/ If you have been given a temporary password at any time by the Helpdesk or registration team this will automatically fall into the expiration category ...

  3. FACTORS GENERATING RISKS DURING REQUIREMENT ENGINEERING PROCESS IN GLOBAL SOFTWARE DEVELOPMENT ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Huma Hayat Khan

    2014-03-01

    Full Text Available Challenges of Requirements Engineering become adequate when it is performed in global software development paradigm. There can be many reasons behind this challenging nature. “Risks” can be one of them, as there is more risk exposure in global development paradigm. So it is may be one of the main reasons of making Requirement Engineering more challenging. For this first there is a need to identify the factors which actually generate these risks. This paper therefore not only identifies the factors, but also the risks which these factors may generate. A systematic literature review is done for the identification of these factors and the risks which may occur during requirement engineering process in global software development paradigm. The list leads to progressive enhancement for assisting in requirement engineering activities in global software development paradigm. This work is especially useful for the, less experience people working in global software development.

  4. Software

    Energy Technology Data Exchange (ETDEWEB)

    Macedo, R.; Budd, G.; Ross, E.; Wells, P.

    2010-07-15

    The software section of this journal presented new software programs that have been developed to help in the exploration and development of hydrocarbon resources. Software provider IHS Inc. has made additions to its geological and engineering analysis software tool, IHS PETRA, a product used by geoscientists and engineers to visualize, analyze and manage well production, well log, drilling, reservoir, seismic and other related information. IHS PETRA also includes a directional well module and a decline curve analysis module to improve analysis capabilities in unconventional reservoirs. Petris Technology Inc. has developed a software to help manage the large volumes of data. PetrisWinds Enterprise (PWE) helps users find and manage wellbore data, including conventional wireline and MWD core data; analysis core photos and images; waveforms and NMR; and external files documentation. Ottawa-based Ambercore Software Inc. has been collaborating with Nexen on the Petroleum iQ software for steam assisted gravity drainage (SAGD) producers. Petroleum iQ integrates geology and geophysics data with engineering data in 3D and 4D. Calgary-based Envirosoft Corporation has developed a software that reduces the costly and time-consuming effort required to comply with Directive 39 of the Alberta Energy Resources Conservation Board. The product includes an emissions modelling software. Houston-based Seismic Micro-Technology (SMT) has developed the Kingdom software that features the latest in seismic interpretation. Holland-based Joa Oil and Gas and Calgary-based Computer Modelling Group have both supplied the petroleum industry with advanced reservoir simulation software that enables reservoir interpretation. The 2010 software survey included a guide to new software applications designed to facilitate petroleum exploration, drilling and production activities. Oil and gas producers can use the products for a range of functions, including reservoir characterization and accounting. In addition to a description of the software application, the name of software providers were listed along with the new features available in each product. The survey included products developed by ADP Inc.; Energy Navigator Inc.; Enersight Canada; Entero Corporation; Envirosoft Corporation; Geologic Systems Ltd.; IHS; Jedex Equipment Ltd.; MJ Systems; OpenSpirit; Petro Management Group Ltd.; P2 Energy Solutions; Risk Advisory, A division of SAS; Seisware International Inc.; Sustainet Software International Inc.; and 3ESI.

  5. Psychosocial Risks Generated By Assets Specific Design Software

    Science.gov (United States)

    Remus, Furtun?; Angela, Domnariu; Petru, Laz?r

    2015-07-01

    The human activity concerning an occupation is resultant from the interaction between the psycho-biological, socio-cultural and organizational-occupational factors. Tehnological development, automation and computerization that are to be found in all the branches of activity, the level of speed in which things develop, as well as reaching their complexity, require less and less physical aptitudes and more cognitive qualifications. The person included in the work process is bound in most of the cases to come in line with the organizational-occupational situations that are specific to the demands of the job. The role of the programmer is essencial in the process of execution of ordered softwares, thus the truly brilliant ideas can only come from well-rested minds, concentrated on their tasks. The actual requirements of the jobs, besides the high number of benefits and opportunities, also create a series of psycho-social risks, which can increase the level of stress during work activity, especially for those who work under pressure.

  6. Towards Generating a Rulebase to Provide Feedback at Design Level for Improving Early Software Design

    OpenAIRE

    B. Bharathi; Kulanthaivel, G.

    2011-01-01

    Performance analysis plays an important role in the software development process. The results of performance predictions have many a times resulted in a collection of performance indices, which are complex to interpret. The proper interpretation of results and generation of suitable feedback is very important for a good performance analysis process. The aim is drive decisions based on the results generated for a performance diagnosis and to generate rule bases that improve the performance at ...

  7. Skel: Generative Software for Producing Skeletal I/O Applications

    Energy Technology Data Exchange (ETDEWEB)

    Logan, J.; Klasky, S.; Lofstead, J.; Abbasi, H.; Ethier, S.; Grout, R.; Ku, S. H.; Liu, Q.; Ma, X.; Parashar, M.; Podhorszki, N.; Schwan, K.; Wolf, M.

    2011-01-01

    Massively parallel computations consist of a mixture of computation, communication, and I/O. As part of the co-design for the inevitable progress towards exascale computing, we must apply lessons learned from past work to succeed in this new age of computing. Of the three components listed above, implementing an effective parallel I/O solution has often been overlooked by application scientists and was usually added to large scale simulations only when existing serial techniques had failed. As scientists teams scaled their codes to run on hundreds of processors, it was common to call on an I/O expert to implement a set of more scalable I/O routines. These routines were easily separated from the calculations and communication, and in many cases, an I/O kernel was derived from the application which could be used for testing I/O performance independent of the application. These I/O kernels developed a life of their own used as a broad measure for comparing different I/O techniques. Unfortunately, as years passed and computation and communication changes required changes to the I/O, the separate I/O kernel used for benchmarking remained static no longer providing an accurate indicator of the I/O performance of the simulation making I/O research less relevant for the application scientists. In this paper we describe a new approach to this problem where I/O kernels are replaced with skeletal I/O applications automatically generated from an abstract set of simulation I/O parameters. We realize this abstraction by leveraging the ADIOS middleware's XML I/O specification with additional runtime parameters. Skeletal applications offer all of the benefits of I/O kernels including allowing I/O optimizations to focus on useful I/O patterns. Moreover, since they are automatically generated, it is easy to produce an updated I/O skeleton whenever the simulation's I/O changes. In this paper we analyze the performance of automatically generated I/O skeletal applications for the S3D and GTS codes. We show that these skeletal applications achieve performance comparable to that of the production applications. We wrap up the paper with a discussion of future changes to make the skeletal application better approximate the actual I/O performed in the simulation.

  8. Software module for geometric product modeling and NC tool path generation

    International Nuclear Information System (INIS)

    The intelligent CAD/CAM system named VIRTUAL MANUFACTURE is created. It is consisted of four intelligent software modules: the module for virtual NC machine creation, the module for geometric product modeling and automatic NC path generation, the module for virtual NC machining and the module for virtual product evaluation. In this paper the second intelligent software module is presented. This module enables feature-based product modeling carried out via automatic saving of the designed product geometric features as knowledge data. The knowledge data are afterwards applied for automatic NC program generation for the designed product NC machining. (Author)

  9. THR Simulator – the software for generating radiographs of THR prosthesis

    Directory of Open Access Journals (Sweden)

    Hou Sheng-Mou

    2009-01-01

    Full Text Available Abstract Background Measuring the orientation of acetabular cup after total hip arthroplasty is important for prognosis. The verification of these measurement methods will be easier and more feasible if we can synthesize prosthesis radiographs in each simulated condition. One reported method used an expensive mechanical device with an indeterminable precision. We thus develop a program, THR Simulator, to directly synthesize digital radiographs of prostheses for further analysis. Under Windows platform and using Borland C++ Builder programming tool, we developed the THR Simulator. We first built a mathematical model of acetabulum and femoral head. The data of the real dimension of prosthesis was adopted to generate the radiograph of hip prosthesis. Then with the ray tracing algorithm, we calculated the thickness each X-ray beam passed, and then transformed to grey scale by mapping function which was derived by fitting the exponential function from the phantom image. Finally we could generate a simulated radiograph for further analysis. Results Using THR Simulator, the users can incorporate many parameters together for radiograph synthesis. These parameters include thickness, film size, tube distance, film distance, anteversion, abduction, upper wear, medial wear, and posterior wear. These parameters are adequate for any radiographic measurement research. This THR Simulator has been used in two studies, and the errors are within 2° for anteversion and 0.2 mm for wearing measurement. Conclusion We design a program, THR Simulator that can synthesize prosthesis radiographs. Such a program can be applied in future studies for further analysis and validation of measurement of various parameters of pelvis after total hip arthroplasty.

  10. 2nd International Multidisciplinary Microscopy and Microanalysis Congress

    CERN Document Server

    Oral, Ahmet; Ozer, Mehmet

    2015-01-01

    The 2nd International Multidisciplinary Microscopy and Microanalysis Congress & Exhibition (InterM 2014) was held on 16–19 October 2014 in Oludeniz, Fethiye/ Mugla, Turkey. The aim of the congress was to gather scientists from various branches and discuss the latest improvements in the field of microscopy. The focus of the congress has been widened in an "interdisciplinary" manner, so as to allow all scientists working on several related subjects to participate and present their work. These proceedings include 33 peer-reviewed technical papers, submitted by leading academic and research institutions from over 17 countries and representing some of the most cutting-edge research available. The papers were presented at the congress in the following sessions: ·         Applications of Microscopy in the Physical Sciences ·         Applications of Microscopy in the Biological Sciences.

  11. 2nd Colombian Congress on Computational Biology and Bioinformatics

    CERN Document Server

    Cristancho, Marco; Isaza, Gustavo; Pinzón, Andrés; Rodríguez, Juan

    2014-01-01

    This volume compiles accepted contributions for the 2nd Edition of the Colombian Computational Biology and Bioinformatics Congress CCBCOL, after a rigorous review process in which 54 papers were accepted for publication from 119 submitted contributions. Bioinformatics and Computational Biology are areas of knowledge that have emerged due to advances that have taken place in the Biological Sciences and its integration with Information Sciences. The expansion of projects involving the study of genomes has led the way in the production of vast amounts of sequence data which needs to be organized, analyzed and stored to understand phenomena associated with living organisms related to their evolution, behavior in different ecosystems, and the development of applications that can be derived from this analysis.  .

  12. 2nd International Conference on Harmony Search Algorithm

    CERN Document Server

    Geem, Zong

    2016-01-01

    The Harmony Search Algorithm (HSA) is one of the most well-known techniques in the field of soft computing, an important paradigm in the science and engineering community.  This volume, the proceedings of the 2nd International Conference on Harmony Search Algorithm 2015 (ICHSA 2015), brings together contributions describing the latest developments in the field of soft computing with a special focus on HSA techniques. It includes coverage of new methods that have potentially immense application in various fields. Contributed articles cover aspects of the following topics related to the Harmony Search Algorithm: analytical studies; improved, hybrid and multi-objective variants; parameter tuning; and large-scale applications.  The book also contains papers discussing recent advances on the following topics: genetic algorithms; evolutionary strategies; the firefly algorithm and cuckoo search; particle swarm optimization and ant colony optimization; simulated annealing; and local search techniques.   This book ...

  13. 2nd CEAS Specialist Conference on Guidance, Navigation and Control

    CERN Document Server

    Mulder, Bob; Choukroun, Daniel; Kampen, Erik-Jan; Visser, Coen; Looye, Gertjan

    2013-01-01

    Following the successful 1st CEAS (Council of European Aerospace Societies) Specialist Conference on Guidance, Navigation and Control (CEAS EuroGNC) held in Munich, Germany in 2011, Delft University of Technology happily accepted the invitation of organizing the 2nd  CEAS EuroGNC in Delft, The Netherlands in 2013. The goal of the conference is to promote new advances in aerospace GNC theory and technologies for enhancing safety, survivability, efficiency, performance, autonomy and intelligence of aerospace systems using on-board sensing, computing and systems. A great push for new developments in GNC are the ever higher safety and sustainability requirements in aviation. Impressive progress was made in new research fields such as sensor and actuator fault detection and diagnosis, reconfigurable and fault tolerant flight control, online safe flight envelop prediction and protection, online global aerodynamic model identification, online global optimization and flight upset recovery. All of these challenges de...

  14. 2nd international conference on advanced nanomaterials and nanotechnology

    CERN Document Server

    Goswami, D; Perumal, A

    2013-01-01

    Nanoscale science and technology have occupied centre stage globally in modern scientific research and discourses in the early twenty first century. The enabling nature of the technology makes it important in modern electronics, computing, materials, healthcare, energy and the environment. This volume contains selected articles presented (as Invited/Oral/Poster presentations) at the 2nd international conference on advanced materials and nanotechnology (ICANN-2011) held recently at the Indian Institute of Technology Guwahati, during Dec 8-10, 2011. The list of topics covered in this proceedings include: Synthesis and self assembly of nanomaterials Nanoscale characterisation Nanophotonics & Nanoelectronics Nanobiotechnology Nanocomposites  F   Nanomagnetism Nanomaterials for Enery Computational Nanotechnology Commercialization of Nanotechnology The conference was represented by around 400 participants from several countries including delegates invited from USA, Germany, Japan, UK, Taiwan, Italy, Singapor...

  15. 2nd International Conference on Construction and Building Research

    CERN Document Server

    Fernández-Plazaola, Igor; Hidalgo-Delgado, Francisco; Martínez-Valenzuela, María; Medina-Ramón, Francisco; Oliver-Faubel, Inmaculada; Rodríguez-Abad, Isabel; Salandin, Andrea; Sánchez-Grandia, Rafael; Tort-Ausina, Isabel; Construction and Building Research

    2014-01-01

    Many areas of knowledge converge in the building industry and therefore research in this field necessarily involves an interdisciplinary approach. Effective research requires strong relations between a broad variety of scientific and technological domains and more conventional construction or craft processes, while also considering advanced management processes, where all the main actors permanently interact. This publication takes an interdisciplinary approach grouping various studies on the building industry chosen from among the works presented for the 2nd International Conference on Construction and Building Research. The papers examine aspects of materials and building systems; construction technology; energy and sustainability; construction management; heritage, refurbishment and conservation. The information contained within these pages may be of interest to researchers and practitioners in construction and building activities from the academic sphere, as well as public and private sectors.

  16. 2nd International Conference on NeuroRehabilitation

    CERN Document Server

    Andersen, Ole; Akay, Metin

    2014-01-01

    The book is the proceedings of the 2nd International Conference on NeuroRehabilitation (ICNR 2014), held 24th-26th June 2014 in Aalborg, Denmark. The conference featured the latest highlights in the emerging and interdisciplinary field of neural rehabilitation engineering and identified important healthcare challenges the scientific community will be faced with in the coming years. Edited and written by leading experts in the field, the book includes keynote papers, regular conference papers, and contributions to special and innovation sessions, covering the following main topics: neuro-rehabilitation applications and solutions for restoring impaired neurological functions; cutting-edge technologies and methods in neuro-rehabilitation; and translational challenges in neuro-rehabilitation. Thanks to its highly interdisciplinary approach, the book will not only be a  highly relevant reference guide for academic researchers, engineers, neurophysiologists, neuroscientists, physicians and physiotherapists workin...

  17. Ontology-based Software for Generating Scenarios for Characterizing Searches for Nuclear Materials

    Energy Technology Data Exchange (ETDEWEB)

    Ward, Richard C [ORNL; Sorokine, Alexandre [ORNL; Schlicher, Bob G [ORNL; Wright, Michael C [ORNL; Kruse, Kara L [ORNL

    2011-01-01

    A software environment was created in which ontologies are used to significantly expand the number and variety of scenarios for special nuclear materials (SNM) detection based on a set of simple generalized initial descriptions. A framework was built that combined advanced reasoning from ontologies with geographical and other data sources to generate a much larger list of specific detailed descriptions from a simple initial set of user-input variables. This presentation shows how basing the scenario generation on a process of inferencing from multiple ontologies, including a new SNM Detection Ontology (DO) combined with data extraction from geodatabases, provided the desired significant variability of scenarios for testing search algorithms, including unique combinations of variables not previously expected. The various components of the software environment and the resulting scenarios generated will be discussed.

  18. Software R&D for Next Generation of HEP Experiments, Inspired by Theano

    CERN Document Server

    CERN. Geneva

    2015-01-01

    In the next decade, the frontiers of High Energy Physics (HEP) will be explored by three machines: the High Luminosity Large Hadron Collider (HL-LHC) in Europe, the Long Base Neutrino Facility (LBNF) in the US, and the International Linear Collider (ILC) in Japan. These next generation experiments must address two fundamental problems in the current generation of HEP experimental software: the inability to take advantage and adapt to the rapidly evolving processor landscape, and the difficulty in developing and maintaining increasingly complex software systems by physicists. I will propose a strategy, inspired by the automatic optimization and code generation in Theano, to simultaneously address both problems. I will describe three R&D projects with short-term physics deliverables aimed at developing this strategy. The first project is to develop maximally sensitive General Search for New Physics at the LHC by applying the Matrix Element Method running GPUs of HPCs. The second is to classify and reconstru...

  19. Development of a Hydrologic Characterization Technology for Fault Zones Phase II 2nd Report

    Energy Technology Data Exchange (ETDEWEB)

    Karasaki, Kenzi [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Doughty, Christine [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Gasperikova, Erika [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Peterson, John [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Conrad, Mark [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Cook, Paul [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Tiemi, Onishi [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2011-03-31

    This is the 2nd report on the three-year program of the 2nd phase of the NUMO-LBNL collaborative project: Development of Hydrologic Characterization Technology for Fault Zones under NUMO-DOE/LBNL collaboration agreement. As such, this report is a compendium of the results by Kiho et al. (2011) and those by LBNL.

  20. CELEBRATED APRIL 2nd – INTERNATIONAL DAY OF PERSONS WITH AUTISM

    Directory of Open Access Journals (Sweden)

    Manuela KRCHANOSKA

    2014-09-01

    Full Text Available On April 2nd, the Macedonian Scientific Society for Autism, for the fourth time organized an event on the occasion of the International Day of Persons with Autism. The event with cultural and artistic character was held at the Museum of the Macedonian Struggle under the motto “They are not alone, we are with them”. The huge number of citizens only confirmed the motto. It seemed that the hall of the Museum of the Macedonian Struggle is too small for the warm hearts of the audience. More than 300 guests were present in the hall, among which there were children with autism and their families, prominent professors, doctors, special educators and rehabilitators, psychologists, students and other citizens with glad heart and will who decided to enrich the event with their presence. The event was opened by the violinist Plamenka Trajkovska, which performed one song. After her, the President of the Macedonian Scientific Society for Autism, PhD. Vladimir Trajkovski delivered his speech. The professor told the parents of autistic children, who were present in large number, not to lose hope, to fight for their children, and that the Macedonian Scientific Society for Autism will provide tremendous support and assistance in this struggle.

  1. Book Review: Digital Forensic Evidence Examination (2nd ed.

    Directory of Open Access Journals (Sweden)

    Gary Kessler

    2010-06-01

    Full Text Available Cohen, F. (2010. Digital Forensic Evidence Examination (2nd ed.. Livermore, CA: ASP Press. 452 pages, ISBN: 978-1-878109-45-3, US$79.Reviewed by Gary C. Kessler, Gary Kessler Associates & Edith Cowan University (gck@garykessler.netOn the day that I sat down to start to write this review, the following e-mailcame across on one of my lists:Person A and Person B write back and forth and create an email thread. Person A then forwards the email to Person C, but changes some wording in the email exchange between A & B. What is the easiest way (and is it even possible to find out when that earlier email message was altered before sent to Person C?Before you try to answer these questions, read Fred Cohen's Digital Forensic Evidence Examination. His book won't actually tell you how to answer these questions but it will help you understand the difficulty in even trying to answer them with any level of certainty.(see PDF for full review

  2. Software for Generating Troposphere Corrections for InSAR Using GPS and Weather Model Data

    Science.gov (United States)

    Moore, Angelyn W.; Webb, Frank H.; Fishbein, Evan F.; Fielding, Eric J.; Owen, Susan E.; Granger, Stephanie L.; Bjoerndahl, Fredrik; Loefgren, Johan; Fang, Peng; Means, James D.; Bock, Yehuda; Tong, Xiaopeng

    2013-01-01

    Atmospheric errors due to the troposphere are a limiting error source for spaceborne interferometric synthetic aperture radar (InSAR) imaging. This software generates tropospheric delay maps that can be used to correct atmospheric artifacts in InSAR data. The software automatically acquires all needed GPS (Global Positioning System), weather, and Digital Elevation Map data, and generates a tropospheric correction map using a novel algorithm for combining GPS and weather information while accounting for terrain. Existing JPL software was prototypical in nature, required a MATLAB license, required additional steps to acquire and ingest needed GPS and weather data, and did not account for topography in interpolation. Previous software did not achieve a level of automation suitable for integration in a Web portal. This software overcomes these issues. GPS estimates of tropospheric delay are a source of corrections that can be used to form correction maps to be applied to InSAR data, but the spacing of GPS stations is insufficient to remove short-wavelength tropospheric artifacts. This software combines interpolated GPS delay with weather model precipitable water vapor (PWV) and a digital elevation model to account for terrain, increasing the spatial resolution of the tropospheric correction maps and thus removing short wavelength tropospheric artifacts to a greater extent. It will be integrated into a Web portal request system, allowing use in a future L-band SAR Earth radar mission data system. This will be a significant contribution to its technology readiness, building on existing investments in in situ space geodetic networks, and improving timeliness, quality, and science value of the collected data

  3. Software architecture for control and data acquisition of linear plasma generator Magnum-PSI

    International Nuclear Information System (INIS)

    Highlights: ? An architecture based on a modular design. ? The design offers flexibility and extendability. ? The design covers the overall software architecture. ? It also covers its (sub)systems’ internal structure. -- Abstract: The FOM Institute DIFFER – Dutch Institute for Fundamental Energy Research has completed the construction phase of Magnum-PSI, a magnetized, steady-state, large area, high-flux linear plasma beam generator to study plasma surface interactions under ITER divertor conditions. Magnum-PSI consists of several hardware subsystems, and a variety of diagnostic systems. The COntrol, Data Acquisition and Communication (CODAC) system integrates these subsystems and provides a complete interface for the Magnum-PSI users. Integrating it all, from the lowest hardware level of sensors and actuators, via the level of networked PLCs and computer systems, up to functions and classes in programming languages, demands a sound and modular software architecture, which is extendable and scalable for future changes. This paper describes this architecture, and the modular design of the software subsystems. The design is implemented in the CODAC system at the level of services and subsystems (the overall software architecture), as well as internally in the software subsystems

  4. Software architecture for control and data acquisition of linear plasma generator Magnum-PSI

    Energy Technology Data Exchange (ETDEWEB)

    Groen, P.W.C., E-mail: p.w.c.groen@differ.nl [FOM Institute DIFFER – Dutch Institute For Fundamental Energy Research, Association EURATOM-FOM, Partner in the Trilateral Euregio Cluster, P.O. Box 1207, 3430 BE, Nieuwegein (Netherlands); Beveren, V. van; Broekema, A.; Busch, P.J.; Genuit, J.W.; Kaas, G.; Poelman, A.J.; Scholten, J.; Zeijlmans van Emmichoven, P.A. [FOM Institute DIFFER – Dutch Institute For Fundamental Energy Research, Association EURATOM-FOM, Partner in the Trilateral Euregio Cluster, P.O. Box 1207, 3430 BE, Nieuwegein (Netherlands)

    2013-10-15

    Highlights: ? An architecture based on a modular design. ? The design offers flexibility and extendability. ? The design covers the overall software architecture. ? It also covers its (sub)systems’ internal structure. -- Abstract: The FOM Institute DIFFER – Dutch Institute for Fundamental Energy Research has completed the construction phase of Magnum-PSI, a magnetized, steady-state, large area, high-flux linear plasma beam generator to study plasma surface interactions under ITER divertor conditions. Magnum-PSI consists of several hardware subsystems, and a variety of diagnostic systems. The COntrol, Data Acquisition and Communication (CODAC) system integrates these subsystems and provides a complete interface for the Magnum-PSI users. Integrating it all, from the lowest hardware level of sensors and actuators, via the level of networked PLCs and computer systems, up to functions and classes in programming languages, demands a sound and modular software architecture, which is extendable and scalable for future changes. This paper describes this architecture, and the modular design of the software subsystems. The design is implemented in the CODAC system at the level of services and subsystems (the overall software architecture), as well as internally in the software subsystems.

  5. 2nd International Open and Distance Learning (IODL Symposium

    Directory of Open Access Journals (Sweden)

    Reviewed by Murat BARKAN

    2006-10-01

    Full Text Available This closing remarks prepared and presented by Prof. Dr. Murat BARKAN Anadolu University, Eskisehir, TURKEY DEAR GUESTS, As the 2nd International Open and Distance Learning (IODL Symposium is now drawing to end, I would like to thank you all for your outstanding speeches, distinguished presentations, constructive roundtable and session discussions, and active participation during the last five days. I hope you all share my view that the whole symposium has been a very stimulating and successful experience. Also, on behalf of all the participants, I would like to take this opportunity to thank and congratulate the symposium organization committee for their excellent job in organizing and hosting our 2nd meeting. Throughout the symposium, five workshops, six keynote speeches and 66 papers, which were prepared by more than 150 academicians and practitioners from 23 different countries, reflected remarkable and various views and approaches about open and flexible learning. Besides, all these academic endeavors, 13 educational films were displayed during the symposium. The technology exhibition, hosted by seven companies, was very effective to showcase the current level of the technology and the success of applications of theory into practice. Now I would like to go over what our scholar workshop and keynote presenters shared with us: Prof. Marina McIsaac form Arizona State University dwelled on how to determine research topics worthwhile to be examined and how to choose appropriate research design and methods. She gave us clues on how to get articles published in professional journals. Prof. Colin Latchem from Australia and Prof. Dr. Ali Ekrem Ozkul from Anadolu University pointed to the importance of strategic planning for distance and flexible learning. They highlighted the advantages of strategic planning for policy-makers, planners, managers and staff. Dr. Wolfram Laaser from Fern University of Hagen, presented different multimedia clips and directed interactive exercises using flashmx in his workshop. Jack Koumi from UK, presented a workshop about what to teach on video and when to choose other media. He exemplified 27 added value techniques and teaching functions for TV and video. He later specified different capabilities and limitations of eight different media used in teaching, emphasizing the importance of optimizing media deployment. Dr. Janet Bohren from University of Cincinnati and Jennifer McVay-Dyche from United Theological Seminary, explained their experience with a course management system used to develop dialogue between K-12 teachers in Turkey and the US, on the topics of religion, culture and schools. Their workshop provided an overview of a pilot study. They showed us a good case-study of utilizing “Blackboard” as a mean for getting rid of biases and improving the understanding of the American and Turkish teachers against each other. We had very remarkable key notes as well. Dr Nikitas Kastis representing European Distance and E-Learning Network (EDEN made his speech on distance and e-Learning evolutions and trends in Europe. He informed the audience about the application and assessment criteria at European scale, concerning e-Learning in the education and training systems. Meanwhile, our key note speakers took our attention to different applications of virtual learning. Dr. Piet Kommers from University of Twente exemplified a virtual training environment for acquiring surgical skills. Dr. Timothy Shih from Tamkang University presented their project called Hard SCORM (Sharable Content Object Reference Model as an asynchronous distance learning specification. In his speech titled “Engaging and Supporting Problem Solving Online” Prof. David Jonassen from University of Missouri, reflected his vision of the future of education and explained why it should embrace problem solving. Then he showed us examples of incorporating this vision with learning environments for making online problem solving possible. Dr. Wolfram Laaser from Fern University talked on applications of ICT at Europe

  6. PREFACE: 2nd National Conference on Nanotechnology 'NANO 2008'

    Science.gov (United States)

    Czuba, P.; Kolodziej, J. J.; Konior, J.; Szymonski, M.

    2009-03-01

    This issue of Journal of Physics: Conference Series contains selected papers presented at the 2nd National Conference on Nanotechnology 'NANO2008', that was held in Kraków, Poland, 25-28 June 2008. It was organized jointly by the Polish Chemical Society, Polish Physical Society, Polish Vacuum Society, and the Centre for Nanometer-scale Science and Advanced Materials (NANOSAM) of the Jagiellonian University. The meeting presentations were categorized into the following topics: 1. Nanomechanics and nanotribology 2. Characterization and manipulation in nanoscale 3. Quantum effects in nanostructures 4. Nanostructures on surfaces 5. Applications of nanotechnology in biology and medicine 6. Nanotechnology in education 7. Industrial applications of nanotechnology, presentations of the companies 8. Nanoengineering and nanomaterials (international sessions shared with the fellows of Maria-Curie Host Fellowships within the 6th FP of the European Community Project 'Nano-Engineering for Expertise and Development, NEED') 9. Nanopowders 10. Carbon nanostructures and nanosystems 11. Nanoelectronics and nanophotonics 12. Nanomaterials in catalysis 13. Nanospintronics 14. Ethical, social, and environmental aspects of nanotechnology The Conference was attended by 334 participants. The presentations were delivered as 7 invited plenary lectures, 25 invited topical lectures, 78 oral and 108 poster contributions. Only 1/6 of the contributions presented during the Conference were submitted for publication in this Proceedings volume. From the submitted material, this volume of Journal of Physics: Conference Series contains 37 articles that were positively evaluated by independent referees. The Organizing Committee gratefully acknowledges all these contributions. We also thank all the referees of the papers submitted for the Proceedings for their timely and thorough work. We would like to thank all members of the National Program Committee for their work in the selection process of invited and contributed papers and in setting up the scientific program of the Conference. P Czuba, J J Kolodziej, J Konior, M Szymonski Kraków, 30 October 2008

  7. POLITO- A new open-source, platform independent software for generating high-quality lithostratigraphic columns

    Directory of Open Access Journals (Sweden)

    Cipran C. Stremtan

    2010-08-01

    Full Text Available POLITO is a free, open-source, and platform-independent software which can automatically generate lithostratigraphic columns from field data. Its simple and easy to use interface allows users to manipulate large datasets and create high-quality graphical outputs, either in editable vector or raster format, or as PDF files. POLITO uses USGS standard lithology patterns and can be downloaded from its Sourceforge project page (http://sourceforge.net/projects/polito/.

  8. RMAWGEN: A software project for a daily Multi-Site Weather Generator with R

    Science.gov (United States)

    Cordano, E.; Eccel, E.

    2012-04-01

    The modeling in in climate change applications for agricultural or hydrological purposes often requires daily time-series of precipitation and temperature. This is the case of downscaled series from monthly or seasonal predictions of Global Climate Models (GCMs). This poster presents a software project, the R package RMAWGEN (R Multi-Sites Auto-regressive Weather GENerator), to generate daily temperature and precipitation time series in several sites by using the theory of vectorial auto-regressive models (VAR). The VAR model is used because it is able to maintain the temporal and spatial correlations among the several series. In particular, observed time series of daily maximum and minimum temperature and precipitation are used to calibrate the parameters of a VAR model (saved as "GPCAvarest2" or "varest2" classes, which inherit the "varest" S3 class defined in the package vars [Pfaff, 2008]). Therefore the VAR model, coupled with monthly mean weather variables downscaled by GCM predictions, allows to generate several stochastic daily scenarios. The structure of the package consists in functions that transform precipitation and temperature time series into Gaussian-distributed random variables through deseasonalization and Principal Component Analysis. Then a VAR model is calibrated on transformed time series. The time series generated by VAR are then inversely re-transformed into precipitation and/or temperature series. An application is included in the software package as an example; it is presented by using a dataset with daily weather time series recorded in 59 different sites of Trentino (Italy) and its neighborhoods for the period 1958-2007. The software is distributed as a Free Software with General Public License (GPL) and is available on CRAN website (http://cran.r-project.org/web/packages/RMAWGEN/index.html)

  9. FACTORS GENERATING RISKS DURING REQUIREMENT ENGINEERING PROCESS IN GLOBAL SOFTWARE DEVELOPMENT ENVIRONMENT

    OpenAIRE

    Huma Hayat Khan; Mohd. Naz’ri bin Mahrin; Suriayati bt Chuprat

    2014-01-01

    Challenges of Requirements Engineering become adequate when it is performed in global software development paradigm. There can be many reasons behind this challenging nature. “Risks” can be one of them, as there is more risk exposure in global development paradigm. So it is may be one of the main reasons of making Requirement Engineering more challenging. For this first there is a need to identify the factors which actually generate these risks. This paper therefore not only identifies the fa...

  10. An automation of design and modelling tasks in NX Siemens environment with original software - generator module

    Science.gov (United States)

    Zbiciak, M.; Grabowik, C.; Janik, W.

    2015-11-01

    Nowadays the design constructional process is almost exclusively aided with CAD/CAE/CAM systems. It is evaluated that nearly 80% of design activities have a routine nature. These design routine tasks are highly susceptible to automation. Design automation is usually made with API tools which allow building original software responsible for adding different engineering activities. In this paper the original software worked out in order to automate engineering tasks at the stage of a product geometrical shape design is presented. The elaborated software works exclusively in NX Siemens CAD/CAM/CAE environment and was prepared in Microsoft Visual Studio with application of the .NET technology and NX SNAP library. The software functionality allows designing and modelling of spur and helicoidal involute gears. Moreover, it is possible to estimate relative manufacturing costs. With the Generator module it is possible to design and model both standard and non-standard gear wheels. The main advantage of the model generated in such a way is its better representation of an involute curve in comparison to those which are drawn in specialized standard CAD systems tools. It comes from fact that usually in CAD systems an involute curve is drawn by 3 points that respond to points located on the addendum circle, the reference diameter of a gear and the base circle respectively. In the Generator module the involute curve is drawn by 11 involute points which are located on and upper the base and the addendum circles therefore 3D gear wheels models are highly accurate. Application of the Generator module makes the modelling process very rapid so that the gear wheel modelling time is reduced to several seconds. During the conducted research the analysis of differences between standard 3 points and 11 points involutes was made. The results and conclusions drawn upon analysis are shown in details.

  11. Analysis of quality raw data of second generation sequencers with Quality Assessment Software

    Directory of Open Access Journals (Sweden)

    Schneider Maria PC

    2011-04-01

    Full Text Available Abstract Background Second generation technologies have advantages over Sanger; however, they have resulted in new challenges for the genome construction process, especially because of the small size of the reads, despite the high degree of coverage. Independent of the program chosen for the construction process, DNA sequences are superimposed, based on identity, to extend the reads, generating contigs; mismatches indicate a lack of homology and are not included. This process improves our confidence in the sequences that are generated. Findings We developed Quality Assessment Software, with which one can review graphs showing the distribution of quality values from the sequencing reads. This software allow us to adopt more stringent quality standards for sequence data, based on quality-graph analysis and estimated coverage after applying the quality filter, providing acceptable sequence coverage for genome construction from short reads. Conclusions Quality filtering is a fundamental step in the process of constructing genomes, as it reduces the frequency of incorrect alignments that are caused by measuring errors, which can occur during the construction process due to the size of the reads, provoking misassemblies. Application of quality filters to sequence data, using the software Quality Assessment, along with graphing analyses, provided greater precision in the definition of cutoff parameters, which increased the accuracy of genome construction.

  12. GENESIS: Agile Generation of Information Management Oriented Software / GENESIS: Generación ágil de software orientado a gestión de información

    Scientific Electronic Library Online (English)

    Claudia, Jiménez Guarín; Juan Erasmo, Gómez.

    2010-05-01

    Full Text Available La especificación de un sistema de información puede estar clara desde el principio: debe adquirir, desplegar, consultar y modificar datos, usando una base de datos. El asunto es decidir cuál información manejar. En el caso que origina este trabajo, la información evoluciona permanentemente, incluso [...] hasta final del proyecto. Esto implica la construcción de un nuevo sistema cada vez que se redefine la información. Este artículo presenta Genesis, una infraestructura ágil para la construcción inmediata del sistema de información que sea requerido. Los expertos describen su información y consultas. Genesis produce el software correspondiente, generando las interfaces gráficas y la base de datos apropiados. Abstract in english The specification for an information system can be clear from the beginning: it must acquire, display, query and modify data, using a database. The main issue is to decide which information to manage. In the case originating this work, information was always evolving, even up to the end of the proje [...] ct. This implies the construction of a new system each time the information is redefined. This article presents Genesis, an agile for the immediate construction of required information systems. Experts describe their information needs and queries, and Genesis generates the corresponding application, with the appropriate graphical interfaces and database.

  13. NgsRelate : a software tool for estimating pairwise relatedness from next-generation sequencing data

    DEFF Research Database (Denmark)

    Korneliussen, Thorfinn Sand; Moltke, Ida

    2015-01-01

    MOTIVATION: Pairwise relatedness estimation is important in many contexts such as disease mapping and population genetics. However, all existing estimation methods are based on called genotypes, which is not ideal for next-generation sequencing (NGS) data of low depth from which genotypes cannot be called with high certainty. RESULTS: We present a software tool, NgsRelate, for estimating pairwise relatedness from NGS data. It provides maximum likelihood estimates that are based on genotype likelihoods instead of genotypes and thereby takes the inherent uncertainty of the genotypes into account. Using both simulated and real data, we show that NgsRelate provides markedly better estimates for low-depth NGS data than two state-of-the-art genotype-based methods. AVAILABILITY: NgsRelate is implemented in C++ and is available under the GNU license at www.pop gen.dk/software. CONTACT: ida@binf.ku.dkSupplementary information: Supplementary data are available at Bioinformatics online.

  14. NgsRelate: a software tool for estimating pairwise relatedness from next-generation sequencing data

    Science.gov (United States)

    Korneliussen, Thorfinn Sand; Moltke, Ida

    2015-01-01

    Motivation: Pairwise relatedness estimation is important in many contexts such as disease mapping and population genetics. However, all existing estimation methods are based on called genotypes, which is not ideal for next-generation sequencing (NGS) data of low depth from which genotypes cannot be called with high certainty. Results: We present a software tool, NgsRelate, for estimating pairwise relatedness from NGS data. It provides maximum likelihood estimates that are based on genotype likelihoods instead of genotypes and thereby takes the inherent uncertainty of the genotypes into account. Using both simulated and real data, we show that NgsRelate provides markedly better estimates for low-depth NGS data than two state-of-the-art genotype-based methods. Availability: NgsRelate is implemented in C++ and is available under the GNU license at www.popgen.dk/software. Contact: ida@binf.ku.dk Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26323718

  15. Combustion synthesis and characterization of Ba2NdSbO6 nanocrystals

    Indian Academy of Sciences (India)

    V T Kavitha; R Jose; S Ramakrishna; P R S Wariar; J Koshy

    2011-07-01

    Nanocrystalline Ba2NdSbO6, a complex cubic perovskite metal oxide, powders were synthesized by a self-sustained combustion method employing citric acid. The product was characterized by X-ray diffraction, differential thermal analysis, thermogravimetric analysis, Fourier transform infrared spectroscopy, transmission electron microscopy and scanning electron microscopy. The as-prepared powders were single phase Ba2NdSbO6 and a mixture of polycrystalline spheroidal particles and single crystalline nanorods. The Ba2NdSbO6 sample sintered at 1500°C for 4 h has high density (? 95% of theoretical density). Sintered nanocrystalline Ba2NdSbO6 had a dielectric constant of ? 21; and dielectric loss = 8 × 10-3 at 5 MHz.

  16. DCT and Eigenvectors of Covariance of 1st and 2nd order Discrete fractional Brownian motion

    OpenAIRE

    Gupta, Anubha; Joshi, ShivDutt

    2013-01-01

    This paper establishes connection between discrete cosine transform (DCT) and 1st and 2nd order discrete-time fractional Brownian motion process. It is proved that the eigenvectors of the auto-covariance matrix of a 1st and 2nd order discrete-time fractional Brownian motion can be approximated by DCT basis vectors in the asymptotic sense. Perturbation in eigenvectors from DCT basis vectors is modeled using the analytic perturbation theory of linear operators.

  17. Generation of a Database of Laboratory Laser-Induced Breakdown Spectroscopy (LIBS) Spectra and Associated Analysis Software

    Science.gov (United States)

    Anderson, R. B.; Clegg, S. M.; Graff, T.; Morris, R. V.; Laura, J.

    2015-06-01

    We describe plans to generate a database of LIBS spectra of planetary analog materials and develop free, open-source software to enable the planetary community to analyze LIBS (and other spectral) data.

  18. PREFACE: 2nd Workshop on Germanium Detectors and Technologies

    Science.gov (United States)

    Abt, I.; Majorovits, B.; Keller, C.; Mei, D.; Wang, G.; Wei, W.

    2015-05-01

    The 2nd workshop on Germanium (Ge) detectors and technology was held at the University of South Dakota on September 14-17th 2014, with more than 113 participants from 8 countries, 22 institutions, 15 national laboratories, and 8 companies. The participants represented the following big projects: (1) GERDA and Majorana for the search of neutrinoless double-beta decay (0???) (2) SuperCDMS, EDELWEISS, CDEX, and CoGeNT for search of dark matter; (3) TEXONO for sub-keV neutrino physics; (4) AGATA and GRETINA for gamma tracking; (5) AARM and others for low background radiation counting; (5) as well as PNNL and LBNL for applications of Ge detectors in homeland security. All participants have expressed a strong desire on having better understanding of Ge detector performance and advancing Ge technology for large-scale applications. The purpose of this workshop was to leverage the unique aspects of the underground laboratories in the world and the germanium (Ge) crystal growing infrastructure at the University of South Dakota (USD) by brining researchers from several institutions taking part in the Experimental Program to Stimulate Competitive Research (EPSCoR) together with key leaders from international laboratories and prestigious universities, working on the forefront of the intensity to advance underground physics focusing on the searches for dark matter, neutrinoless double-beta decay (0???), and neutrino properties. The goal of the workshop was to develop opportunities for EPSCoR institutions to play key roles in the planned world-class research experiments. The workshop was to integrate individual talents and existing research capabilities, from multiple disciplines and multiple institutions, to develop research collaborations, which includes EPSCor institutions from South Dakota, North Dakota, Alabama, Iowa, and South Carolina to support multi-ton scale experiments for future. The topic areas covered in the workshop were: 1) science related to Ge-based detectors and technology; 2) Ge zone refining and crystal growth; 3) Ge detector development; 4) Ge orientated business and applications; 5) Ge recycling and recovery; 6) introduction to underground sciences for young scientists; and 7) introduction of experimental techniques for low background experiments to young scientists. Sections 1-5 were dedicated to Ge detectors and technology. Each topic was complemented with a panel discussion on challenges, critical measures, and R&D activities. Sections 6-7 provided students and postdocs an opportunity to understand fundamental principles of underground sciences and experimental techniques on low background experiments. To these two sections, well-known scientists in the field were invited to give lectures and allow young scientists to make presentations on their own research activities. Fifty-six invited talks were delivered during the three-day workshop. Many critical questions were addressed not only in the specific talks but also in the panel discussions. Details of the panel discussions, as well as conference photos, the list of committees and the workshop website can be found in the PDF.

  19. EVENT GENERATION OF STANDARD MODEL HIGGS DECAY TO DIMUON PAIRS USING PYTHIA SOFTWARE

    CERN Document Server

    Yusof, Adib

    2015-01-01

    My project for CERN Summer Student Programme 2015 is on Event Generation of Standard Model Higgs Decay to Dimuon Pairs using Pythia Software. Briefly, Pythia or specifically, Pythia 8.1 is a program for the generation of high-energy Physics events that is able to describe the collisions at any given energies between elementary particles such as Electron, Positron, Proton as well as anti-Proton. It contains theory and models for a number of Physics aspects, including hard and soft interactions, parton distributions, initial-state and final-state parton showers, multiparton interactions, fragmentation and decay. All programming code is to be written in C++ language for this version (the previous version uses FORTRAN) and can be linked to ROOT software for displaying output in form of histogram. For my project, I need to generate events for standard model Higgs Boson into Muon and anti-Muon pairs (H??+ ?) to study the expected significance value for this particular process at centre-of-mass energy of 13 TeV...

  20. A proposed metamodel for the implementation of object oriented software through the automatic generation of source code

    Directory of Open Access Journals (Sweden)

    CARVALHO, J. S. C.

    2008-12-01

    Full Text Available During the development of software one of the most visible risks and perhaps the biggest implementation obstacle relates to the time management. All delivery deadlines software versions must be followed, but it is not always possible, sometimes due to delay in coding. This paper presents a metamodel for software implementation, which will rise to a development tool for automatic generation of source code, in order to make any development pattern transparent to the programmer, significantly reducing the time spent in coding artifacts that make up the software.

  1. Proceedings 2nd Workshop on GRAPH Inspection and Traversal Engineering

    OpenAIRE

    Wijs, Anton; Bošna?ki, Dragan; Edelkamp, Stefan

    2013-01-01

    These are the proceedings of the Second Workshop on GRAPH Inspection and Traversal Engineering (GRAPHITE 2013), which took place on March 24, 2013 in Rome, Italy, as a satellite event of the 16th European Joint Conferences on Theory and Practice of Software (ETAPS 2013). The topic of the GRAPHITE workshop is graph analysis in all its forms in computer science. Graphs are used to represent data in many application areas, and they are subjected to various computational algor...

  2. Severe weather phenomena: SQUALL LINES The case of July 2nd 2009

    Science.gov (United States)

    Paraschivescu, Mihnea; Tanase, Adrian

    2010-05-01

    The wind intensity plays an important role, among the dangerous meteorological phenomena, to produce negative effects on the economy and the social activities, particularly when the wind is about to turn into a storm. During the past years one can notice an increase of wind frequency and intensity due to climate changes and, consequently, as a result of the extreme meteorological phenomena not only on a planetary level but also on a regional one. Although dangerous meteorological phenomena cannot be avoided, since they are natural, nevertheless they can be anticipated and decision making institutions and mass media can be informed. This is the reason why, in this paper, we set out to identify the synoptic conditions that led to the occurrence of the severe storm case in Bucharest on July 2nd, 2009, as well as the matrices that generate such cases. At the same time we sought to identify some indications evidence especially from radar data so as to lead to the improvement of the time interval between the nowcasting warning and the actual occurrence of the phenomenon.

  3. Evolution of a Reconfigurable Processing Platform for a Next Generation Space Software Defined Radio

    Science.gov (United States)

    Kacpura, Thomas J.; Downey, Joseph A.; Anderson, Keffery R.; Baldwin, Keith

    2014-01-01

    The National Aeronautics and Space Administration (NASA)Harris Ka-Band Software Defined Radio (SDR) is the first, fully reprogrammable space-qualified SDR operating in the Ka-Band frequency range. Providing exceptionally higher data communication rates than previously possible, this SDR offers in-orbit reconfiguration, multi-waveform operation, and fast deployment due to its highly modular hardware and software architecture. Currently in operation on the International Space Station (ISS), this new paradigm of reconfigurable technology is enabling experimenters to investigate navigation and networking in the space environment.The modular SDR and the NASA developed Space Telecommunications Radio System (STRS) architecture standard are the basis for Harris reusable, digital signal processing space platform trademarked as AppSTAR. As a result, two new space radio products are a synthetic aperture radar payload and an Automatic Detection Surveillance Broadcast (ADS-B) receiver. In addition, Harris is currently developing many new products similar to the Ka-Band software defined radio for other applications. For NASAs next generation flight Ka-Band radio development, leveraging these advancements could lead to a more robust and more capable software defined radio.The space environment has special considerations different from terrestrial applications that must be considered for any system operated in space. Each space mission has unique requirements that can make these systems unique. These unique requirements can make products that are expensive and limited in reuse. Space systems put a premium on size, weight and power. A key trade is the amount of reconfigurability in a space system. The more reconfigurable the hardware platform, the easier it is to adapt to the platform to the next mission, and this reduces the amount of non-recurring engineering costs. However, the more reconfigurable platforms often use more spacecraft resources. Software has similar considerations to hardware. Having an architecture standard promotes reuse of software and firmware. Space platforms have limited processor capability, which makes the trade on the amount of amount of flexibility paramount.

  4. ANT: Software for Generating and Evaluating Degenerate Codons for Natural and Expanded Genetic Codes.

    Science.gov (United States)

    Engqvist, Martin K M; Nielsen, Jens

    2015-08-21

    The Ambiguous Nucleotide Tool (ANT) is a desktop application that generates and evaluates degenerate codons. Degenerate codons are used to represent DNA positions that have multiple possible nucleotide alternatives. This is useful for protein engineering and directed evolution, where primers specified with degenerate codons are used as a basis for generating libraries of protein sequences. ANT is intuitive and can be used in a graphical user interface or by interacting with the code through a defined application programming interface. ANT comes with full support for nonstandard, user-defined, or expanded genetic codes (translation tables), which is important because synthetic biology is being applied to an ever widening range of natural and engineered organisms. The Python source code for ANT is freely distributed so that it may be used without restriction, modified, and incorporated in other software or custom data pipelines. PMID:25901796

  5. Meteosat Second Generation station: processing software and computing arquitecture; Estacion de Recepcion de Imagenes del Satelite Meteosat Segunda generacion: Arquitectura Informatica y Software de Proceso

    Energy Technology Data Exchange (ETDEWEB)

    Martin, L.; Cony, M.; Navarro, A. A.; Zarzalejo, L. F.; Polo, J.

    2010-05-01

    The Renewable Energy Division of CIEMAT houses a specific station for receiving the Meteosat Second Generation images, which is of interest in the works being carried out concerning the solar radiation derived from satellite images. The complexity, the huge amount of information being received and the particular characteristics of the MSG images encouraged the design and development of a specific computer structure and the associated software as well, for a better and more suitable use of the images. This document describes the mentioned structure and software. (Author) 8 refs.

  6. Analysis and software development for controlling RF signal generator proton cyclotron Decy-13 using DDS Technique

    International Nuclear Information System (INIS)

    Analysis and manufacture of computer programs for controlling the signal generator Radio Frequency (RF) proton cyclotron Decy-13 have been done. Signal generator uses a technique Direct Digital Synthesiser (DDS) which settings must be done with software. Signal generator consists of electronic modules which are: DDS, micro controller ATmega16, amplifier RF.dan ± 12 Vdc power supply. Function of the programs that have been made is to set the DDS module, namely: output frequency, step frequency and phase settings and displays the operating parameters of the DDS and the RF amplifier on the monitor screen. Computer programs created with Visual Basic and has been tested to control the RF signal generator to send data serially to the module ATmega16 and receives data to be displayed on the monitor screen. Testing sending and receiving data is done with a baudrate of 1200 bps to 19200 bps with perfect results. Computer programs that have been made equipped with a Human Machine Interface to provide values parameter input on the DDS operations. (author)

  7. 2nd International Conference on Computer and Communication Technologies

    CERN Document Server

    Raju, K; Mandal, Jyotsna; Bhateja, Vikrant

    2016-01-01

    The book is about all aspects of computing, communication, general sciences and educational research covered at the Second International Conference on Computer & Communication Technologies held during 24-26 July 2015 at Hyderabad. It hosted by CMR Technical Campus in association with Division – V (Education & Research) CSI, India. After a rigorous review only quality papers are selected and included in this book. The entire book is divided into three volumes. Three volumes cover a variety of topics which include medical imaging, networks, data mining, intelligent computing, software design, image processing, mobile computing, digital signals and speech processing, video surveillance and processing, web mining, wireless sensor networks, circuit analysis, fuzzy systems, antenna and communication systems, biomedical signal processing and applications, cloud computing, embedded systems applications and cyber security and digital forensic. The readers of these volumes will be highly benefited from the te...

  8. Thermoluminescent characteristics of ZrO{sub 2}:Nd films; Caracteristicas termoluminiscentes de peliculas de ZrO{sub 2}:Nd

    Energy Technology Data Exchange (ETDEWEB)

    Vera B, G.; Rivera M, T. [Escuela Superior de Ingenieria Mecanica y Electrica-IPN, 04430 Mexico D.F. (Mexico); Azorin N, J. [Departamento de Fisica, Universidad Autonoma Metropolitana-Iztapalapa, 09340 Mexico D.F. (Mexico); Falcony G, C. [Centro de Investigacion y Estudios Avanzados-IPN, 07000 Mexico D.F. (Mexico); Garcia H, M.; Martinez S, E. [Instituto de Investigaciones en Materiales-UNAM, C.P. 04510 Mexico D.F. (Mexico)

    2002-07-01

    In this work it is exposed the obtained results after analysing the photo luminescent and thermoluminescent characteristics of activated zirconium oxide with neodymium (ZrO{sub 2} :Nd) and its possible application in the UV radiation dosimetry. The realized experiments had as objective to study the characteristics such as the optimum thermal erased treatment, the influence of light on the response, the response depending on the wavelength, the fadeout of the information, the temperature effect, the response depending on the time and the recurring of the response. The results show that the ZrO{sub 2} :Nd is a promising material to be used as Tl dosemeter for the UV radiation. (Author)

  9. Quetzalcoatl: Una Herramienta para Generar Contratos de Desarrollo de Software en Entornos de Outsourcing / Quetzalcoatl: A Tool for Generating Software Development Contracts in Outsourcing Environments

    Scientific Electronic Library Online (English)

    Jezreel, Mejía; Sergio D., Ixmatlahua; Alma I., Sánchez.

    2014-03-01

    Full Text Available Actualmente el outsourcing es una de las actividades principales de trabajo. Sin embargo, las relaciones que se dan entre un cliente y un proveedor de servicios no son lo suficientemente fuertes para lograr las expectativas de los acuerdos. El contrato de outsourcing para proyectos en desarrollo de [...] software es una alternativa a este tipo de relaciones. En este artículo se presenta la arquitectura de la herramienta Quetzalcoatl, así como las funciones principales que ofrece la herramienta, todo esto con el objetivo de generar y evaluar contratos para proyectos de desarrollo de software en entornos de outsourcing como apoyo a las PYMEs. Abstract in english Nowadays, outsourcing is one of the most important work activities for the software development companies. However, the relationships between a client and a service provider are not b enough to meet the expectations of the agreements. The outsourcing contract for software development projects is an [...] alternative to this type of relationship. This paper presents the architecture of the tool named Quetzalcoatl, also the main functions that this tool offers in order to generate and evaluate contracts for software development projects in outsourcing environments to support SMEs.

  10. Pragmatics Annotated Coloured Petri Nets for Protocol Software Generation and Verification

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge; Kristensen, Lars Michael

    2014-01-01

    This paper presents the formal definition of Pragmatics Annotated Coloured Petri Nets (PA-CPNs). PA-CPNs represent a class of Coloured Petri Nets (CPNs) that are designed to support automated code genera-tion of protocol software. PA-CPNs restrict the structure of CPN models and allow Petri net elements to be annotated with so-called pragmatics, which are exploited for code generation. The approach and tool for gen-erating code is called PetriCode and has been discussed and evaluated in earlier work already. The contribution of this paper is to give a formal def-inition for PA-CPNs; in addition, we show how the structural restrictions of PA-CPNs can be exploited for making the verification of the modelled protocols more efficient. This is done by automatically deriving progress measures for the sweep-line method, and by introducing so-called service testers, that can be used to control the part of the state space that is to be explored for verification purposes.

  11. Next generation hyper-scale software and hardware systems for big data analytics

    CERN Document Server

    CERN. Geneva

    2013-01-01

    Building on foundational technologies such as many-core systems, non-volatile memories and photonic interconnects, we describe some current technologies and future research to create real-time, big data analytics, IT infrastructure. We will also briefly describe some of our biologically-inspired software and hardware architecture for creating radically new hyper-scale cognitive computing systems. About the speaker Rich Friedrich is the director of Strategic Innovation and Research Services (SIRS) at HP Labs. In this strategic role, he is responsible for research investments in nano-technology, exascale computing, cyber security, information management, cloud computing, immersive interaction, sustainability, social computing and commercial digital printing. Rich's philosophy is to fuse strategy and inspiration to create compelling capabilities for next generation information devices, systems and services. Using essential insights gained from the metaphysics of innnovation, he effectively leads ...

  12. Next Generation Astronomical Data Processing using Big Data Technologies from the Apache Software Foundation

    Science.gov (United States)

    Mattmann, Chris

    2014-04-01

    In this era of exascale instruments for astronomy we must naturally develop next generation capabilities for the unprecedented data volume and velocity that will arrive due to the veracity of these ground-based sensor and observatories. Integrating scientific algorithms stewarded by scientific groups unobtrusively and rapidly; intelligently selecting data movement technologies; making use of cloud computing for storage and processing; and automatically extracting text and metadata and science from any type of file are all needed capabilities in this exciting time. Our group at NASA JPL has promoted the use of open source data management technologies available from the Apache Software Foundation (ASF) in pursuit of constructing next generation data management and processing systems for astronomical instruments including the Expanded Very Large Array (EVLA) in Socorro, NM and the Atacama Large Milimetre/Sub Milimetre Array (ALMA); as well as for the KAT-7 project led by SKA South Africa as a precursor to the full MeerKAT telescope. In addition we are funded currently by the National Science Foundation in the US to work with MIT Haystack Observatory and the University of Cambridge in the UK to construct a Radio Array of Portable Interferometric Devices (RAPID) that will undoubtedly draw from the rich technology advances underway. NASA JPL is investing in a strategic initiative for Big Data that is pulling in these capabilities and technologies for astronomical instruments and also for Earth science remote sensing. In this talk I will describe the above collaborative efforts underway and point to solutions in open source from the Apache Software Foundation that can be deployed and used today and that are already bringing our teams and projects benefits. I will describe how others can take advantage of our experience and point towards future application and contribution of these tools.

  13. 2nd international expert meeting straw power; 2. Internationale Fachtagung Strohenergie

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2012-06-15

    Within the 2nd Guelzow expert discussions at 29th to 30th March, 2012 in Berlin (Federal Republic of Germany), the following lectures were held: (1) Promotion of the utilisation of straw in Germany (A. Schuette); (2) The significance of straw in the heat and power generation in EU-27 member states in 2020 and in 2030 under consideration of the costs and sustainability criteria (C. Panoutsou); (3) State of he art of the energetic utilization of hay goods in Europe (D. Thraen); (4) Incineration technological characterisation of straw based on analysis data as well as measured data of large-scale installations (I. Obernberger); (5) Energetic utilization of hay goods in Germany (T. Hering); (6) Actual state of the art towards establishing the first German straw thermal power station (R. Knieper); (7) Straw thermal power plants at agricultural sow farms and poultry farms (H. Heilmann); (8) Country report power from straw in Denmark (A. Evald); (9) Country report power from straw in Poland (J. Antonowicz); (10) Country report power from straw in China (J. Zhang); (11) Energetic utilisation of straw in Czechia (D. Andert); (12) Mobile pelletization of straw (S. Auth); (13) Experiences with the straw thermal power plant from Vattenfall (N. Kirkegaard); (14) Available straw potentials in Germany (potential, straw provision costs) (C. Weiser); (15) Standardization of hay good and test fuels - Classification and development of product standards (M. Englisch); (16) Measures of reduction of emissions at hay good incinerators (V. Lenz); (17) Fermentation of straw - State of the art and perspectives (G. Reinhold); (18) Cellulosis - Ethanol from agricultural residues - Sustainable biofuels (A. Hartmair); (19) Syngas by fermentation of straw (N. Dahmen); (20) Construction using straw (D. Scharmer).

  14. Modeling of wind turbines with doubly fed generator system

    CERN Document Server

    Fortmann, Jens

    2014-01-01

    Jens Fortmann describes the deduction of models for the grid integration of variable speed wind turbines and the reactive power control design of wind plants. The modeling part is intended as background to understand the theory, capabilities and limitations of the generic doubly fed generator and full converter wind turbine models described in the IEC 61400-27-1 and as 2nd generation WECC models that are used as standard library models of wind turbines for grid simulation software. Focus of the reactive power control part is a deduction of the origin and theory behind the reactive current requ

  15. 2nd International Conference on Robot Intelligence Technology and Applications

    CERN Document Server

    Matson, Eric; Myung, Hyun; Xu, Peter; Karray, Fakhri

    2014-01-01

    We are facing a new technological challenge on how to store and retrieve knowledge and manipulate intelligence for autonomous services by intelligent systems which should be capable of carrying out real world tasks autonomously. To address this issue, robot researchers have been developing intelligence technology (InT) for “robots that think” which is in the focus of this book. The book covers all aspects of intelligence from perception at sensor level and reasoning at cognitive level to behavior planning at execution level for each low level segment of the machine. It also presents the technologies for cognitive reasoning, social interaction with humans, behavior generation, ability to cooperate with other robots, ambience awareness, and an artificial genome that can be passed on to other robots. These technologies are to materialize cognitive intelligence, social intelligence, behavioral intelligence, collective intelligence, ambient intelligence and genetic intelligence. The book aims at serving resear...

  16. Understanding environmental pollution: a primer. 2nd ed.

    Energy Technology Data Exchange (ETDEWEB)

    Marquita K. Hill [University of Maine, Orono, MN (United States)

    2004-08-15

    The book moves from the definition of pollution and how pollutants behave, to air and water pollution basics, pollution and global change, solid waste, and pollution in the home. It also discusses persistent and bioaccumulative chemicals, and pesticides, and it places greater stress on global pollutants. The relationship between energy generation and use, and pollution is stressed, as well as the importance of going beyond pollution control, to pollution prevention. Impacts on human and environmental health are emphasized throughout. Contents are: 1. Understanding pollution; 2. Reducing pollution; 3. Chemical toxicity; 4. Chemical exposures and risk assessment; 5. Ambient air pollution; 6. Acid deposition; 7. Global climate change; 8. Stratispheric ozone depletion; 9. Water pollution; 10. Drinking water; 11. Solid waste; 12. Hazardous waste; 13. Energy; 14. Persistent, bioaccumulative and toxic; 15. Metals; 16. Pesticides; 17. Pollution at home; and 18. Zero waste, zero emissions. 69 figs., 42 tabs.

  17. 2nd Topical Workshop on Laser Technology and Optics Design

    CERN Document Server

    2013-01-01

    Lasers have a variety of applications in particle accelerator operation and will play a key role in the development of future particle accelerators by improving the generation of high brightness electron and exotic ion beams and through increasing the acceleration gradient. Lasers will also make an increasingly important contribution to the characterization of many complex particle beams by means of laser-based beam diagnostics methods. The second LANET topical workshop will address the key aspects of laser technology and optics design relevant to laser application to accelerators. The workshop will cover general optics design, provide an overview of different laser sources and discuss methods to characterize beams in details. Participants will be able to choose from a range of topical areas that go deeper in more specific aspects including tuneable lasers, design of transfer lines, noise sources and their elimination and non-linear optics effects. The format of the workshop will be mainly training-based wit...

  18. The Crest Wing Wave Energy Device : 2nd phase testing

    DEFF Research Database (Denmark)

    Kofoed, Jens Peter; Antonishen, Michael Patrick

    2009-01-01

    This report presents the results of a continuation of an experimental study of the wave energy converting abilities of the Crest Wing wave energy converter (WEC), in the following referred to as ‘Phase 2'. The Crest Wing is a WEC that uses its movement in matching the shape of an oncoming wave to generate power. Model tests have been performed using scale models (length scale 1:30), provided by WaveEnergyFyn, in regular and irregular wave states that can be found in Assessment of Wave Energy Devices. Best Practice as used in Denmark (Frigaard et al., 2008). The tests were carried out at Dept. of Civil Engineering, Aalborg University (AAU) in the 3D deep water wave tank. The displacement and force applied to a power take off system, provided by WaveEnergyFyn, were measured and used to calculate mechanical power available to the power take off.

  19. Software for evaluating magnetic induction field generated by power lines: implementation of a new algorithm

    International Nuclear Information System (INIS)

    The Regional Environment Protection Agency of Friuli Venezia Giulia (A.R.P.A. F.V.G., Italy) has performed an analysis on existing software designed to calculate magnetic induction field generated by power lines. As far as the agency requirements are concerned the tested programs display some difficulties in the immediate processing of electrical and geometrical data supplied by plant owners, and in certain cases turn out to be inadequate in representing complex configurations of power lines. Furthermore, none of them is preset for cyclic calculus to determine the time evolution of induction in a certain exposure area. Finally, the output data are not immediately importable by ArcView, the G.I.S. used by A.R.P.A. F.V.G., and it is not always possible to implement the territory orography to determine the field at specified heights above the ground. P.h.i.d.e.l., an innovative software, tackles and works out al l the above mentioned problems. The power line wires interested in its implementation are represented by poly lines, and the field is analytically calculated, with no further approximation, not even when more power lines are concerned. Therefore, the obtained results, when compared with those of other programs, are the closest to experimental measurements. The output data can be employed both in G.I.S. and Excel environments, allowing the immediate overlaying of digital cartography and the determining of the 3 and 10 ?T bands, in compliance with the Italian Decree of the President of the Council of Ministers of 8 July 2003. (authors)

  20. Proceedings of the 2nd Annual Beaufort Marine Biodiscovery Research Workshop

    OpenAIRE

    Nardello, I.

    2010-01-01

    This publication presents the background and aims of the 2nd Annual Marine Biodiscovery Workshop 2009. Presentations relating to progress achieved in the marine biodiscovery research area through the Irish Beaufort Marine Biodiscovery Research Awards have been captured in extended abstracts.

  1. GWAS: Diabetes mellitus JSNP 2nd stage (GeMDBJ) [gwas

    Lifescience Database Archive (English)

    Full Text Available Diabetes JSNP GeMDBJ : Diabetes mellitus JSNP 2nd stage (GeMDBJ) Study details Disease Name : Ty ... se Comments : This data is originated from GeMDBJ (Genome ... Medicine Database of Japan) , which has been creat ...

  2. GWAS: Gastric cancer JSNP 2nd stage (GeMDBJ) [gwas

    Lifescience Database Archive (English)

    Full Text Available Gastric_cancer JSNP GeMDBJ : Gastric cancer JSNP 2nd stage (GeMDBJ) Study details Disease Name : ... se Comments : This data is originated from GeMDBJ (Genome ... Medicine Database of Japan) , which has been creat ...

  3. GWAS: Bronchial asthma JSNP 2nd stage (GeMDBJ) [gwas

    Lifescience Database Archive (English)

    Full Text Available Bronchial_asthma JSNP GeMDBJ : Bronchial asthma JSNP 2nd stage (GeMDBJ) Study details Disease Na ... se Comments : This data is originated from GeMDBJ (Genome ... Medicine Database of Japan) , which has been creat ...

  4. Methods for the Determination of Chemical Substances in Marine and Estuarine Environmental Matrices - 2nd Edition

    Science.gov (United States)

    This NERL-Cincinnati publication, “Methods for the Determination of Chemical Substances in Marine and Estuarine Environmental Matrices - 2nd Edition” was prepared as the continuation of an initiative to gather together under a single cover a compendium of standardized laborato...

  5. Proceedings of the 2nd Mediterranean Conference on Information Technology Applications (ITA '97)

    International Nuclear Information System (INIS)

    This is the proceedings of the 2nd Mediterranean Conference on Information Technology Applications, held in Nicosia, Cyprus, between 6-7 November, 1997. It contains 16 papers. Two of these fall within the scope of INIS and are dealing with Telemetry, Radiation Monitoring, Environment Monitoring, Radiation Accidents, Air Pollution Monitoring, Diagnosis, Computers, Radiology and Data Processing

  6. 8. Book Review: ‘Broken Bones: Anthropological Analysis of Blunt Force Trauma’ 2 nd edition, 2014

    Directory of Open Access Journals (Sweden)

    R. Gaur

    2014-04-01

    Full Text Available 'Broken Bones: Anthropological Analysis of Blunt Force Trauma' 2nd edition, 2014. Editors: Vicki L. Wedel and Alison Galloway; Publisher: Charles C. Thomas, Illinois. pp 479 + xxiii ISBN: 978-0-398-08768-5 (Hard ISBN: 978-0-398-08769-2 (eBook

  7. 8. Book Review: ‘Broken Bones: Anthropological Analysis of Blunt Force Trauma’ 2 nd edition, 2014

    OpenAIRE

    Gaur, R.

    2014-01-01

    'Broken Bones: Anthropological Analysis of Blunt Force Trauma' 2nd edition, 2014. Editors: Vicki L. Wedel and Alison Galloway; Publisher: Charles C. Thomas, Illinois. pp 479 + xxiii ISBN: 978-0-398-08768-5 (Hard) ISBN: 978-0-398-08769-2 (eBook)

  8. Proceedings of the 2nd symposium on valves for coal conversion and utilization

    Energy Technology Data Exchange (ETDEWEB)

    Maxfield, D.A. (ed.)

    1981-01-01

    The 2nd symposium on valves for coal conversion and utilization was held October 15 to 17, 1980. It was sponsored by the US Department of Energy, Morgantown Energy Technology Center, in cooperation with the Valve Manufacturers Association. Seventeen papers have been entered individually into EDB and ERA. (LTN)

  9. Benchmarks on automated system and software generation higher flexibility increased productivity and shorter time-to-market by ScaPable software

    Science.gov (United States)

    Gerlich, Rainer

    2002-07-01

    "ScaPable" is an acronym derived from "scalable" and "portable". The attribute "scalable" indicates that specific application software can automatically be built from scratch and verified without writing any statement in a programming language like C, thereby covering a large variety of embedded and/or distributed applications. The term "portable" addresses the capability to automatically port parts of such an application from one physical node to another one - the processor and operating system type may change - only requiring the names of the nodes, their processor type and operating system. This way the infrastructure of an embedded / distributed system can be built just by provision of literals and figures which define the system interaction, communication, topology and performance. Moreover, dedicated application software like needed for on-board command handling, data acquisition and processing, and telemetry handling can be built from generic templates. The generation time range from less than one second up to about twenty minutes on a PC/Linux platform (800 MHz). By this extremely short generation time risks can be identified early because the executable application is immediately available for validation. A rough estimation shows that one hour of automated system and software generation is equivalent to about 5 .. 50 man years. Currently, about 50% of a typical space embedded system can be covered by the available automated approach. However, the more it is applied, the more can be covered by automation. A system is constructed by applying a formal transformation to the few information as delivered by the user. This approach is not limited to the space domain, although the first industrial application was a space project. Quite different domains can take advantage of such principles of system construction. This paper explains the approach, compares it with other approaches, and provides figures on productivity, duration of system generation and reliability.

  10. Towards a "2nd Generation" of Quality Labels: a Proposal for the Evaluation of Territorial Quality Marks / Vers une «2ème génération» de labels de qualité: une proposition pour l'évaluation des marques de qualité territoriale / Hacia una "2" generación" de sellos de calidad: una propuesta para la evaluación de las marcas de calidad territorial

    Scientific Electronic Library Online (English)

    Eduardo, Ramos; Dolores, Garrido.

    2014-12-01

    Full Text Available La literatura reciente analiza el papel de las especificidades territoriales como el núcleo de las estrategias de desarrollo territorial rural basadas en la diferenciación. Desafortunadamente, la proliferación de los sistemas de garantía de calidad está provocando un "laberinto de sellos", que difun [...] den los esfuerzos locales de capitalizar las especificidades rurales. Una segunda generación de sellos se está desarrollando actualmente para simplificar la diferenciación territorial. Una parte de los territorios al sur de Europa está basando sus estrategias de desarrollo rural mediante el proyecto Marca de Calidad Territorial Europea (MCTE). Este trabajo propone una metodología original, diseñada y desarrollada por los autores para la evaluación de algunos de los sellos de segunda generación. Esta metodología se ha validado en quince territorios rurales como los pioneros de la MCTE en España. Abstract in english Recent literature analyses the role of territorial specificities, as the core of territorial rural development strategies based on differentiation. Unfortunately, the proliferation of quality assurance schemes is provoking a "labyrinth of labels" which diffuses the local efforts for capitalizing rur [...] al specificities. A second-generation of labels is currently being developed to simplify the territorial differentiation message. A number of territories in Southern Europe are basing their rural development strategies joining the so-called European Territorial Quality Mark (ETQM) Project. This paper proposes an original methodology, designed and developed by authors, for the evaluation of some of these second-generation labels. This methodology has been validated in 15 rural territories as the pioneers of the ETQM in Spain.

  11. The 2nd to 4th Digit Length Difference and Ratio as Predictors of Hyperandrogenism and Metabolic Syndrome in Females

    Directory of Open Access Journals (Sweden)

    P?nar Y?ld?z1

    2015-03-01

    Full Text Available Objective: In this study we evaluated the usefulness of 2nd to 4th (2nd:4th digit length difference and ratio in determining hyperandrogenism in females and the relationship with metabolic syndrome. Methods: We designed a cross-sectional clinical study and examined 150 females who visited our clinic; 137 completed the study. We measured blood pressure and anthropometric values. Biochemical parameters associated with metabolic syndrome were also measured. Results: The mean age of our patient is 46.1 yrs. The 2nd:4th digit length difference and ratio were correlated slightly with total testosterone levels and positively with free testosterone levels (p=0.028, p=0.016, p=0.003, p=0.016. Sex hormone-binding globulin levels and 2nd:4th digit length difference and ratio were mildly negatively correlated (p=0.011, p=0.016. No statistically significant differences were found between 2nd:4th digit length difference and ratio, and metabolic syndrome parameters. Thus, the 2nd:4th digit length difference and ratio are significantly correlated with androgens. Conclusion: The 2nd:4th digit length difference and ratio, which are easily measurable values, can be used as an important predictor of hyperandrogenism in females. In the present study, 2nd:4th digit length difference and ratio were not statistically significantly correlated with metabolic syndrome; however, additional studies with a larger group of patients are necessary.

  12. Development of novel software to generate anthropometric norms at perinatal autopsy.

    Science.gov (United States)

    Cain, Matthew D; Siebert, Joseph R; Iriabho, Egiebade; Gruneberg, Alexander; Almeida, Jonas S; Faye-Petersen, Ona Marie

    2015-01-01

    Fetal and infant autopsy yields information regarding cause of death and the risk of recurrence, and it provides closure for parents. A significant number of perinatal evaluations are performed by general practice pathologists or trainees, who often find them time-consuming and/or intimidating. We sought to create a program that would enable pathologists to conduct these examinations with greater ease and to produce reliable, informative reports. We developed software that automatically generates a set of expected anthropometric and organ weight ranges by gestational age (GA)/postnatal age (PA) and a correlative table with the GA/PA that best matches the observed anthropometry. The program highlights measurement and organ weight discrepancies, enabling users to identify abnormalities. Furthermore, a Web page provides options for exporting and saving the data. Pathology residents utilized the program to determine ease of usage and benefits. The average time using conventional methods (ie, reference books and Internet sites) was compared to the average time using our Web page. Average time for novice and experienced residents using conventional methods was 26.7 minutes and 15 minutes, respectively. Using the Web page program, these times were reduced to an average of 3.2 minutes (P usage by both university and private practice groups is in progress. PMID:25634794

  13. Efficient FPGA implementation of 2nd order digital controllers using Matlab/Simulink

    OpenAIRE

    Vikas gupta; K. Khare; Singh, R P

    2011-01-01

    This paper explains a method for the design and implementation of digital controller based on Field Programmable Gate Array (FPGA) device. It is more compact, power efficient and provides high speed capabilities as compared to software based PID controllers. The proposed method is based on implementation of Digital controller as digital filters using DSP architectures. The PID controller is designed using MATLAB and Simulink to generate a set of coefficients associated with the desired contro...

  14. OASIS4 – a coupling software for next generation earth system modelling

    OpenAIRE

    Redler, R.; Valcke, S.; Ritzdorf, H.

    2010-01-01

    In this article we present a new version of the Ocean Atmosphere Sea Ice Soil coupling software (OASIS4). With this new fully parallel OASIS4 coupler we target the needs of Earth system modelling in its full complexity. The primary focus of this article is to describe the design of the OASIS4 software and how the coupling software drives the whole coupled model system ensuring the synchronization of the different component models. The application programmer interface (API) manages the couplin...

  15. Implementation of a Software-Defined Radio Transceiver on High-Speed Digitizer/Generator SDR14

    OpenAIRE

    Björklund, Daniel

    2012-01-01

    This thesis describes the specification, design and implementation of a software-defined radio system on a two-channel 14-bit digitizer/generator. The multi-stage interpolations and decimations which are required to operate two analog-to-digital converters at 800 megasamples per second (MSps) and two digital-to-analog converters at 1600 MSps from a 25 MSps software-side interface, were designed and implemented. Quadrature processing was used throughout the system, and a combination of fine-tu...

  16. Library perceptions of using social software as blogs in the idea generation phase of service innovations : Lessons from an experiment

    DEFF Research Database (Denmark)

    Scupola, Ada; Nicolajsen, Hanne Westh

    2013-01-01

    This article investigates the use of social software such as blogs to communicate with and to involve users in the idea generation process of service innovations. After a theoretical discussion of user involvement and more specifically user involvement using web-tools with specific focus on blogs, the article reports findings and lessons from a field experiment at a university library. In the experiment, a blog was established to collect service innovation ideas from the library users. The experiment shows that a blog may engage a limited number of users in the idea generation process and generate a useful, but modest amount of ideas.

  17. 2nd-Order CESE Results For C1.4: Vortex Transport by Uniform Flow

    Science.gov (United States)

    Friedlander, David J.

    2015-01-01

    The Conservation Element and Solution Element (CESE) method was used as implemented in the NASA research code ez4d. The CESE method is a time accurate formulation with flux-conservation in both space and time. The method treats the discretized derivatives of space and time identically and while the 2nd-order accurate version was used, high-order versions exist, the 2nd-order accurate version was used. In regards to the ez4d code, it is an unstructured Navier-Stokes solver coded in C++ with serial and parallel versions available. As part of its architecture, ez4d has the capability to utilize multi-thread and Messaging Passage Interface (MPI) for parallel runs.

  18. Revised data for 2nd version of nuclear criticality safety handbook/data collection

    International Nuclear Information System (INIS)

    This paper outlines the data prepared for the 2nd version of Data Collection of the Nuclear Criticality Safety Handbook. These data are discussed in the order of its preliminary table of contents. The nuclear characteristic parameters (k?, M2, D) were derived, and subcriticality judgment graphs were drawn for eleven kinds of fuels which were often encountered in criticality safety evaluation of fuel cycle facilities. For calculation of criticality data, benchmark calculations using the combination of the continuous energy Monte Carlo criticality code MVP and the Japanese Evaluated Nuclear Data Library JENDL-3.2 were made. The calculation errors were evaluated for this combination. The implementation of the experimental results obtained by using NUCEF facilities into the 2nd version of the Data Collection is under discussion. Therefore, related data were just mentioned. A database is being prepared to retrieve revised data easily. (author)

  19. Crystal structures and phase transformation of deuterated lithium imide, Li2ND

    International Nuclear Information System (INIS)

    We have investigated the crystal structure of deuterated lithium imide, Li2ND, by means of neutron and X-ray diffraction. An order-disorder transition occurs near 360K. Below that temperature Li2ND can be described to the same level of accuracy as a disordered cubic (Fd3-bar m) structure with partially occupied Li 32e sites or as a fully occupied orthorhombic (Ima2 or Imm2) structure. The high temperature phase is best characterized as disordered cubic (Fm3-bar m) with D atoms randomized over the 192l sites. Density functional theory calculations complement and support the diffraction analyses. We compare our findings in detail with previous studies

  20. Study of Nd2O3-GeO2-NdPO4 system

    International Nuclear Information System (INIS)

    Phase ratios in the system Nd2O3-GeO2-NdPO4 at the temperatures of 1300-1400 deg C were ascertained by the method of X-ray phase analysis. Three compounds are formed in the system: Nd3GePO9, Nd8GeP2O19, Nd7Ge2PO17. Roentgenometric data on Nd7Ge2PO17 are given

  1. Idaho National Laboratory Quarterly Performance Analysis for the 2nd Quarter FY 2015

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, Lisbeth A. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-04-01

    This report is published quarterly by the Idaho National Laboratory (INL) Quality and Performance Management Organization. The Department of Energy (DOE) Occurrence Reporting and Processing System (ORPS), as prescribed in DOE Order 232.2, “Occurrence Reporting and Processing of Operations Information,” requires a quarterly analysis of events, both reportable and not reportable, for the previous 12 months. This report is the analysis of events for the 2nd Qtr FY-15.

  2. The 2nd competition on counter measures to 2D face spoofing attacks

    OpenAIRE

    Chingovska, Ivana; Yang, Jinwei; Lei, Zhen; Yi, Dong; Z.Li, Stan; Kähm, Olga; Damer, Naser; Glaser, Christian; Kuijper, Arjan; Nouak, Alexander; Komulainen, Jukka; de Freitas Pereira, Tiago; Gupta, Shubham; Bansal, Shubham; Khandelwal, Shubham

    2013-01-01

    As a crucial security problem, anti-spoofing in biometrics, and particularly for the face modality, has achieved great progress in the recent years. Still, new threats arrive in form of better, more realistic and more sophisticated spoofing attacks. The objective of the 2nd Competition on Counter Measures to 2D Face Spoofing Attacks is to challenge researchers to create counter measures effectively detecting a variety of attacks. The submitted propositions are evaluated on the Replay-Attack d...

  3. Preface: 2nd Workshop on the State of the Art in Nuclear Cluster Physics

    International Nuclear Information System (INIS)

    The 2nd workshop on the "State of the Art in Nuclear Cluster Physics" (SOTANCP2) took place on May 25-28, 2010, at the Universite Libre de Bruxelles (Brussels, Belgium). The first workshop of this series was held in Strasbourg (France) in 2008. The purpose of SOTANCP2 was to promote the exchange of ideas and to discuss new developments in Clustering Phenomena in Nuclear Physics and Nuclear Astrophysics both from a theoretical and from an experimental point of view

  4. Production of artificial ionospheric layers by frequency sweeping near the 2nd gyroharmonic

    OpenAIRE

    Pedersen, T; McCarrick, M.; Reinisch, B; WATKINS, B; Hamel, R.; V. Paznukhov

    2011-01-01

    Artificial ionospheric plasmas descending from the background F-region have been observed on multiple occasions at the High Frequency Active Auroral Research Program (HAARP) facility since it reached full 3.6 MW power. Proximity of the transmitter frequency to the 2nd harmonic of the electron gyrofrequency (2fce) has been noted as a requirement for their occurrence, and their disappearance after only a few minutes has been attributed to the increasing...

  5. Mesocosm soil ecological risk assessment tool for GMO 2nd tier studies

    DEFF Research Database (Denmark)

    D'Annibale, Alessandra; Maraldo, Kristine; Larsen, Thomas; Strandberg, Beate; Cortet, Jérôme; Vincze, Éva; Audisio, Paolo Aldo; Krogh, Paul Henning

    2011-01-01

    Ecological Risk Assessment (ERA) of GMO is basically identical to ERA of chemical substances, when it comes to assessing specific effects of the GMO plant material on the soil ecosystem. The tiered approach always includes the option of studying more complex but still realistic ecosystem level effects in 2nd tier caged experimental systems, cf. the new GMO ERA guidance: EFSA Journal 2010; 8(11):1879. We propose to perform a trophic structure analysis, TSA, and include the trophic structure as an...

  6. Application research on enhancing near-infrared micro-imaging quality by 2nd derivative

    Science.gov (United States)

    Wang, Dong; Ma, Zhi-hong; Zhao, Liu; Wang, Bei-hong; Han, Ping; Pan, Li-gang; Wang, Ji-hua

    2013-08-01

    Near-infrared micro-imaging will not only provide the sample's spatial distribution information, but also the spectroscopic information of each pixel. In this thesis, it took the artificial sample of wheat flour and formaldehyde sodium sulfoxylate distribution given for example to research the data processing method for enhancing the quality of near-infrared micro-imaging. Near-infrared spectroscopic feature of wheat flour and formaldehyde sodium sulfoxylate being studied on, compare correlation imaging and 2nd derivative imaging were applied in the imaging processing of the near-infrared micro-image of the artificial sample. Furthermore, the two methods were combined, i.e. 2nd derivative compare correlation imaging was acquired. The result indicated that the difference of the correlation coefficients between the two substances, i.e. wheat flour and formaldehyde sodium sulfoxylate, and the reference spectrum has been increased from 0.001 in compare correlation image to 0.796 in 2nd derivative compare correlation image respectively, which enhances the imaging quality efficiently. This study will, to some extent, be of important reference significance to near-infrared micro-imaging method research of agricultural products and foods.

  7. mbs: modifying Hudson's ms software to generate samples of DNA sequences with a biallelic site under selection

    Directory of Open Access Journals (Sweden)

    Innan Hideki

    2009-05-01

    Full Text Available Abstract Background The pattern of single nucleotide polymorphisms, or SNPs, contains a tremendous amount of information with respect to the mechanisms of the micro-evolutionary process of a species. The inference of the roles of these mechanisms, including natural selection, relies heavily on computer simulations. A coalescent simulation is extremely powerful in generating a large number of samples of DNA sequences from a population (species when all mutations are neutral, and Hudson's ms software is frequently used for this purpose. However, it has been difficult to incorporate natural selection into the coalescent framework. Results We herein present a software application to generate samples of DNA sequences when there is a biallelic site targeted by selection. This software application, referred to as mbs, is developed by modifying Hudson's ms. The mbs software is so flexible that it can incorporate any arbitrary histories of population size changes and any mode of selection as long as selection is operating on a biallelic site. Conclusion mbs provides opportunities to investigate the effect of any mode of selection on the pattern of SNPs under various demography.

  8. Managing Complexity in Next Generation Robotic Spacecraft: From a Software Perspective

    Science.gov (United States)

    Reinholtz, Kirk

    2008-01-01

    This presentation highlights the challenges in the design of software to support robotic spacecraft. Robotic spacecraft offer a higher degree of autonomy, however currently more capabilities are required, primarily in the software, while providing the same or higher degree of reliability. The complexity of designing such an autonomous system is great, particularly while attempting to address the needs for increased capabilities and high reliability without increased needs for time or money. The efforts to develop programming models for the new hardware and the integration of software architecture are highlighted.

  9. Utilisation of 2nd generation web technologies in master level vocational teacher training

    OpenAIRE

    Péter Tóth

    2009-01-01

    The Masters level Opportunities and Technological Innovation in Vocational Teacher Education project (project site: http://motivate.tmpk.bmf.hu/) aims to develop the use and management of virtual learning environments in the area of vocational teacher training, drawing on a well established international partnership of institutions providing both technical and educational expertise. This paper gives an overall picture of the first results and products of the collaboration. We touch upon the g...

  10. Reed canary grass as a feedstock for 2nd generation bioethanol production.

    Science.gov (United States)

    Kallioinen, Anne; Uusitalo, Jaana; Pahkala, Katri; Kontturi, Markku; Viikari, Liisa; Weymarn, Niklas von; Siika-Aho, Matti

    2012-11-01

    The enzymatic hydrolysis and fermentation of reed canary grass, harvested in the spring or autumn, and barley straw were studied. Steam pretreated materials were efficiently hydrolysed by commercial enzymes with a dosage of 10-20FPU/g d.m. Reed canary grass harvested in the spring was hydrolysed more efficiently than the autumn-harvested reed canary grass. Additional ?-glucosidase improved the release of glucose and xylose during the hydrolysis reaction. The hydrolysis rate and level of reed canary grass with a commercial Trichoderma reesei cellulase could be improved by supplementation of purified enzymes. The addition of CBH II improved the hydrolysis level by 10% in 48hours' hydrolysis. Efficient mixing was shown to be important for hydrolysis already at 10% dry matter consistency. The highest ethanol concentration (20g/l) and yield (82%) was obtained with reed canary grass at 10% d.m. consistency. PMID:22939601

  11. eLEN2 — 2nd generation eLearning Exchange Networks.

    OpenAIRE

    Panckhurst, Rachel; Marsh, Debra

    2009-01-01

    Since May 2007 the authors of this paper have explored and evaluated the use, including relative merits and challenges of social networking within the context of higher education professional development programmes in France and in Britain (Marsh & Panckhurst, 2007; Panckhurst & Marsh, 2008). A social networking tool was adopted for Masters' level courses in order to try and establish an effective collaborative pedagogical environment and sense of community, by placing students at the centre ...

  12. Utilisation of 2nd generation web technologies in master level vocational teacher training

    Directory of Open Access Journals (Sweden)

    Péter Tóth

    2009-03-01

    Full Text Available The Masters level Opportunities and Technological Innovation in Vocational Teacher Education project (project site: http://motivate.tmpk.bmf.hu/ aims to develop the use and management of virtual learning environments in the area of vocational teacher training, drawing on a well established international partnership of institutions providing both technical and educational expertise. This paper gives an overall picture of the first results and products of the collaboration. We touch upon the goals, the assessments and the learning process of using “Multimedia and e-Learning: e-learning methods and tools” module in details. The main cooperative and collaborative devices are presented in virtual learning environment. The communication during collaborative learning, the structured debate on forum and the benefits of collaborative learning in VLE are interpreted at the end of this paper.

  13. Performance of 2nd Generation BaBar Resistive Plate Chambers

    Energy Technology Data Exchange (ETDEWEB)

    Anulli, F.; Baldini, R.; Calcaterra, A.; de Sangro, R.; Finocchiaro, G.; Patteri, P.; Piccolo, M.; Zallo, A.; /Frascati; Cheng, C.H.; Lange, D.J.; Wright, D.M.; /LLNL,; Messner, R.; Wisniewski, William J.; /SLAC; Pappagallo, M.; /Bari U. /INFN, Bari; Andreotti, M.; Bettoni, D.; Calabrese, R.; Cibinetto, G.; Luppi, E.; Negrini, M.; /Ferrara; Capra, R.; /Genoa U. /INFN, Genoa /Naples U. /INFN, Naples /Perugia U. /INFN, Perugia /Pisa U. /INFN, Pisa /Rome U. /INFN, Rome /Oregon U. /UC, Riverside

    2005-07-12

    The BaBar detector has operated nearly 200 Resistive Plate Chambers (RPCs), constructed as part of an upgrade of the forward endcap muon detector, for the past two years. The RPCs experience widely different background and luminosity-driven singles rates (0.01-10 Hz/cm{sup 2}) depending on position within the endcap. Some regions have integrated over 0.3 C/cm{sup 2}. RPC efficiency measured with cosmic rays is high and stable. The average efficiency measured with beam is also high. However, a few of the highest rate RPCs have suffered efficiency losses of 5-15%. Although constructed with improved techniques and minimal use of linseed oil, many of the RPCs, which are operated in streamer mode, have shown increased dark currents and noise rates that are correlated with the direction of the gas flow and the integrated current. Studies of the above aging effects are presented and correlated with detector operating conditions.

  14. PPRODUCTION OF 2ND GENERATION BIOETHANOL FROM LUCERNE – OPTIMIZATION OF HYDROTHERMAL PRETREATMENT

    Directory of Open Access Journals (Sweden)

    Sune Tjalfe Thomsen,

    2012-02-01

    Full Text Available Lucerne (Medicago sativa has many qualities associated with sustainable agriculture such as nitrogen fixation and high biomass yield. Therefore, there is interest in whether lucerne is a suitable biomass substrate for bioethanol production, and if hydrothermal pretreatment (HTT of lucerne improves enzymatic convertibility, providing sufficient enzymatic conversion of carbohydrate to simple sugars for ethanol production. The HTT process was optimised for lucerne hay, and the pretreated biomass was assessed by carbohydrate analysis, inhibitor characterisation of liquid phases, and by simultaneous saccharification and fermentation (SSF of the whole slurry with Cellubrix enzymes and Saccharomyces cerevisiae yeast. The optimal HTT conditions were 205°C for 5 minutes, resulting in pentose recovery of 81%, and an enzymatic convertibility of glucan to monomeric glucose of 74%, facilitating a conversion of 6.2% w/w of untreated material into bioethanol in SSF, which is equivalent to 1,100 litre ethanol per hectare per year

  15. PPRODUCTION OF 2ND GENERATION BIOETHANOL FROM LUCERNE – OPTIMIZATION OF HYDROTHERMAL PRETREATMENT

    OpenAIRE

    Sune Tjalfe Thomsen,; Morten Jensen,; Jens Ejbye Schmidt

    2012-01-01

    Lucerne (Medicago sativa) has many qualities associated with sustainable agriculture such as nitrogen fixation and high biomass yield. Therefore, there is interest in whether lucerne is a suitable biomass substrate for bioethanol production, and if hydrothermal pretreatment (HTT) of lucerne improves enzymatic convertibility, providing sufficient enzymatic conversion of carbohydrate to simple sugars for ethanol production. The HTT process was optimised for lucerne hay, and the pretreated bioma...

  16. Methodology and Toolset for Model Verification, Hardware/Software co-simulation, Performance Optimisation and Customisable Source-code generation

    DEFF Research Database (Denmark)

    Berger, Michael Stübert; Soler, José

    2013-01-01

    The MODUS project aims to provide a pragmatic and viable solution that will allow SMEs to substantially improve their positioning in the embedded-systems development market. The MODUS tool will provide a model verification and Hardware/Software co-simulation tool (TRIAL) and a performance optimisation and customisable source-code generation tool (TUNE). The concept is depicted in automated modelling and optimisation of embedded-systems development. The tool will enable model verification by guiding the selection of existing open-source model verification engines, based on the automated analysis of system properties, and producing inputs to be fed into these engines, interfacing with standard (SystemC) simulation platforms for HW/SW co-simulation, customisable source-code generation towards respecting coding standards and conventions and software performance-tuning optimisation through automated design transformations.

  17. System and Component Software Specification, Run-time Verification and Automatic Test Generation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This proposal is for the creation of a system-level software specification and verification tool. This proposal suggests a major leap-forward in usability of...

  18. GSIMF: a web service based software and database management system for the next generation grids

    International Nuclear Information System (INIS)

    To process the vast amount of data from high energy physics experiments, physicists rely on Computational and Data Grids; yet, the distribution, installation, and updating of a myriad of different versions of different programs over the Grid environment is complicated, time-consuming, and error-prone. Our Grid Software Installation Management Framework (GSIMF) is a set of Grid Services that has been developed for managing versioned and interdependent software applications and file-based databases over the Grid infrastructure. This set of Grid services provide a mechanism to install software packages on distributed Grid computing elements, thus automating the software and database installation management process on behalf of the users. This enables users to remotely install programs and tap into the computing power provided by Grids

  19. GSIMF: a web service based software and database management system for the next generation grids

    Science.gov (United States)

    Wang, N.; Ananthan, B.; Gieraltowski, G.; May, E.; Vaniachine, A.

    2008-07-01

    To process the vast amount of data from high energy physics experiments, physicists rely on Computational and Data Grids; yet, the distribution, installation, and updating of a myriad of different versions of different programs over the Grid environment is complicated, time-consuming, and error-prone. Our Grid Software Installation Management Framework (GSIMF) is a set of Grid Services that has been developed for managing versioned and interdependent software applications and file-based databases over the Grid infrastructure. This set of Grid services provide a mechanism to install software packages on distributed Grid computing elements, thus automating the software and database installation management process on behalf of the users. This enables users to remotely install programs and tap into the computing power provided by Grids.

  20. An Evaluation of Software Distributed Shared Memory for Next-Generation Processors and Networks

    OpenAIRE

    Cox, A. L.; Dwarkadas, S.; Keleher, P.; Zwaenepoel, W

    1993-01-01

    We evaluate the effect of processor speed, network characteristics, and software overhead on the performance of release-consistent software distributed shared memory. We examine five different protocols for implementing release consistency: eager update, eager invalidate, lazy update, lazy invalidate, and a new protocol called lazy hybrid. This lazy hybrid protocol combines the benefits of both lazy update and lazy invalidate. Our simulations indicate that with the processors and networks tha...

  1. XML (eXtensible Mark-up Language) Industrial Standard, Determining Architecture of the Next Generation of the Internet Software

    CERN Document Server

    Galaktionov, V V

    2000-01-01

    The past 1999 became the period of standing of the new Internet technology - XML (eXtensible Mark-up Language), the language of sectoring established by a Consortium WWW (http://www.w3.org) as a new industrial standard, determining architecture of the next generation Internet software. In this message the results of a research of this technology, basic opportunities XML, rules and recommendations for its application are given.

  2. Development of high-performance algorithms for a new generation of versatile molecular descriptors. The Pentacle software

    OpenAIRE

    Durán Alcaide, Ángel

    2010-01-01

    The work of this thesis was focused on the development of high-performance algorithms for a new generation of molecular descriptors, with many advantages with respect to its predecessors, suitable for diverse applications in the field of drug design, as well as its implementation in commercial grade scientific software (Pentacle). As a first step, we developed a new algorithm (AMANDA) for discretizing molecular interaction fields which allows extracting from them the most interesting regions...

  3. Sustainable development - a role for nuclear power? 2nd scientific forum

    International Nuclear Information System (INIS)

    The 2nd Scientific Forum of the International Atomic Energy Agency (IAEA) was held during the 43rd General Conference. This paper summarizes the deliberations of the two-day Forum. The definition of 'sustainable development' of the 1987 Bruntland Commission - 'development that meets the needs of the present without compromising the ability of future generations to meet their own needs' - provided the background for the Forum's debate whether and how nuclear power could contribute to sustainable energy development. The framework for this debate comprises different perspectives on economic, energy, environmental, and political considerations. Nuclear power, along with all energy generating systems, should be judged on these considerations using a common set of criteria (e.g., emission levels, economics, public safety, wastes, and risks). First and foremost, there is a growing political concern over the possible adverse impact of increasing emissions of greenhouse gases from fossil fuel combustion. However, there is debate as to whether this would have any material impact on the predominantly economic criteria currently used to make investment decisions on energy production. According to the views expressed, the level of safety of existing nuclear power plants is no longer a major concern - a view not yet fully shared by the general public. The need to maintain the highest standards of safety in operation remains, especially under the mounting pressure of competitiveness in deregulated and liberalized energy markets. The industry must continuously reinforce a strong safety culture among reactor designers, builders, and operators. Furthermore, a convincing case for safety will have to be made for any new reactor designs. Of greater concern to the public and politicians are the issues of radioactive waste and proliferation of nuclear weapons. There is a consensus among technical experts that radioactive wastes from nuclear power can be disposed of safely and economically in deep geologic formations. However, the necessary political decisions to select sites for repositories need public support and understanding about what the industry is doing and what can be done. As to nuclear weapons proliferation, the existing safeguards system must be fully maintained and strengthened and inherently proliferation-resistant fuel cycles should be explored. Overviews of the future global energy demand and of the prospects for nuclear power in various economic regions of the world indicate that, in the case of the OECD countries, the dominant issue is economics in an increasingly free market system for electricity. For the so-called transition economies, countries of the Former Soviet Union and Central and Eastern Europe, the issue is one of managing nuclear power plant operations safely. In the case of developing countries, the dominant concern is effective management of technology, in addition to economics and finance. The prospects for nuclear power depend on the resolution of two cardinal issues. The first is economic competitiveness, and in particular, reduced capital cost. The second is public confidence in the ability of the industry to manage plant operations and its high level waste safely. There is a continuing need for dialogue and communication with all sectors of the public: economists, investors, social scientists, politicians, regulators, unions, and environmentalists. Of help in this dialogue would be nuclear power's relevance to and comparative advantages in addressing environmental issues, such as global climate change, local air quality, and regional acidification. Suggestions have been made for a globalized approach to critical nuclear power issues, such as waste management, innovative and proliferation-resistant reactors and fuel cycles, and international standards for new generation nuclear reactor designs.The conclusion seems to be that there is a role for nuclear energy in sustainable development, especially if greenhouse gas emissions are to be limited. Doubts persist in the minds of many energy experts over the pote

  4. Decision-Support Software for Grid Operators: Transmission Topology Control for Infrastructure Resilience to the Integration of Renewable Generation

    Energy Technology Data Exchange (ETDEWEB)

    None

    2012-03-16

    GENI Project: The CRA team is developing control technology to help grid operators more actively manage power flows and integrate renewables by optimally turning on and off entire power lines in coordination with traditional control of generation and load resources. The control technology being developed would provide grid operators with tools to help manage transmission congestion by identifying the facilities whose on/off status must change to lower generation costs, increase utilization of renewable resources and improve system reliability. The technology is based on fast optimization algorithms for the near to real-time change in the on/off status of transmission facilities and their software implementation.

  5. User's manual for the UNDERDOG [Underground Nuclear Depository Evaluation, Reduction, and Detailed Output Generator] data reduction software

    International Nuclear Information System (INIS)

    UNDERDOG is a computer program that aids experimentalists in the process of data reduction. This software allows a user to reduce, extract, and generate displays of data collected at the WIPP site. UNDERDOG contains three major functional components: a Data Reduction package, a Data Analysis interface, and a Publication-Quality Output generator. It also maintains audit trails of all actions performed for quality assurance purposes and provides mechanisms which control an individual's access to the data. UNDERDOG was designed to run on a Digital Equipment Corporation VAX computer using the VMS operating system. 8 refs., 24 figs., 2 tabs

  6. User's manual for the UNDERDOG (Underground Nuclear Depository Evaluation, Reduction, and Detailed Output Generator) data reduction software

    Energy Technology Data Exchange (ETDEWEB)

    Ball, J.R.; Shepard, L.K.

    1987-12-01

    UNDERDOG is a computer program that aids experimentalists in the process of data reduction. This software allows a user to reduce, extract, and generate displays of data collected at the WIPP site. UNDERDOG contains three major functional components: a Data Reduction package, a Data Analysis interface, and a Publication-Quality Output generator. It also maintains audit trails of all actions performed for quality assurance purposes and provides mechanisms which control an individual's access to the data. UNDERDOG was designed to run on a Digital Equipment Corporation VAX computer using the VMS operating system. 8 refs., 24 figs., 2 tabs.

  7. 2nd International Doctoral Symposium on Applied Computation and Security Systems

    CERN Document Server

    Cortesi, Agostino; Saeed, Khalid; Chaki, Nabendu

    2016-01-01

    The book contains the extended version of the works that have been presented and discussed in the Second International Doctoral Symposium on Applied Computation and Security Systems (ACSS 2015) held during May 23-25, 2015 in Kolkata, India. The symposium has been jointly organized by the AGH University of Science & Technology, Cracow, Poland; Ca’ Foscari University, Venice, Italy and University of Calcutta, India. The book is divided into volumes and presents dissertation works in the areas of Image Processing, Biometrics-based Authentication, Soft Computing, Data Mining, Next Generation Networking and Network Security, Remote Healthcare, Communications, Embedded Systems, Software Engineering and Service Engineering.

  8. OASIS4 – a coupling software for next generation earth system modelling

    Directory of Open Access Journals (Sweden)

    R. Redler

    2009-07-01

    Full Text Available In this article we present a new version of the Ocean Atmosphere Sea Ice Soil coupling software (OASIS4. With this new fully parallel OASIS4 coupler we target the needs of Earth system modelling in its full complexity. The primary focus of this article is to describe the design of the OASIS4 software and how the coupling software drives the whole coupled model system ensuring the synchronization of the different component models. The application programmer interface (API manages the coupling exchanges between arbitrary climate component models, as well as the input and output from and to files of each individual component. The OASIS4 Transformer instance performs the parallel interpolation and transfer of the coupling data between source and target model components. As a new core technology for the software, the fully parallel search algorithm of OASIS4 is described in detail. First benchmark results are discussed with simple test configurations to demonstrate the efficiency and scalability of the software when applied to Earth system model components. Typically the compute time needed in order to perform the search is in the order of a few seconds and is only weakly dependant on the grid size.

  9. OASIS4 – a coupling software for next generation earth system modelling

    Directory of Open Access Journals (Sweden)

    R. Redler

    2010-01-01

    Full Text Available In this article we present a new version of the Ocean Atmosphere Sea Ice Soil coupling software (OASIS4. With this new fully parallel OASIS4 coupler we target the needs of Earth system modelling in its full complexity. The primary focus of this article is to describe the design of the OASIS4 software and how the coupling software drives the whole coupled model system ensuring the synchronization of the different component models. The application programmer interface (API manages the coupling exchanges between arbitrary climate component models, as well as the input and output from and to files of each individual component. The OASIS4 Transformer instance performs the parallel interpolation and transfer of the coupling data between source and target model components. As a new core technology for the software, the fully parallel search algorithm of OASIS4 is described in detail. First benchmark results are discussed with simple test configurations to demonstrate the efficiency and scalability of the software when applied to Earth system model components. Typically the compute time needed to perform the search is in the order of a few seconds and is only weakly dependant on the grid size.

  10. Proceedings of the 2nd Workshop on Robots in Clutter: Preparing robots for the real world (Berlin, 2013)

    OpenAIRE

    Zillich, Michael; Bennewitz, Maren; Fox, Maria; Piater, Justus; Pangercic, Dejan

    2013-01-01

    This volume represents the proceedings of the 2nd Workshop on Robots in Clutter: Preparing robots for the real world, held June 27, 2013, at the Robotics: Science and Systems conference in Berlin, Germany.

  11. Estructura cristalina del nuevo óxido tipo perovskita compleja Ba2NdZrO5,5

    Directory of Open Access Journals (Sweden)

    D.A. Landínez Téllez

    2007-01-01

    Full Text Available A new complex perovskite material Ba2NdZrO5;5has been synthesized for the first time by a conventional solid state reaction process. X–raydiffraction (XRD measurements and Rietveld analysis revealed an ordered complex cubic structure characteristic of A2BB0O6crystallinestructure with a lattice constanta= 8;40ß0;01?A. Energy Dispersive X–ray (EDX analysis shows that Ba2NdZrO5;5is free of impuritytraces. Preliminary studies reveal that at820±C temperature Ba2NdZrO5;5does not react with YBa2Cu3O7°±. These favorable characteristicsof Ba2NdZrO5;5show that it can be used as a potential substrate material for fabrication of superconducting films.

  12. Radiation protection for repairs of reactor's internals at the 2nd Unit of the Nuclear Power Plant Temelin

    International Nuclear Information System (INIS)

    This presentation describes the process and extent of repairs of the 2nd unit of the Nuclear power plant Temelin during the shutdown of the reactor. All works were optimized in terms of radiation protection of workers.

  13. Generating Variable Strength Covering Array for Combinatorial Software Testing with Greedy Strategy

    Directory of Open Access Journals (Sweden)

    Ziyuan Wang

    2013-12-01

    Full Text Available Combinatorial testingis a practical and efficient software testing techniques, which could detectthe faults that triggered by interactions among factors in software. Comparedto the classic fixed strength combinatorial testing, the variable strengthcombinatorial testing usually uses less test cases to detect more interactionfaults, because it considers the actual interaction relationship in softwaresufficiently. For a model of variable strength combinatorial testing that hasbeen propose previously, two heuristic algorithms, which are based onone-test-at-a-time greedy strategy, are proposed in this paper to generatevariable strength covering arrays as test suites in software testing.Experimental results show that, compared to some existed algorithms and tools,the two proposed algorithms have advantages onboth the execution effectiveness and the optimality of the size of generatedtest suite.

  14. Automatic Generation of Machine Emulators: Efficient Synthesis of Robust Virtual Machines for Legacy Software Migration

    DEFF Research Database (Denmark)

    Franz, Michael; Gal, Andreas

    2006-01-01

    As older mainframe architectures become obsolete, the corresponding le- gacy software is increasingly executed via platform emulators running on top of more modern commodity hardware. These emulators are virtual machines that often include a combination of interpreters and just-in-time compilers. Implementing interpreters and compilers for each combination of emulated and target platform independently of each other is a redundant and error-prone task. We describe an alternative approach that automatically synthesizes specialized virtual-machine interpreters and just-in-time compilers, which then execute on top of an existing software portability platform such as Java. The result is a considerably reduced implementation effort.

  15. Characterization of the 1st and 2nd EF-hands of NADPH oxidase 5 by fluorescence, isothermal titration calorimetry, and circular dichroism

    Directory of Open Access Journals (Sweden)

    Wei Chin-Chuan

    2012-04-01

    Full Text Available Abstract Background Superoxide generated by non-phagocytic NADPH oxidases (NOXs is of growing importance for physiology and pathobiology. The calcium binding domain (CaBD of NOX5 contains four EF-hands, each binding one calcium ion. To better understand the metal binding properties of the 1st and 2nd EF-hands, we characterized the N-terminal half of CaBD (NCaBD and its calcium-binding knockout mutants. Results The isothermal titration calorimetry measurement for NCaBD reveals that the calcium binding of two EF-hands are loosely associated with each other and can be treated as independent binding events. However, the Ca2+ binding studies on NCaBD(E31Q and NCaBD(E63Q showed their binding constants to be 6.5 × 105 and 5.0 × 102 M-1 with ?Hs of -14 and -4 kJ/mol, respectively, suggesting that intrinsic calcium binding for the 1st non-canonical EF-hand is largely enhanced by the binding of Ca2+ to the 2nd canonical EF-hand. The fluorescence quenching and CD spectra support a conformational change upon Ca2+ binding, which changes Trp residues toward a more non-polar and exposed environment and also increases its ?-helix secondary structure content. All measurements exclude Mg2+-binding in NCaBD. Conclusions We demonstrated that the 1st non-canonical EF-hand of NOX5 has very weak Ca2+ binding affinity compared with the 2nd canonical EF-hand. Both EF-hands interact with each other in a cooperative manner to enhance their Ca2+ binding affinity. Our characterization reveals that the two EF-hands in the N-terminal NOX5 are Ca2+ specific. Graphical abstract

  16. PREFACE: 2nd International Conference on Innovative Materials, Structures and Technologies

    Science.gov (United States)

    Ru?evskis, Sandris

    2015-11-01

    The 2nd International Conference on Innovative Materials, Structures and Technologies (IMST 2015) took place in Riga, Latvia from 30th September - 2nd October, 2015. The first event of the conference series, dedicated to the 150th anniversary of the Faculty of Civil Engineering of Riga Technical University, was held in 2013. Following the established tradition, the aim of the conference was to promote and discuss the latest results of industrial and academic research carried out in the following engineering fields: analysis and design of advanced structures and buildings; innovative, ecological and energy efficient building materials; maintenance, inspection and monitoring methods; construction technologies; structural management; sustainable and safe transport infrastructure; and geomatics and geotechnics. The conference provided an excellent opportunity for leading researchers, representatives of the industrial community, engineers, managers and students to share the latest achievements, discuss recent advances and highlight the current challenges. IMST 2015 attracted over 120 scientists from 24 countries. After rigorous reviewing, over 80 technical papers were accepted for publication in the conference proceedings. On behalf of the organizing committee I would like to thank all the speakers, authors, session chairs and reviewers for their efficient and timely effort. The 2nd International Conference on Innovative Materials, Structures and Technologies was organized by the Faculty of Civil Engineering of Riga Technical University with the support of the Latvia State Research Programme under the grant agreement "INNOVATIVE MATERIALS AND SMART TECHNOLOGIES FOR ENVIRONMENTAL SAFETY, IMATEH". I would like to express sincere gratitude to Juris Smirnovs, Dean of the Faculty of Civil Engineering, and Andris Chate, manager of the Latvia State Research Programme. Finally, I would like to thank all those who helped to make this event happen. Special thanks go to Diana Bajare, Laura Sele, Liga Radina and Jana Galilejeva for their major contribution to organizing the conference and to the literary editor Tatjana Smirnova and technical editor Daira Erdmane for their hard work on the conference proceedings.

  17. Mapping and industrial IT project to a 2nd semester design-build project

    DEFF Research Database (Denmark)

    Nyborg, Mads; Høgh, Stig

    2010-01-01

    CDIO means bringing the engineer's daily life and working practice into the educational system. In our opinion this is best done by selecting an appropriate project from industry. In this paper we describe how we have mapped an industrial IT project to a 2nd semester design-build project in the Diploma IT program at the Technical University of Denmark. The system in question is a weighing system operating in a LAN environment. The system is used in the medical industry for producing tablets. We ...

  18. TF insert experiment log book. 2nd Experiment of CS model coil

    International Nuclear Information System (INIS)

    The cool down of CS model coil and TF insert was started on August 20, 2001. It took almost one month and immediately started coil charge since September 17, 2001. The charge test of TF insert and CS model coil was completed on October 19, 2001. In this campaign, total shot numbers were 88 and the size of the data file in the DAS (Data Acquisition System) was about 4 GB. This report is a database that consists of the log list and the log sheets of every shot. This is an experiment logbook for 2nd experiment of CS model coil and TF insert for charge test. (author)

  19. TF insert experiment log book. 2nd Experiment of CS model coil

    Energy Technology Data Exchange (ETDEWEB)

    Sugimoto, Makoto; Isono, Takaaki; Matsui, Kunihiro [Japan Atomic Energy Research Inst., Naka, Ibaraki (Japan). Naka Fusion Research Establishment] [and others

    2001-12-01

    The cool down of CS model coil and TF insert was started on August 20, 2001. It took almost one month and immediately started coil charge since September 17, 2001. The charge test of TF insert and CS model coil was completed on October 19, 2001. In this campaign, total shot numbers were 88 and the size of the data file in the DAS (Data Acquisition System) was about 4 GB. This report is a database that consists of the log list and the log sheets of every shot. This is an experiment logbook for 2nd experiment of CS model coil and TF insert for charge test. (author)

  20. Mapping and industrial IT project to a 2nd semester design-build project

    OpenAIRE

    Nyborg, Mads; Høgh, Stig

    2010-01-01

    CDIO means bringing the engineer's daily life and working practice into the educational system. In our opinion this is best done by selecting an appropriate project from industry. In this paper we describe how we have mapped an industrial IT project to a 2nd semester design-build project in the Diploma IT program at the Technical University of Denmark. The system in question is a weighing system operating in a LAN environment. The system is used in the medical industry for prod...

  1. PREFACE: 2nd International Meeting for Researchers in Materials and Plasma Technology

    Science.gov (United States)

    Niño, Ely Dannier V.

    2013-11-01

    These proceedings present the written contributions of the participants of the 2nd International Meeting for Researchers in Materials and Plasma Technology, 2nd IMRMPT, which was held from February 27 to March 2, 2013 at the Pontificia Bolivariana Bucaramanga-UPB and Santander and Industrial - UIS Universities, Bucaramanga, Colombia, organized by research groups from GINTEP-UPB, FITEK-UIS. The IMRMPT, was the second version of biennial meetings that began in 2011. The three-day scientific program of the 2nd IMRMPT consisted in 14 Magisterial Conferences, 42 Oral Presentations and 48 Poster Presentations, with the participation of undergraduate and graduate students, professors, researchers and entrepreneurs from Colombia, Russia, France, Venezuela, Brazil, Uruguay, Argentina, Peru, Mexico, United States, among others. Moreover, the objective of IMRMPT was to bring together national and international researchers in order to establish scientific cooperation in the field of materials science and plasma technology; introduce new techniques of surface treatment of materials to improve properties of metals in terms of the deterioration due to corrosion, hydrogen embrittlement, abrasion, hardness, among others; and establish cooperation agreements between universities and industry. The topics covered in the 2nd IMRMPT include New Materials, Surface Physics, Laser and Hybrid Processes, Characterization of Materials, Thin Films and Nanomaterials, Surface Hardening Processes, Wear and Corrosion / Oxidation, Modeling, Simulation and Diagnostics, Plasma Applications and Technologies, Biomedical Coatings and Surface Treatments, Non Destructive Evaluation and Online Process Control, Surface Modification (Ion Implantation, Ion Nitriding, PVD, CVD). The editors hope that those interested in the are of materials science and plasma technology, enjoy the reading that reflect a wide range of topics. It is a pleasure to thank the sponsors and all the participants and contributors for making possible this international meeting of researchers. It should be noted that the event organized by UIS and UPB universities, through their research groups FITEK and GINTEP, was a very significant contribution to the national and international scientific community, achieving the interaction of different research groups from academia and business sector. On behalf of the research groups GINTEP - UPB and FITEK - UIS, we greatly appreciate the support provided by the Sponsors, who allowed to continue with the dream of research. Ely Dannier V-Nitilde no The Editor The PDF file also contains a list of committees and sponsors.

  2. Future Control and Automation : Proceedings of the 2nd International Conference on Future Control and Automation

    CERN Document Server

    2012-01-01

    This volume Future Control and Automation- Volume 1 includes best papers selected from 2012 2nd International Conference on Future Control and Automation (ICFCA 2012) held on July 1-2, 2012, Changsha, China. Future control and automation is the use of control systems and information technologies to reduce the need for human work in the production of goods and services. This volume can be divided into five sessions on the basis of the classification of manuscripts considered, which is listed as follows: Identification and Control, Navigation, Guidance and Sensor, Simulation Technology, Future Telecommunications and Control

  3. 2nd workshop on Wendelstein VII-X, Schloss Ringberg, Bavaria, 13-16 June 1988

    International Nuclear Information System (INIS)

    This IPP-Report is based on the 'Summary of the Workshop' by H. Wobig, and contains a number of figures and tables from contributed papers with some short descriptive remarks. About 40 papers were presented at the 2nd Workshop on Wendelstein VII-X. The programme of the workshop is given in appendix 1. There were nearly 50 participants as listed in appendix 2, several of them on a part-time basis. Appendix 3 gives the correspondence for the numbers of figures and tables to those contained in the contributions to the workshop. (orig.)

  4. Construction of the 2nd 500kV DC gun at KEK

    International Nuclear Information System (INIS)

    The 2nd 500 kV DC photocathode electron gun for a ERL injector was constructed at KEK. The gun has some functions such as a insulated anode electrode for using dark current monitor, a repeller electrode for decreasing backward ions, extreme high vacuum pumps and so on. A high voltage conditioning is just begun from this summer. In addition, a new cathode preparation system has been developed. It can prepare three cathodes simultaneously and storage many cathodes in a good vacuum condition. The detail design was finished and the construction of all in-vacuum components is progressing. (author)

  5. Book Review: The Communicating Leader: The key to strategic alignment (2nd Ed)

    OpenAIRE

    X. C. Birkenbach

    2003-01-01

    Title: The Communicating Leader: The key to strategic alignment (2nd Ed) Author: Gustav Puth Publisher: Van Schaik Publishers Reviewer: XC Birkenbach

    The aim of the book according to the author, is "meant to be a usable tool, an instrument in the toolbox of the real leader and leadership student". The book is written in conversational style (as intended by the author) and the 219 pages of the 10 chapters are logically packaged into three parts. While the main emphasis is naturally on...

  6. DNA Data Visualization (DDV): Software for Generating Web-Based Interfaces Supporting Navigation and Analysis of DNA Sequence Data of Entire Genomes

    Science.gov (United States)

    Neugebauer, Tomasz; Bordeleau, Eric; Burrus, Vincent; Brzezinski, Ryszard

    2015-01-01

    Data visualization methods are necessary during the exploration and analysis activities of an increasingly data-intensive scientific process. There are few existing visualization methods for raw nucleotide sequences of a whole genome or chromosome. Software for data visualization should allow the researchers to create accessible data visualization interfaces that can be exported and shared with others on the web. Herein, novel software developed for generating DNA data visualization interfaces is described. The software converts DNA data sets into images that are further processed as multi-scale images to be accessed through a web-based interface that supports zooming, panning and sequence fragment selection. Nucleotide composition frequencies and GC skew of a selected sequence segment can be obtained through the interface. The software was used to generate DNA data visualization of human and bacterial chromosomes. Examples of visually detectable features such as short and long direct repeats, long terminal repeats, mobile genetic elements, heterochromatic segments in microbial and human chromosomes, are presented. The software and its source code are available for download and further development. The visualization interfaces generated with the software allow for the immediate identification and observation of several types of sequence patterns in genomes of various sizes and origins. The visualization interfaces generated with the software are readily accessible through a web browser. This software is a useful research and teaching tool for genetics and structural genomics. PMID:26636979

  7. A technical note about Phidel: A new software for evaluating magnetic induction field generated by power lines

    International Nuclear Information System (INIS)

    The Regional Environment Protection Agency of Friuli Venezia Giulia (ARPA FVG, Italy) has performed an analysis on existing software designed to calculate magnetic induction field generated by power lines. As far as the agency's requirements are concerned the tested programs display some difficulties in the immediate processing of electrical and geometrical data supplied by plant owners, and in certain cases turn out to be inadequate in representing complex configurations of power lines. Phidel, an innovative software, tackles and works out all the above-mentioned problems. Therefore, the obtained results, when compared with those of other programs, are the closest to experimental measurements. The output data can be employed both in the GIS and Excel environments, allowing the immediate overlaying of digital cartography and the determining of the 3 and 10 ?T bands, in compliance with the Italian Decree of the President of the Council of Ministers of 8 July 2003. (authors)

  8. A proposed approach for developing next-generation computational electromagnetics software

    Energy Technology Data Exchange (ETDEWEB)

    Miller, E.K.; Kruger, R.P. (Los Alamos National Lab., NM (United States)); Moraites, S. (Simulated Life Systems, Inc., Chambersburg, PA (United States))

    1993-01-01

    Computations have become a tool coequal with mathematics and measurements as a means of performing electromagnetic analysis and design. This is demonstrated by the volume of articles and meeting presentations in which computational electromagnetics (CEM) is routinely employed to address an increasing variety of problems. Yet, in spite of the substantial resources invested in CEM software over the past three decades, little real progress seems to have been made towards providing the EM engineer software tools having a functionality equivalent to that expected of hardware instrumentation. Furthermore, the bulk of CEM software now available is generally of limited applicability to large, complex problems because most modeling codes employ a single field propagator, or analytical form, of Maxwell's Equations. The acknowledged advantages of hybrid models, i.e., those which employ different propagators in differing regions of a problem, are relatively unexploited. The thrust of this discussion is to propose a new approach designed to address both problems outlined above, integrating advances being made in both software and hardware development. After briefly reviewing the evolution of modeling CEM software to date and pointing out the deficiencies thereof, we describe an approach for making CEM tools more truly user friendly'' called EMSES (Electromagnetic Modeling and Simulation Environment for Systems). This will be achieved through two main avenues. One is developing a common problem-description language implemented in a visual programming environment working together with a translator that produces the specific model description needed by various numerical treatments, in order to optimize user efficiency. The other is to employ a new modeling paradigm based on the idea of field propagators to expedite the development of the hybrid models that are needed to optimize computation efficiency.

  9. A proposed approach for developing next-generation computational electromagnetics software

    Energy Technology Data Exchange (ETDEWEB)

    Miller, E.K.; Kruger, R.P. [Los Alamos National Lab., NM (United States); Moraites, S. [Simulated Life Systems, Inc., Chambersburg, PA (United States)

    1993-02-01

    Computations have become a tool coequal with mathematics and measurements as a means of performing electromagnetic analysis and design. This is demonstrated by the volume of articles and meeting presentations in which computational electromagnetics (CEM) is routinely employed to address an increasing variety of problems. Yet, in spite of the substantial resources invested in CEM software over the past three decades, little real progress seems to have been made towards providing the EM engineer software tools having a functionality equivalent to that expected of hardware instrumentation. Furthermore, the bulk of CEM software now available is generally of limited applicability to large, complex problems because most modeling codes employ a single field propagator, or analytical form, of Maxwell`s Equations. The acknowledged advantages of hybrid models, i.e., those which employ different propagators in differing regions of a problem, are relatively unexploited. The thrust of this discussion is to propose a new approach designed to address both problems outlined above, integrating advances being made in both software and hardware development. After briefly reviewing the evolution of modeling CEM software to date and pointing out the deficiencies thereof, we describe an approach for making CEM tools more truly ``user friendly`` called EMSES (Electromagnetic Modeling and Simulation Environment for Systems). This will be achieved through two main avenues. One is developing a common problem-description language implemented in a visual programming environment working together with a translator that produces the specific model description needed by various numerical treatments, in order to optimize user efficiency. The other is to employ a new modeling paradigm based on the idea of field propagators to expedite the development of the hybrid models that are needed to optimize computation efficiency.

  10. Roles of doping ions in afterglow properties of blue CaAl2O4:Eu2+,Nd3+ phosphors

    Science.gov (United States)

    Wako, A. H.; Dejene, B. F.; Swart, H. C.

    2014-04-01

    Eu2+ doped and Nd3+ co-doped calcium aluminate (CaAl2O4:Eu2+,Nd3+) phosphor was prepared by a urea-nitrate solution combustion method at furnace temperatures as low as 500 °C. The produced CaAl2O4:Eu2+,Nd3+ powder was investigated in terms of phase composition, morphology and luminescence by X-Ray diffraction (XRD), Scanning Electron Microscope (SEM), Fourier Transform Infra Red spectroscopy (FTIR) and Photoluminescence (PL) techniques respectively. XRD analysis depicts a dominant monoclinic phase that indicates no change in the crystalline structure of the phosphor with varying concentration of Eu2+ and Nd3+. SEM results show agglomerates with non-uniform shapes and sizes with a number of irregular network structures having lots of voids and pores. The Energy Dispersive X-ray Spectroscopy (EDS) and (FTIR) spectra confirm the expected chemical components of the phosphor. PL measurements indicated one broadband excitation spectra from 200 to 300 nm centered around 240 nm corresponding to the crystal field splitting of the Eu2+ d-orbital and an emission spectrum in the blue region with a maximum on 440 nm. This is a strong indication that there was dominantly one luminescence center, Eu2+ which represents emission from transitions between the 4f7 ground state and the 4f6-5d1 excited state configuration. High concentrations of Eu2+ and Nd3+ generally reduce both intensity and lifetime of the phosphor powders. The optimized content of Eu2+ is 1 mol% and for Nd3+ is 1 mol% for the obtained phosphors with excellent optical properties. The phosphor also emits visible light at around 587 and 616 nm. Such emissions can be ascribed to the 5D0-7F1 and 5D0-7F2 intrinsic transition of Eu3+ respectively. The decay characteristics exhibit a significant rise in initial intensity with increasing Eu2+ doping concentration while the decay time increased with Nd3+ co-doping. The observed afterglow can be ascribed to the generation of suitable traps due to the presence of the Nd3+ ions.

  11. Makahiki+WattDepot : An open source software stack for next generation energy research and education

    DEFF Research Database (Denmark)

    Johnson, Philip M.; Xu, Yongwen

    2013-01-01

    The accelerating world-wide growth in demand for energy has led to the conceptualization of a “smart grid”, where a variety of decentralized, intermittent, renewable energy sources (for example, wind, solar, and wave) would provide most or all of the power required by small-scale “micro-grids” servicing hundreds to thousands of consumers. Such a smart grid will require consumers to transition from passive to active participation in order to optimize the efficiency and effectiveness of the grid’s electrical capabilities. This paper presents a software stack comprised of two open source software systems, Makahiki and WattDepot, which together are designed to engage consumers in energy issues through a combination of education, real-time feedback, incentives, and game mechanics. We detail the novel features of Makahiki and WattDepot, along with our initial experiences using them to implement an energy challenge called the Kukui Cup.

  12. Mutation-driven test generation for conflict detection in software integration

    OpenAIRE

    Wilhelm, Ricardo Daniel Sequeira

    2013-01-01

    Em projetos de desenvolvimento de software, os programadores colaboram muitas vezes em equipa com o intuito de aumentar a sua produtividade. Os sistemas de controlo de versões (VCS) permitem facilitar esta colaboração, tendo a tarefa de fundir as alterações feitas por cada membro da equipa, possibilitando modificações de um ou mais ficheiros concorrentemente por vários programadores e aumentando assim a produtividade da equipa. No entanto, vários conflitos podem emergir neste processo de fusã...

  13. Editorial: 2nd Special Issue on behavior change, health, and health disparities.

    Science.gov (United States)

    Higgins, Stephen T

    2015-11-01

    This Special Issue of Preventive Medicine (PM) is the 2nd that we have organized on behavior change, health, and health disparities. This is a topic of fundamental importance to improving population health in the U.S. and other industrialized countries that are trying to more effectively manage chronic health conditions. There is broad scientific consensus that personal behavior patterns such as cigarette smoking, other substance abuse, and physical inactivity/obesity are among the most important modifiable causes of chronic disease and its adverse impacts on population health. As such behavior change needs to be a key component of improving population health. There is also broad agreement that while these problems extend across socioeconomic strata, they are overrepresented among more economically disadvantaged populations and contribute directly to the growing problem of health disparities. Hence, behavior change represents an essential step in curtailing that unsettling problem as well. In this 2nd Special Issue, we devote considerable space to the current U.S. prescription opioid addiction epidemic, a crisis that was not addressed in the prior Special Issue. We also continue to devote attention to the two largest contributors to preventable disease and premature death, cigarette smoking and physical inactivity/obesity as well as risks of co-occurrence of these unhealthy behavior patterns. Across each of these topics we included contributions from highly accomplished policy makers and scientists to acquaint readers with recent accomplishments as well as remaining knowledge gaps and challenges to effectively managing these important chronic health problems. PMID:26257372

  14. Proceedings of the 2nd JAERI symposium on HTGR technologies October 21 ? 23, 1992, Oarai, Japan

    International Nuclear Information System (INIS)

    The Japan Atomic Energy Research Institute (JAERI) held the 2nd JAERI Symposium on HTGR Technologies on October 21 to 23, 1992, at Oarai Park Hotel at Oarai-machi, Ibaraki-ken, Japan, with support of International Atomic Energy Agency (IAEA), Science and Technology Agency of Japan and the Atomic Energy Society of Japan on the occasion that the construction of the High Temperature Engineering Test Reactor (HTTR), which is the first high temperature gas-cooled reactor (HTGR) in Japan, is now being proceeded smoothly. In this symposium, the worldwide present status of research and development (R and D) of the HTGRs and the future perspectives of the HTGR development were discussed with 47 papers including 3 invited lectures, focusing on the present status of HTGR projects and perspectives of HTGR Development, Safety, Operation Experience, Fuel and Heat Utilization. A panel discussion was also organized on how the HTGRs can contribute to the preservation of global environment. About 280 participants attended the symposium from Japan, Bangladesh, Germany, France, Indonesia, People's Republic of China, Poland, Russia, Switzerland, United Kingdom, United States of America, Venezuela and the IAEA. This paper was edited as the proceedings of the 2nd JAERI Symposium on HTGR Technologies, collecting the 47 papers presented in the oral and poster sessions along with 11 panel exhibitions on the results of research and development associated to the HTTR. (author)

  15. Proceedings of the 2nd technical meeting on high temperature gas-cooled reactors

    International Nuclear Information System (INIS)

    From the point of view for establishing and upgrading the technology basis of HTGRs, the 2nd Technical Meeting on High Temperature Gas-cooled Reactors (HTGRs) was held on March 11 and 12, 1992, in Tokai Research Establishment in order to review the present status and the results of Research and Development (R and D) of HTGRs, to discuss on the items of R and D which should be promoted more positively in the future and then, to help in determining the strategy of development of high temperature engineering and examination in JAERI. At the 2nd Technical Meeting, which followed the 1st Technical Meeting held in February 1990 in Tokai Research Establishment, expectations to the High Temperature Engineering Test Reactor (HTTR), possible contributions of the HTGRs to the preservation of the global environment and the prospect of HTGRs were especially discussed, focusing on the R and D of Safety, high temperature components and process heat utilization by the experts from JAERI as well as universities, national institutes, industries and so on. This proceedings summarizes the papers presented in the oral sessions and materials exhibited in the poster session at the meeting and will be variable as key materials for promoting the R and D on HTGRs from now on. (author)

  16. A novel 2nd-order bandpass MFSS filter with miniaturized structure

    Science.gov (United States)

    Fang, C. Y.; Gao, J. S.; Feng, X. G.

    2015-08-01

    In order to effectively obtain a miniaturized structure and good filtering properties, we propose a novel 2nd-order bandpass metamaterial frequency selective surface (MFSS) filter which contains two capacitive layers and one inductive layer, where there are multi-loop metallic patches as shunt capacitor C and planar wire grids as series inductor L respectively. Unlike the traditional operation way—the tuned elements used in resonant surface approximately equal to one wavelength in circumference and the structure thickness with a spacing of a quarter wavelength apart, by changing the value of L and C and matching multilayer dielectric to adjust the LC coupling resonance and the resonance impedance respectively, the proposed MFSS filter can achieves a miniatured structure with ideal bandpass properties. Measurement results of the fabricated prototype of the bandpass filter (BPF) indicate that the dimension of the tuned element on resonant surface is approximately 0.025 wavelength, i.e., 0.025?. At the same time, the filter has the stable center frequency of f0 = 1.53GHz and the transmittance of T ? 96.3% and high Q-value for the TE/TM wave polarization at various incidence angles. The novel 2nd-order bandpass MFSS filter with miniaturized structure not only can decrease structure dimension, but also has a wide range of applications to microwave and infrared band.

  17. A novel 2nd-order bandpass MFSS filter with miniaturized structure

    Directory of Open Access Journals (Sweden)

    C. Y. Fang

    2015-08-01

    Full Text Available In order to effectively obtain a miniaturized structure and good filtering properties, we propose a novel 2nd-order bandpass metamaterial frequency selective surface (MFSS filter which contains two capacitive layers and one inductive layer, where there are multi-loop metallic patches as shunt capacitor C and planar wire grids as series inductor L respectively. Unlike the traditional operation way—the tuned elements used in resonant surface approximately equal to one wavelength in circumference and the structure thickness with a spacing of a quarter wavelength apart, by changing the value of L and C and matching multilayer dielectric to adjust the LC coupling resonance and the resonance impedance respectively, the proposed MFSS filter can achieves a miniatured structure with ideal bandpass properties. Measurement results of the fabricated prototype of the bandpass filter (BPF indicate that the dimension of the tuned element on resonant surface is approximately 0.025 wavelength, i.e., 0.025?. At the same time, the filter has the stable center frequency of f0 = 1.53GHz and the transmittance of T ? 96.3% and high Q-value for the TE/TM wave polarization at various incidence angles. The novel 2nd-order bandpass MFSS filter with miniaturized structure not only can decrease structure dimension, but also has a wide range of applications to microwave and infrared band.

  18. Research on Object-oriented Software Testing Cases of Automatic Generation

    Directory of Open Access Journals (Sweden)

    Junli Zhang

    2013-11-01

    Full Text Available In the research on automatic generation of testing cases, there are different execution paths under drivers of different testing cases. The probability of these paths being executed is also different. For paths which are easy to be executed, more redundant testing case tend to be generated; But only fewer testing cases are generated for the control paths which are hard to be executed. Genetic algorithm can be used to instruct the automatic generation of testing cases. For the former paths, it can restrict the generation of these kinds of testing cases. On the contrary, the algorithm will encourage the generation of such testing cases as much as possible. So based on the study on the technology of path-oriented testing case automatic generation, the genetic algorithm is adopted to construct the process of automatic generation. According to the triggering path during the dynamic execution of program, the generated testing cases are separated into different equivalence class. The number of testing case is adjusted dynamicly by the fitness corresponding to the paths. The method can create a certain number of testing cases for each execution path to ensure the sufficiency. It also reduces redundant testing cases so it is an effective method for automatic generation of testing cases.

  19. RPS2 Proposal Submission Software: Testing and Distribution of Periodically Updated Software

    Science.gov (United States)

    Douglas, R. E., Jr.

    1997-12-01

    In 1995, the Space Telescope Science Institute (STScI) introduced RPS2 (Remote Proposal Submission 2nd Generation). RPS2 is used by Hubble Space Telescope (HST) proposers to prepare their detailed observation descriptions. It is a client/server system implemented using Tcl/Tk. The client can transparently access servers on the user's machine, at STScI, or on any other machine on the Internet. The servers combine syntax checking, feasibility analysis, orbit packing, and constraint and schedulability analysis of user-specified observations. Prior to the release of RPS2, observers used a system which provided only syntax checking. RPS2 now provides the observers with some of the more advanced software, that had previously been available only to STScI staff for the preparation of detailed observing plans. The RPS2 system consists of four independent subsystems which are controlled by the client/server mechanism. A problem with a system of this size and complexity is that the software components, which continue to grow and change with HST itself, must continually be tested and distributed to those who need them. In the past, it had been acceptable to release the RPS2 software only once per observing cycle, but it became apparent before the 1997 HST Servicing Mission that multiple releases of RPS2 were going to be required to support the new instruments. This paper discusses how RPS2 and its component systems are maintained, updated, tested, and distributed.

  20. Technical Background Material for the Wave Generation Software AwaSys 5

    DEFF Research Database (Denmark)

    Frigaard, Peter; Andersen, Thomas Lykke

    2010-01-01

    "Les Appareils Generateurs de Houle en Laboratorie" presented by Bi¶esel and Suquet in 1951 discussed and solved the analytical problems concerning a number of di®erent wave generator types. For each wave maker type the paper presented the transfer function between wave maker displacement and wave amplitude in those cases where the analytical problem could be solved. The article therefore represented a giant step in wave generation techniques and found the basis for today's wave generation in hy...

  1. Software tool for analysing the family shopping basket without candidate generation

    Directory of Open Access Journals (Sweden)

    Roberto Carlos Naranjo Cuervo

    2010-05-01

    Full Text Available Tools leading to useful knowledge being obtained for supporting marketing decisions being taken are currently needed in the e-commerce environment. A process is needed for this which uses a series of techniques for data-processing; data-mining is one such technique enabling automatic information discovery. This work presents the association rules as a suitable technique for dis-covering how customers buy from a company offering business to consumer (B2C e-business, aimed at supporting decision-ma-king in supplying its customers or capturing new ones. Many algorithms such as A priori, DHP, Partition, FP-Growth and Eclat are available for implementing association rules; the following criteria were defined for selecting the appropriate algorithm: database insert, computational cost, performance and execution time. The development of a software tool is also presented which involved the CRISP-DM approach; this software tool was formed by the following four sub-modules: data pre-processing, data-mining, re-sults analysis and results application. The application design used three-layer architecture: presentation logic, business logic and service logic. Data warehouse design and algorithm design were included in developing this data-mining software tool. It was tested by using a FoodMart company database; the tests included performance, functionality and results’ validity, thereby allo-wing association rules to be found. The results led to concluding that using association rules as a data mining technique facilita-tes analysing volumes of information for B2C e-business services which represents a competitive advantage for those companies using Internet as their sales’ media.

  2. System and Component Software Specification, Run-time Verification and Automatic Test Generation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The following background technology is described in Part 5: Run-time Verification (RV), White Box Automatic Test Generation (WBATG). Part 5 also describes how WBATG...

  3. A Customer Value Creation Framework for Businesses That Generate Revenue with Open Source Software

    OpenAIRE

    Aparna Shanker

    2012-01-01

    Technology entrepreneurs must create value for customers in order to generate revenue. This article examines the dimensions of customer value creation and provides a framework to help entrepreneurs, managers, and leaders of open source projects create value, with an emphasis on businesses that generate revenue from open source assets. The proposed framework focuses on a firm's pre-emptive value offering (also known as a customer value proposition). This is a firm's offering of the value it se...

  4. Belief Functions: Theory and Applications - Proceedings of the 2nd International Conference on Belief Functions

    CERN Document Server

    Masson, Marie-Hélène

    2012-01-01

    The theory of belief functions, also known as evidence theory or Dempster-Shafer theory, was first introduced by Arthur P. Dempster in the context of statistical inference, and was later developed by Glenn Shafer as a general framework for modeling epistemic uncertainty. These early contributions have been the starting points of many important developments, including the Transferable Belief Model and the Theory of Hints. The theory of belief functions is now well established as a general framework for reasoning with uncertainty, and has well understood connections to other frameworks such as probability, possibility and imprecise probability theories.   This volume contains the proceedings of the 2nd International Conference on Belief Functions that was held in Compiègne, France on 9-11 May 2012. It gathers 51 contributions describing recent developments both on theoretical issues (including approximation methods, combination rules, continuous belief functions, graphical models and independence concepts) an...

  5. Estimation of 2nd-order derivative thermodynamic properties using the crossover lattice equation of state

    International Nuclear Information System (INIS)

    We apply the crossover lattice equation of state (xLF EOS) [M.S. Shin, Y. Lee, H. Kim, J. Chem. Thermodyn. 40 (2007) 174-179] to the calculations of thermodynamic 2nd-order derivative properties (isochoric heat capacity, isobaric heat capacity, isothermal compressibility, thermal expansion coefficient, Joule-Thompson coefficient, and sound speed). This equation of state is used to calculate the same properties of pure systems (carbon dioxide, normal alkanes from methane to propane). We show that, over a wide range of states, the equation of state yields properties with better accuracy than the lattice equation of state (LF EOS), and near the critical region, represents singular behavior well

  6. Nonlinear Dynamics of Memristor Based 2nd and 3rd Order Oscillators

    KAUST Repository

    Talukdar, Abdul Hafiz

    2011-05-01

    Exceptional behaviours of Memristor are illustrated in Memristor based second order (Wien oscillator) and third order (phase shift oscillator) oscillator systems in this Thesis. Conventional concepts about sustained oscillation have been argued by demonstrating the possibility of sustained oscillation with oscillating resistance and dynamic poles. Mathematical models are also proposed for analysis and simulations have been presented to support the surprising characteristics of the Memristor based oscillator systems. This thesis also describes a comparative study among the Wien family oscillators with one Memristor. In case of phase shift oscillator, one Memristor and three Memristors systems are illustrated and compared to generalize the nonlinear dynamics observed for both 2nd order and 3rd order system. Detail explanations are provided with analytical models to simplify the unconventional properties of Memristor based oscillatory systems.

  7. Knowledge grows when shared : The Launch of OpenAIRE, 2nd December in Ghent

    DEFF Research Database (Denmark)

    Elbæk, Mikael Karstensen

    2010-01-01

    Knowledge is one of the few commodities that don’t devalue when used. Actually knowledge grows when shared and the free online access to peer-reviewed scientific publications is a potent ingredient the process of sharing. The sharing of knowledge is facilitated by the Open Access Movement. However Open Access is much more than downloading the PDF. Vice President of the European Commission and European Digital Agenda Commissioner Neelie Kroes boldly presented this message in the Opening Session of the OpenAIRE launch. On the 2nd December 2010 the official launch of OpenAIRE the European infrastructure for Open Access was launched in Ghent, Belgium. This project and initiative is facilitating the success of the Open Access Pilot in FP7 as presented earlier in this journal. In this brief article I will present some of the most interesting issues that were discussed during the first session of the day.

  8. 2nd International Conference on Education and Educational Technology (EET 2011)

    CERN Document Server

    Education Management, Education Theory and Education Application

    2012-01-01

    This volume includes extended and revised versions of a set of selected papers from the 2011 2nd International Conference on Education and Educational Technology (EET 2011) held in Chengdu, China, October 1-2, 2011. The mission of EET 2011 Volume 2 is to provide a forum for researchers, educators, engineers, and government officials involved in the general areas of education management, education theory and education application to disseminate their latest research results and exchange views on the future research directions of these fields. 133 related topic papers were selected into this volume. All the papers were reviewed by 2 program committee members and selected by the volume editor Prof. Yuanzhi Wang, from Intelligent Information Technology Application Research Association, Hong Kong. The conference will bring together leading researchers, engineers and scientists in the domain of interest. We hope every participant can have a good opportunity to exchange their research ideas and results and to discus...

  9. analysis and implementation of reactor protection system circuits - case study Egypt's 2 nd research reactor-

    International Nuclear Information System (INIS)

    this work presents a way to design and implement the trip unit of a reactor protection system (RPS) using a field programmable gate arrays (FPGA). instead of the traditional embedded microprocessor based interface design method, a proposed tailor made FPGA based circuit is built to substitute the trip unit (TU), which is used in Egypt's 2 nd research reactor ETRR-2. the existing embedded system is built around the STD32 field computer bus which is used in industrial and process control applications. it is modular, rugged, reliable, and easy-to-use and is able to support a large mix of I/O cards and to easily change its configuration in the future. therefore, the same bus is still used in the proposed design. the state machine of this bus is designed based around its timing diagrams and implemented in VHDL to interface the designed TU circuit

  10. Proceedings of the 2nd seminar of R and D on advanced ORIENT

    International Nuclear Information System (INIS)

    The 2nd Seminar of R and D on advanced ORIENT was held at Ricotte, on November 7th, 2008, Japan Atomic Energy Agency. The first meeting of this seminar was held on Oarai, Ibaraki on May, 2008, and more than fifty participants including related researchers and general public people were attended to this seminar. The second seminar has headed by Nuclear Science and Engineering Directorate, JAEA on Tokai, Ibaraki with 63 participants. Spent nuclear fuel should be recognized not only mass of radioactive elements but also potentially useful materials including platinum metals and rare earth elements. Taking the cooperation with universities related companies and research institutes, into consideration, we aimed at expanding and progressing the basic researches. This report records abstracts and figures submitted from the oral speakers in this seminar. (author)

  11. Proceedings of the 2nd annual meeting of Japanese Society of Radiation Safety Management 2003 Tsukuba

    International Nuclear Information System (INIS)

    This is the program and the proceedings of the 2nd annual meeting of Japanese Society of Radiation Safety Management held from December 3rd through the 5th of 2003. The sessions held were: (1) Research on Low-level Waste, (2) Topics related to Detector, Measurement, and Instrument, (3) Dose Level and Imaging Plate, (4) Radiation, (5) Safety Education and Safety Evaluation. The poster sessions held were: (1) Safety Education, Safety Evaluation, Shielding, and so on, (2) Control System and Control Technology, (3) Detector and Radiation Measurement, (4) Topics Related to Imaging Plate, (5) Environment and Radiation Measurement, and (6) Radiation Control. Symposia held were: (1) 'Regarding Basic Concept to Incorporate International Exemption Level in Regulation' as the keynote lecture and (2) 'Regarding Correspondence Associated with Legal Revision and Radiation Safety Regulation'. Regarding these topics, after the explanation from each area, panel discussions were held. (S.K.)

  12. Computation of equivalent poles placement for class of 2nd order discrete bilinear systems

    Science.gov (United States)

    Gadek, Lukasz; Koszalka, Leszek; Burnham, Keith

    2015-11-01

    This paper introduces an adaptation of the classical linear control theory representation of zeros, poles and gain into a bilinear approach. The placement of poles at the complex plane is a complete description of plants dynamics; hence it is a convenient form from which calculation of various properties, e.g. rise time, settling time, is plausible. Such technique can be adjusted into the bilinear structure if poles of a quasi-linear representation (linear with respect to input) are concerned. The research outcomes with conclusion on the equivalent poles displacement and generalized rules for a 2nd order bilinear system equivalent poles input dependent loci. The proposed approach seems to be promising, as simplification of design and identification of a bilinear system increases transparency during modelling and control in practical applications and hence it may be followed by applicability of such structure in common industrial problems.

  13. International symposium on peripheral nerve repair and regeneration and 2nd club Brunelli meeting.

    Science.gov (United States)

    Turgut, Mehmet; Geuna, Stefano

    2010-01-01

    The International Symposium "Peripheral Nerve Repair and Regeneration and 2nd Club Brunelli Meeting" was held on December 4-5, 2009 in Turin, Italy (Organizers: Bruno Battiston, Stefano Geuna, Isabelle Perroteau, Pierluigi Tos). Interest in the study of peripheral nerve regeneration is very much alive because complete recovery of nerve function almost never occurs after nerve reconstruction and, often, the clinical outcome is rather poor. Therefore, there is a need for defining innovative strategies for improving the success of recovery after nerve lesion and repair and this meeting was intended to discuss, from a multidisciplinary point of view, some of today's most important issues in this scientific field, arising from both basic and clinical neurosciences. PMID:20214775

  14. International symposium on peripheral nerve repair and regeneration and 2nd club Brunelli meeting

    Directory of Open Access Journals (Sweden)

    Geuna Stefano

    2010-03-01

    Full Text Available Abstract The International Symposium "Peripheral Nerve Repair and Regeneration and 2nd Club Brunelli Meeting" was held on December 4-5, 2009 in Turin, Italy (Organizers: Bruno Battiston, Stefano Geuna, Isabelle Perroteau, Pierluigi Tos. Interest in the study of peripheral nerve regeneration is very much alive because complete recovery of nerve function almost never occurs after nerve reconstruction and, often, the clinical outcome is rather poor. Therefore, there is a need for defining innovative strategies for improving the success of recovery after nerve lesion and repair and this meeting was intended to discuss, from a multidisciplinary point of view, some of today's most important issues in this scientific field, arising from both basic and clinical neurosciences.

  15. 2nd International Colloquium on Sports Science, Exercise, Engineering and Technology 2015

    CERN Document Server

    Sulaiman, Norasrudin; Adnan, Rahmat

    2016-01-01

    The proceeding is a collection of research papers presented at the 2nd International Colloquium on Sports Science, Exercise, Engineering and Technology (ICoSSEET2015), a conference dedicated to address the challenges in the areas of sports science, exercise, sports engineering and technology including other areas of sports, thereby presenting a consolidated view to the interested researchers in the aforesaid fields. The goal of this conference was to bring together researchers and practitioners from academia and industry to focus on the scope of the conference and establishing new collaborations in these areas. The topics of interest are in mainly (1) Sports and Exercise Science (2) Sports Engineering and Technology Application (3) Sports Industry and Management.

  16. 2nd symposium on materials research 1991. Papers and posters. Vol. 3

    International Nuclear Information System (INIS)

    With the '2nd symposium on materials research' the technological status of the Federal Republic was to be documented and balanced in the area of the new materials. Through overview lectures and subject-related lectures, results of fundamental research up to practical material developments are introduced. In the first volume, the polymers and metals topic circles are discussed; in the second volume, the ceramic materials, composites as well as the measurement technology, the testing method and analysis engineering; and in the third volume thin film technology and tribology. This was followed by a poster presentation (286 posters) on the subject of ceramic materials, powder metallurgy, high temperature and special materials, composites and new polymers. (MM)

  17. 2nd Symposium on Fluid-Structure-Sound Interactions and Control

    CERN Document Server

    Liu, Yang; Huang, Lixi; Hodges, Dewey

    2014-01-01

    With rapid economic and industrial development in China, India and elsewhere, fluid-related structural vibration and noise problems are widely encountered in many fields, just as they are in the more developed parts of the world, causing increasingly grievous concerns. Turbulence clearly has a significant impact on many such problems. On the other hand, new opportunities are emerging with the advent of various new technologies, such as signal processing, flow visualization and diagnostics, new functional materials, sensors and actuators, etc. These have revitalized interdisciplinary research activities, and it is in this context that the 2nd symposium on fluid-structure-sound interactions and control (FSSIC) was organized. Held in Hong Kong (May 20-21, 2013) and Macau (May 22-23, 2013), the meeting brought together scientists and engineers working in all related branches from both East and West and provided them with a forum to exchange and share the latest progress, ideas and advances and to chart the fronti...

  18. 2nd International Conference on Education and Educational Technology (EET 2011)

    CERN Document Server

    Education and Educational Technology

    2012-01-01

    This volume includes extended and revised versions of a set of selected papers from the 2011 2nd International Conference on Education and Educational Technology (EET 2011) held in Chengdu, China, October 1-2, 2011. The mission of EET 2011 Volume 1 is to provide a forum for researchers, educators, engineers, and government officials involved in the general areas of education and educational technology to disseminate their latest research results and exchange views on the future research directions of these fields. 130 related topic papers were selected into this volume. All the papers were reviewed by 2 program committee members and selected by the volume editor Prof. Yuanzhi Wang, from Intelligent Information Technology Application Research Association, Hong Kong. The conference will bring together leading researchers, engineers and scientists in the domain of interest. We hope every participant can have a good opportunity to exchange their research ideas and results and to discuss the state of the art in th...

  19. 2nd FP7 Conference and International Summer School Nanotechnology : From Fundamental Research to Innovations

    CERN Document Server

    Yatsenko, Leonid

    2015-01-01

    This book presents some of the latest achievements in nanotechnology and nanomaterials from leading researchers in Ukraine, Europe, and beyond. It features contributions from participants in the 2nd International Summer School “Nanotechnology: From Fundamental Research to Innovations” and International Research and Practice Conference “Nanotechnology and Nanomaterials”, NANO-2013, which were held in Bukovel, Ukraine on August 25-September 1, 2013. These events took place within the framework of the European Commission FP7 project Nanotwinning, and were organized jointly by the Institute of Physics of the National Academy of Sciences of Ukraine, University of Tartu (Estonia), University of Turin (Italy), and Pierre and Marie Curie University (France). Internationally recognized experts from a wide range of universities and research institutions share their knowledge and key results on topics ranging from nanooptics, nanoplasmonics, and interface studies to energy storage and biomedical applications. Pr...

  20. A critical discussion of the 2nd intercomparison on electron paramagnetic resonance dosimetry with tooth enamel

    International Nuclear Information System (INIS)

    Recently, we have participated in 'The 2nd International Intercomparison on EPR Tooth Dosimetry' wherein 18 laboratories had to evaluate low-radiation doses (100-1000 mGy) in intact teeth (Wieser et al., Radiat. Meas., 32 (2000a) 549). The results of this international intercomparison seem to indicate a promising picture of EPR tooth dosimetry. In this paper, the two Belgian EPR participants present a more detailed and critical study of their contribution to this intercomparison. The methods used were maximum likelihood common factor analysis (MLCFA) and spectrum subtraction. Special attention is paid to potential problems with sample preparation, intrinsic dose evaluation, linearity of the dose response, and determination of dose uncertainties

  1. Book Review: The Communicating Leader: The key to strategic alignment (2nd Ed

    Directory of Open Access Journals (Sweden)

    X. C. Birkenbach

    2003-10-01

    Full Text Available Title: The Communicating Leader: The key to strategic alignment (2nd Ed Author: Gustav Puth Publisher: Van Schaik Publishers Reviewer: XC Birkenbach The aim of the book according to the author, is "meant to be a usable tool, an instrument in the toolbox of the real leader and leadership student". The book is written in conversational style (as intended by the author and the 219 pages of the 10 chapters are logically packaged into three parts. While the main emphasis is naturally on leadership and communication, the coverage includes topics typically encountered in Organisational Behaviour or Management texts, e.g., organizational culture, managing change, motivation, conflict management and strategic management.

  2. An Evidential Interpretation of the 1st and 2nd Laws of Thermodynamics

    CERN Document Server

    Vieland, V J

    2013-01-01

    I argue here that both the 1st and 2nd laws of thermodynamics, generally understood to be quintessentially physical in nature, can be equally well described as being about the flow dynamics of information without the need to invoke physical manifestations for information. This involves developing two distinct, yet related, forms of bookkeeping: one pertaining to what physicists generally understand as information per se, which I call purely combinatoric information; and the other pertaining to a version of what physicists understand as energy, which I call evidential information, for reasons to be made clear. I illustrate both sets of books with application to a simple coin-tossing (binomial) experiment. I then show that the physical quantity temperature (T) linking those two forms of bookkeeping together in physics has a familiar, but surprising, interpretation in this setting: the direct informational analogue of T turns out to be what we would in ordinary English call the evidence.

  3. Software tools for automatic generation of finite element mesh and application of biomechanical calculation in medicine

    Directory of Open Access Journals (Sweden)

    Milašinovi? Danko Z.

    2008-01-01

    Full Text Available Cardiovascular diseases are common and a special difficulty in their curing is diagnostics. Modern medical instruments can provide data that is much more adequate for computer modeling. Computer simulations of blood flow through the cardiovascular organs give powerful advantages to scientists today. The motivation for this work is raw data that our Center recently received from the University Clinical center in Heidelberg from a multislice CT scanner. In this work raw data from CT scanner was used for creating a 3D model of the aorta. In this process we used Gmsh, TetGen (Hang Si as well as our own software tools, and the result was the 8-node (brick mesh on which the calculation was run. The results obtained were very satisfactory so...

  4. Software Defined Networking for Next Generation Converged Metro-Access Networks

    Science.gov (United States)

    Ruffini, M.; Slyne, F.; Bluemm, C.; Kitsuwan, N.; McGettrick, S.

    2015-12-01

    While the concept of Software Defined Networking (SDN) has seen a rapid deployment within the data center community, its adoption in telecommunications network has progressed slowly, although the concept has been swiftly adopted by all major telecoms vendors. This paper presents a control plane architecture for SDN-driven converged metro-access networks, developed through the DISCUS European FP7 project. The SDN-based controller architecture was developed in a testbed implementation targeting two main scenarios: fast feeder fiber protection over dual-homed Passive Optical Networks (PONs) and dynamic service provisioning over a multi-wavelength PON. Implementation details and results of the experiment carried out over the second scenario are reported in the paper, showing the potential of SDN in providing assured on-demand services to end-users.

  5. 1st and 2nd Trimester Headsize in Fetuses with Congenital Heart Disease: A Cohort Study

    DEFF Research Database (Denmark)

    Lauridsen, Mette HØj; Petersen, Olav BjØrn

    ?Background: Congenital heart disease (CHD) is associated with neuro-developmental disorders. The influence of CHD on the brain may be present in the fetus. We hypothesize that fetal cerebral growth is impaired as early as 2nd trimester. Aim: To investigate if fetal cerebral growth is associated with major and minor CHD.: Pregnant women in Denmark (more than 95%) attend two publicly funded ultrasound scans; around 12 and 20 weeks gestational age (GA). During the first scan fetal bi-parietal-diameter (BPD) is routinely obtained. During the second scan fetal head- circumference (HC) is obtained and screening for fetal malformations is carried out. Our cohort includes all fetuses in Western Denmark (2.9 million inhabitants) screened in between January 1st 2012 and December 31st 2013, diagnosed with any structural, non-syndromic congenital heart disease either during pregnancy or up to 6 months after birth.? Results 276 fetuses with CHD were identified. 114 (41%) were genetically screened primarily by chromosomal microarray analysis (n=82). Fetuses with identified chromosomal abnormalities were excluded as were multiple gestation fetuses and fetuses with major extra cardiac malformations. Data from 208 fetuses (75%) with presumed non-syndromic CHD were included, 85 (41%) with minor and 123 (59%) with major CHD. Z-scores for head size were analysed. Conclusions: Our preliminary results suggest that Bi-parietal-diameter in children with CHD is within the normal range in the 1st trimester, but fetal cerebral growth may be disrupted as early as during 2nd trimester in major CHD.

  6. Assessment of nursing care using indicators generated by software / Evaluación de la asistencia de enfermería utilizando indicadores generados por un software / Avaliação da assistência de enfermagem utilizando indicadores gerados por um software

    Scientific Electronic Library Online (English)

    Ana Paula Souza, Lima; Tânia Couto Machado, Chianca; Meire Chucre, Tannure.

    2015-04-01

    Full Text Available OBJETIVO: analisar a eficácia do Processo de Enfermagem em uma Unidade de Terapia Intensiva, utilizando indicadores gerados por um software. MÉTODO: estudo transversal, cujos dados foram coletados durante quatro meses. Enfermeiros e acadêmicos realizaram, diariamente, cadastro e anamnese (na admiss [...] ão), exame físico, diagnósticos de enfermagem, planejamento/prescrição de enfermagem e avaliação da assistência de 17 pacientes, utilizando um software. Calculou-se os indicadores incidência e prevalência de diagnósticos de enfermagem, taxa de efetividade diagnóstica de risco e taxa de efetividade na prevenção de complicações. RESULTADOS: o Risco de desequilíbrio na temperatura corporal foi o diagnóstico mais incidente (23,53%) e o menos incidente foi o Risco de constipação (0%). O Risco de integridade da pele prejudicada foi prevalente em 100% dos pacientes, enquanto o Risco de confusão aguda foi o menos prevalente (11,76%). Risco de constipação e Risco de integridade da pele prejudicada obtiveram taxa de efetividade diagnóstica de risco de 100%. A taxa de efetividade na prevenção de confusão aguda e de queda foi de 100%. CONCLUSÃO: analisou-se a eficácia do Processo de Enfermagem utilizando indicadores, pois retratam como o enfermeiro tem identificado os problemas e riscos do paciente, e planejado a assistência de forma sistematizada. Abstract in spanish OBJETIVO: analizar la eficacia del Proceso de Enfermería en una Unidad de Terapia Intensiva, utilizando indicadores generados por un software. MÉTODO: estudio transversal, cuyos datos fueron recolectados durante cuatro meses. Enfermeros y académicos realizaron, diariamente, registro y anamnesis (en [...] la admisión), examen físico, diagnósticos de enfermería, planificación/prescripción de enfermería y evaluación de la asistencia en 17 pacientes, utilizando un software. Se calculó los indicadores incidencia y prevalencia de diagnósticos de enfermería, la tasa de efectividad diagnóstica de riesgo y la tasa de efectividad en la prevención de complicaciones. RESULTADOS: el Riesgo de desequilibrio en la temperatura corporal fue el diagnóstico más prevalente (23,53%) y el menos prevalente fue el Riesgo de constipación (0%). El Riesgo de integridad de la piel perjudicada fue prevalente en 100% de los pacientes, en cuanto el Riesgo de confusión aguda fue el menos prevalente (11,76%). El Riesgo de constipación y el Riesgo de integridad de la piel perjudicada obtuvieron una tasa de efectividad diagnóstica de riesgo de 100%. La tasa de efectividad en la prevención de confusión aguda y de caída fue de 100%. CONCLUSIÓN: se analizó la eficacia del Proceso de Enfermería utilizando indicadores, ya que retratan cómo el enfermero ha identificado los problemas y riesgos del paciente, y planificado la asistencia de forma sistematizada. Abstract in english OBJECTIVE: to analyze the efficacy of the Nursing Process in an Intensive Care Unit using indicators generated by software. METHOD: cross-sectional study using data collected for four months. RNs and students daily registered patients, took history (at admission), performed physical assessments, an [...] d established nursing diagnoses, nursing plans/prescriptions, and assessed care delivered to 17 patients using software. Indicators concerning the incidence and prevalence of nursing diagnoses, rate of effectiveness, risk diagnoses, and rate of effective prevention of complications were computed. RESULTS: the Risk for imbalanced body temperature was the most frequent diagnosis (23.53%), while the least frequent was Risk for constipation (0%). The Risk for Impaired skin integrity was prevalent in 100% of the patients, while Risk for acute confusion was the least prevalent (11.76%). Risk for constipation and Risk for impaired skin integrity obtained a rate of risk diagnostic effectiveness of 100%. The rate of effective prevention of acute confusion and falls was 100%. CONCLUSION: the efficacy of the Nursing Process using indicators was analyzed because these

  7. Net Generation at Social Software: Challenging Assumptions, Clarifying Relationships and Raising Implications for Learning

    Science.gov (United States)

    Valtonen, Teemu; Dillon, Patrick; Hacklin, Stina; Vaisanen, Pertti

    2010-01-01

    This paper takes as its starting point assumptions about use of information and communication technology (ICT) by people born after 1983, the so called net generation. The focus of the paper is on social networking. A questionnaire survey was carried out with 1070 students from schools in Eastern Finland. Data are presented on students' ICT-skills…

  8. Massive coordination of dispersed generation using PowerMatcher based software agents

    International Nuclear Information System (INIS)

    One of the outcomes of the EU-Fifth framework CRISP-project (http://crisp.ecn.nl/), has been the development of a real-time control strategy based on the application of distributed intelligence (ICT) to coordinate demand and supply in electricity grids. This PowerMatcher approach has been validated in two real-life and real-time field tests. The experiments aimed at controlled coordination of dispersed electricity suppliers (DG-RES) and demanders in distributed grids enabled by ICT-networks. Optimization objectives for the technology in the tests were minimization of imbalance in a commercial portfolio and mitigation of strong load variations in a distribution network with residential micro-CHPs. With respect to the number of ICT-nodes, the field tests were on a relatively small-scale. However, application of the technology has yielded some very encouraging results in both occasions. In the present paper, lessons learned from the field experiments are discussed. Furthermore, it contains an account of the roadmap for scaling up these field-tests with a larger number of nodes and with more diverse appliance/installation types. Due to its autonomous decision making agent-paradigm, the PowerMatcher software technology is expected to be widely more scaleable than central coordination approaches. Indeed, it is based on microeconomic theory and is expected to work best if it is applied on a massive scale in transparent market settings. A set of various types of supply and demand appliances was defined and implemented in a PowerMatcher software simulation environment. A massive amount of these PowerMatcher node-agents each representing such a devicetype was utilized in a number of scenario calculations. As the production of DG-RES-resources and the demand profiles are strongly dependent on the time-of-year, climate scenarios leading to operational snapshots of the cluster were taken for a number of representative periods. The results of these larger scale simulations as well as scalability issues, encountered, are discussed. Further issues covered are the stability of the system as reflected by the internal price development pattern that acts as an 'invisible hand' to reach the common optimisation goal. Finally, the effects of scaling-up the technology are discussed in terms of possible 'emergent behaviour' of subsets in the cluster and primary process quality of appliances operating concertedly using the PowerMatcher

  9. Numerical Simulation of the Francis Turbine and CAD used to Optimized the Runner Design (2nd).

    Science.gov (United States)

    Sutikno, Priyono

    2010-06-01

    Hydro Power is the most important renewable energy source on earth. The water is free of charge and with the generation of electric energy in a Hydroelectric Power station the production of green house gases (mainly CO2) is negligible. Hydro Power Generation Stations are long term installations and can be used for 50 years and more, care must be taken to guarantee a smooth and safe operation over the years. Maintenance is necessary and critical parts of the machines have to be replaced if necessary. Within modern engineering the numerical flow simulation plays an important role in order to optimize the hydraulic turbine in conjunction with connected components of the plant. Especially for rehabilitation and upgrading existing Power Plants important point of concern are to predict the power output of turbine, to achieve maximum hydraulic efficiency, to avoid or to minimize cavitations, to avoid or to minimized vibrations in whole range operation. Flow simulation can help to solve operational problems and to optimize the turbo machinery for hydro electric generating stations or their component through, intuitive optimization, mathematical optimization, parametric design, the reduction of cavitations through design, prediction of draft tube vortex, trouble shooting by using the simulation. The classic design through graphic-analytical method is cumbersome and can't give in evidence the positive or negative aspects of the designing options. So it was obvious to have imposed as necessity the classical design methods to an adequate design method using the CAD software. There are many option chose during design calculus in a specific step of designing may be verified in ensemble and detail form a point of view. The final graphic post processing would be realized only for the optimal solution, through a 3 D representation of the runner as a whole for the final approval geometric shape. In this article it was investigated the redesign of the hydraulic turbine's runner, medium head Francis type, with following value for the most important parameter, the rated specific speed ns.

  10. Software tool for learning the generation of the cardioid curve in an autocad environment

    OpenAIRE

    Gómez-Elvira-González, Miguel Ángel; Rojas-Sola, José Ignacio; Carranza-Cañadas, María del Pilar

    2012-01-01

    This article presents a novel application which has been developed in Visual LISP for an AutoCAD environment, and which shows the generation of the cardioid curve intuitively and quickly in five different ways (using the conchoid of a circumference, pedal curve of a circumference, inverse of a parabola, orthoptic curve of a circumference, and epicycloid of a circumference). This cyclic curve has a large number of artistic and technical applications, among them the profile of some cams.

  11. Development of new generation software tools for simulation of electron beam formation in novel high power gyrotrons

    International Nuclear Information System (INIS)

    Computer aided design (CAD) based on numerical experiments performed by using adequate physical models and efficient simulation codes is an indispensable tool for development, investigation, and optimization of gyrotrons used as radiation sources for electron cyclotron resonance heating (ECRH) of fusion plasmas. In this paper, we review briefly the state-of-the-art in the field of modelling and simulation of intense, relativistic, helical electron beams formed in the electron-optical systems (EOS) of powerful gyrotrons. We discuss both the limitations of the known computer codes and the requirements for increasing their capabilities for solution of various design problems that are being envisaged in the development of the next generation gyrotrons for ECRH. Moreover, we present the concept followed by us in an attempt to unite the advantages of the modern programming techniques with self-consistent, first-principles 3D physical models in the creation of a new highly efficient and versatile software package for simulation of powerful gyrotrons

  12. Multigrid preconditioning of steam generator two-phase mixture balance equations in the Genepi software

    OpenAIRE

    Belliard, Michel

    2006-01-01

    Within the framework of averaged two-phase mixture flow simulations of PWR Steam Generators (SG), this paper provides a geometric version of a pseudo-FMG FAS preconditioning of the balance equations used in the CEA Genepi code. The 3D steady-state flow is reached by a transient computation using a fractional step algorithm and a projection method. Our application is based on the PVM package. The difficulties of applying geometric FAS multigrid methods to the balance equations solver are addre...

  13. Report from the 2nd Summer School in Computational Biology organized by the Queen's University of Belfast

    Directory of Open Access Journals (Sweden)

    Frank Emmert-Streib

    2014-12-01

    Full Text Available In this paper, we present a meeting report for the 2nd Summer School in Computational Biology organized by the Queen's University of Belfast. We describe the organization of the summer school, its underlying concept and student feedback we received after the completion of the summer school.

  14. Give It a Shot! Toolkit for Nurses and Other Immunization Champions Working with Secondary Schools. 2nd Edition

    Science.gov (United States)

    Boyer-Chu, Lynda; Wooley, Susan F.

    2008-01-01

    Adolescent immunization saves lives--but promoting immunization takes time and thought, and today's nurses and other health advocates are faced with a host of ever-expanding responsibilities in a time of reduced budgets and staff. This toolkit is thus structured as an easy and reliable resource. This 2nd edition contains: (1) a 64-page manual;…

  15. Observation in a School without Walls: Peer Observation of Teaching in a 2nd-12th Grade Independent School

    Science.gov (United States)

    Salvador, Josephine

    2012-01-01

    What happens when teachers start to observe each other's classes? How do teachers make meaning of observing and being observed? What effects, if any, does requiring peer observation have on the teaching community? This research explores these questions in a qualitative study of peer observation of teaching (POT) in the 2nd-12th grades of an…

  16. Software framework for prognostic health monitoring of ocean-based power generation

    Science.gov (United States)

    Bowren, Mark

    On August 5, 2010 the U.S. Department of Energy (DOE) has designated the Center for Ocean Energy Technology (COET) at Florida Atlantic University (FAU) as a national center for ocean energy research and development of prototypes for open-ocean power generation. Maintenance on ocean-based machinery can be very costly. To avoid unnecessary maintenance it is necessary to monitor the condition of each machine in order to predict problems. This kind of prognostic health monitoring (PHM) requires a condition-based maintenance (CBM) system that supports diagnostic and prognostic analysis of large amounts of data. Research in this field led to the creation of ISO13374 and the development of a standard open-architecture for machine condition monitoring. This thesis explores an implementation of such a system for ocean-based machinery using this framework and current open-standard technologies.

  17. The 2nd and 3rd lower molars development of in utero gamma irradiated mouse fetus and neonates

    International Nuclear Information System (INIS)

    Pregnant mothers were irradiated by a single dose of gamma rays (0, 2, 4, 6 Gy cobalt 60) in the days 10, 12, 14, 16, 18 of pregnancy. The heads of the embryos, and those of the neonates were taken at consecutive intervals of irradiation, starting from 16 days of pregnancy till 3rd day after delivery. The effect of irradiation was investigated in the development of the 2nd and 3rd lower molars on serial tissue sections, within consecutive periods of their organogenesis. Irradiation led to growth-deficiency in the 2nd and 3rd molars, and causes delay in their development. This was observed in various degree depending on the dose, time of irradiation, and time after irradiation. This belated development was manifested in morphogenesis, histogenesis, and odontoblasts and ameloblastis cyto and functional differentiations. The study showed that the delay in the development-stages of the 2nd lower molar, under control, if compared with the same process, to which is exposed, the 1st lower molar - within two days difference - dose not diminish the later irradiation effect on the 2nd molar, when compared with the immediate irradiation effect in the 1st molar (demonstrated in a previous study by Osman and Al-Achkar, 2001). On the contrary, the present study showed that the 2nd lower molar is more radiosensitive to various doses than the 1st lower molar. Also it showed the irradiation with two doses 4 and 6 Gy leads to a delay in the formation of the 3rd lower molar's bud, and it does not go deeper beyond the lower molar. (Author)

  18. The Influence of Instructional Climates on Time Spent in Management Tasks and Physical Activity of 2nd-Grade Students during Physical Education

    Science.gov (United States)

    Logan, Samuel W.; Robinson, Leah E.; Webster, E. Kipling; Rudisill, Mary E.

    2015-01-01

    The purpose of this study is to determine the effect of two physical education (PE) instructional climates (mastery, performance) on the percentage of time students spent in a) moderate-to-vigorous physical activity (MVPA) and b) management tasks during PE in 2nd-grade students. Forty-eight 2nd graders (mastery, n = 23; performance, n = 25)…

  19. GONe: Software for estimating effective population size in species with generational overlap

    Science.gov (United States)

    Coombs, J.A.; Letcher, B.H.; Nislow, K.H.

    2012-01-01

    GONe is a user-friendly, Windows-based program for estimating effective size (N e) in populations with overlapping generations. It uses the Jorde-Ryman modification to the temporal method to account for age structure in populations. This method requires estimates of age-specific survival and birth rate and allele frequencies measured in two or more consecutive cohorts. Allele frequencies are acquired by reading in genotypic data from files formatted for either GENEPOP or TEMPOFS. For each interval between consecutive cohorts, N e is estimated at each locus and over all loci. Furthermore, N e estimates are output for three different genetic drift estimators (F s, F c and F k). Confidence intervals are derived from a chi-square distribution with degrees of freedom equal to the number of independent alleles. GONe has been validated over a wide range of N e values, and for scenarios where survival and birth rates differ between sexes, sex ratios are unequal and reproductive variances differ. GONe is freely available for download at. ?? 2011 Blackwell Publishing Ltd.

  20. Mesocosm soil ecological risk assessment tool for GMO 2nd tier studies

    DEFF Research Database (Denmark)

    D'Annibale, Alessandra; Maraldo, Kristine

    Ecological Risk Assessment (ERA) of GMO is basically identical to ERA of chemical substances, when it comes to assessing specific effects of the GMO plant material on the soil ecosystem. The tiered approach always includes the option of studying more complex but still realistic ecosystem level effects in 2nd tier caged experimental systems, cf. the new GMO ERA guidance: EFSA Journal 2010; 8(11):1879. We propose to perform a trophic structure analysis, TSA, and include the trophic structure as an ecological endpoint to gain more direct insight into the change in interactions between species, i.e. the food-web structure, instead of relying only on the indirect evidence from population abundances. The approach was applied for effect assessment in the agro-ecosystem where we combined factors of elevated CO2, viz. global climate change, and GMO plant effects. A multi-species (Collembola, Acari and Enchytraeidae) mesocosm factorial experiment was set up in a greenhouse at ambient CO2 and 450 ppm CO2 with a GM barley variety and conventional varieties. The GM barley differed concerning the composition of amino acids in the grain (antisense C-hordein line). The fungicide carbendazim acted as a positive control. After 5 and 11 weeks, data on populations, plants and soil organic matter decomposition were evaluated. Natural abundances of stable isotopes, 13C and 15N, of animals, soil, plants and added organic matter (crushed maize leaves) were used to describe the soil food web structure.

  1. Proceedings of the 2nd CSNI Specialist Meeting on Simulators and Plant Analysers

    International Nuclear Information System (INIS)

    The safe utilisation of nuclear power plants requires the availability of different computerised tools for analysing the plant behaviour and training the plant personnel. These can be grouped into three categories: accident analysis codes, plant analysers and training simulators. The safety analysis of nuclear power plants has traditionally been limited to the worst accident cases expected for the specific plant design. Many accident analysis codes have been developed for different plant types. The scope of the analyses has continuously expanded. The plant analysers are now emerging tools intended for extensive analysis of the plant behaviour using a best estimate model for the whole plant including the reactor and full thermodynamic process, both combined with automation and electrical systems. The comprehensive model is also supported by good visualisation tools. Training simulators with real time plant model are tools for training the plant operators to run the plant. Modern training simulators have also features supporting visualisation of the important phenomena occurring in the plant during transients. The 2nd CSNI Specialist Meeting on Simulators and Plant Analysers in Espoo attracted some 90 participants from 17 countries. A total of 49 invited papers were presented in the meeting in addition to 7 simulator system demonstrations. Ample time was reserved for the presentations and informal discussions during the four meeting days. (orig.)

  2. Transient 2(nd) Degree Av Block Mobitz Type II: A Rare Finding in Dengue Haemorrhagic Fever.

    Science.gov (United States)

    Nigam, Ashwini Kumar; Singh, Omkar; Agarwal, Ayush; Singh, Amit K; Yadav, Subhash

    2015-05-01

    Dengue has been a major problem as endemic occurs almost every year and causes a state of panic due to lack of proper diagnostic methods and facilities for proper management. Patients presenting with classical symptoms are easy to diagnose, however as a large number of cases occur every year, a number of cases diagnosed with dengue fever on occasion presents with atypical manifestations, which cause extensive evaluation of the patients, unnecessary referral to higher centre irrespective of the severity and therefore a rough idea of these manifestations must be present in the backdrop in order to prevent these problems. Involvement of cardiovascular system in dengue has been reported in previous studies, and they are usually benign and self-limited. The importance of study of conduction abnormalities is important as sometimes conduction blocks are the first sign of acute myocarditis in patients of Dengue Hemorrhagic Fever in shock. We present here a case of 2(nd) Degree Mobitz Type II atrioventricular AV block in a case of Dengue Hemorrhagic fever reverting to the normal rhythm in recovery phase and no signs thereafter on follow up. PMID:26155512

  3. DRS // CUMULUS Oslo 2013. The 2nd International Conference for Design Education Researchers

    Directory of Open Access Journals (Sweden)

    Liv Merete Nielsen

    2013-01-01

    Full Text Available 14-17 May 2013, Oslo, NorwayWe have received more than 200 full papers for the 2nd International Conference for Design Education Researchers in Oslo.This international conference is a springboard for sharing ideas and concepts about contemporary design education research. Contributors are invited to submit research that deals with different facets of contemporary approaches to design education research. All papers will be double-blind peer-reviewed. This conference is open to research in any aspect and discipline of design educationConference themeDesign Learning for Tomorrow - Design Education from Kindergarten to PhDDesigned artefacts and solutions influence our lives and values, both from a personal and societal perspective. Designers, decision makers, investors and consumers hold different positions in the design process, but they all make choices that will influence our future visual and material culture. To promote sustainability and meet global challenges for the future, professional designers are dependent on critical consumers and a design literate general public.  For this purpose design education is important for all. We propose that design education in general education represents both a foundation for professional design education and a vital requirement for developing the general public’s competence for informed decision making.REGISTRATION AT http://www.hioa.no/DRScumulus

  4. Production of artificial ionospheric layers by frequency sweeping near the 2nd gyroharmonic

    Directory of Open Access Journals (Sweden)

    T. Pedersen

    2011-01-01

    Full Text Available Artificial ionospheric plasmas descending from the background F-region have been observed on multiple occasions at the High Frequency Active Auroral Research Program (HAARP facility since it reached full 3.6 MW power. Proximity of the transmitter frequency to the 2nd harmonic of the electron gyrofrequency (2fce has been noted as a requirement for their occurrence, and their disappearance after only a few minutes has been attributed to the increasing frequency mismatch at lower altitudes. We report new experiments employing frequency sweeps to match 2fce in the artificial plasmas as they descend. In addition to revealing the dependence on the 2fce resonance, this technique reliably produces descending plasmas in multiple transmitter beam positions and appears to increase their stability and lifetime. High-speed ionosonde measurements are used to monitor the altitude and density of the artificial plasmas during both the formation and decay stages.

  5. Academic Training - 2nd Term: 08.01.2007 - 31.03.2007

    CERN Multimedia

    2006-01-01

    2006 - 2007 ACADEMIC TRAINING PROGRAMME 2nd Term : 08.01.2007 - 31.03.2007 LECTURE SERIES Applied Superconductivity by V. Palmieri, INFN, Padova, It. 17, 18, 19 January 11:00 -1200 - Auditorium, Bldg 500 String Theory for Pedestrians by B. Zwiebach, M.I.T. Cambridge, USA 29, 30, 31 January 11:00-12:00 - Auditorium, Bldg 500 on 29, 30 January TH Auditorium on 31 January Introduction to Supersymmetry by D. Kaplan, John Hopkins University, Baltimore, USA 12, 13, 14, 15 February 11:00-12:00 - Auditorium, Bldg 500 The Hunt for the Higgs Particle by F. Zwirner, University of Padova, It 27, 28 February, 1st March 11:00-12:00 - Auditorium, Bldg 500 From Evolution Theory to Parallel and Distributed Genetic by F. Fernandez de Vega 15, 16, March 11:00-12:00 - Auditorium, Bldg 500 The lectures are open to all those interested, without application. The abstract of the lectures, as well as any change to the above information (title, dates, time, place etc.) will be published in the CERN bulletin, the WWW, an...

  6. Academic Training - 2nd Term: 08.01.2007 - 31.03.2007

    CERN Multimedia

    2006-01-01

    2006 - 2007 ACADEMIC TRAINING PROGRAMME 2nd Term : 08.01.2007 - 31.03.2007 LECTURE SERIES Applied Superconductivity by V. Palmieri, INFN, Padova, It. 17, 18, 19 January 11:00 -1200 - Auditorium, bldg. 500 String Theory for Pedestrians by B. Zwiebach, M.I.T. Cambridge, USA 29, 30, 31 January 11:00-12:00 - Auditorium, bldg. 500 on 29, 30 January TH Auditorium on 31 January Introduction to Supersymmetry by D. Kaplan, John Hopkins University, Baltimore, USA 12, 13, 14, 15 February 11:00-12:00 - Auditorium, bldg. 500 The Hunt for the Higgs Particle by F. Zwirner, University of Padova, It 27, 28 February, 1st March 11:00-12:00 - Auditorium, bldg. 500 From Evolution Theory to Parallel and Distributed Genetic Programming by F. Fernandez de Vega 15, 16, March 11:00-12:00 - Auditorium, bldg. 500 The lectures are open to all those interested, without application. The abstract of the lectures, as well as any change to the above information (title, dates, time, place etc.) will be published in the WWW, and ...

  7. Study of Application for Excursion Observation Method in Primary School 2nd Grade Social Studies

    Directory of Open Access Journals (Sweden)

    Ahmet Ali GAZEL

    2014-04-01

    Full Text Available This study aims to investigate how field trips are conducted at 2nd grade of primary schools as a part of social studies course. Data for this research is compiled from 143 permanent Social Studies teachers working throughout 2011–2012 Education Year in the primary schools of central Kütahya and its districts. Data is compiled by using descriptive search model. In the research, after taking expert opinions, a measuring tool developed by the researcher is used. Data obtained from the research were transferred to computer, and analyses were made. In the analysis of the data, frequency and percentage values have been used to determine the distribution. Also a single factor variance analysis and t-test for independent samples have been used to determine the significance of difference between the variables. As a result of the research, it has been realized that insufficient importance is given to field trip method in Social Studies lessons. Most of the teachers using this method apply it in spring months. Teachers usually make use of field trips independent from unit/topic to increase the students’ motivation, and they generally use verbal expression in the class after tours. The biggest difficulty teachers encounter while using tour-observation method is the students’ undisciplined behavior.

  8. PREFACE: 2nd International Conference on Competitive Materials and Technological Processes (IC-CMTP2)

    Science.gov (United States)

    László, Gömze A.

    2013-12-01

    Competitiveness is one of the most important factors in our life and it plays a key role in the efficiency both of organizations and societies. The more scientifically supported and prepared organizations develop more competitive materials with better physical, chemical and biological properties and the leading companies apply more competitive equipment and technology processes. The aims of the 2nd International Conference on Competitive Materials and Technology Processes (ic-cmtp2) are the following: Promote new methods and results of scientific research in the fields of material, biological, environmental and technology sciences; Change information between the theoretical and applied sciences as well as technical and technological implantations. Promote the communication between the scientist of different nations, countries and continents. Among the major fields of interest are materials with extreme physical, chemical, biological, medical, thermal, mechanical properties and dynamic strength; including their crystalline and nano-structures, phase transformations as well as methods of their technological processes, tests and measurements. Multidisciplinary applications of materials science and technological problems encountered in sectors like ceramics, glasses, thin films, aerospace, automotive and marine industry, electronics, energy, construction materials, medicine, biosciences and environmental sciences are of particular interest. In accordance to the program of the conference ic-cmtp2, more than 250 inquiries and registrations from different organizations were received. Researchers from 36 countries in Asia, Europe, Africa, North and South America arrived at the venue of conference. Including co-authors, the research work of more than 500 scientists are presented in this volume. Professor Dr Gömze A László Chair, ic-cmtp2 The PDF also contains lists of the boards, session chairs and sponsors.

  9. La sorpresiva congruencia democrática del 2 de diciembre / The Surprising Democratic Congruence of December 2nd

    Scientific Electronic Library Online (English)

    Pedro, Nikken.

    2008-08-01

    Full Text Available El artículo comienza por subrayar, y valorar positivamente, que en la Venezuela polarizada de hoy se hayan podido procesar democráticamente y sin violencia los resultados del referendo sobre la reforma constitucional. Pasa luego a evaluar las condiciones políticas imperantes en Venezuela luego de la [...] abrumadora victoria electoral del Presidente en diciembre de 2006. Entre esas condiciones destacan los llamados “cinco motores de la revolución”, siendo uno de ellos el de “la reforma constitucional”. A continuación se señalan los que el autor considera los contenidos más resaltantes de la propuesta de reforma original del Presidente. Evalúa las razones de los resultados electorales desfavorables a la propuesta de reforma, considerando contenidos mismos de la propuesta, debilidades del sector oficialista para ese debate y fortalezas del sector opositor. Concluye el artículo presentando las principales consecuencias de los resultados electorales del 2 de diciembre para la realidad sociopolítica venezolana. Abstract in english The article begins by underlining positively the fact that in the results of the referendum for the reform of the Constitution were possible to achieve in democracy and without violence in the currently polarized Venezuela. Later, it evaluates contemporary political conditions in Venezuela after the [...] overwhelming electoral victory of the President in December 2006. Among these conditions outstands “la reforma constitucional” (The Reform of the Constitution) as one of the "Los cinco motores de la revolución” (The Five Engines of the Revolution). Then the author pointed the contents considered most relevant from the original proposal for the reform presented the President. The reasons of the electoral unfavourable results to the reform proposal, taking into consideration the contents of the offer itself, the weaknesses of the official sector to set this debate and the strengths of opposing sectors, are evaluated. The article concludes presenting the principal consequences of the electoral results of December 2nd for Venezuelan social-political reality.

  10. Archaeometric study of glass beads from the 2nd century BC cemetery of Numantia

    Directory of Open Access Journals (Sweden)

    García Heras, Manuel

    2003-06-01

    Full Text Available Recent archaeologícalf ieldwork undertaken in the Celtiberian cremation necropolis of Numantia (Soria, Spain has provided a group of glass beads from the 2nd century BC. Such glass beads were part, together with other metallic and ceramic items, of the offerings deposited with the dead. They are ring-shaped in typology and deep-blue, amber, or semitransparent white in colour. This paper reports results derived from the chemical and microstructural characterization carried out on a representative sample set of this group of beads. The main goal of the research was to find out about their production technology to explore their probable provenance. In addítion, corrosion mechanisms were also assessed to determine the influence of crematíon on the beads' structure. The resulting data suggest that these blue and amber beads were made using soda-lime silicate glass, whereas semi-transparent white ones were manufactured from alumino-silicate glass. It has also determined that some transition metal oxides were used as chromophores, as well as lead oxide for decoration.

    La reciente excavación de la necrópolis celtibérica de Numancia (Garray, Soria ha permitido recuperar un conjunto de cuentas de vidrio del siglo II a.C. Las cuentas, junto con otros objetos de metaly cerámica, formaban parte de las ofrendas depositadas con el difunto, siendo de tipología anular y coloreadas en azul oscuro, ambar y blanco semitransparente. Este trabajo presenta los resultados obtenidos en la caracterización química y microestructural de una muestra representativa de este conjunto. El objetivo principal de la investigación consistió en recabar información sobre su tecnología de manufactura y evaluar su posible procedencia. Asimismo, también se investigaron sus mecanismos de corrosión para determinar si la cremación había inducido cambios en su estructura. Los resultados indican que las cuentas azules y ámbar se realizaron con vidrio de silicato sódico cálcico y las blancas semitransparentes con vidrio de aluminosilicato, utilizando óxidos de metales de transición como cromóforos y óxido de plomo para la decoración.

  11. THINKLET: ELEMENTO CLAVE EN LA GENERACIÓN DE MÉTODOS COLABORATIVOS PARA EVALUAR USABILIDAD DE SOFTWARE / THINKLET: KEY ELEMENT IN THE COLLABORATIVE METHODS GENERATION FOR EVALUATE SOFTWARE USABILITY

    Scientific Electronic Library Online (English)

    Andrés, Solano Alegría; Yenny, Méndez Alegría; César, Collazos Ordóñez.

    2010-07-01

    Full Text Available En la actualidad, la usabilidad es un atributo fundamental para el éxito de un producto software. La competitividad entre organizaciones obliga a mejorar el nivel de usabilidad de los productos, debido al riesgo que existe de perder clientes, si el producto no es fácil de usar y/o fácil de aprender. [...] Aunque se han establecido métodos para evaluar la usabilidad de productos software, la mayoría de estos métodos no consideran la posibilidad de involucrar a varias personas trabajando de forma colaborativa en el proceso de evaluación. Por esta razón, convendría utilizar la Metodología para el Diseño de Métodos de Evaluación de Usabilidad Colaborativos, de tal forma que se diseñen métodos que permitan a varias personas de diversas áreas de conocimiento, trabajar de forma colaborativa en el proceso de evaluación. Este artículo presenta de forma general, la metodología mencionada y hace especial énfasis en los thinklets, como elementos clave para el diseño de procesos colaborativos. Abstract in english Currently, usability is a critical attribute to success of software. The competition among organizations forces to improve the level of product usability due to the risk of losing customers if product would not be easy to use and/or easy to learn. Methods have been established to evaluate the usabil [...] ity of software products; however, most of these methods don't take into account the possibility to involve several people working collaboratively in the evaluation process. Therefore, Methodology for Design of Collaborative Usability Evaluation Methods should be used to design methods that allow several people from a range of knowledge areas to work collaboratively in the evaluation process. This paper presents the methodology mentioned and gives special emphasis on Thinklets, as key elements for design of collaborative processes.

  12. Colourful, Courageous and Community-building : - Reflections from the organizer of the 2nd Nordic STS Conference

    DEFF Research Database (Denmark)

    Jensen, Torben Elgaard

    2015-01-01

    - Reflections from the organizer of the 2nd Nordic STS Conference Abstract The 2nd Nordic STS conference, held in Copenhagen 2015, was an occassion to take stock of the current trends and developments of Nordic STS. In this paper, the leading organizer reflects on the event and characterises contemporary Nordic STS as colourful (spanning a wide range of perspectives and empirical topics), courageous (critical, reflexive but also willing to take on collaborative roles), and community building (sharing commitment to a number of topics and issues). Speculating on future developments, he suggests that Nordic STS will be receive impetus for change and transformation from at least four sources: (1) The steady stream of new controversial scientific and technological developments. (2) The increasing commitment to gather interdisciplinary collaboration and research funding around grand societal challenges. (3) Methodological developments wihin STS. (4) The increasing number of collaborative roles that STS, as a mature discipline, will be invited to take up.

  13. Proceedings of the 2nd international advisory committee on biomolecular dynamics instrument DNA in MLF at J-PARC

    International Nuclear Information System (INIS)

    The 2nd International Advisory Committee on the 'Biomolecular Dynamics Backscattering Spectrometer DNA' was held on November 12th - 13th, 2008 at J-PARC Center, Japan Atomic Energy Agency. This IAC has been organized for aiming to realize an innovative neutron backscattering instrument in the Materials and Life Science Experimental Facility (MLF) at the J-PARC and therefore four leading scientists in the field of neutron backscattering instruments has been selected as the member (Dr. Dan Neumann (Chair); Prof. Ferenc Mezei; Dr. Hannu Mutka; Dr. Philip Tregenna-Piggott), and the 1st IAC had been held on February 27th - 29th, 2008. This report includes the executive summary and materials of the presentations in the 2nd IAC. (author)

  14. Report on the 2nd International Consortium on Hallucination Research: Evolving Directions and Top-10 “Hot Spots” in Hallucination Research

    OpenAIRE

    Waters, Flavie; Woods, Angela; Fernyhough, Charles

    2013-01-01

    This article presents a report on the 2nd meeting of the International Consortium on Hallucination Research, held on September 12th and 13th 2013 at Durham University, UK. Twelve working groups involving specialists in each area presented their findings and sought to summarize the available knowledge, inconsistencies in the field, and ways to progress. The 12 working groups reported on the following domains of investigation: cortical organisation of hallucinations, nonclinical hallucinations,...

  15. What do 2nd and 10th Graders Have in Common? Worms and Technology: Using Technology to Collaborate Across Boundaries

    OpenAIRE

    Patti Culver; Angie Culbert; Judy McEntyre; Patrick Clifton; Donna F. Herring; Charles E. Notar

    2009-01-01

    The article is about the collaboration between two classrooms that enabled a second grade class to participate in a high school biology class. Through the use of modern video conferencing equipment, Mrs. Culbert, with the help of the Dalton State College Educational Technology Training Center (ETTC), set up a live, two way video and audio feed of the lab, across town, to Mrs. Patty Culver’s 2nd grade classroom.

  16. Influence of socio-cultural environment on development of childrens musical talents in 2nd trienium of primary school

    OpenAIRE

    Antolin, Petra

    2014-01-01

    The thesis examines the impact of socio-cultural environment on the development of musical talent among pupils in the 2nd three years of primary school. Thesis begins with a closer look at the definition of giftedness, which may be general or specific (partial). Specific giftedness means that children achieve above-average results in one area only, while general gifted children achieve above-average results in multiple different areas. The characteristics of gifted pupils which distinguish th...

  17. Temperature dependent 2nd derivative absorbance spectroscopy of aromatic amino acids as a probe of protein dynamics

    OpenAIRE

    Esfandiary, Reza; Hunjan, Jagtar S.; Lushington, Gerald H; Joshi, Sangeeta B.; Middaugh, C. Russell

    2009-01-01

    Proteins display a broad peak in 250–300 nm region of their UV spectrum containing multiple overlapping bands arising from the aromatic rings of phenylalanine, tyrosine, and tryptophan residues. Employing high resolution 2nd derivative absorbance spectroscopy, these overlapping absorption bands can be highly resolved and therefore provide a very sensitive measure of changes in the local microenvironment of the aromatic side chains. This has traditionally been used to detect both subtle and dr...

  18. 2nd International Salzburg Conference on Neurorecovery (ISCN 2013) Salzburg/ Austria | November 28th - 29th, 2013

    OpenAIRE

    Brainin, M.; Muresanu, D; Slavoaca, D

    2014-01-01

    The 2nd International Salzburg Conference on Neurorecovery was held on the 28th and 29th of November, 2013, in Salzburg, one of the most beautiful cities in Austria, which is well known for its rich cultural heritage, world-famous music and beautiful surrounding landscapes. The aim of the conference was to discuss the progress in the field of neurorecovery. The conference brought together internationally renowned scientists and clinicians, who described the clinical and therapeutic relevance ...

  19. A Study of Performance and Effort Expectancy Factors among Generational and Gender Groups to Predict Enterprise Social Software Technology Adoption

    Science.gov (United States)

    Patel, Sunil S.

    2013-01-01

    Social software technology has gained considerable popularity over the last decade and has had a great impact on hundreds of millions of people across the globe. Businesses have also expressed their interest in leveraging its use in business contexts. As a result, software vendors and business consumers have invested billions of dollars to use…

  20. Public Health Genomics European Network: Report from the 2nd Network Meeting in Rome

    Directory of Open Access Journals (Sweden)

    Nicole Rosenkötter

    2007-03-01

    Full Text Available

    Dear Sirs,

    The Public Health Genomics European Network (PHGEN is a mapping exercise for the responsible and effective integration of genome-based knowledge and technologies into public policy and health services for the benefit of population health. In 2005, the European Commission called for a “networking exercise…to lead to an inventory report on genetic determinants relevant for public health”[1], this lead to the funding of a PHGEN three year project (EC project 2005313.This project started in early 2006 with a kick-off meeting in Bielefeld / Germany.The project work is comprised of, according to the public health trias, three one year periods of assessment, policy development and assurance.At the end of the assessment phase a network meeting was held in Rome from January, 31st to February 2nd 2007 with over 90 network members and network observers in attendance. The participants represented different organisations throughout the European Union with expertise in areas such as human genetics and other medical disciplines,epidemiology,public health, law, ethics, political and social sciences. The aim of the meeting was to wrap up the last year’s assessment period and to herald the policy development phase.The assessment period of PHGEN was characterised by several activities: - Contact and cooperation with other European and internationally funded networks and projects on public health genomics or related issues (e.g. EuroGenetest, EUnetHTA, Orphanet, IPTS, PHOEBE, GRaPHInt, P3G - Identification of key experts in public health genomics in the European members states, applicant countries and EFTA/EEA countries from different disciplines (e.g. human genetics and other medical disciplines, public health, law, philosophy, epidemiology, political and social sciences - Building up national task forces on public health genomics in the above mentioned countries - Establishing and work in three working groups: public health genomics definitions, genetic exceptionalism and public health genomics issues and priorities - Participation in the development process on OECD and European Council documents on genetic testing - Dissemination of results in journals, on websites and in conferences.

  1. White Paper Summary of 2nd ASTM International Workshop on Hydrides in Zirconium Alloy Cladding

    Energy Technology Data Exchange (ETDEWEB)

    Sindelar, R. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Louthan, M. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); PNNL, B. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-05-29

    This white paper recommends that ASTM International develop standards to address the potential impact of hydrides on the long term performance of irradiated zirconium alloys. The need for such standards was apparent during the 2nd ASTM International Workshop on Hydrides in Zirconium Alloy Cladding and Assembly Components, sponsored by ASTM International Committee C26.13 and held on June 10-12, 2014, in Jackson, Wyoming. The potentially adverse impacts of hydrogen and hydrides on the long term performance of irradiated zirconium-alloy cladding on used fuel were shown to depend on multiple factors such as alloy chemistry and processing, irradiation and post irradiation history, residual and applied stresses and stress states, and the service environment. These factors determine the hydrogen content and hydride morphology in the alloy, which, in turn, influence the response of the alloy to the thermo-mechanical conditions imposed (and anticipated) during storage, transport and disposal of used nuclear fuel. Workshop presentations and discussions showed that although hydrogen/hydride induced degradation of zirconium alloys may be of concern, the potential for occurrence and the extent of anticipated degradation vary throughout the nuclear industry because of the variations in hydrogen content, hydride morphology, alloy chemistry and irradiation conditions. The tools and techniques used to characterize hydrides and hydride morphologies and their impacts on material performance also vary. Such variations make site-to-site comparisons of test results and observations difficult. There is no consensus that a single material or system characteristic (e.g., reactor type, burnup, hydrogen content, end-of life stress, alloy type, drying temperature, etc.) is an effective predictor of material response during long term storage or of performance after long term storage. Multi-variable correlations made for one alloy may not represent the behavior of another alloy exposed to identical conditions and the material responses to thermo-mechanical exposures will be different depending on the materials and systems used. The discussions at the workshop showed several gaps in the standardization of processes and techniques necessary to assess the long term performance of irradiated zirconium alloy cladding during dry storage and transport. The development of, and adherence to, standards to help bridge these gaps will strengthen the technical basis for long term storage and post-storage operations, provide consistency across the nuclear industry, maximize the value of most observations, and enhance the understanding of behavioral differences among alloys. The need for, and potential benefits of, developing the recommended standards are illustrated in the various sections of this report.

  2. Open3DGRID : An open-source software aimed at high-throughput generation of molecular interaction fields (MIFs)

    DEFF Research Database (Denmark)

    Tosco, Paolo; Balle, Thomas

    Description Open3DGRID is an open-source software aimed at high-throughput generation of molecular interaction fields (MIFs). Open3DGRID can generate steric potential, electron density and MM/QM electrostatic potential fields; furthermore, it can import GRIDKONT binary files produced by GRID and CoMFA/CoMSIA fields (exported from SYBYL with the aid of a small SPL script). High computational performance is attained through implementation of parallelized algorithms for MIF generation. Most prominent features in Open3DGRID include: •Seamless integration with OpenBabel, PyMOL, GAUSSIAN, FIREFLY, GAMESS-US, TURBOMOLE, MOLDEN, Molecular Discovery GRID •Multi-threaded computation of MIFs (both MM and QM); support for MMFF94 and GAFF force-fields with automated assignment of atom types to the imported molecular structures •Human and machine-readable text output, integrated with 3D maps in several formats to allow visualization of results in PyMOL, MOE, Maestro and SYBYL •User-friendly interface toall major QM packages (e.g. GAUSSIAN, FIREFLY, GAMESS-US, TURBOMOLE, MOLDEN), allows calculation of QM electron density and electrostatic potential 3D maps from within Open3DGRID •User-friendly interface to Molecular Discovery GRID to compute GRID MIFs from within Open3DGRID Open3DGRID is controlled through a command line interface; commands can be either entered interactively from a command prompt or read from a batch script. If PyMOL is installed on the system while Open3DGRID is being operated interactively, the setup of 3D grid computations can be followed in real time on PyMOL's viewport, allowing to tweak grid size and training/test set composition very easily. The main output is arranged as human-readable plain ASCII text, while a number of additional files are generated to store data and to export the results of computations for further analysis and visualization with third party tools. In particular, Open3DGRID can export 3D maps for visualization in PyMOL, MOE, Maestro and SYBYL. Open3DGRID is written in C; while pre-built binaries are available for mainstream operating systems (Windows 32/64-bit, Linux 32/64-bit, Solaris x86 32/64-bit, FreeBSD 32/64-bit, Intel Mac OS X 32/64-bit), source code is portable and can be compiled under any *NIX platform supporting POSIX threads. The modular nature of the code allows for easy implementation of new features, so that the core application can be customized to meet individual needs. A detailed ChangeLog is kept to keep track of the additions and modifications during Open3DGRID's development.

  3. Herramienta software para el análisis de canasta de mercado sin selección de candidatos / Software tool for analysing the family shopping basket without candidate generation

    Scientific Electronic Library Online (English)

    Roberto Carlos, Naranjo Cuervo; Luz Marina, Sierra Martínez.

    2009-04-01

    Full Text Available Actualmente en el entorno del comercio electrónico es necesario contar con herramientas que permitan obtener conocimiento útil que brinde soporte a la toma de decisiones de marketing; para ello se necesita de un proceso que utiliza una serie de técnicas para el procesamiento de los datos, entre ella [...] s se encuentra la minería de datos, que permite llevar a cabo un proceso de descubrimiento de información automático. Este trabajo tiene como objetivo presentar la técnica de reglas de asociación como la adecuada para descubrir cómo compran los clientes en una empresa que ofrece un servicio de comercio electrónico tipo B2C, con el fin de apoyar la toma de decisiones para desarrollar ofertas hacia sus clientes o cautivar nuevos. Para la implementación de las reglas de asociación existe una variedad de algoritmos como: A priori, DHP, Partition, FP-Growth y Eclat y para seleccionar el más adecuado se define una serie de criterios (Danger y Berlanga, 2001), entre los que se encuentran: inserciones a la base de datos, costo computacional, tiempo de ejecución y rendimiento, los cuales se analizaron en cada algoritmo para realizar la selección. Además, se presenta el desarrollo de una herramienta software que contempla la metodología CRISP-DM constituida por cuatro submódulos, así: Preprocesamiento de datos, Minería de datos, Análisis de resultados y Aplicación de resultados. El diseño de la aplicación utiliza una arquitectura de tres capas: Lógica de presentación, Lógica del Negocio y Lógica de servicios; dentro del proceso de construcción de la herramienta se incluye el diseño de la bodega de datos y el diseño de algoritmo como parte de la herramienta de minería de datos. Las pruebas hechas a la herramienta de minería de datos desarrollada se realizaron con una base de datos de la compañía FoodMart3. Estas pruebas fueron de: rendimiento, funcionalidad y confiabilidad en resultados, las cuales permiten encontrar reglas de asociación igualmente. Los resultados obtenidos facilitaron concluir, entre otros aspectos, que las reglas de asociación como técnica de minería de datos permiten analizar volúmenes de datos para servicios de comercio electrónico tipo B2C, lo cual es una ventaja competitiva para las empresas. Abstract in english Tools leading to useful knowledge being obtained for supporting marketing decisions being taken are currently needed in the ecommerce environment. A process is needed for this which uses a series of techniques for data-processing; data-mining is one such technique enabling automatic information disc [...] overy. This work presents the association rules as a suitable technique for discovering how customers buy from a company offering business to consumer (B2C) e-business, aimed at supporting decision-making in supplying its customers or capturing new ones. Many algorithms such as A priori, DHP, Partition, FP-Growth and Eclat are available for implementing association rules; the following criteria were defined for selecting the appropriate algorithm: database insert, computational cost, performance and execution time. The development of a software tool is also presented which involved the CRISP-DM approach; this software tool was formed by the following four sub-modules: data pre-processing, data-mining, results analysis and results application. The application design used three-layer architecture: presentation logic, business logic and service logic. Data warehouse design and algorithm design were included in developing this data-mining software tool. It was tested by using a FoodMart company database; the tests included performance, functionality and results’ validity, thereby allowing association rules to be found. The results led to concluding that using association rules as a data mining technique facilitates analysing volumes of information for B2C e-business services which represents a competitive advantage for those companies using Internet as their sales’ media.

  4. GENERACIÓN AUTOMÁTICA DE APLICACIONES SOFTWARE A PARTIR DEL ESTANDAR MDA BASÁNDOSE EN LA METODOLOGÍA DE SISTEMAS EXPERTOS E INTELIGENCIA ARTIFICIAL / AUTOMATIC GENERATION OF SOFTWARE APPLICATIONS FROM STANDARD MDA STANDARD BASED ON THE METHOD OF ARTIFICIAL INTELLIGENCE AND EXPERT SYSTEMS

    Directory of Open Access Journals (Sweden)

    IVÁN MAURICIO RUEDA CÁCERES

    2011-04-01

    Full Text Available RESUMEN ANALÍTICO Son muchos los estudios que se han presentado a cerca de la generación automática de líneas de código, este artículo pretende presentar una solución a las limitaciones de una herramienta muy conocida llamada MDA, haciendo uso los avances tecnológicos de la inteligencia artificial y los sistemas expertos. Abarca los principios del marco de trabajo de MDA, transformando los modelos usados y añadiendo características a estos que permitirán hacer más eficiente esta metodología de trabajo. El modelo propuesto abarca las fases del ciclo de vida software siguiendo las reglas del negocio que hacen parte esencial un proyecto real de software. Es con las reglas del negocio que se empieza a dar la transformación del estándar MDA y se pretende dar un aporte que contribuya a automatizar las reglas del negocio de forma tal que sirva para la definición de las aplicaciones en todo el ciclo de vida que la genera. ANALYTICAL SUMMARY Many studies are presented about automatic generation of code lines, this article want to present a solution for limitations of a tool called MDA, using from Artifcial intelligence technological advances and expert sistems. covering the principle of MDA work frame, transforming used models and adding characteristics to this that allow to make more effcient this work metodology. the proposed model covers the phases cycle life software, following the business rules that make essential part in a real software proyect. With the Business rules can start to transform the standard MDA aiming to give a contribution to automate the business rules that works to defne aplications in all the life's cicle that generate it.

  5. Free Open Source Software: FOSS Based GIS for Spatial Retrievals of Appropriate Locations for Ocean Energy Utilizing Electric Power Generation Plants

    OpenAIRE

    Kohei Arai

    2012-01-01

    Free Open Source Software: FOSS based Geographic Information System: GIS for spatial retrievals of appropriate locations for ocean wind and tidal motion utilizing electric power generation plants is proposed. Using scatterometer onboard earth observation satellites, strong wind coastal areas are retrieved with FOSS/GIS of PostgreSQL/GIS. PostGIS has to be modified together with altimeter and scatterometer database. These modification and database creation would be a good reference to the user...

  6. FOREWORD: 2nd International Workshop on New Computational Methods for Inverse Problems (NCMIP 2012)

    Science.gov (United States)

    Blanc-Féraud, Laure; Joubert, Pierre-Yves

    2012-09-01

    Conference logo This volume of Journal of Physics: Conference Series is dedicated to the scientific contributions presented during the 2nd International Workshop on New Computational Methods for Inverse Problems, (NCMIP 2012). This workshop took place at Ecole Normale Supérieure de Cachan, in Cachan, France, on 15 May 2012, at the initiative of Institut Farman. The first edition of NCMIP also took place in Cachan, France, within the scope of the ValueTools Conference, in May 2011 (http://www.ncmip.org/2011/). The NCMIP Workshop focused on recent advances in the resolution of inverse problems. Indeed inverse problems appear in numerous scientific areas such as geophysics, biological and medical imaging, material and structure characterization, electrical, mechanical and civil engineering, and finance. The resolution of inverse problems consists of estimating the parameters of the observed system or structure from data collected by an instrumental sensing or imaging device. Its success firstly requires the collection of relevant observation data. It also requires accurate models describing the physical interactions between the instrumental device and the observed system, as well as the intrinsic properties of the solution itself. Finally, it requires the design of robust, accurate and efficient inversion algorithms. Advanced sensor arrays and imaging devices provide high rate and high volume data; in this context, the efficient resolution of the inverse problem requires the joint development of new models and inversion methods, taking computational and implementation aspects into account. During this one-day workshop, researchers had the opportunity to bring to light and share new techniques and results in the field of inverse problems. The topics of the workshop were: algorithms and computational aspects of inversion, Bayesian estimation, kernel methods, learning methods, convex optimization, free discontinuity problems, metamodels, proper orthogonal decomposition, reduced models for the inversion, non-linear inverse scattering, image reconstruction and restoration, applications (bio-medical imaging, non-destructive evaluation etc). NCMIP 2012 was a one-day workshop. Each of the submitted papers was reviewed by 2 to 4 reviewers. Among the accepted papers, there are 8 oral presentations and 5 posters. Three international speakers were invited for a long talk. This second edition attracted 60 registered attendees in May 2012. NCMIP 2012 was supported by Institut Farman (ENS Cachan) and endorsed by the following French research networks (GDR ISIS, GDR Ondes, GDR MOA, GDR MSPC). The program committee acknowledges the following laboratories CMLA, LMT, LSV, LURPA, SATIE, as well as DIGITEO Network. Laure Blanc-Féraud and Pierre-Yves Joubert Workshop Co-chairs Laure Blanc-Féraud, I3S laboratory, CNRS, France Pierre-Yves Joubert, IEF laboratory, Paris-Sud University, CNRS, France Technical Program Committee Alexandre Baussard, ENSTA Bretagne, Lab-STICC, France Marc Bonnet, ENSTA, ParisTech, France Jerôme Darbon, CMLA, ENS Cachan, CNRS, France Oliver Dorn, School of Mathematics, University of Manchester, UK Mário Figueiredo, Instituto Superior Técnico, Lisbon, Portugal Laurent Fribourg, LSV, ENS Cachan, CNRS, France Marc Lambert, L2S Laboratory, CNRS, SupElec, Paris-Sud University, France Anthony Quinn, Trinity College, Dublin, Ireland Christian Rey, LMT, ENS Cachan, CNRS, France Joachim Weickert, Saarland University, Germany Local Chair Alejandro Mottini, Morpheme group I3S-INRIA Sophie Abriet, SATIE, ENS Cachan, CNRS, France Béatrice Bacquet, SATIE, ENS Cachan, CNRS, France Reviewers Gilles Aubert, J-A Dieudonné Laboratory, CNRS and University of Nice-Sophia Antipolis, France Alexandre Baussard, ENSTA Bretagne, Lab-STICC, France Laure Blanc-Féraud, I3S laboratory, CNRS, France Marc Bonnet, ENSTA, ParisTech, France Jerôme Darbon, CMLA, ENS Cachan, CNRS, France Oliver Dorn, School of Mathematics, University of Manchester, UK Gérard Favier, I3S laboratory, CNRS, France Mário Figueiredo, Instituto Superior Técnico, Lisb

  7. Development of Hydrologic Characterization Technology of Fault Zones -- Phase I, 2nd Report

    Energy Technology Data Exchange (ETDEWEB)

    Karasaki, Kenzi; Onishi, Tiemi; Black, Bill; Biraud, Sebastien

    2009-03-31

    This is the year-end report of the 2nd year of the NUMO-LBNL collaborative project: Development of Hydrologic Characterization Technology of Fault Zones under NUMO-DOE/LBNL collaboration agreement, the task description of which can be found in the Appendix 3. Literature survey of published information on the relationship between geologic and hydrologic characteristics of faults was conducted. The survey concluded that it may be possible to classify faults by indicators based on various geometric and geologic attributes that may indirectly relate to the hydrologic property of faults. Analysis of existing information on the Wildcat Fault and its surrounding geology was performed. The Wildcat Fault is thought to be a strike-slip fault with a thrust component that runs along the eastern boundary of the Lawrence Berkeley National Laboratory. It is believed to be part of the Hayward Fault system but is considered inactive. Three trenches were excavated at carefully selected locations mainly based on the information from the past investigative work inside the LBNL property. At least one fault was encountered in all three trenches. Detailed trench mapping was conducted by CRIEPI (Central Research Institute for Electric Power Industries) and LBNL scientists. Some intriguing and puzzling discoveries were made that may contradict with the published work in the past. Predictions are made regarding the hydrologic property of the Wildcat Fault based on the analysis of fault structure. Preliminary conceptual models of the Wildcat Fault were proposed. The Wildcat Fault appears to have multiple splays and some low angled faults may be part of the flower structure. In parallel, surface geophysical investigations were conducted using electrical resistivity survey and seismic reflection profiling along three lines on the north and south of the LBNL site. Because of the steep terrain, it was difficult to find optimum locations for survey lines as it is desirable for them to be as straight as possible. One interpretation suggests that the Wildcat Fault is westerly dipping. This could imply that the Wildcat Fault may merge with the Hayward Fault at depth. However, due to the complex geology of the Berkeley Hills, multiple interpretations of the geophysical surveys are possible. iv An effort to construct a 3D GIS model is under way. The model will be used not so much for visualization of the existing data because only surface data are available thus far, but to conduct investigation of possible abutment relations of the buried formations offset by the fault. A 3D model would be useful to conduct 'what if' scenario testing to aid the selection of borehole drilling locations and configurations. Based on the information available thus far, a preliminary plan for borehole drilling is outlined. The basic strategy is to first drill boreholes on both sides of the fault without penetrating it. Borehole tests will be conducted in these boreholes to estimate the property of the fault. Possibly a slanted borehole will be drilled later to intersect the fault to confirm the findings from the boreholes that do not intersect the fault. Finally, the lessons learned from conducting the trenching and geophysical surveys are listed. It is believed that these lessons will be invaluable information for NUMO when it conducts preliminary investigations at yet-to-be selected candidate sites in Japan.

  8. 2nd Radio and Antenna Days of the Indian Ocean (RADIO 2014)

    Science.gov (United States)

    2014-10-01

    It was an honor and a great pleasure for all those involved in its organization to welcome the participants to the ''Radio and Antenna Days of the Indian Ocean'' (RADIO 2014) international conference that was held from 7th to 10th April 2014 at the Sugar Beach Resort, Wolmar, Flic-en-Flac, Mauritius. RADIO 2014 is the second of a series of conferences organized in the Indian Ocean region. The aim of the conference is to discuss recent developments, theories and practical applications covering the whole scope of radio-frequency engineering, including radio waves, antennas, propagation, and electromagnetic compatibility. The RADIO international conference emerged following discussions with engineers and scientists from the countries of the Indian Ocean as well as from other parts of the world and a need was felt for the organization of such an event in this region. Following numerous requests, the Island of Mauritius, worldwide known for its white sandy beaches and pleasant tropical atmosphere, was again chosen for the organization of the 2nd RADIO international conference. The conference was organized by the Radio Society, Mauritius and the Local Organizing Committee consisted of scientists from SUPELEC, France, the University of Mauritius, and the University of Technology, Mauritius. We would like to take the opportunity to thank all people, institutions and companies that made the event such a success. We are grateful to our gold sponsors CST and FEKO as well as URSI for their generous support which enabled us to partially support one PhD student and two scientists to attend the conference. We would also like to thank IEEE-APS and URSI for providing technical co-sponsorship. More than hundred and thirty abstracts were submitted to the conference. They were peer-reviewed by an international scientific committee and, based on the reviews, either accepted, eventually after revision, or rejected. RADIO 2014 brought together participants from twenty countries spanning five continents: Australia, Botswana, Brazil, Canada, China, Denmark, France, India, Italy, Mauritius, Poland, Reunion Island, Russia, South Africa, South Korea, Spain, Switzerland, The Netherlands, United Kingdom, and USA. The conference featured eleven oral sessions and one poster session on state-of-the-art research themes. Three internationally recognized scientists delivered keynote speeches during the conference. Prizes for the first and second Best Student Papers were awarded during the closing ceremony. Following the call for the extended contributions for publication as a volume in the IOP Conference Series: Materials Science and Engineering (MSE), both on-line and in print, we received thirty-two full papers. All submitted contributions were then peer-reviewed, revised whenever necessary, and accepted or rejected based on the recommendations of the reviewers of the editorial board. At the end of the procedure, twenty-five of them have been accepted for publication in this volume.

  9. Development of Hydrologic Characterization Technology of Fault Zones: Phase I, 2nd Report

    International Nuclear Information System (INIS)

    This is the year-end report of the 2nd year of the NUMO-LBNL collaborative project: Development of Hydrologic Characterization Technology of Fault Zones under NUMO-DOE/LBNL collaboration agreement, the task description of which can be found in the Appendix 3. Literature survey of published information on the relationship between geologic and hydrologic characteristics of faults was conducted. The survey concluded that it may be possible to classify faults by indicators based on various geometric and geologic attributes that may indirectly relate to the hydrologic property of faults. Analysis of existing information on the Wildcat Fault and its surrounding geology was performed. The Wildcat Fault is thought to be a strike-slip fault with a thrust component that runs along the eastern boundary of the Lawrence Berkeley National Laboratory. It is believed to be part of the Hayward Fault system but is considered inactive. Three trenches were excavated at carefully selected locations mainly based on the information from the past investigative work inside the LBNL property. At least one fault was encountered in all three trenches. Detailed trench mapping was conducted by CRIEPI (Central Research Institute for Electric Power Industries) and LBNL scientists. Some intriguing and puzzling discoveries were made that may contradict with the published work in the past. Predictions are made regarding the hydrologic property of the Wildcat Fault based on the analysis of fault structure. Preliminary conceptual models of the Wildcat Fault were proposed. The Wildcat Fault appears to have multiple splays and some low angled faults may be part of the flower structure. In parallel, surface geophysical investigations were conducted using electrical resistivity survey and seismic reflection profiling along three lines on the north and south of the LBNL site. Because of the steep terrain, it was difficult to find optimum locations for survey lines as it is desirable for them to be as straight as possible. One interpretation suggests that the Wildcat Fault is westerly dipping. This could imply that the Wildcat Fault may merge with the Hayward Fault at depth. However, due to the complex geology of the Berkeley Hills, multiple interpretations of the geophysical surveys are possible. iv An effort to construct a 3D GIS model is under way. The model will be used not so much for visualization of the existing data because only surface data are available thus far, but to conduct investigation of possible abutment relations of the buried formations offset by the fault. A 3D model would be useful to conduct 'what if' scenario testing to aid the selection of borehole drilling locations and configurations. Based on the information available thus far, a preliminary plan for borehole drilling is outlined. The basic strategy is to first drill boreholes on both sides of the fault without penetrating it. Borehole tests will be conducted in these boreholes to estimate the property of the fault. Possibly a slanted borehole will be drilled later to intersect the fault to confirm the findings from the boreholes that do not intersect the fault. Finally, the lessons learned from conducting the trenching and geophysical surveys are listed. It is believed that these lessons will be invaluable information for NUMO when it conducts preliminary investigations at yet-to-be selected candidate sites in Japan.

  10. The 2nd DBCLS BioHackathon: interoperable bioinformatics Web services for integrated applications

    OpenAIRE

    Katayama Toshiaki; Wilkinson Mark D; Vos Rutger; Kawashima Takeshi; Kawashima Shuichi; Nakao Mitsuteru; Yamamoto Yasunori; Chun Hong-Woo; Yamaguchi Atsuko; Kawano Shin; Aerts Jan; Aoki-Kinoshita Kiyoko F; Arakawa Kazuharu; Aranda Bruno; Bonnal Raoul JP

    2011-01-01

    Abstract Background The interaction between biological researchers and the bioinformatics tools they use is still hampered by incomplete interoperability between such tools. To ensure interoperability initiatives are effectively deployed, end-user applications need to be aware of, and support, best practices and standards. Here, we report on an initiative in which software developers and genome biologists came together to explore and raise awareness of these issues: BioHackathon 2009. Results...

  11. An open-source software tool for the generation of relaxation time maps in magnetic resonance imaging

    International Nuclear Information System (INIS)

    In magnetic resonance (MR) imaging, T1, T2 and T2* relaxation times represent characteristic tissue properties that can be quantified with the help of specific imaging strategies. While there are basic software tools for specific pulse sequences, until now there is no universal software program available to automate pixel-wise mapping of relaxation times from various types of images or MR systems. Such a software program would allow researchers to test and compare new imaging strategies and thus would significantly facilitate research in the area of quantitative tissue characterization. After defining requirements for a universal MR mapping tool, a software program named MRmap was created using a high-level graphics language. Additional features include a manual registration tool for source images with motion artifacts and a tabular DICOM viewer to examine pulse sequence parameters. MRmap was successfully tested on three different computer platforms with image data from three different MR system manufacturers and five different sorts of pulse sequences: multi-image inversion recovery T1; Look-Locker/TOMROP T1; modified Look-Locker (MOLLI) T1; single-echo T2/T2*; and multi-echo T2/T2*. Computing times varied between 2 and 113 seconds. Estimates of relaxation times compared favorably to those obtained from non-automated curve fitting. Completed maps were exported in DICOM format and could be read in standard software packages used for analysis of clinical and research MR data. MRmap is a flexible cross-platform research tool that enables accurate mapping of relaxation times from various pulse sequences. The software allows researchers to optimize quantitative MR strategies in a manufacturer-independent fashion. The program and its source code were made available as open-source software on the internet

  12. An open-source software tool for the generation of relaxation time maps in magnetic resonance imaging

    Directory of Open Access Journals (Sweden)

    Kühne Titus

    2010-07-01

    Full Text Available Abstract Background In magnetic resonance (MR imaging, T1, T2 and T2* relaxation times represent characteristic tissue properties that can be quantified with the help of specific imaging strategies. While there are basic software tools for specific pulse sequences, until now there is no universal software program available to automate pixel-wise mapping of relaxation times from various types of images or MR systems. Such a software program would allow researchers to test and compare new imaging strategies and thus would significantly facilitate research in the area of quantitative tissue characterization. Results After defining requirements for a universal MR mapping tool, a software program named MRmap was created using a high-level graphics language. Additional features include a manual registration tool for source images with motion artifacts and a tabular DICOM viewer to examine pulse sequence parameters. MRmap was successfully tested on three different computer platforms with image data from three different MR system manufacturers and five different sorts of pulse sequences: multi-image inversion recovery T1; Look-Locker/TOMROP T1; modified Look-Locker (MOLLI T1; single-echo T2/T2*; and multi-echo T2/T2*. Computing times varied between 2 and 113 seconds. Estimates of relaxation times compared favorably to those obtained from non-automated curve fitting. Completed maps were exported in DICOM format and could be read in standard software packages used for analysis of clinical and research MR data. Conclusions MRmap is a flexible cross-platform research tool that enables accurate mapping of relaxation times from various pulse sequences. The software allows researchers to optimize quantitative MR strategies in a manufacturer-independent fashion. The program and its source code were made available as open-source software on the internet.

  13. The 2nd to 4th digit ratio (2D:4D) and eating disorder diagnosis in women

    OpenAIRE

    Quinton, Stephanie Jane; Smith, April Rose; Joiner, Thomas

    2011-01-01

    Eating disorders are more common in females than in males and are believed to be caused, in part, by biological and hormonal factors. Digit ratio or 2D:4D (the ratio of the 2nd to the 4th digit) is considered to be a proxy for prenatal testosterone (PT) and prenatal oestrogen (PE) exposure. However, how 2D:4D may be related to type of eating pathology is unknown. The relationship between 2D:4D and eating disorder diagnosis was investigated in recovered and currently eating disordered (n=31) a...

  14. Virtual Visit to the ATLAS Control Room by 2nd High School of Eleftherio–Kordelio in Thessaloniki

    CERN Multimedia

    2013-01-01

    Our school is the 2nd High School of Eleftherio – Kordelio. It is located at the west suburbs of Thessaloniki in Greece and our students are between 15-17 years old. Thessaloniki is the second largest city in Greece with a port of a major role in trading at the area of South Balkans. During this period of time our students have heard so much about CERN and the great discoveries which have taken place there and they are really keen on visiting and learning many things about it.

  15. 2nd International Salzburg Conference on Neurorecovery (ISCN 2013) Salzburg/Austria|November 28th-29th, 2013.

    Science.gov (United States)

    Brainin, M; Muresanu, D; Slavoaca, D

    2014-01-01

    The 2nd International Salzburg Conference on Neurorecovery was held on the 28th and 29th of November, 2013, in Salzburg, one of the most beautiful cities in Austria, which is well known for its rich cultural heritage, world-famous music and beautiful surrounding landscapes. The aim of the conference was to discuss the progress in the field of neurorecovery. The conference brought together internationally renowned scientists and clinicians, who described the clinical and therapeutic relevance of translational research and its applications in neurorehabilitation. PMID:25713602

  16. RECONSTRUCTING THE IDEA OF PRAYER SPACE: A CRITICAL ANALYSIS OF THE TEMPORARY PRAYING PLATFORM PROJECT OF 2ND YEAR ARCHITECTURE STUDENTS IN THE NATIONAL UNIVERSITY OF MALAYSIA (UKM

    Directory of Open Access Journals (Sweden)

    Nangkula Utaberta

    2013-12-01

    Full Text Available Abstract God created human as caliph on this earth. Caliph means leader, care-taker and guardian. Therefore humans have an obligation to maintain, preserve and conserve this natural for future generations. Today we see a lot of damage that occurs in the earth caused by human behavior. Islam saw the whole of nature as a place of prayer that must be maintained its cleanliness and purity. Therefore as Muslims we need to preserve nature as we keep our place of prayer. The main objective of this paper is to re-questioning and re-interpreting the idea of sustainability in Islamic Architecture through a critical analysis of first project of 2nd year architecture student of UKM which is the “Temporary Praying Platform”. The discussion itself will be divided into three (3 main parts. The first part will be discussing contemporary issues in Islamic Architecture especially in the design of Mosques while the second part will expand the framework of sustainability in Islamic Architecture. The last part will be analyzing some sample of design submission by 2nd year students of UKM on the temporary praying platform project. It is expected that this paper can start a further discussion on the inner meaning in Islam and how it was implemented in the design of praying spaces in the future. Keywords:  Sustainability, Islamic Architecture, Temporary Praying PlatformAbstrak Tuhan menciptakan manusia sebagai khalifah di muka bumi ini. Khalifah berarti pemimpin, penjaga dan wali. Oleh karena itu, manusia memiliki kewajiban untuk memelihara, menjaga dan melestarikan alam ini untuk generasi mendatang. Sekaranginikitatelahmelihat banyak kerusakan yang terjadi di bumi yang disebabkan oleh perilaku manusia itu sendiri yang disebutkan sebagai khalifah di bumi. Islam melihat seluruh alam sebagai tempat beribadah yang harus dijaga kebersihan dan kemurniannya, oleh karena itu, sebagai umat Islam adalah perlu melestarikan alam seperti menjaga tempat ibadah mereka. Tujuan utama dari makalah ini adalah untuk mempertanyakan dan menafsirkan kembali gagasan keberlanjutan (sustainable dalam Arsitektur Islam melalui analisis kritis tugas  pertama dari mahasiswa arsitektur angkatan  tahun  kedua dari Universiti Kebangsaan Malaysia (UKM, yaitu tugas perancangan " tempat beribadah sementara "atau “temporary praying platform” . Kajiandibagi menjadi tiga bagian utama. Bagian pertama akan membahas isu-isu kontemporer dalam Arsitektur Islam terutama dalam desain masjid. Kajian kedua adalah kerangka keberlanjutan dalam arsitektur Islam. Bagian ketiga adalah analisis dari beberapa sampel pengajuan desain oleh mahasiswa. Diharapkan tulisan ini dapat memulai diskusi lebih lanjut tentang makna batin dalam Islam dan bagaimana penerapannya dalam desain ruang beribadah yang sustainable. Kata kunci:Keberlanjutan, Arsitektur Islam, tempat beribadah sementara  

  17. [Psychotherapeutic approach to migrant patients of the 1st or 2nd generation: contribution of Tobie Nathan's ethnopsychoanalysis].

    Science.gov (United States)

    Pierre, D

    1993-01-01

    Based on our experience of the ethnopsychiatric consultation directed by Tobie Nathan in Paris, we try to define the adaptations of the therapeutic setting necessary, according to him, in a transcultural situation, and by whom we can be inspired (work-group, participation of a translator, mobilisation of the traditional etiological interpretations). Then, we study the clinical case of a young Algerian girl living in Belgium. We analyze her acute delirious psychosis following the conceptions of Tobie Nathan: the particular situation of migrant children would be a proneness for the splitting of the ego. At least we show how important is in the psychotherapy the restitution of the original cultural framework. PMID:8036937

  18. XUV spectra of 2nd transition row elements: identification of 3d–4p and 3d–4f transition arrays

    Science.gov (United States)

    Lokasani, Ragava; Long, Elaine; Maguire, Oisin; Sheridan, Paul; Hayden, Patrick; O’Reilly, Fergal; Dunne, Padraig; Sokell, Emma; Endo, Akira; Limpouch, Jiri; O’Sullivan, Gerry

    2015-12-01

    The use of laser produced plasmas (LPPs) in extreme ultraviolet/soft x-ray lithography and metrology at 13.5 nm has been widely reported and recent research efforts have focused on developing next generation sources for lithography, surface morphology, patterning and microscopy at shorter wavelengths. In this paper, the spectra emitted from LPPs of the 2nd transition row elements from yttrium (Z = 39) to palladium (Z = 46), with the exception of zirconium (Z = 40) and technetium (Z = 43), produced by two Nd:YAG lasers which delivered up to 600 mJ in 7 ns and 230 mJ in 170 ps, respectively, are reported. Intense emission was observed in the 2–8 nm spectral region resulting from unresolved transition arrays (UTAs) due to 3d–4p, 3d–4f and 3p–3d transitions. These transitions in a number of ion stages of yttrium, niobium, ruthenium and rhodium were identified by comparison with results from Cowan code calculations and previous studies. The theoretical data were parameterized using the UTA formalism and the mean wavelength and widths were calculated and compared with experimental results.

  19. The whiteStar development project: Westinghouse's next generation core design simulator and core monitoring software to power the nuclear renaissance

    International Nuclear Information System (INIS)

    The WhiteStar project has undertaken the development of the next generation core analysis and monitoring system for Westinghouse Electric Company. This on-going project focuses on the development of the ANC core simulator, BEACON core monitoring system and NEXUS nuclear data generation system. This system contains many functional upgrades to the ANC core simulator and BEACON core monitoring products as well as the release of the NEXUS family of codes. The NEXUS family of codes is an automated once-through cross section generation system designed for use in both PWR and BWR applications. ANC is a multi-dimensional nodal code for all nuclear core design calculations at a given condition. ANC predicts core reactivity, assembly power, rod power, detector thimble flux, and other relevant core characteristics. BEACON is an advanced core monitoring and support system which uses existing instrumentation data in conjunction with an analytical methodology for on-line generation and evaluation of 3D core power distributions. This new system is needed to design and monitor the Westinghouse AP1000 PWR. This paper describes provides an overview of the software system, software development methodologies used as well some initial results. (authors)

  20. Influence of Nd dopant amount on microstructure and photoluminescence of TiO2:Nd thin films

    Science.gov (United States)

    Wojcieszak, Damian; Mazur, Michal; Kaczmarek, Danuta; Morgiel, Jerzy; Zatryb, Grzegorz; Domaradzki, Jaroslaw; Misiewicz, Jan

    2015-10-01

    TiO2 and TiO2:Nd thin films were deposited using reactive magnetron sputtering process from mosaic Ti-Nd targets with various Nd concentration. The thin films were characterized using X-ray diffraction (XRD), transmission electron microscopy (TEM) and spectroscopic techniques. Photoluminescence (PL) in the near infrared obtained upon 514.5 nm excitation was also examined. The relationship between the Nd concentration, structural, optical and photoluminescence properties of prepared thin films was investigated and discussed. XRD and TEM measurements showed that an increase in the Nd concentration in the thin films hinders the crystal growth in the deposited coatings. Depending on the Nd amount in the thin films, TiO2 with the rutile, mixed rutile-amorphous or amorphous phase was obtained. Transmittance measurements revealed that addition of Nd dopant to titania matrix did not deteriorate optical transparency of the coatings, however it influenced on the position of the fundamental absorption edge and therefore on the width of optical band gap energy. All TiO2:Nd thin films exhibited PL emission that occurred at ca. 0.91, 1.09 and 1.38 ?m. Finally, results obtained for deposited coatings showed that titania with the rutile structure and 1.0 at.% of Nd was the most efficient in VIS to NIR photon conversion.

  1. [Medical support of the 65th Army during the East Prussian offensive operation performed by the 2nd Belorussian Front].

    Science.gov (United States)

    Shelepov, A M; Leonik, S I; Lemeshkin, R N

    2015-02-01

    Prussian offensive operation performed by the 2nd Belorussian Front. An activity of the medical An activity of the medical service of the 65th Army during the East Prussian offensive operation performed by the 2nd Belorussian Front is a typical example of the medical support of troops during the final stages of World War II. Forms and methods of medical support management, which were developed during the war, haven't lost their importance in modern conditions. These methods include the establishment of specialized surgical and therapeutic field hospital, establishment of medical institutions in the Army, which worked on the evacuation directions and reserve of mobile hospitals and transport, timely extension of the first echelons of the hospital base front to change institutions hospital deployed the army base. A research of experience in organizing medical support of the offensive operations performed during the last year of World War II provides the material for the development of the theory of modern medical support operations and ability to provide on this basis, the continuity of the hospitals, the continuity of qualified and specialized medical care, improve the performance of diagnostic and treatment work. PMID:25920177

  2. Numerical stability of 2nd order Runge-Kutta integration algorithms for use in particle-in-cell codes

    International Nuclear Information System (INIS)

    An essential ingredient of particle-in-cell (PIC) codes is a numerically accurate and stable integration scheme for the particle equations of motion. Such a scheme is the well known time-centered leapfrog (LF) method accurate to 2nd order with respect to the timestep ?t. However, this scheme can only be used for forces independent of velocity unless a simple enough implicit implementation is possible. The LF scheme is therefore inapplicable in Monte-Carlo treatments of particle collisions and/or interactions with radio-frequency fields. We examine here the suitability of the 2nd order Runge-Kutta (RK) method. We find that the basic RK scheme is numerically unstable, but that conditional stability can be attained by an implementation which preserves phase space area. Examples are presented to illustrate the performance of the RK schemes. We compare analytic and computed electron orbits in a traveling nonlinear wave and also show self-consistent PIC simulations describing plasma flow in the vicinity of a lower hybrid antenna. (author)

  3. Comparison of elution efficiency of 99Mo/99mTc generator using theoretical and a free web based software method

    International Nuclear Information System (INIS)

    Full text: Generator is constructed on the principle of decay growth relationship between a long lived parent radionuclide and short lived daughter radionuclide. Difference in chemical properties of daughter and parent radionuclide helps in efficient separation of the two radionuclides. Aim and Objectives: The present study was designed to calculate the elution efficiency of the generator using the traditional formula based method and free web based software method. Materials and Methods: 99Mo/99mTc MON.TEK (Monrol, Gebze) generator and sterile 0.9% NaCl vial and vacuum vial in the lead shield were used for the elution. A new 99Mo/99mTc generator (calibrated activity 30GBq) calibrated for thursday was received on monday morning in our department. Generator was placed behind lead bricks in fume hood. The rubber plugs of both vacuum and 0.9% NaCl vial were wiped with 70% isopropyl alcohol swabs. Vacuum vial placed inside the lead shield was inserted in the vacuum position simultaneously 10 ml NaCl vial was inserted in the second slot. After 1-2 min vacuum vial was removed without moving the emptied 0.9%NaCl vial. The vacuum slot was covered with another sterile vial to maintain sterility. The RAC was measured in the calibrated dose calibrator (Capintec, 15 CRC). The elution efficiency was calculated theoretically and using free web based software (Apache Web server (www.apache.org) and PHP (www.php.net). Web site of the Italian Association of Nuclear Medicine and Molecular Imaging (www.aimn.it). Results: The mean elution efficiency calculated by theoretical method was 93.95% +0.61. The mean elution efficiency as calculated by the software was 92.85% + 0.89. There was no statistical difference in both the methods. Conclusion: The free web based software provides precise and reproducible results and thus saves time and mathematical calculation steps. This enables a rational use of available activity and also enabling a selection of the type and number of procedures to perform in a busy nuclear medicine department

  4. 2nd International Conference on INformation Systems Design and Intelligent Applications

    CERN Document Server

    Satapathy, Suresh; Sanyal, Manas; Sarkar, Partha; Mukhopadhyay, Anirban

    2015-01-01

    The second international conference on INformation Systems Design and Intelligent Applications (INDIA – 2015) held in Kalyani, India during January 8-9, 2015. The book covers all aspects of information system design, computer science and technology, general sciences, and educational research. Upon a double blind review process, a number of high quality papers are selected and collected in the book, which is composed of two different volumes, and covers a variety of topics, including natural language processing, artificial intelligence, security and privacy, communications, wireless and sensor networks, microelectronics, circuit and systems, machine learning, soft computing, mobile computing and applications, cloud computing, software engineering, graphics and image processing, rural engineering, e-commerce, e-governance, business computing, molecular computing, nano computing, chemical computing, intelligent computing for GIS and remote sensing, bio-informatics and bio-computing. These fields are not only ...

  5. Numerical Study of Entropy Generation in the Flameless Oxidation Using Large Eddy Simulation Model and OpenFOAM Software

    OpenAIRE

    Mousavi, Seyed Mahmood

    2014-01-01

    In this paper, in order to 3D investigation non-premixed flameless oxidation, large eddy simulation model using OpenFOAM software is applied. In this context, finite volume discrete ordinate model and partially stirred reactor are applied in order to model radiation and the combustion, respectively, and the full mechanism GRI-2.11 is used to precisely represent chemistry reactions. The flow field is discretized using the volume method and PISO algorithm coupled the pressure and velocity field...

  6. The MeqTrees software system and its use for third-generation calibration of radio interferometers

    OpenAIRE

    Noordam, Jan E.; Smirnov, Oleg M.

    2011-01-01

    The formulation of the radio interferometer measurement equation (RIME) by Hamaker et al. has provided us with an elegant mathematical apparatus for better understanding, simulation and calibration of existing and future instruments. The calibration of the new radio telescopes (LOFAR, SKA) would be unthinkable without the RIME formalism, and new software to exploit it. MeqTrees is designed to implement numerical models such as the RIME, and to solve for arbitrary subsets of ...

  7. Free Open Source Software: FOSS Based GIS for Spatial Retrievals of Appropriate Locations for Ocean Energy Utilizing Electric Power Generation Plants

    Directory of Open Access Journals (Sweden)

    Kohei Arai

    2012-09-01

    Full Text Available Free Open Source Software: FOSS based Geographic Information System: GIS for spatial retrievals of appropriate locations for ocean wind and tidal motion utilizing electric power generation plants is proposed. Using scatterometer onboard earth observation satellites, strong wind coastal areas are retrieved with FOSS/GIS of PostgreSQL/GIS. PostGIS has to be modified together with altimeter and scatterometer database. These modification and database creation would be a good reference to the users who would like to create GIS system together with database with FOSS.

  8. Use of 2nd and 3rd Level Correlation Analysis for Studying Degradation in Polycrystalline Thin-Film Solar Cells

    Energy Technology Data Exchange (ETDEWEB)

    Albin, D. S.; del Cueto, J. A.; Demtsu, S. H.; Bansal, S.

    2011-03-01

    The correlation of stress-induced changes in the performance of laboratory-made CdTe solar cells with various 2nd and 3rd level metrics is discussed. The overall behavior of aggregated data showing how cell efficiency changes as a function of open-circuit voltage (Voc), short-circuit current density (Jsc), and fill factor (FF) is explained using a two-diode, PSpice model in which degradation is simulated by systematically changing model parameters. FF shows the highest correlation with performance during stress, and is subsequently shown to be most affected by shunt resistance, recombination and in some cases voltage-dependent collection. Large decreases in Jsc as well as increasing rates of Voc degradation are related to voltage-dependent collection effects and catastrophic shunting respectively. Large decreases in Voc in the absence of catastrophic shunting are attributed to increased recombination. The relevance of capacitance-derived data correlated with both Voc and FF is discussed.

  9. International collaborative study for establishment of the 2nd WHO International Standard for Haemophilus influenzae type b polysaccharide.

    Science.gov (United States)

    Mawas, Fatme; Burkin, Karena; Dougall, Thomas; Saydam, Manolya; Rigsby, Peter; Bolgiano, Barbara

    2015-11-01

    In this report we present the results of a collaborative study for the preparation and calibration of a replacement International Standard (IS) for Haemophilus influenzae type b polysaccharide (polyribosyl ribitol phosphate; 5-d-ribitol-(1 ? 1)-?-d-ribose-3-phosphate; PRP). Two candidate preparations were evaluated. Thirteen laboratories from 9 different countries participated in the collaborative study to assess the suitability and determine the PRP content of two candidate standards. On the basis of the results from this study, Candidate 2 (NIBSC code 12/306) has been established as the 2nd WHO IS for PRP by the Expert Committee of Biological Standards of the World Health Organisation with a content of 4.904 ± 0.185mg/ampoule, as determined by the ribose assays carried out by 11 of the participating laboratories. PMID:26298195

  10. An Inquiry into Perceived Autonomy Support of Iranian EFL Learners: 2nd, 3rd and 4th Grade University Students

    Directory of Open Access Journals (Sweden)

    Husain Abdulhay

    2015-10-01

    Full Text Available Gaining an insight into Iranian EFL learning environment is increasingly felt, consonant with dissociation from the traditional and spoon-feeding rituals of Iranian indigenous teaching. To that end, the study tried to scour the grade level differences of 202 students in their perceived autonomy support in the context of Iranian universities. Exposures to autonomy supportive environment were examined in 2nd, 3rd and 4th grade-levels through the administration of Learning Climate Questionnaire (LCQ: Black & Deci, 2000, a self-report instrument for appraising perceived autonomy support. Data collected were analyzed in respect of means differences. Significant differences were found between graders in their perceptions of autonomy supportive environments. Second graders appeared to perceive their learning and teaching environment more autonomy supportive than the two other graders. Juniors had lower perception of their environment as autonomy supportive than senior students. The results substantiate previous studies by indicating that perceived autonomy support is dwindled by grade level.

  11. Colourful, Courageous and Community-Building : Reflections from the Organizer of the 2nd Nordic STS Conference

    DEFF Research Database (Denmark)

    Jensen, Torben Elgaard

    2015-01-01

    The 2nd Nordic STS conference,held in Copenhagen 2015, was an occassion to take stock of the current trends and developments of Nordic STS. In this paper, the leading organizer reflects on the event and characterises contemporary Nordic STS as colourful (spanning a wide range of perspectives and empirical topics), courageous (critical, relexive but also willing to take on collaborative roles), and community building (sharing commitment to a number of topics and issues). Speculating on future developments, he suggests that Nordic STS will be receive impetus for change and transformation from at least four sources: (1) The steady stream of new controversial scientiic and technological developments. (2) The increasing commitment to gather interdisciplinary collaboration and research funding around grand societal challenges. (3) Methodological developments wihin STS. (4) The increasing number of collaborative roles that STS, as a mature discipline, will be invited to take up.

  12. Anatomy of a 2nd-order unconformity: stratigraphy and facies of the Bakken formation during basin realignment

    Energy Technology Data Exchange (ETDEWEB)

    Skinner, Orion; Canter, Lyn; Sonnenfeld, Mark; Williams, Mark [Whiting Oil and Gas Corp., Denver, CO (United States)

    2011-07-01

    Because classic Laramide compressional structures are relatively rare, the Williston Basin is often considered as structurally simple, but because of the presence of numerous sub-basins, simplistic lithofacies generalization is impossible, and detailed facies mapping is necessary to unravel Middle Bakken paleogeography. The unconformity above the Devonian Three Forks is explained by the infilling and destruction of the Devonian Elk Point basin, prepares the Bakken system, and introduces a Mississippian Williston Basin with a very different configuration. Black shales are too often considered as deposits that can only be found in deep water, but to a very different conclusion must be drawn after a review of stratigraphic geometry and facies successions. The whole Bakken is a 2nd-order lowstand to transgressive systems tract lying below the basal Lodgepole, which represents an interval of maximal flooding. This lowstand to transgressive stratigraphic context explains why the sedimentary process and provenance shows high aerial variability.

  13. A summary of the 2nd workshop on Human Resources Development (HRD) in the nuclear field in Asia. FY2000

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-06-01

    The Human Resources Development (HRD) Project was added in 1999 as a Cooperation Activity of 'the Forum for Nuclear Cooperation in Asia (FNCA)' which is organized by Nuclear Committee. The HRD Project supports to solidify the foundation of nuclear development utilization in Asia by promoting human resources development in Asian countries. The principal activity of the HRD Project is to hold the Workshop on Human Resources Development in the Nuclear Field in Asia once a year. The objective of the Workshop is to clarify problems and needs of the human resources development of each country and to support it mutually by exchanging information etc. The report consists of a summary of the 2nd Workshop on Human Resources Development in the Nuclear Field in Asia held on November 27 and 28, 2000 at Tokai Research Establishment of JAERI. (author)

  14. Software Testing

    OpenAIRE

    Sarbjeet Singh; Sukhvinder singh; Gurpreet Singh

    2010-01-01

    Software goes through a cycle of software development stages. A software is envisioned, created, evaluated, fixed and then put to use. To run any software consistently without any failure/bug/error, the most important step is to test the software. This paper points various types of software testing(manual and automation), various software testing techniques like black box, white box, gray box, sanity, functional testing etc. and software test life cycle models (V-model and W-model). This pape...

  15. CO2 dynamics in nested catchments: a longitudinal perspective from soil to 1st and 2nd order streams

    Science.gov (United States)

    Johnson, M. S.; Lehmann, J.; Riha, S. J.; Couto, E. G.

    2005-12-01

    Fluxes of CO2 from terrestrial to aquatic environments were investigated in a nested catchment study in the seasonally-dry southern Amazon. Dissolved CO2 concentrations in groundwater springs, four 1st order streams and one 2nd order stream were determined via routine sampling and in-situ monitoring. CO2 concentrations were monitored in the soil atmosphere to 8m. Belowground, the seasonal trend in soil CO2 concentrations at depth lagged that of seasonal water table dynamics, with peak concentrations (8.7% CO2 vol/vol at 4m) occurring one month after maximum water table height, indicating a shift in root respiration and plant water uptake to deeper soil layers during the dry season. Peak dissolved CO2 concentrations in springs and streams lagged the soil CO2 maximum by an additional month. During storm events, streamflow CO2 concentrations were found to decrease initially, reflecting the initial contribution of low-CO2 direct precipitation and surface runoff. Streamwater CO2 then increased as the contribution of pre-event water to storm flow increased. Dissolved CO2 in springs was also found to increase during storm events. Diurnal fluctuations in dissolved CO2 of springs were also observed, indicating the connectivity of the biosphere, pedosphere and hydrosphere for headwater catchments. The dissolved CO2 concentration within 1st order streams decreases rapidly downstream from stream sources, with spring CO2 concentration 3.3 times that at headwater catchment outlets. This initial outgassing of CO2 within 1st order streams was found to be accompanied by a corresponding increase in the pH of stream water. However, dissolved CO2 concentrations were not found to be significantly different between 1st and 2nd order streams. This suggests a discontinuity between some processes at the terrestrial-aquatic interface in headwater catchments and those of larger-order watersheds.

  16. Licensing safety critical software

    International Nuclear Information System (INIS)

    Licensing difficulties with the shutdown system software at the Darlington Nuclear Generating Station contributed to delays in starting up the station. Even though the station has now been given approval by the Atomic Energy Control Board (AECB) to operate, the software issue has not disappeared - Ontario Hydro has been instructed by the AECB to redesign the software. This article attempts to explain why software based shutdown systems were chosen for Darlington, why there was so much difficulty licensing them, and what the implications are for other safety related software based applications

  17. A study on trait anger – anger expression and friendship commitment levels of primary school 2nd stage students who play – do not play sports

    OpenAIRE

    Hüseyin K?r?mo?lu; Yunus Y?ld?r?m; Ali Temiz

    2010-01-01

    The aim of this research was to investigate the state of trait anger-anger expression and friendship commitment levels depending on the fact whether 2nd-stage-students in primary schools played sports or not.Personal Information Form, Trait Anger-Anger Expression Scales and Inventory of Peer Attachment were used in order to collect the data.The population of the research was consisted of the students who studied in 2nd stage of 40 primary state schools that belonged to National Education Dire...

  18. Analysis of Polish writing on the history of physical education and sports in the North-Eastern borderlands of the 2nd republic

    OpenAIRE

    Eligiusz Ma?olepszy

    2013-01-01

    The aim of this paper is presentation of the up-to-date state of research on physical education and sports in the North-Eastern Borderlands of the 2nd Republic based on analysis of Polish literatureon the subject. In the sense of territorial scope, the paper covers the areas of the Polesie, Novogrodek and Vilnius voivodeships.As for the scope of studies on the history of physical education and sports in the North-Eastern Borderlands of the 2nd Republic, the most cognitively significant is the...

  19. VennDIS: a JavaFX-based Venn and Euler diagram software to generate publication quality figures.

    Science.gov (United States)

    Ignatchenko, Vladimir; Ignatchenko, Alexandr; Sinha, Ankit; Boutros, Paul C; Kislinger, Thomas

    2015-04-01

    Venn diagrams are graphical representations of the relationships among multiple sets of objects and are often used to illustrate similarities and differences among genomic and proteomic datasets. All currently existing tools for producing Venn diagrams evince one of two traits; they require expertise in specific statistical software packages (such as R), or lack the flexibility required to produce publication-quality figures. We describe a simple tool that addresses both shortcomings, Venn Diagram Interactive Software (VennDIS), a JavaFX-based solution for producing highly customizable, publication-quality Venn, and Euler diagrams of up to five sets. The strengths of VennDIS are its simple graphical user interface and its large array of customization options, including the ability to modify attributes such as font, style and position of the labels, background color, size of the circle/ellipse, and outline color. It is platform independent and provides real-time visualization of figure modifications. The created figures can be saved as XML files for future modification or exported as high-resolution images for direct use in publications. PMID:25545689

  20. The MeqTrees software system and its use for third-generation calibration of radio interferometers

    CERN Document Server

    Noordam, Jan E; 10.1051/0004-6361/201015013

    2011-01-01

    The formulation of the radio interferometer measurement equation (RIME) by Hamaker et al. has provided us with an elegant mathematical apparatus for better understanding, simulation and calibration of existing and future instruments. The calibration of the new radio telescopes (LOFAR, SKA) would be unthinkable without the RIME formalism, and new software to exploit it. MeqTrees is designed to implement numerical models such as the RIME, and to solve for arbitrary subsets of their parameters. The technical goal of MeqTrees is to provide a tool for rapid implementation of such models, while offering performance comparable to hand-written code. We are also pursuing the wider goal of increasing the rate of evolution of radio astronomical software, by offering a tool for rapid experimentation and exchange of ideas. MeqTrees is implemented as a Python-based front-end called the meqbrowser, and an efficient (C++-based) computational back-end called the meqserver. Numerical models are defined on the front-end via a P...

  1. Operation of Finnish nuclear power plants. Quarterly report, 2nd quarter 1998

    International Nuclear Information System (INIS)

    Quarterly reports on the operation of Finnish NPPs describe events and observations relating to nuclear and radiation safety that the Radiation and Nuclear Safety Authority (STUK) considers safety significant. Safety improvements at the plants are also described. The report includes a summary of the radiation safety of plant personnel and the environment and tabulated data on the plants' production and load factors. The Loviisa plant units were in power operation for the whole second quarter of 1998. The Olkiluoto units discontinued electricity generation for annual maintenance and also briefly for tests pertaining to the power upratings of the units. In addition, there were breaks in power generation at Olkiluoto 2 due to a low electricity demand in Midsummer and turbine balancing. The Olkiluoto units were in power operation in this quarter with the exception of the aforementioned breaks. The load factor average of the four plant units was 87.7%. The events in this quarter had no bearing on the nuclear or radiation safety. Occupational doses and radioactive releases off-site were below authorised limits. Radioactive substances were measurable in samples collected around the plants in such quantities only as have no bearing on the radiation exposure of the population. (orig.)

  2. Operation of Finnish nuclear power plants. Quarterly report, 2nd quarter 1998

    Energy Technology Data Exchange (ETDEWEB)

    Tossavainen, K. [ed.

    1999-01-01

    Quarterly reports on the operation of Finnish NPPs describe events and observations relating to nuclear and radiation safety that the Radiation and Nuclear Safety Authority (STUK) considers safety significant. Safety improvements at the plants are also described. The report includes a summary of the radiation safety of plant personnel and the environment and tabulated data on the plants` production and load factors. The Loviisa plant units were in power operation for the whole second quarter of 1998. The Olkiluoto units discontinued electricity generation for annual maintenance and also briefly for tests pertaining to the power upratings of the units. In addition, there were breaks in power generation at Olkiluoto 2 due to a low electricity demand in Midsummer and turbine balancing. The Olkiluoto units were in power operation in this quarter with the exception of the aforementioned breaks. The load factor average of the four plant units was 87.7%. The events in this quarter had no bearing on the nuclear or radiation safety. Occupational doses and radioactive releases off-site were below authorised limits. Radioactive substances were measurable in samples collected around the plants in such quantities only as have no bearing on the radiation exposure of the population. (orig.)

  3. The 2nd DBCLS BioHackathon: interoperable bioinformatics Web services for integrated applications

    Directory of Open Access Journals (Sweden)

    Katayama Toshiaki

    2011-08-01

    Full Text Available Abstract Background The interaction between biological researchers and the bioinformatics tools they use is still hampered by incomplete interoperability between such tools. To ensure interoperability initiatives are effectively deployed, end-user applications need to be aware of, and support, best practices and standards. Here, we report on an initiative in which software developers and genome biologists came together to explore and raise awareness of these issues: BioHackathon 2009. Results Developers in attendance came from diverse backgrounds, with experts in Web services, workflow tools, text mining and visualization. Genome biologists provided expertise and exemplar data from the domains of sequence and pathway analysis and glyco-informatics. One goal of the meeting was to evaluate the ability to address real world use cases in these domains using the tools that the developers represented. This resulted in i a workflow to annotate 100,000 sequences from an invertebrate species; ii an integrated system for analysis of the transcription factor binding sites (TFBSs enriched based on differential gene expression data obtained from a microarray experiment; iii a workflow to enumerate putative physical protein interactions among enzymes in a metabolic pathway using protein structure data; iv a workflow to analyze glyco-gene-related diseases by searching for human homologs of glyco-genes in other species, such as fruit flies, and retrieving their phenotype-annotated SNPs. Conclusions Beyond deriving prototype solutions for each use-case, a second major purpose of the BioHackathon was to highlight areas of insufficiency. We discuss the issues raised by our exploration of the problem/solution space, concluding that there are still problems with the way Web services are modeled and annotated, including: i the absence of several useful data or analysis functions in the Web service "space"; ii the lack of documentation of methods; iii lack of compliance with the SOAP/WSDL specification among and between various programming-language libraries; and iv incompatibility between various bioinformatics data formats. Although it was still difficult to solve real world problems posed to the developers by the biological researchers in attendance because of these problems, we note the promise of addressing these issues within a semantic framework.

  4. Development of a radioactive waste treatment equipment utilizing microwave heating, 2nd report

    International Nuclear Information System (INIS)

    The objective of the present study is to establish an incineration technique utilizing microwave heating which enables a high volume reduction of spent ion-exchange resins and filtering media generated at nuclear facilities. The past three years from 1982 to 1985, with the financial aid from the Agency of Science and Technology, brought a great and rapid progress to this project when the heating technique was switched from direct microwave heating to indirect heating by employing a bed of beads of silicon carbide. This material was also used to build a secondary furnace, walls and roster bars, to treat the obnoxious gases and soot arising in the primary incineration process by the radiating heat of this material heated to above 1000 deg C again by microwave energy, but not by the originarily applied direct plasma torch combustion. The incinerator and the secondary furnace were integrated into one unit as the principal treating equipment. This novel approach made possible a well stabilized continuous incineration operation. Further, developmental efforts toward industrial applications were made by setting up a pilot plant with microwave generators, 2 sets of 5 kW of 2450 MHz and 1 set of 25 kW of 915 MHz, and tests were carried out to prove remarkably high volume reduction capability well above roughly 200 on weight basis. For hot test runs, a one - tenth scale pilot test setup was installed at the TOKAI Laboratory of Japan Atmic Energy Research Institute and tested with materials spiked with radioisotopes and also with spent ion-exchange resins stored there. Very satisfactory results were obtained in these proving tests to show the efficient capability of high volume reduction treatment of otherwise stable radioactive waste materials such as spent ion-exchange resins. (author)

  5. Fractally Generated Microstrip Bandpass Filter Designs Basedon Dual-Mode Square Ring Resonator for WirelessCommunication Systems

    Directory of Open Access Journals (Sweden)

    Jawad K. Ali

    2008-01-01

    Full Text Available A novel fractal design scheme has been introduced in this paper to generate microstrip bandpass filter designs with miniaturized sizes for wireless applications. The presented fractal scheme is based on Minkowski-like prefractal geometry. The space-filling property and self-similarity of this fractal geometry has found to produce reduced size symmetrical structures corresponding to the successive iteration levels. The resulting filter designs are with sizes suitable for use in modern wireless communication systems. The performance of each of the generated bandpass filter structures up to the 2nd iteration has been analyzed using a method of moments (MoM based software IE3D, which is widely adopted in microwave research and industry. Results show that these filters possess good transmission and return loss characteristics, besides the miniaturized sizes meeting the design specifications of most of wireless communication systems

  6. The 2008—2013 Crisis as Metastasis. A Preview of the 2nd edition of The Cancer Stage of Capitalism by Pluto Press

    Directory of Open Access Journals (Sweden)

    John McMurtry

    2013-03-01

    Full Text Available By means of selection of relevant excerpts, a preview is offered hereby of the 2nd edition of John McMurtry's prophetic 1999 book "The Cancer Stage of Capitalism", published by Pluto Press, and entitled "The Cancer Stage of Capitalism and Its Cure"

  7. Short rare hTERT-VNTR2-2nd alleles are associated with prostate cancer susceptibility and influence gene expression

    International Nuclear Information System (INIS)

    The hTERT (human telomerase reverse transcriptase) gene contains five variable number tandem repeats (VNTR) and previous studies have described polymorphisms for hTERT-VNTR2-2nd. We investigated how allelic variation in hTERT-VNTR2-2nd may affect susceptibility to prostate cancer. A case-control study was performed using DNA from 421 cancer-free male controls and 329 patients with prostate cancer. In addition, to determine whether the VNTR polymorphisms have a functional consequence, we examined the transcriptional levels of a reporter gene linked to these VNTRs and driven by the hTERT promoter in cell lines. Three new rare alleles were detected from this study, two of which were identified only in cancer subjects. A statistically significant association between rare hTERT-VNTR2-2nd alleles and risk of prostate cancer was observed [OR, 5.17; 95% confidence interval (CI), 1.09-24.43; P = 0.021]. Furthermore, the results indicated that these VNTRs inserted in the enhancer region could influence the expression of hTERT in prostate cancer cell lines. This is the first study to report that rare hTERT VNTRs are associated with prostate cancer predisposition and that the VNTRs can induce enhanced levels of hTERT promoter activity in prostate cancer cell lines. Thus, the hTERT-VNTR2-2nd locus may function as a modifier of prostate cancer risk by affecting gene expression

  8. The 2008—2013 crisis as metastasis : a preview of the 2nd edition of The cancer stage of capitalism by Pluto Press

    OpenAIRE

    John McMurtry

    2013-01-01

    By means of selection of relevant excerpts, a preview is offered hereby of the 2nd edition of John McMurtry's prophetic 1999 book "The Cancer Stage of Capitalism", published by Pluto Press, and entitled "The Cancer Stage of Capitalism and Its Cure"

  9. Software testing concepts and operations

    CERN Document Server

    Mili, Ali

    2015-01-01

    Explores and identifies the main issues, concepts, principles and evolution of software testing, including software quality engineering and testing concepts, test data generation, test deployment analysis, and software test managementThis book examines the principles, concepts, and processes that are fundamental to the software testing function. This book is divided into five broad parts. Part I introduces software testing in the broader context of software engineering and explores the qualities that testing aims to achieve or ascertain, as well as the lifecycle of software testing. Part II c

  10. Optimal Planning of an Off-grid Electricity Generation with Renewable Energy Resources using the HOMER Software

    Directory of Open Access Journals (Sweden)

    Hossein Shahinzadeh

    2015-03-01

    Full Text Available In recent years, several factors such as environmental pollution which is caused by fossil fuels and various diseases caused by them from one hand and concerns about the dwindling fossil fuels and price fluctuation of the products and resulting effects of these fluctuations in the economy from other hand has led most countries to seek alternative energy sources for fossil fuel supplies. Such a way that in 2006, about 18% of the consumed energy of the world is obtained through renewable energies. Iran is among the countries that are geographically located in hot and dry areas and has the most sun exposure in different months of the year. Except in the coasts of Caspian Sea, the percentage of sunny days throughout the year is between 63 to 98 percent in Iran. On the other hand, there are dispersed and remote areas and loads far from national grid which is impossible to provide electrical energy for them through transmission from national grid, therefore, for such cases the renewable energy technologies could be used to solve the problem and provide the energy. In this paper, technical and economic feasibility for the use of renewable energies for independent systems of the grid for a dispersed load in the area on the outskirts of Isfahan (Sepahan with the maximum energy consumption of 3Kwh in a day is studied and presented. In addition, the HOMER simulation software is used as the optimization tool.

  11. Incretinas, incretinomiméticos, inhibidores de DPP IV: (2ª parte) / Incretins, Incretinmimetics, Inhibitors (2nd part)

    Scientific Electronic Library Online (English)

    Claudia, Bayón; Mercedes Araceli, Barriga; León, Litwak.

    2010-09-01

    Full Text Available En los últimos años se reconoce un nuevo mecanismo involucrado en la fisiopatología de la Diabetes Mellitus tipo 2: el déficit de producción y/o acción de las incretinas. Las incretinas son enterohormonas que estimulan la secreción de insulina en respuesta a la ingesta de nutrientes. Glucagon like p [...] eptide-1 (GLP1) y Polipéptido insulinotrópico glucosa dependiente (GIP) son las principales incretinas descubiertas hasta hoy. Ambas presentan también efecto trófico sobre las células beta de los islotes pancreáticos. GLP-1 presenta otras acciones como son la inhibición de la secreción de glucagón, enlentecimiento del vaciamiento gástrico e inhibición del apetito. Ambas incretinas son rápidamente clivadas por la enzima dipeptidil peptidasa 4 (DPP-4). Nuevas drogas como los incretinomiméticos, análogos y los inhibidores de DPP-4 se presentan como una terapéutica prometedora para los pacientes con diabetes tipo 2. Conflicto de intereses: Dr. León Litwak - Miembro del Board Latinoamericano de Eli Lilly y Sanofi Aventis - Miembro del Board Nacional de los laboratorios Novo Nordisk, Novartis, GlaxoSmithKline, Sanofi Aventis, Boheringer Ingelheim, Bristol Myers, Astra Zeneca - Investigador principal de protocolos pertenecientes a Eli Lilly, Novo Nordisk, Novartis, GlaxoSmithKline, Takeda, PPDF, Pfizer, Merck Sharp and Dôhme, Amger, Roche, Minimed, Quintiles - Conferencista de los laboratorios mencionados. Abstract in english Two main pathophysiological mechanisms are currently involved in Type 2 Diabetes (T2DM), insulin resistance and impairment of beta cell function. However, in recent years a new mechanism was reported: a significant decrease in incretins production and/or action. Incretins are gastrointestinal hormon [...] es whose main action is stimulating insulin secretion in response to nutrients. The best known incretins are glucagon like peptide-1 (GLP-1) and Gastric insulinotropic peptide (GIP). GLP-1 and GIP not only increase insulin secretion, but also decrease glucagon secretion, slow gastric emptying and reduce apetite, generating weight loss. Both incretins are rapidly clived by the enzyme dipeptidil peptidase 4 (DPP4). In order to emulate incretins action, several drugs were developed: GLP-1 receptor agonists, GLP-1 mimetics, and DPP4 inhibitors. All of them seem to became a very promising tool for the treatment of T2DM. Financial Interests: Dr. León Litwak - Member of the Latin American Board of Eli Lilly and Sanofi Aventis - Member of the National Board of the following laboratories: Novo Nordisk, Novartis, GlaxoSmithKlein Sanofi, Aventis, Boheringer Ingelheim, Bristol Myers, Astra Zeneca - Principal Investigator of Protocols from: Eli Lilly, Novo Nordisk, Novartis, GlaxoSmithKlein, Takeda, PPDF, Pfizer, Merck Sharp and Dôhme, Amgen, Roche, Minimed, Quintiles - Lecturer for the former laboratories.

  12. Incretinas, incretinomiméticos, inhibidores de DPP IV: (2ª parte Incretins, Incretinmimetics, Inhibitors (2nd part

    Directory of Open Access Journals (Sweden)

    Claudia Bayón

    2010-09-01

    Full Text Available En los últimos años se reconoce un nuevo mecanismo involucrado en la fisiopatología de la Diabetes Mellitus tipo 2: el déficit de producción y/o acción de las incretinas. Las incretinas son enterohormonas que estimulan la secreción de insulina en respuesta a la ingesta de nutrientes. Glucagon like peptide-1 (GLP1 y Polipéptido insulinotrópico glucosa dependiente (GIP son las principales incretinas descubiertas hasta hoy. Ambas presentan también efecto trófico sobre las células beta de los islotes pancreáticos. GLP-1 presenta otras acciones como son la inhibición de la secreción de glucagón, enlentecimiento del vaciamiento gástrico e inhibición del apetito. Ambas incretinas son rápidamente clivadas por la enzima dipeptidil peptidasa 4 (DPP-4. Nuevas drogas como los incretinomiméticos, análogos y los inhibidores de DPP-4 se presentan como una terapéutica prometedora para los pacientes con diabetes tipo 2. Conflicto de intereses: Dr. León Litwak - Miembro del Board Latinoamericano de Eli Lilly y Sanofi Aventis - Miembro del Board Nacional de los laboratorios Novo Nordisk, Novartis, GlaxoSmithKline, Sanofi Aventis, Boheringer Ingelheim, Bristol Myers, Astra Zeneca - Investigador principal de protocolos pertenecientes a Eli Lilly, Novo Nordisk, Novartis, GlaxoSmithKline, Takeda, PPDF, Pfizer, Merck Sharp and Dôhme, Amger, Roche, Minimed, Quintiles - Conferencista de los laboratorios mencionados.Two main pathophysiological mechanisms are currently involved in Type 2 Diabetes (T2DM, insulin resistance and impairment of beta cell function. However, in recent years a new mechanism was reported: a significant decrease in incretins production and/or action. Incretins are gastrointestinal hormones whose main action is stimulating insulin secretion in response to nutrients. The best known incretins are glucagon like peptide-1 (GLP-1 and Gastric insulinotropic peptide (GIP. GLP-1 and GIP not only increase insulin secretion, but also decrease glucagon secretion, slow gastric emptying and reduce apetite, generating weight loss. Both incretins are rapidly clived by the enzyme dipeptidil peptidase 4 (DPP4. In order to emulate incretins action, several drugs were developed: GLP-1 receptor agonists, GLP-1 mimetics, and DPP4 inhibitors. All of them seem to became a very promising tool for the treatment of T2DM. Financial Interests: Dr. León Litwak - Member of the Latin American Board of Eli Lilly and Sanofi Aventis - Member of the National Board of the following laboratories: Novo Nordisk, Novartis, GlaxoSmithKlein Sanofi, Aventis, Boheringer Ingelheim, Bristol Myers, Astra Zeneca - Principal Investigator of Protocols from: Eli Lilly, Novo Nordisk, Novartis, GlaxoSmithKlein, Takeda, PPDF, Pfizer, Merck Sharp and Dôhme, Amgen, Roche, Minimed, Quintiles - Lecturer for the former laboratories.

  13. User Manual for Beta Version of TURBO-GRD: A Software System for Interactive Two-Dimensional Boundary/ Field Grid Generation, Modification, and Refinement

    Science.gov (United States)

    Choo, Yung K.; Slater, John W.; Henderson, Todd L.; Bidwell, Colin S.; Braun, Donald C.; Chung, Joongkee

    1998-01-01

    TURBO-GRD is a software system for interactive two-dimensional boundary/field grid generation. modification, and refinement. Its features allow users to explicitly control grid quality locally and globally. The grid control can be achieved interactively by using control points that the user picks and moves on the workstation monitor or by direct stretching and refining. The techniques used in the code are the control point form of algebraic grid generation, a damped cubic spline for edge meshing and parametric mapping between physical and computational domains. It also performs elliptic grid smoothing and free-form boundary control for boundary geometry manipulation. Internal block boundaries are constructed and shaped by using Bezier curve. Because TURBO-GRD is a highly interactive code, users can read in an initial solution, display its solution contour in the background of the grid and control net, and exercise grid modification using the solution contour as a guide. This process can be called an interactive solution-adaptive grid generation.

  14. mbs: modifying Hudson's ms software to generate samples of DNA sequences with a biallelic site under selection

    OpenAIRE

    Innan Hideki; Teshima Kosuke M

    2009-01-01

    Abstract Background The pattern of single nucleotide polymorphisms, or SNPs, contains a tremendous amount of information with respect to the mechanisms of the micro-evolutionary process of a species. The inference of the roles of these mechanisms, including natural selection, relies heavily on computer simulations. A coalescent simulation is extremely powerful in generating a large number of samples of DNA sequences from a population (species) when all mutations are neutral, and Hudson's ms s...

  15. Promoting concrete algorithm for implementation in computer system and data movement in terms of software reuse to generate actual values suitable for different access

    Directory of Open Access Journals (Sweden)

    Nderim Zeqiri

    2013-04-01

    Full Text Available The construction of functional algorithms by a good line and programming, open new routes and in the same time increase the capability to use them in the Mechatronics systems with specific and reliability system for any practical implementation and by justification in aspect of the economy context, and in terms of maintenance, making it more stable etc. This flexibility is really a possibility for the new approach and by makes the program code an easy way for updating data and In many cases is needed a quick access method which is which is specified in the context of generating appropriate values for digital systems. This forms, is opening a new space and better management to manage a respective values of a program code, and for software reuse, because this solution reduce costs and has a positive effect in terms of a digital economy.

  16. Programed oil generation of the Zubair Formation, Southern Iraq oil fields: Results from Petromod software modeling and geochemical analysis

    Science.gov (United States)

    Al-Ameri, T. K.; Pitman, J.; Naser, M.E.; Zumberge, J.; Al-Haydari, H. A.

    2011-01-01

    1D petroleum system modeling was performed on wells in each of four oil fields in South Iraq, Zubair (well Zb-47), Nahr Umr (well NR-9), West Qurna (well WQ-15 and 23), and Majnoon (well Mj-8). In each of these fields, deposition of the Zubair Formation was followed by continuous burial, reaching maximum temperatures of 100??C (equivalent to 0. 70%Ro) at depths of 3,344-3,750 m of well Zb-47 and 3,081. 5-3,420 m of well WQ-15, 120??C (equivalent to 0. 78%Ro) at depths of 3,353-3,645 m of well NR-9, and 3,391-3,691. 5 m of well Mj-8. Generation of petroleum in the Zubair Formation began in the late Tertiary, 10 million years ago. At present day, modeled transformation ratios (TR) indicate that 65% TR of its generation potential has been reached in well Zb-47, 75% TR in well NR-9 and 55-85% TR in West Qurna oil field (wells WQ-15 and WQ-23) and up to 95% TR in well Mj-8, In contrast, younger source rocks are immature to early mature (<20% TR), whereas older source rocks are mature to overmature (100% TR). Comparison of these basin modeling results, in Basrah region, are performed with Kifle oil field in Hilla region of western Euphrates River whereas the Zubair Formation is immature within temperature range of 65-70??C (0. 50%Ro equivalent) with up to 12% (TR = 12%) hydrocarbon generation efficiency and hence poor generation could be assessed in this last location. The Zubair Formation was deposited in a deltaic environment and consists of interbedded shales and porous and permeable sandstones. In Basrah region, the shales have total organic carbon of 0. 5-7. 0 wt%, Tmax 430-470??C and hydrogen indices of up to 466 with S2 = 0. 4-9. 4 of kerogen type II & III and petroleum potential of 0. 4-9. 98 of good hydrocarbon generation, which is consistent with 55-95% hydrocarbon efficiency. These generated hydrocarbons had charged (in part) the Cretaceous and Tertiary reservoirs, especially the Zubair Formation itself, in the traps formed by Alpine collision that closed the Tethys Ocean between Arabian and Euracian Plates and developed folds in Mesopotamian Basin 15-10 million years ago. These traps are mainly stratigraphic facies of sandstones with the shale that formed during the deposition of the Zubair Formation in transgression and regression phases within the main structural folds of the Zubair, Nahr Umr, West Qurna and Majnoon Oil fields. Oil biomarkers of the Zubair Formation Reservoirs are showing source affinity with mixed oil from the Upper Jurassic and Lower Cretaceous strata, including Zubair Formation organic matters, based on presentation of GC and GC-MS results on diagrams of global petroleum systems. ?? 2010 Saudi Society for Geosciences.

  17. Contractions of 2D 2nd Order Quantum Superintegrable Systems and the Askey Scheme for Hypergeometric Orthogonal Polynomials

    Directory of Open Access Journals (Sweden)

    Ernest G. Kalnins

    2013-10-01

    Full Text Available We show explicitly that all 2nd order superintegrable systems in 2 dimensions are limiting cases of a single system: the generic 3-parameter potential on the 2-sphere, S9 in our listing. We extend the Wigner-Inönü method of Lie algebra contractions to contractions of quadratic algebras and show that all of the quadratic symmetry algebras of these systems are contractions of that of S9. Amazingly, all of the relevant contractions of these superintegrable systems on flat space and the sphere are uniquely induced by the well known Lie algebra contractions of e(2 and so(3. By contracting function space realizations of irreducible representations of the S9 algebra (which give the structure equations for Racah/Wilson polynomials to the other superintegrable systems, and using Wigner's idea of ''saving'' a representation, we obtain the full Askey scheme of hypergeometric orthogonal polynomials. This relationship directly ties the polynomials and their structure equations to physical phenomena. It is more general because it applies to all special functions that arise from these systems via separation of variables, not just those of hypergeometric type, and it extends to higher dimensions.

  18. Report on the 2nd International Consortium on Hallucination Research: evolving directions and top-10 "hot spots" in hallucination research.

    Science.gov (United States)

    Waters, Flavie; Woods, Angela; Fernyhough, Charles

    2014-01-01

    This article presents a report on the 2nd meeting of the International Consortium on Hallucination Research, held on September 12th and 13th 2013 at Durham University, UK. Twelve working groups involving specialists in each area presented their findings and sought to summarize the available knowledge, inconsistencies in the field, and ways to progress. The 12 working groups reported on the following domains of investigation: cortical organisation of hallucinations, nonclinical hallucinations, interdisciplinary approaches to phenomenology, culture and hallucinations, subtypes of auditory verbal hallucinations, a Psychotic Symptoms Rating Scale multisite study, visual hallucinations in the psychosis spectrum, hallucinations in children and adolescents, Research Domain Criteria behavioral constructs and hallucinations, new methods of assessment, psychological therapies, and the Hearing Voices Movement approach to understanding and working with voices. This report presents a summary of this meeting and outlines 10 hot spots for hallucination research, which include the in-depth examination of (1) the social determinants of hallucinations, (2) translation of basic neuroscience into targeted therapies, (3) different modalities of hallucination, (4) domain convergence in cross-diagnostic studies, (5) improved methods for assessing hallucinations in nonclinical samples, (6) using humanities and social science methodologies to recontextualize hallucinatory experiences, (7) developmental approaches to better understand hallucinations, (8) changing the memory or meaning of past trauma to help recovery, (9) hallucinations in the context of sleep and sleep disorders, and (10) subtypes of hallucinations in a therapeutic context. PMID:24282321

  19. Crystal structure of H2[Nd2(H2O)12UMo12O42]·12H2O

    International Nuclear Information System (INIS)

    Crystal structure of a new complex heteropolyacid H2[Nd2(H2O)12UMo12O42]·12H2O (1) is determined. The crystal are monoclinic: a=24.225(3), b=21.323(3), c=10.982(3) A, ?=95.36(1) deg, sp.gr. B2/n, Z=4. Central symmetric anions [Nd2UMo12O42]2-, protons and water molecules appear to be structural units of 1. Anions are constructed on the base of uranium-molybdenum [UMo12O42]8-, coordinated by two neodymium ions (3). The specific feature of the given structure consists in the existence of short distances between bridge oxygen atoms of two adjacent anions (2.455 and 2.467 A). At the same time structure 1 illustrated two main macroligand [UMo12O42]8-properties: ability to complexing with metal ions and tendency to protonation with the production of a net structure at the expense of hydrogen bonds

  20. 2nd order spline interpolation of the Abel transformation for use in cylindrically-symmetric radiative source

    International Nuclear Information System (INIS)

    Inversion of the observed transverse radiance and transmittance, M(z) and N(z) , into the radial emission coefficients J(r) , in cylindrically-symmetric radiation source induce to solve the generalized Abel equations S(z) = 2 ?zR [J(r) K(z,r) r dr/? (R2-z2)] (0 ? z zR [K(r) r dr/? (R2-z2)]. This equation can be solved analytically. In 1981, Young proposed an iteration(Y' I). After 1990, we proposed successively a piecewise linear interpolation (PLI) and a block-bi-quadric interpolation(BBQI). In this paper, we notice that emission coefficient J(r) is sufficiently smooth and symmetric at r = 0, from which we know that J'(r) = 0. Considering the condition, we propose the 2nd-order spline interpolation (2OSI). The former Abel equation is separated into a system of linear algebraic equations, whose coefficient matrix is an upper Heisenberg matrix. So this equation can be solved easily and rapidly and the results obtained from the method are smoother than others. Thus the experimenters can apply this method easily. We have solved the two examples by using present 2OSI. The results show that the computed values converge to the exact solutions with an increase in the number of nodes n. Therefore the method (2OSI) is effective and reasonable

  1. Explicit formulas for 2nd-order driving terms due to sextupoles and chromatic effects of quadrupoles.

    Energy Technology Data Exchange (ETDEWEB)

    Wang, C-X. (Accelerator Systems Division (APS))

    2012-04-25

    Optimization of nonlinear driving terms have become a useful tool for designing storage rings, especially modern light sources where the strong nonlinearity is dominated by the large chromatic effects of quadrupoles and strong sextupoles for chromaticity control. The Lie algebraic method is well known for computing such driving terms. However, it appears that there was a lack of explicit formulas in the public domain for such computation, resulting in uncertainty and/or inconsistency in widely used codes. This note presents explicit formulas for driving terms due to sextupoles and chromatic effects of quadrupoles, which can be considered as thin elements. The computation is accurate to the 4th-order Hamiltonian and 2nd-order in terms of magnet parameters. The results given here are the same as the APS internal note AOP-TN-2009-020. This internal nte has been revised and published here as a Light Source Note in order to get this information into the public domain, since both ELEGANT and OPA are using these formulas.

  2. Inelastic neutron scattering studies on the 3d-4f heterometallic single-molecule magnet Mn2Nd2

    International Nuclear Information System (INIS)

    The discovery of slow relaxation and quantum tunneling of the magnetization in Mn12ac more than 15 years ago has inspired both physicists and chemists alike. This class of molecules, now called single-molecule magnets (SMMs), has very recently been expanded to heterometallic clusters incorporating transition metal and rare earth ions. The 4f ions were chosen because of their large angular momentum and magnetic anisotropy. Inelastic neutron scattering experiments were performed on the time-of-flight disk-chopper spectrometer IN5 at ILL on the SMM Mn2Nd2. A magnetic model was developed which perfectly describes all data, including the magnetic data. It was found that neither the large anisotropy nor the large angular momentum of the NdIII ions is the main reason for the SMM behavior in this molecule. Our analysis of the data indicates that it is the weak coupling of the NdIII ions to the MnIII ions, usually considered as a drawback of rare earth ions, which enhances the relaxation time and therefore leads to SMM behavior.

  3. [JAN J?DRZEJEWICZ AND EUROPEAN ASTRONOMY OF THE 2ND HALF OF THE 19TH CENTURY].

    Science.gov (United States)

    Siuda-Bochenek, Magda

    2015-01-01

    Jan J?drzejewicz was an amateur astronomer who in the 2nd half of the 19th century created an observation centre, which considering the level of research was comparable to the European ones. J?drzejewicz settled down in Plonsk in 1862 and worked as a doctor ever since but his greatest passion was astronomy, to which he dedicated all his free time. In 1875 J?drzejewicz finished the construction of his observatory. He equipped it with basic astronomical and meteorological instruments, then began his observations and with time he became quite skilled in it. J?drzejewicz focused mainly on binary stars but he also pointed his telescopes at the planets of the solar system, the comets, the Sun, as well as all the phenomena appearing in the sky at that time. Thanks to the variety of the objects observed and the number of observations he stood out from other observers in Poland and took a very good position in the mainstream of the 19th-century astronomy in Europe. Micrometer observations of binary stars made in P?o?sk gained recognition in the West and were included in the catalogues of binary stars. Interest in J?drzejewicz and his observatory was confirmed by numerous references in the English "Nature" magazine. PMID:26455002

  4. MYOB software for dummies

    CERN Document Server

    Curtis, Veechi

    2012-01-01

    Your complete guide to MYOB® AccountRight softwareNow in its seventh edition, MYOB® Software For Dummies walks you through everything you need to know, from starting your MYOB® file from scratch and recording payments and receipts, to tracking profit and analysing sales. This new edition includes all the information you need on the new generation of MYOB® AccountRight software, including the new cloud computing features. Set up MYOB® software - understand how to make it work the first time Keep track of purchases and sales - monitor customer accounts and ensure you get pai

  5. GENII (Generation II): The Hanford Environmental Radiation Dosimetry Software System: Volume 3, Code maintenance manual: Hanford Environmental Dosimetry Upgrade Project

    Energy Technology Data Exchange (ETDEWEB)

    Napier, B.A.; Peloquin, R.A.; Strenge, D.L.; Ramsdell, J.V.

    1988-09-01

    The Hanford Environmental Dosimetry Upgrade Project was undertaken to incorporate the internal dosimetry models recommended by the International Commission on Radiological Protection (ICRP) in updated versions of the environmental pathway analysis models used at Hanford. The resulting second generation of Hanford environmental dosimetry computer codes is compiled in the Hanford Environmental Dosimetry System (Generation II, or GENII). This coupled system of computer codes is intended for analysis of environmental contamination resulting from acute or chronic releases to, or initial contamination of, air, water, or soil, on through the calculation of radiation doses to individuals or populations. GENII is described in three volumes of documentation. This volume is a Code Maintenance Manual for the serious user, including code logic diagrams, global dictionary, worksheets to assist with hand calculations, and listings of the code and its associated data libraries. The first volume describes the theoretical considerations of the system. The second volume is a Users' Manual, providing code structure, users' instructions, required system configurations, and QA-related topics. 7 figs., 5 tabs.

  6. GENII [Generation II]: The Hanford Environmental Radiation Dosimetry Software System: Volume 3, Code maintenance manual: Hanford Environmental Dosimetry Upgrade Project

    International Nuclear Information System (INIS)

    The Hanford Environmental Dosimetry Upgrade Project was undertaken to incorporate the internal dosimetry models recommended by the International Commission on Radiological Protection (ICRP) in updated versions of the environmental pathway analysis models used at Hanford. The resulting second generation of Hanford environmental dosimetry computer codes is compiled in the Hanford Environmental Dosimetry System (Generation II, or GENII). This coupled system of computer codes is intended for analysis of environmental contamination resulting from acute or chronic releases to, or initial contamination of, air, water, or soil, on through the calculation of radiation doses to individuals or populations. GENII is described in three volumes of documentation. This volume is a Code Maintenance Manual for the serious user, including code logic diagrams, global dictionary, worksheets to assist with hand calculations, and listings of the code and its associated data libraries. The first volume describes the theoretical considerations of the system. The second volume is a Users' Manual, providing code structure, users' instructions, required system configurations, and QA-related topics. 7 figs., 5 tabs

  7. Software Engineering Program: Software Process Improvement Guidebook

    Science.gov (United States)

    1996-01-01

    The purpose of this document is to provide experience-based guidance in implementing a software process improvement program in any NASA software development or maintenance community. This guidebook details how to define, operate, and implement a working software process improvement program. It describes the concept of the software process improvement program and its basic organizational components. It then describes the structure, organization, and operation of the software process improvement program, illustrating all these concepts with specific NASA examples. The information presented in the document is derived from the experiences of several NASA software organizations, including the SEL, the SEAL, and the SORCE. Their experiences reflect many of the elements of software process improvement within NASA. This guidebook presents lessons learned in a form usable by anyone considering establishing a software process improvement program within his or her own environment. This guidebook attempts to balance general and detailed information. It provides material general enough to be usable by NASA organizations whose characteristics do not directly match those of the sources of the information and models presented herein. It also keeps the ideas sufficiently close to the sources of the practical experiences that have generated the models and information.

  8. Phase relationship in the TiO{sub 2}-Nd{sub 2}O{sub 3} pseudo-binary system

    Energy Technology Data Exchange (ETDEWEB)

    Gong, Weiping, E-mail: weiping_gong@csu.edu.cn [State Key Laboratory of Powder Metallurgy, Central South University, Changsha 410083, Hunan (China); Laboratory of Electronic Functional Materials, Huizhou University, Huizhou 516001, Guangdong (China); Zhang, Rui [State Key Laboratory of Powder Metallurgy, Central South University, Changsha 410083, Hunan (China)

    2013-01-25

    Highlights: Black-Right-Pointing-Pointer DSC and XRD measurements for the TiO{sub 2}-Nd{sub 2}O{sub 3} system. Black-Right-Pointing-Pointer Nd{sub 2}Ti{sub 2}O{sub 7}, Nd{sub 2}TiO{sub 5}, Nd{sub 2}Ti{sub 3}O{sub 9} and Nd{sub 4}Ti{sub 9}O{sub 24} exist. Black-Right-Pointing-Pointer Nd{sub 2}Ti{sub 4}O{sub 11} and Nd{sub 4}Ti{sub 9}O{sub 24} were the same compounds. Black-Right-Pointing-Pointer Thermodynamic calculation on the TiO{sub 2}-Nd{sub 2}O{sub 3} system. - Abstract: Phase equilibria in the TiO{sub 2}-Nd{sub 2}O{sub 3} system have been experimentally investigated via X-ray diffraction (XRD) and differential scanning calorimetry (DSC). Four compounds Nd{sub 2}Ti{sub 2}O{sub 7}, Nd{sub 2}TiO{sub 5}, Nd{sub 2}Ti{sub 3}O{sub 9} and Nd{sub 4}Ti{sub 9}O{sub 24} were confirmed to exist. The literature reported Nd{sub 2}Ti{sub 4}O{sub 11} was proved to be the same compound as Nd{sub 4}Ti{sub 9}O{sub 24}, and the reported phase transformation of Nd{sub 2}Ti{sub 4}O{sub 11} from {alpha} structure to {beta} at 1373 K was not detected. All the phase diagram data from both the literatures and the present work were critically reviewed and taken into account during the thermodynamic optimization of the TiO{sub 2}-Nd{sub 2}O{sub 3} system. A set of consistent thermodynamic parameters, which can explain most of the experimental data of the TiO{sub 2}-Nd{sub 2}O{sub 3} system, was achieved. The calculated phase diagram of the TiO{sub 2}-Nd{sub 2}O{sub 3} system was provided.

  9. 2nd SUMO Conference

    CERN Document Server

    Weber, Melanie

    2015-01-01

    This contributed volume contains the conference proceedings of the Simulation of Urban Mobility (SUMO) conference 2014, Berlin. The included research papers cover a wide range of topics in traffic planning and simulation, including open data, vehicular communication, e-mobility, urban mobility, multimodal traffic as well as usage approaches. The target audience primarily comprises researchers and experts in the field, but the book may also be beneficial for graduate students.  

  10. 2nd Bozeman Conference

    CERN Document Server

    Lund, John

    1991-01-01

    This volume contains a collection of papers delivered by the partici­ pants at the second Conference on Computation and Control held at Mon­ tana State University in Bozeman, Montana from August 1-7, 1990. The conference, as well as this proceedings, attests to the vitality and cohesion between the control theorist and the numerical analyst that was adver­ tised by the first Conference on Computation and Control in 1988. The proceedings of that initial conference was published by Birkhiiuser Boston as the first volume of this same series entitled Computation and Control, Proceedings of the Bozeman Conference, Bozeman, Montana, 1988. Control theory and numerical analysis are both, by their very nature, interdisciplinary subjects as evidenced by their interaction with other fields of mathematics and engineering. While it is clear that new control or es­ timation algorithms and new feedback design methodologies will need to be implemented computationally, it is likewise clear that new problems in computation...

  11. Universe (2nd edition)

    International Nuclear Information System (INIS)

    A general text on astronomy is presented. The foundations of the science are reviewed, including descriptions of naked-eye observatons of eclipses and planetary motions and such basic tools as Kepler's laws, the fundamental properties of light, and the optics of telescopes. The formation of the solar system is addressed, and the planets and their satellites are discussed individually. Solar science is treated in detail. Stellar evolution is described chronologically from birth to death. Molecular clouds, star clusters, nebulae, neutron stars, black holes, and various other phenomena that occur in the life of a star are examined in the sequence in which they naturally occur. A survey of the Milky Way introduces galactic astronomy. Quasars and cosmology are addressed, including the most recent developments in research. 156 references

  12. 2nd INTERA Conference

    CERN Document Server

    2014-01-01

    This book presents the latest scientific research related to the field of Robotics. It involves different topics such as biomedicine, energy efficiency and home automation and robotics.  The book is written by technical experts and researchers from academia and industry working on robotics applications.The book could be used as supplementary material for courses related to Robotics and Domotics.

  13. 2nd Abel Symposium

    CERN Document Server

    Nunno, Giulia; Lindstrøm, Tom; Øksendal, Bernt; Zhang, Tusheng

    2007-01-01

    Kiyosi Ito, the founder of stochastic calculus, is one of the few central figures of the twentieth century mathematics who reshaped the mathematical world. Today stochastic calculus is a central research field with applications in several other mathematical disciplines, for example physics, engineering, biology, economics and finance. The Abel Symposium 2005 was organized as a tribute to the work of Kiyosi Ito on the occasion of his 90th birthday. Distinguished researchers from all over the world were invited to present the newest developments within the exciting and fast growing field of stochastic analysis. The present volume combines both papers from the invited speakers and contributions by the presenting lecturers. A special feature is the Memoirs that Kiyoshi Ito wrote for this occasion. These are valuable pages for both young and established researchers in the field.

  14. 2nd ISAAC Congress

    CERN Document Server

    Gilbert, Robert; Kajiwara, Joji

    2000-01-01

    This book is the Proceedings of the Second ISAAC Congress. ISAAC is the acronym of the International Society for Analysis, its Applications and Computation. The president of ISAAC is Professor Robert P. Gilbert, the second named editor of this book, e-mail: gilbert@math.udel.edu. The Congress is world-wide valued so highly that an application for a grant has been selected and this project has been executed with Grant No. 11-56 from *the Commemorative Association for the Japan World Exposition (1970). The finance of the publication of this book is exclusively the said Grant No. 11-56 from *. Thus, a pair of each one copy of two volumes of this book will be sent to all contributors, who registered at the Second ISAAC Congress in Fukuoka, free of charge by the Kluwer Academic Publishers. Analysis is understood here in the broad sense of the word, includ­ ing differential equations, integral equations, functional analysis, and function theory. It is the purpose of ISAAC to promote analysis, its applications, and...

  15. Multi-atom resonant photoemission and the development of next-generation software and high-speed detectors for electron spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Kay, Alexander William

    2000-09-01

    This dissertation has involved the exploration of a new effect in photoelectron emission, multi-atom resonant photoemission (MARPE), as well as the development of new software, data analysis techniques, and detectors of general use in such research. We present experimental and theoretical results related to MARPE, in which the photoelectron intensity from a core level on one atom is influenced by a core-level absorption resonance on another. We point out that some of our and others prior experimental data has been strongly influenced by detector non-linearity and that the effects seen in new corrected data are smaller and of different form. Corrected data for the MnO(001) system with resonance between the O 1s and Mn 2p energy levels are found to be well described by an extension of well-known intraatomic resonant photoemission theory to the interatomic case, provided that interactions beyond the usual second-order Kramers-Heisenberg treatment are included. This theory is also found to simplify under certain conditions so as to yield results equivalent to a classical x-ray optical approach, with the latter providing an accurate and alternative, although less detailed and general, physical picture of these effects. Possible future applications of MARPE as a new probe of near-neighbor identities and bonding and its relationship to other known effects are also discussed. We also consider in detail specially written data acquisition software that has been used for most of the measurements reported here. This software has been used with an existing experimental system to develop the method of detector characterization and then data correction required for the work described above. The development of a next generation one-dimensional, high-speed, electron detector is also discussed. Our goal has been to design, build and test a prototype high-performance, one-dimensional pulse-counting detector that represents a significant advancement in detector technology and is well matched to modern high-brightness synchrotrons radiation sources and high-transmission electron-energy analyzers as typically used in photoelectron spectroscopy experiments. The general design of the detector and the results of initial tests are discussed and the acquisition of photoelectron spectra with the first test detector is described.

  16. Multi-atom resonant photoemission and the development of next-generation software and high-speed detectors for electron spectroscopy

    International Nuclear Information System (INIS)

    This dissertation has involved the exploration of a new effect in photoelectron emission, multi-atom resonant photoemission (MARPE), as well as the development of new software, data analysis techniques, and detectors of general use in such research. We present experimental and theoretical results related to MARPE, in which the photoelectron intensity from a core level on one atom is influenced by a core-level absorption resonance on another. We point out that some of our and others prior experimental data has been strongly influenced by detector non-linearity and that the effects seen in new corrected data are smaller and of different form. Corrected data for the MnO(001) system with resonance between the O 1s and Mn 2p energy levels are found to be well described by an extension of well-known intraatomic resonant photoemission theory to the interatomic case, provided that interactions beyond the usual second-order Kramers-Heisenberg treatment are included. This theory is also found to simplify under certain conditions so as to yield results equivalent to a classical x-ray optical approach, with the latter providing an accurate and alternative, although less detailed and general, physical picture of these effects. Possible future applications of MARPE as a new probe of near-neighbor identities and bonding and its relationship to other known effects are also discussed. We also consider in detail specially written data acquisition software that has been used for most of the measurements reported here. This software has been used with an existing experimental system to develop the method of detector characterization and then data correction required for the work described above. The development of a next generation one-dimensional, high-speed, electron detector is also discussed. Our goal has been to design, build and test a prototype high-performance, one-dimensional pulse-counting detector that represents a significant advancement in detector technology and is well matched to modern high-brightness synchrotrons radiation sources and high-transmission electron-energy analyzers as typically used in photoelectron spectroscopy experiments. The general design of the detector and the results of initial tests are discussed and the acquisition of photoelectron spectra with the first test detector is described

  17. IP Modularity in Software Products and Software Platform Ecosystems

    OpenAIRE

    Waltl, Josef

    2013-01-01

    This dissertation examines the impact of an architecture that is modular w.r.t. Intellectual Property (IP) on software products and software platform ecosystems. The results extend the existing literature on IP modularity by demonstrating a direct association between IP modular product/platform architecture and the business models of the software products or software platforms developers seeking to generate profits. In addition, the early consideration of IP requirements in the requirement...

  18. ENABLE -- A systolic 2nd level trigger processor for track finding and e/? discrimination for ATLAS/LHC

    International Nuclear Information System (INIS)

    The Enable Machine is a systolic 2nd level trigger processor for the transition radiation detector (TRD) of ATLAS/LHC. It is developed within the EAST/RD-11 collaboration at CERN. The task of the processor is to find electron tracks and to reject pion tracks according to the EAST benchmark algorithm in less than 10?s. Track are identified by template matching in a (?,z) region of interest (RoI) selected by a 1st level trigger. In the (?,z) plane tracks of constant curvature are straight lines. The relevant lines form mask templates. Track identification is done by histogramming the coincidences of the templates and the RoI data for each possible track. The Enable Machine is an array processor that handles tracks of the same slope in parallel, and tracks of different slope in a pipeline. It is composed of two units, the Enable histogrammer unit and the Enable z/?-board. The interface daughter board is equipped with a HIPPI-interface developed at JINR/-Dubna, and Xilinx 'corner turning' data converter chips. Enable uses programmable gate arrays (XILINX) for histogramming and synchronous SRAMs for pattern storage. With a clock rate of 40 MHz the trigger decision time is 6.5 ?s and the latency 7.0 ?s. The Enable machine is scalable in the RoI size as well as in the number of tracks processed. It can be adapted to different recognition tasks and detector setups. The prototype of the Enable Machine has been tested in a beam time of the RD6 collaboration at CERN in October 1993

  19. The Effects of Star Strategy of Computer-Assisted Mathematics Lessons on the Achievement and Problem Solving Skills in 2nd Grade Courses

    Directory of Open Access Journals (Sweden)

    Jale ?PEK

    2013-12-01

    Full Text Available The aim of research is to determine the effect of STAR strategy on 2nd grade students’ academic achievement and problem solving skills in the computer assisted mathematics instruction. 30 students who attend to 2nd grade course in a primary school in Ayd?n in 2010-2011 education year join the study group. The research has been taken place in 7 week. 3 types of tests have been used as means of data collecting tool, “Academic Achievement Test”, “Problem Solving Achievement Test” and “The Evaluation Form of Problem Solving Skills”. At the end of research students’ views about computer assisted mathematics instruction were evaluated. It has been examined that whether the differences between the scores of pre-test and post-test are statistically meaningful or not. According to the results, a positive increase on the academic achievement and problem solving skills has been determined at the end of the education carried out with STAR strategy.

  20. Comparative analysis of effectiveness of treatment with anti-TB drugs of the 1st and 2nd lines for children and adolescents with multidrug resistant tuberculosis

    Directory of Open Access Journals (Sweden)

    Tleukhan Abildaev

    2012-05-01

    Full Text Available The paper shows results of study on comparative treatment effectiveness in children and adolescents with from multi drug resistant tuberculosis MDR TB (2000-2008 treated with anti-TB drugs of the 2nd line (80 patients and 1st line (80 patients in the Kazakhstan. It was stated in patients with MDR TB that outcomes of treatment were successful in 91.2%, but relapse development of TB disease occurred in 12.7% of cases, and 5 (6.2% patients died (P ?0.05. Thus, patients with MDR TB need to be treated with anti-TB drugs of the 2nd line accordingly to their DST.

  1. 2nd Annual Workshop Proceedings of the Collaborative Project "Redox Phenomena Controlling Systems" (7th EC FP CP RECOSY) (KIT Scientific Reports ; 7557)

    OpenAIRE

    Buckau, Gunnar [Hrsg.; Kienzler, Bernhard; Duro, Lara; Grivé, Mireia; Montoya, Vanessa [Hrsg.; ,

    2010-01-01

    These are proceedings of the 2nd Annual Workshop of the EURATOM FP7 Collaborative Project "Redox Phenomena Controlling System", held in Larnaca (Cyprus) 16th to 19th March 2010. The project deals with the impact of redox processes on the long-term safety of nuclear waste disposal. The proceedings have six workpackage overview contributions, and 21 reviewed scientific-technical short papers. The proceedings document the scientific-technical progress of the second project year.

  2. Central European Societies of Fortified Settlements in the First Half of the 2nd Millenium BC. Comparative Study of Trial Areas

    OpenAIRE

    Jaeger, Mateusz

    2011-01-01

    The dissertation is aimed at a description and characterization of the central European societies of fortified settlements in the first half of the 2nd millenium BC. Because of the high number of such sites and different stage of research in particular regions, it was necessary to select the sources. Four trial areas were chosen: Alpine area, south-western Wielkopolska, middle Danube basin and upper Tisa basin. All of them were related to different cultural units: inner-Alpine Bronze Age grou...

  3. A STUDY OF OSSIFICATION OF HEADS OF 2ND TO 5TH METACARPALS IN FORENSIC AGE ESTIMATION IN THE KERALA POPULATION

    Directory of Open Access Journals (Sweden)

    Ajay

    2013-12-01

    Full Text Available The current study aims to determine methods to determine age from ossification of heads of 2 nd to 5 th metacarpals in the Kerala population using 85 wrist X - Rays of children aged less than 5 years ROC Curve analys is was the statistical tool employed in the study. KEYWORDS : Age determination in the living , Forensic Age Estimation , Metacarpal , Head of Metacarpals , Kerala Data , Skeletal Change

  4. What is Your Software Worth?

    OpenAIRE

    Wiederhold, Gio

    2005-01-01

    This article presents a method for valuing software based on the income that use of that software is expected to generate in the future. Well-known principles of intellectual property (IP) valuation, sales expectations, discounting to present value, and the like, are applied, always focusing on the benefits and costs of software. A major issue, not dealt with in the literature of valuing intangibles, is that software is continually upgraded. Applying depreciation schedules is the simple solu...

  5. Software engineering

    CERN Document Server

    Sommerville, Ian

    2010-01-01

    Intended for introductory and advanced courses in software engineering. The ninth edition of Software Engineering presents a broad perspective of software engineering, focusing on the processes and techniques fundamental to the creation of reliable, software systems. Increased coverage of agile methods and software reuse, along with coverage of 'traditional' plan-driven software engineering, gives readers the most up-to-date view of the field currently available. Practical case studies, a full set of easy-to-access supplements, and extensive web resources make teaching the course easier than ever. The book is now structured into four parts: 1: Introduction to Software Engineering 2: Dependability and Security 3: Advanced Software Engineering 4: Software Engineering Management

  6. Preparation of nanometric CeO2-ZrO2-Nd2O3 solid solution and its catalytic performances

    International Nuclear Information System (INIS)

    A kind of nanometric CeO2-ZrO2-Nd2O3 (CZN) solid solution for a carrier in the automotive three-way catalysts was synthesized by a coprecipitation method and characterized by means of X-ray diffraction (XRD), X-ray photoelectron spectroscopy (XPS), nitrogen adsorption-desorption (BET), scanning electron microscopy (SEM) and oxygen storage capacity (OSC). For the purpose of comparison, an unincorporated CeO2-ZrO2 (CZ) was also synthesized. The XRD measurements disclose the prepared CeO2-ZrO2-Nd2O3 have a face-centered cubic fluorite structure and nanoparticle sizes. According to the results of XPS, Nd3+ ions can enter the CZ lattice and form a homogenous solid solution. Oxygen storage capacity measurements reveal that CeO2-ZrO2-Nd2O3 display high oxygen mobility at a low temperature. The results of the activity tests show that the catalyst exhibits good three-way catalytic activity and fairly wide range of air-to-fuel ratios

  7. Thematic network on the analysis of thorium and its isotopes in workplace materials. Report on the 2nd intercomparison exercise

    International Nuclear Information System (INIS)

    Work Package 2 (WP 2) of the EC Thematic Network on ''The analysis of thorium and its isotopes in workplace materials'' is concerned with ''Examination and comparison of analytical techniques for the determination of thorium and its progeny in bulk materials and the development of standards, both for the calibration of associated metrological facilities, and for routine quality control purposes''. The primary objective of WP 2 is ''To evaluate appropriate techniques and determine best practice for analysis of 232Th at workplace levels and environments'', and is to be achieved through a series of intercomparison exercises. Following the results of the 1st intercomparison exercise, a 2nd intercomparison exercise was designed to evaluate the capability of the analytical methods currently being used by European laboratories to determine thorium at very low levels in the presence of a complex inorganic matrix. The intercomparison exercise involved the analysis of three samples of thorium in solution, prepared by the United Kingdom National Physical Laboratory. One sample was the equilibrium solution, analysed in the 1st intercomparison exercise. The other two samples were prepared by dilution of a non-equilibrium thorium solution, one of which was spiked with impurity elements. (i) The results showed a further improvement in the accuracy of measurements in the equilibrium solution, when compared with the results from the 1st intercomparison study. There was an overall drop in u-test values and the number of u-test values above the upper limit of significance fell from 6 to 3. (ii) Overall, participating laboratories also performed well in the analysis of the low level non-equilibrium solutions. However, several laboratories using y-spectrometry had insufficient sensitivity for measurement of low level non-equilibrium sample solutions and did not report results. (iii) There was little evidence that the presence of impurities had a detrimental effect on measurements made on the low level non-equilibrium sample solutions. (iv) Some laboratories used different analytical techniques in this work than in the 1st intercomparison study. Consequently, there is a slight change in the spread of analytical techniques used

  8. PREFACE: The 2nd International Conference on Geological, Geographical, Aerospace and Earth Sciences 2014 (AeroEarth 2014)

    Science.gov (United States)

    Lumban Gaol, Ford; Soewito, Benfano

    2015-01-01

    The 2nd International Conference on Geological, Geographical, Aerospace and Earth Sciences 2014 (AeroEarth 2014), was held at Discovery Kartika Plaza Hotel, Kuta, Bali, Indonesia during 11 - 12 October 2014. The AeroEarth 2014 conference aims to bring together researchers and engineers from around the world. Through research and development, earth scientists have the power to preserve the planet's different resource domains by providing expert opinion and information about the forces which make life possible on Earth. Earth provides resources and the exact conditions to make life possible. However, with the advent of technology and industrialization, the Earth's resources are being pushed to the brink of depletion. Non-sustainable industrial practices are not only endangering the supply of the Earth's natural resources, but are also putting burden on life itself by bringing about pollution and climate change. A major role of earth science scholars is to examine the delicate balance between the Earth's resources and the growing demands of industrialization. Through research and development, earth scientists have the power to preserve the planet's different resource domains by providing expert opinion and information about the forces which make life possible on Earth. We would like to express our sincere gratitude to all in the Technical Program Committee who have reviewed the papers and developed a very interesting Conference Program as well as the invited and plenary speakers. This year, we received 98 papers and after rigorous review, 17 papers were accepted. The participants come from eight countries. There are four Parallel Sessions and two invited Speakers. It is an honour to present this volume of IOP Conference Series: Earth and Environmental Science (EES) and we deeply thank the authors for their enthusiastic and high-grade contributions. Finally, we would like to thank the conference chairmen, the members of the steering committee, the organizing committee, the organizing secretariat and the financial support from the conference sponsors that allowed the success of AeroEarth 2014. The Editors of the AeroEarth 2014 Proceedings Dr. Ford Lumban Gaol Dr. Benfano Soewito

  9. Software reliability

    CERN Document Server

    Bendell, A

    1986-01-01

    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  10. Improving Software Citation and Credit

    CERN Document Server

    Allen, Alice; DuPrie, Kimberly; Mink, Jessica; Nemiroff, Robert; Robitaille, Thomas; Shamir, Lior; Shortridge, Keith; Taylor, Mark; Teuben, Peter; Wallin, John

    2015-01-01

    The past year has seen movement on several fronts for improving software citation, including the Center for Open Science's Transparency and Openness Promotion (TOP) Guidelines, the Software Publishing Special Interest Group that was started at January's AAS meeting in Seattle at the request of that organization's Working Group on Astronomical Software, a Sloan-sponsored meeting at GitHub in San Francisco to begin work on a cohesive research software citation-enabling platform, the work of Force11 to "transform and improve" research communication, and WSSSPE's ongoing efforts that include software publication, citation, credit, and sustainability. Brief reports on these efforts were shared at the BoF, after which participants discussed ideas for improving software citation, generating a list of recommendations to the community of software authors, journal publishers, ADS, and research authors. The discussion, recommendations, and feedback will help form recommendations for software citation to those publishers...

  11. Second-order adjoint sensitivity analysis methodology (2nd-ASAM) for computing exactly and efficiently first- and second-order sensitivities in large-scale linear systems: I. Computational methodology

    Science.gov (United States)

    Cacuci, Dan G.

    2015-03-01

    This work presents the second-order forward and adjoint sensitivity analysis methodologies (2nd-FSAM and 2nd-ASAM) for computing exactly and efficiently the second-order functional derivatives of physical (engineering, biological, etc.) system responses (i.e., "system performance parameters") to the system's model parameters. The definition of "system parameters" used in this work includes all computational input data, correlations, initial and/or boundary conditions, etc. For a physical system comprising N? parameters and Nr responses, we note that the 2nd-FSAM requires a total of (N?2 / 2 + 3N? / 2) large-scale computations for obtaining all of the first- and second-order sensitivities, for all Nr system responses. On the other hand, for one functional-type system response, the 2nd-ASAM requires one large-scale computation using the first-level adjoint sensitivity system for obtaining all of the first-order sensitivities, followed by at most N? large-scale computations using the second-level adjoint sensitivity systems for obtaining exactly all of the second-order sensitivities. Therefore, the 2nd-FSAM should be used when Nr ?N?, while the 2nd-ASAM should be used when N? ?Nr. The original 2nd-ASAM presented in this work should enable the hitherto very difficult, if not intractable, exact computation of all of the second-order response sensitivities (i.e., functional Gateaux-derivatives) for large-systems involving many parameters, as usually encountered in practice. Very importantly, the implementation of the 2nd-ASAM requires very little additional effort beyond the construction of the adjoint sensitivity system needed for computing the first-order sensitivities.

  12. TESTING FOR OBJECT ORIENTED SOFTWARE

    Directory of Open Access Journals (Sweden)

    Jitendra S. Kushwah

    2011-02-01

    Full Text Available This paper deals with design and development of an automated testing tool for Object Oriented Software. By an automated testing tool, we mean a tool that automates a part of the testing process. It can include one or more of the following processes: test strategy eneration, test case generation, test case execution, test data generation, reporting and logging results. By object-oriented software we mean a software designed using OO approach and implemented using a OO language. Testing of OO software is different from testing software created using procedural languages. Severalnew challenges are posed. In the past most of the methods for testing OO software were just a simple extension of existing methods for conventional software. However, they have been shown to be not very appropriate. Hence, new techniques have been developed. This thesis work has mainly focused on testing design specifications for OO software. As described later, there is a lack of specification-based testing tools for OO software. An advantage of testing software specifications as compared to program code is that specifications aregenerally correct whereas code is flawed. Moreover, with software engineering principles firmly established in the industry, most of the software developed nowadays follow all the steps of Software Development Life Cycle (SDLC. For this work, UML specifications created in Rational Rose are taken. UML has become the de-factostandard for analysis and design of OO software. Testing is conducted at 3 levels: Unit, Integration and System. At the system level there is no difference between the testing techniques used for OO software and other software created using a procedural language, and hence, conventional techniques can be used. This tool provides features for testing at Unit (Class level as well as Integration level. Further a maintenance-level component has also been incorporated. Results of applying this tool to sample Rational Rose files have been ncorporated, and have been found to be satisfactory.

  13. A software engineering process for safety-critical software application

    International Nuclear Information System (INIS)

    Application of computer software to safety-critical systems in on the increase. To be successful, the software must be designed and constructed to meet the functional and performance requirements of the system. For safety reason, the software must be demonstrated not only to meet these requirements, but also to operate safely as a component within the system. For longer-term cost consideration, the software must be designed and structured to ease future maintenance and modifications. This paper presents a software engineering process for the production of safety-critical software for a nuclear power plant. The presentation is expository in nature of a viable high quality safety-critical software development. It is based on the ideas of a rational design process and on the experience of the adaptation of such process in the production of the safety-critical software for the shutdown system number two of Wolsung 2, 3 and 4 nuclear power generation plants. This process is significantly different from a conventional process in terms of rigorous software development phases and software design techniques, The process covers documentation, design, verification and testing using mathematically precise notations and highly reviewable tabular format to specify software requirements and software requirements and software requirements and code against software design using static analysis. The software engineering process described in this paper applies the principle of information-hiding decomposition in software design using a modular design technique so that when a change is required or an error is detected, the affected scope can be readily and confidently located. it also facilitates a sense of high degree of confidence in the 'correctness' of the software production, and provides a relatively simple and straightforward code implementation effort. 1 figs., 10 refs. (Author)

  14. Proceedings of the 2nd NUCEF international symposium NUCEF`98. Safety research and development of base technology on nuclear fuel cycle

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-03-01

    This volume contains 68 papers presented at the 2nd NUCEF International Symposium NUCEF`98 held on 16-17 November 1998, in Hitachinaka, Japan, following the 1st symposium NUCEF`95 (Proceeding: JAERI-Conf 96-003). The theme of this symposium was `Safety Research and Development of Base Technology on Nuclear Fuel Cycle`. The papers were presented in oral and poster sessions on following research fields: (1) Criticality Safety, (2) Reprocessing and Partitioning, (3) Radioactive Waste Management. The 68 papers are indexed individually. (J.P.N.)

  15. 2nd ESMO Consensus Conference on Lung Cancer: non-small-cell lung cancer first-line/second and further lines of treatment in advanced disease

    DEFF Research Database (Denmark)

    Besse, B; Adjei, A; Baas, P; Meldgaard, P; Nicolson, M; Paz-Ares, L; Reck, M; Smit, E F; Syrigos, K; Stahel, R; Felip, E; Peters, S

    2014-01-01

    To complement the existing treatment guidelines for all tumour types, ESMO organises consensus conferences to focus on specific issues in each type of tumour. The 2nd ESMO Consensus Conference on Lung Cancer was held on 11-12 May 2013 in Lugano. A total of 35 experts met to address several questions on non-small-cell lung cancer (NSCLC) in each of four areas: pathology and molecular biomarkers, first-line/second and further lines of treatment in advanced disease, early-stage disease and locally ...

  16. Proceedings of the 2nd NUCEF international symposium NUCEF'98. Safety research and development of base technology on nuclear fuel cycle

    International Nuclear Information System (INIS)

    This volume contains 68 papers presented at the 2nd NUCEF International Symposium NUCEF'98 held on 16-17 November 1998, in Hitachinaka, Japan, following the 1st symposium NUCEF'95 (Proceeding: JAERI-Conf 96-003). The theme of this symposium was 'Safety Research and Development of Base Technology on Nuclear Fuel Cycle'. The papers were presented in oral and poster sessions on following research fields: (1) Criticality Safety, (2) Reprocessing and Partitioning, (3) Radioactive Waste Management. The 68 papers are indexed individually. (J.P.N.)

  17. Phase equilibria and crystal chemistry of the CaO–1/2 Nd2O3–CoOz system at 885 °C in air

    International Nuclear Information System (INIS)

    The phase diagram of the CaO–1/2 Nd2O3–CoOz system at 885 °C in air has been determined. The system consists of two calcium cobaltate compounds that have promising thermoelectric properties, namely, the 2D thermoelectric oxide solid solution, (Ca3?xNdx)Co4O9?z (0?x?0.5), which has a misfit layered structure, and Ca3Co2O6 which consists of 1D chains of alternating CoO6 trigonal prisms and CoO6 octahedra. Ca3Co2O6 was found to be a point compound without the substitution of Nd on the Ca site. The reported Nd2CoO4 phase was not observed at 885 °C. A ternary (Ca1?xNd1+x)CoO4?z (x=0) phase, or (CaNdCo)O4?z, was found to be stable at this temperature. A solid solution region of distorted perovskite (Nd1?xCax)CoO3?z (0?x?0.25, space group Pnma) was established. In the peripheral binary systems, while a solid solution region was identified for (Nd1?xCax)2O3?z (0?x?0.2), Nd was not found to substitute in the Ca site of CaO. Six solid solution tie-line regions and six three-phase regions were determined in the CaO–Nd2O3–CoOz system in air. - Graphical abstract: Phase diagram of the 1/2 Nd2O3–CaO–CoOx system at 885 °C, showing the limits of various solid solutions, and the tie-line relationships of various phases. - Highlights: • Phase diagram of the CaO–1/2 Nd2O3–CoOz system constructed. • System consists of thermoelectric oxide (Ca3?xNdx)Co4O9?z (0?x?0.5). • Structures of (Nd1?xCax)CoO3?z and (CaNdCo)O4?z determined

  18. Software Radio

    Directory of Open Access Journals (Sweden)

    Varun Sharma

    2010-05-01

    Full Text Available This paper aims to provide an overview on rapidly growing technology in the radio domain which overcomes the drawbacks suffered by the conventional analog radio. This is the age of Software radio – the technology which tries to transform the hardware radio transceivers into smart programmable devices which can fit into various devices available in today’s rapidly evolving wireless communication industry. This new technology has some or the entire physical layer functions software defined. All of the waveform processing, including the physical layer, of a wireless device moves into the software. An ideal Software Radio provides improved device flexibility, software portability, and reduced development costs. This paper tries to get into the details of all this. It takes one through a brief history of conventional radios, analyzes the drawbacks and then focuses on the Software radio in overcoming these short comings.

  19. Effect of Software Manager, Programmer and Customer over Software Quality

    Directory of Open Access Journals (Sweden)

    Ghrehbaghi Farhad

    2013-01-01

    Full Text Available Several factors might be affecting the quality of software products. In this study we focus on three significant parameters: software manager, programmer and the customer. Our study demonstrates that the quality of product will improve by increasing the information generated by these three parameters. These parameters can be considered as triangular which the quality is its centroid. In this perspective, if the triangular be equilateral, then the optimum quality of the product will beachieved. In other words, to generate high quality software, the ability of software manager, programmer and customer must be same. Subsequently, only a manager or only an expert programmer, and with a software aware customer cannot be a guarantee to the high quality of the software products.

  20. Software Economies

    OpenAIRE

    Bacon, David F.; Bokelberg, Eric; Chen, Yiling; Kash, Ian; Parkes, David C; Rao, Malvika; Sridharan, Manu

    2010-01-01

    Software construction has typically drawn on engineering metaphors like building bridges or cathedrals, which emphasize architecture, specification, central planning, and determinism. Approaches to correctness have drawn on metaphors from mathematics, like formal proofs. However, these approaches have failed to scale to modern software systems, and the problem keeps getting worse. We believe that the time has come to completely re-imagine the creation of complex software, drawing on systems i...

  1. Software Innovation

    DEFF Research Database (Denmark)

    Rose, Jeremy

    2010-01-01

      Innovation is the forgotten key to modern systems development - the element that defines the enterprising engineer, the thriving software firm and the cutting edge software application.  Traditional forms of technical education pay little attention to creativity - often encouraging overly rationalistic ways of thinking which stifle the ability to innovate. Professional software developers are often drowned in commercial drudgery and overwhelmed by work pressure and deadlines. The topic that wi...

  2. Preseismic oscillating electric field "strange attractor like" precursor, of T = 6 months, triggered by Ssa tidal wave. Application on large (Ms > 6.0R) EQs in Greece (October 1st, 2006 - December 2nd, 2008)

    CERN Document Server

    Thanassoulas, C; Verveniotis, G; Zymaris, N

    2009-01-01

    In this work the preseismic "strange attractor like" precursor is studied, in the domain of the Earth's oscillating electric field for T = 6 months. It is assumed that the specific oscillating electric field is generated by the corresponding lithospheric oscillation, triggered by the Ssa tidal wave of the same wave length (6 months) under excess strain load conditions met in the focal area of a future large earthquake. The analysis of the recorded Earth's oscillating electric field by the two distant monitoring sites of PYR and HIO and for a period of time of 26 months (October 1st, 2006 - December 2nd, 2008) suggests that the specific precursor can successfully resolve the predictive time window in terms of months and for a "swarm" of large EQs (Ms > 6.0R), in contrast to the resolution obtained by the use of electric fields of shorter (T = 1, 14 days, single EQ identification) wave length. More over, the fractal character of the "strange attractor like" precursor in the frequency domain is pointed out. Fina...

  3. Software Reviews.

    Science.gov (United States)

    Miller, Anne, Ed.; Radziemski, Cathy, Ed.

    1988-01-01

    Reviews two software packages for the Macintosh series. "Course Builder 2.0," a courseware authoring system, allows the user to create programs which stand alone and may be used independently in the classroom. "World Builder," an artificial intelligence software package, allows creative thinking, problem-solving, and decision-making. (YP)

  4. Software Reviews.

    Science.gov (United States)

    Science and Children, 1988

    1988-01-01

    Reviews six software packages for the Apple II family. Programs reviewed include "Science Courseware: Earth Science Series"; "Heat and Light"; "In Search of Space: Introduction to Model Rocketry"; "Drug Education Series: Drugs--Their Effects on You'"; "Uncertainties and Measurement"; and "Software Films: Learning about Science Series," which…

  5. A study on trait anger – anger expression and friendship commitment levels of primary school 2nd stage students who play – do not play sports

    Directory of Open Access Journals (Sweden)

    Hüseyin K?r?mo?lu

    2010-09-01

    Full Text Available The aim of this research was to investigate the state of trait anger-anger expression and friendship commitment levels depending on the fact whether 2nd-stage-students in primary schools played sports or not.Personal Information Form, Trait Anger-Anger Expression Scales and Inventory of Peer Attachment were used in order to collect the data.The population of the research was consisted of the students who studied in 2nd stage of 40 primary state schools that belonged to National Education Directorate of Hatay Province between 2009-2010 academic year. Sample group was made up by 853 students of 21 primary schools who were selected from the population (262 boy students and 149 girl students who played sports as registered players; 233 boy students and 209 girl students who did not play sports..To sum up; the comparison of the scores of trait anger and external anger of the participant students who played sports yielded a statistically significant difference in terms of sex variable (p< 0.05. As for the sedentary group, boys had higher scores of internal anger and external anger than girls. In the comparison of the scores of friendship commitment in sedentary students in terms of sex variable, it was found out that there was a statistically significant difference between girls and boys, which was in favour of boys (p<0.05.

  6. Software management issues

    International Nuclear Information System (INIS)

    The difficulty of managing the software in large HEP collaborations appears to becoming progressively worst with each new generation of detector. If one were to extrapolate to the SSC, it will become a major problem. This paper explores the possible causes of the difficulty and makes suggestions on what corrective actions should be taken

  7. Software management issues

    Energy Technology Data Exchange (ETDEWEB)

    Kunz, P.F.

    1990-06-01

    The difficulty of managing the software in large HEP collaborations appears to becoming progressively worst with each new generation of detector. If one were to extrapolate to the SSC, it will become a major problem. This paper explores the possible causes of the difficulty and makes suggestions on what corrective actions should be taken.

  8. OIFITS 2: the 2nd version of the Data Exchange Standard for Optical (Visible/IR) Interferometry

    CERN Document Server

    Duvert, Gilles; Hummel, Christian

    2015-01-01

    This paper describes version 2 of the OI Exchange Format (OIFITS), the standard for exchanging calibrated data from optical (visible/infrared) interferometers. This IAU-endorsed standard has been in use for 10 years at most of the past and current optical interferometer projects, including COAST, NPOI, IOTA, CHARA, VLTI, PTI and the Keck interferometer. Software is available for reading, writing and merging OI Exchange Format files. This version 2 provides definitions of additional data tables (e.g. for polarisation measurements), addressing the needs of future interferometric instruments. Also included are data columns for a more rigorous description of measurement errors and their correlations. In that, this document is a step towards the design of a common data model for optical interferometry. Finally, the main OIFITS header is expanded with several new keywords summarising the content to allow data base searches.

  9. Software requirements

    CERN Document Server

    Wiegers, Karl E

    2003-01-01

    Without formal, verifiable software requirements-and an effective system for managing them-the programs that developers think they've agreed to build often will not be the same products their customers are expecting. In SOFTWARE REQUIREMENTS, Second Edition, requirements engineering authority Karl Wiegers amplifies the best practices presented in his original award-winning text?now a mainstay for anyone participating in the software development process. In this book, you'll discover effective techniques for managing the requirements engineering process all the way through the development cy

  10. Software Reviews.

    Science.gov (United States)

    Mathematics and Computer Education, 1988

    1988-01-01

    Presents reviews of six software packages. Includes (1) "Plain Vanilla Statistics"; (2) "MathCAD 2.0"; (3) "GrFx"; (4) "Trigonometry"; (5) "Algebra II"; (6) "Algebra Drill and Practice I, II, and III." (PK)

  11. Desempenho ortográfico de escolares do 2º ao 5º ano do ensino público Spelling performance of 2nd to 5th grade students from public school

    Directory of Open Access Journals (Sweden)

    Simone Aparecida Capellini

    2011-09-01

    Full Text Available OBJETIVOS: Caracterizar, comparar e classificar o desempenho de escolares do 2º ao 5º ano do ensino público segundo a semiologia dos erros. MÉTODOS: Participaram deste estudo 120 escolares do 2º ao 5º ano de escola pública municipal de Marília-SP, sendo 30 de cada série, divididos em quatro grupos: GI (2º ano; GII (3º ano; GIII (4º ano; e GIV (5º ano. Como procedimento foram aplicadas as provas do Pro-Ortografia: versão coletiva (escrita de letras do alfabeto, ditado randomizado das letras do alfabeto, ditado de palavras, ditado de pseudopalavras, ditado com figuras, escrita temática induzida por figura e versão individual (ditado de frases, erro proposital, ditado soletrado, memória lexical ortográfica. RESULTADOS: Houve diferença na comparação intergrupos, indicando melhor desempenho dos escolares a cada série subsequente, na maior parte das provas da versão coletiva e individual. Com o avanço da seriação escolar, os grupos apresentaram menor média de erros na escrita. CONCLUSÃO: O perfil de aquisição da ortografia do sistema de escrita do Português observado em escolares do ensino público é indicativo do funcionamento normal de desenvolvimento da escrita infantil.PURPOSE: To characterize, compare and classify the performance of 2nd to 5th grade students from public schools according to the semiology of spelling errors. METHODS: Participants were 120 students from 2nd to 5th grades of a public school in Marília (SP, Brazil, 30 students from each grade, who were divided into four groups: GI (2nd grade, GII (3rd grade, GIII (4th grade, and GIV (5th grade. The tasks of the Pro-Ortografia test were applied: collective version (writing of alphabet letters, randomized dictation of letters, words dictation, nonwords dictation, dictation with pictures, thematic writing induced by picture and individual version (dictation of sentences, purposeful error, spelled dictation, orthographic lexical memory. RESULTS: Significant difference was found in the between-group comparison indicating better performance of students in every subsequent grade in most of the individual and collective version tasks. With the increase of grade level, the groups decreased the average of writing errors. CONCLUSION: The profile of spelling acquisition of the Portuguese writing system found in these public school students indicates normal writing development in this population.

  12. Desempenho ortográfico de escolares do 2º ao 5º ano do ensino público / Spelling performance of 2nd to 5th grade students from public school

    Scientific Electronic Library Online (English)

    Simone Aparecida, Capellini; Amanda Corrêa do, Amaral; Andrea Batista, Oliveira; Maria Nobre, Sampaio; Natália, Fusco; José Francisco, Cervera-Mérida; Amparo, Ygual-Fernández.

    2011-09-01

    Full Text Available OBJETIVOS: Caracterizar, comparar e classificar o desempenho de escolares do 2º ao 5º ano do ensino público segundo a semiologia dos erros. MÉTODOS: Participaram deste estudo 120 escolares do 2º ao 5º ano de escola pública municipal de Marília-SP, sendo 30 de cada série, divididos em quatro grupos: [...] GI (2º ano); GII (3º ano); GIII (4º ano); e GIV (5º ano). Como procedimento foram aplicadas as provas do Pro-Ortografia: versão coletiva (escrita de letras do alfabeto, ditado randomizado das letras do alfabeto, ditado de palavras, ditado de pseudopalavras, ditado com figuras, escrita temática induzida por figura) e versão individual (ditado de frases, erro proposital, ditado soletrado, memória lexical ortográfica). RESULTADOS: Houve diferença na comparação intergrupos, indicando melhor desempenho dos escolares a cada série subsequente, na maior parte das provas da versão coletiva e individual. Com o avanço da seriação escolar, os grupos apresentaram menor média de erros na escrita. CONCLUSÃO: O perfil de aquisição da ortografia do sistema de escrita do Português observado em escolares do ensino público é indicativo do funcionamento normal de desenvolvimento da escrita infantil. Abstract in english PURPOSE: To characterize, compare and classify the performance of 2nd to 5th grade students from public schools according to the semiology of spelling errors. METHODS: Participants were 120 students from 2nd to 5th grades of a public school in Marília (SP), Brazil, 30 students from each grade, who w [...] ere divided into four groups: GI (2nd grade), GII (3rd grade), GIII (4th grade), and GIV (5th grade). The tasks of the Pro-Ortografia test were applied: collective version (writing of alphabet letters, randomized dictation of letters, words dictation, nonwords dictation, dictation with pictures, thematic writing induced by picture) and individual version (dictation of sentences, purposeful error, spelled dictation, orthographic lexical memory). RESULTS: Significant difference was found in the between-group comparison indicating better performance of students in every subsequent grade in most of the individual and collective version tasks. With the increase of grade level, the groups decreased the average of writing errors. CONCLUSION: The profile of spelling acquisition of the Portuguese writing system found in these public school students indicates normal writing development in this population.

  13. PREFACE: 2nd Russia-Japan-USA Symposium on the Fundamental and Applied Problems of Terahertz Devices and Technologies (RJUS TeraTech - 2013)

    Science.gov (United States)

    Karasik, Valeriy; Ryzhii, Viktor; Yurchenko, Stanislav

    2014-03-01

    The 2nd Russia-Japan-USA Symposium 'The Fundamental & Applied Problems of Terahertz Devices & Technologies' (RJUS TeraTech - 2013) Bauman Moscow State Technical University Moscow, Russia, 3-6 June, 2013 The 2nd Russia-Japan-USA Symposium 'The Fundamental & Applied Problems of Terahertz Devices & Technologies' (RJUS TeraTech - 2013) was held in Bauman Moscow State Technical University on 3-6 June 2013 and was devoted to modern problems of terahertz optical technologies. RJUS TeraTech 2013 was organized by Bauman Moscow State Technical University in cooperation with Tohoku University (Sendai, Japan) and University of Buffalo (The State University of New York, USA). The Symposium was supported by Bauman Moscow State Technical University (Moscow, Russia) and Russian Foundation for Basic Research (grant number 13-08-06100-g). RJUS TeraTech - 2013 became a foundation for sharing and discussing modern and promising achievements in fundamental and applied problems of terahertz optical technologies, devices based on grapheme and grapheme strictures, condensed matter of different nature. Among participants of RJUS TeraTech - 2013, there were more than 100 researchers and students from different countries. This volume contains proceedings of the 2nd Russia-Japan-USA Symposium 'The Fundamental & Applied Problems of Terahertz Devices & Technologies'. Valeriy Karasik, Viktor Ryzhii and Stanislav Yurchenko Bauman Moscow State Technical University Symposium chair Anatoliy A Aleksandrov, Rector of BMSTU Symposium co-chair Valeriy E Karasik, Head of the Research and Educational Center 'PHOTONICS AND INFRARED TECHNOLOGY' (Russia) Invited Speakers Taiichi Otsuji, Research Institute of Electrical Communication, Tohoku University, Sendai, Japan Akira Satou, Research Institute of Electrical Communication, Tohoku University, Sendai, Japan Michael Shur, Electrical, Computer and System Engineering and Physics, Applied Physics, and Astronomy, Rensselaer Polytechnic Institute, NY, USA Natasha Kirova, University Paris-Sud, France Andrei Sergeev, Department of Electrical Engineering, The University of Buffalo, The State University of New Your, Buffalo, NY, USA Magnus Willander, Linkoping University (LIU), Department of Science and Technology, Linkopings, Sweden Dmitry R Khohlov, Physical Faculty, Lomonosov Moscow State University, Russia Vladimir L Vaks, Institute for Physics of Microstructures of Russian Academy of Sciences, Russia

  14. Inventory of safeguards software

    International Nuclear Information System (INIS)

    The purpose of this survey activity will serve as a basis for determining what needs may exist in this arena for development of next-generation safeguards systems and approaches. 23 software tools are surveyed by JAEA and NMCC. Exchanging information regarding existing software tools for safeguards and discussing about a next R and D program of developing a general-purpose safeguards tool should be beneficial to a safeguards system design and indispensable to evaluate a safeguards system for future nuclear fuel facilities. (author)

  15. Molten carbonate fuel cell product design & improvement - 2nd quarter, 1996. Quarterly report, April 1--June 30, 1996

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-05-01

    The main objective of this project is to establish the commercial readiness of a molten carbonate fuel cell power plant for distributed power generation, cogeneration, and compressor station applications. This effort includes marketing, systems design and analysis, packaging and assembly, test facility development, and technology development, improvement, and verification.

  16. THE FINANCIAL EXPERT AUTOMATED SOFTWARE COMPLEX

    Directory of Open Access Journals (Sweden)

    Zaikina L. N.

    2015-04-01

    Full Text Available In the article we describe the created automated software complex called “Financial expert”, adapted to the environment of 1C: Enterprise 8.3. This system is intended for a complex assessment of a financial and economic condition of companies of the construction branch on the basis of integration of diverse methods, such as probabilistic, fuzzy-production and neural. This program has no analogs in the Russian Federation and allows creating effective and adequate system of an assessment of a financial and economic condition of companies of the construction branch of the Krasnodar region. The software complex of "Financial Expert" has such opportunities as the analysis of a financial and economic condition of the enterprises of the construction branch with the help of: the cluster analysis, discriminant and regression models proposed in the thesis, as well as classic models: Fullman, Springeyt, the 2nd factorial, Taffler and Altman's model adapted for Russia; complex assessment of a condition of the construction enterprises by the analysis of quantitative and qualitative characteristics using fuzzy logic systems, including original structure and base of rules of fuzzy inference; an assessment of a financial and economic condition of the construction enterprises by means of neural networks. The software complex allows carrying out the comparative analysis of the models stated above at an assessment of a financial and economic condition of the particular company in the construction branch

  17. Design and manufacture of a D-shape coil-based toroid-type HTS DC reactor using 2nd generation HTS wire

    International Nuclear Information System (INIS)

    Highlights: • The authors designed and fabricated a D-shape coil based toroid-type HTS DC reactor using 2G GdBCO HTS wires. • The toroid-type magnet consisted of 30 D-shape double pancake coil (DDC)s. The total length of the wire was 2.32 km. • The conduction cooling method was adopted for reactor magnet cooling. • The maximum cooling temperature of reactor magnet is 5.5 K. • The inductance was 408 mH in the steady-state condition (300 A operating). - Abstract: This paper describes the design specifications and performance of a real toroid-type high temperature superconducting (HTS) DC reactor. The HTS DC reactor was designed using 2G HTS wires. The HTS coils of the toroid-type DC reactor magnet were made in the form of a D-shape. The target inductance of the HTS DC reactor was 400 mH. The expected operating temperature was under 20 K. The electromagnetic performance of the toroid-type HTS DC reactor magnet was analyzed using the finite element method program. A conduction cooling method was adopted for reactor magnet cooling. Performances of the toroid-type HTS DC reactor were analyzed through experiments conducted under the steady-state and charge conditions. The fundamental design specifications and the data obtained from this research will be applied to the design of a commercial-type HTS DC reactor

  18. Final Report of the 2nd Ex-Vessel Neutron Dosimetry Installation And Evaluations for Yonggwang Unit 1 Reactor Pressure Vessel

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Byoung Chul; Yoo, Choon Sung; Lee, Sam Lai; Chang, Kee Ok; Gong, Un Sik; Choi, Kwon Jae; Chang, Jong Hwa; Li, Nam Jin; Hong, Joon Wha

    2007-01-15

    This report describes a neutron fluence assessment performed for the Yonggwang Unit 1 pressure vessel belt line region based on the guidance specified in Regulatory Guide 1.190. In this assessment, maximum fast neutron exposures expressed in terms of fast neutron fluence (E>1 MeV) and iron atom displacements (dpa) were established for the belt line region of the pressure vessel. During Cycle 16 of reactor operation, 2nd Ex-Vessel Neutron Dosimetry Program was instituted at Yonggwang Unit 1 to provide continuous monitoring of the belt line region of the reactor vessel. The use of the Ex-Vessel Neutron Dosimetry Program coupled with available surveillance capsule measurements provides a plant specific data base that enables the evaluation of the vessel exposure and the uncertainty associated with that exposure over the service life of the unit. Ex-Vessel Neutron Dosimetry has been evaluated at the conclusion of Cycle 16.

  19. Final Report of the 2nd Ex-Vessel Neutron Dosimetry Installation And Evaluations for Kori Unit 1 Reactor Pressure Vessel

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Byoung Chul; Yoo, Choon Sung; Lee, Sam Lai; Chang, Kee Ok; Gong, Un Sik; Choi, Kwon Jae; Chang, Jong Hwa; Kim, Kwan Hyun; Hong, Joon Wha

    2007-02-15

    This report describes a neutron fluence assessment performed for the Kori Unit 1 pressure vessel beltline region based on the guidance specified in Regulatory Guide 1.190. In this assessment, maximum fast neutron exposures expressed in terms of fast neutron fluence (E>1 MeV) and iron atom displacements (dpa) were established for the beltline region of the pressure vessel. After Cycle 22 of reactor operation, 2nd Ex-Vessel Neutron Dosimetry Program was instituted at Kori Unit 1 to provide continuous monitoring of the beltline region of the reactor vessel. The use of the Ex-Vessel Neutron Dosimetry Program coupled with available surveillance capsule measurements provides a plant specific data base that enables the evaluation of the vessel exposure and the uncertainty associated with that exposure over the service life of the unit. Ex-Vessel Neutron Dosimetry has been evaluated at the conclusion of Cycle 23.

  20. ISE-SPL: uma abordagem baseada em linha de produtos de software aplicada à geração automática de sistemas para educação médica na plataforma E-learning / ISE-SPL: a software product line approach applied to automatic generation of systems for medical education in E-learning platform

    Scientific Electronic Library Online (English)

    Túlio de Paiva Marques, Carvalho; Bruno Gomes de, Araújo; Ricardo Alexsandro de Medeiros, Valentim; Jose, Diniz Junior; Francis Solange Vieira, Tourinho; Rosiane Viana Zuza, Diniz.

    2013-12-01

    Full Text Available INTRODUÇÃO: O e-learning surgiu como uma forma complementar de ensino, trazendo consigo vantagens como o aumento da acessibilidade da informação, aprendizado personalizado, democratização do ensino e facilidade de atualização, distribuição e padronização do conteúdo. Neste sentido, o presente trabal [...] ho tem como objeto apresentar uma ferramenta, intitulada de ISE-SPL, cujo propósito é a geração automática de sistemas de e-learning para a educação médica, utilizando para isso sistemas ISE (Interactive Spaced-Education) e conceitos de Linhas de Produto de Software. MÉTODOS: A ferramenta consiste em uma metodologia inovadora para a educação médica que visa auxiliar o docente da área de saúde na sua prática pedagógica por meio do uso de tecnologias educacionais, todas baseadas na computação aplicada à saúde (Informática em Saúde). RESULTADOS: Os testes realizados para validar a ISE-SPL foram divididos em duas etapas: a primeira foi feita através da utilização de um software de análise de ferramentas semelhantes ao ISE-SPL, chamado S.P.L.O.T; e a segunda foi realizada através da aplicação de questionários de usabilidade aos docentes da área da saúde que utilizaram o ISE-SPL. CONCLUSÃO: Ambos os testes demonstraram resultados positivos, permitindo comprovar a eficiência e a utilidade da ferramenta de geração de softwares de e-learning para o docente da área da saúde. Abstract in english INTRODUCTION: E-learning, which refers to the use of Internet-related technologies to improve knowledge and learning, has emerged as a complementary form of education, bringing advantages such as increased accessibility to information, personalized learning, democratization of education and ease of [...] update, distribution and standardization of the content. In this sense, this paper aims to present a tool, named ISE-SPL, whose purpose is the automatic generation of E-learning systems for medical education, making use of ISE systems (Interactive Spaced-Education) and concepts of Software Product Lines. METHODS: The tool consists of an innovative methodology for medical education that aims to assist professors of healthcare in their teaching through the use of educational technologies, all based on computing applied to healthcare (Informatics in Health). RESULTS: The tests performed to validate the ISE-SPL were divided into two stages: the first was made by using a software analysis tool similar to ISE-SPL, called S.P.L.O.T and the second was performed through usability questionnaires to healthcare professors who used ISE-SPL. CONCLUSION: Both tests showed positive results, allowing to conclude that ISE-SPL is an efficient tool for generation of E-learning software and useful for teachers in healthcare.

  1. The Chroma Software System for Lattice QCD

    International Nuclear Information System (INIS)

    We describe aspects of the Chroma software system for lattice QCD calculations. Chroma is an open source C++ based software system developed using the software infrastructure of the US SciDAC initiative. Chroma interfaces with output from the BAGEL assembly generator for optimized lattice fermion kernels on some architectures. It can be run on workstations, clusters and the QCDOC supercomputer

  2. The Chroma Software System for Lattice QCD

    International Nuclear Information System (INIS)

    We describe aspects of the Chroma software for lattice QCD calculations. Chroma is an open source C++ based software system developed using the software infrastructure of the US SciDAC initiative. Chroma interfaces with output from the BAGEL assembly generator for optimised lattice fermion kernels on some architectures. It can be run on workstations, clusters and the QCDOC supercomputer

  3. The Chroma Software System for Lattice QCD

    OpenAIRE

    Edwards, Robert G.; Joo, Balint

    2004-01-01

    We describe aspects of the Chroma software system for lattice QCD calculations. Chroma is an open source C++ based software system developed using the software infrastructure of the US SciDAC initiative. Chroma interfaces with output from the BAGEL assembly generator for optimised lattice fermion kernels on some architectures. It can be run on workstations, clusters and the QCDOC supercomputer.

  4. Proceedings of the Fifth Triennial Software Quality Forum 2000, Software for the Next Millennium, Software Quality Forum

    Energy Technology Data Exchange (ETDEWEB)

    Scientific Software Engineering Group, CIC-12

    2000-04-01

    The Software Quality Forum is a triennial conference held by the Software Quality Assurance Subcommittee for the Department of Energy's Quality Managers. The forum centers on key issues, information, and technology important in software development for the Nuclear Weapons Complex. This year it will be opened up to include local information technology companies and software vendors presenting their solutions, ideas, and lessons learned. The Software Quality Forum 2000 will take on a more hands-on, instructional tone than those previously held. There will be an emphasis on providing information, tools, and resources to assist developers in their goal of producing next generation software.

  5. Development of a Risk-Based Probabilistic Performance-Assessment Method for Long-Term Cover Systems - 2nd Edition

    International Nuclear Information System (INIS)

    A probabilistic, risk-based performance-assessment methodology has been developed to assist designers, regulators, and stakeholders in the selection, design, and monitoring of long-term covers for contaminated subsurface sites. This report describes the method, the software tools that were developed, and an example that illustrates the probabilistic performance-assessment method using a repository site in Monticello, Utah. At the Monticello site, a long-term cover system is being used to isolate long-lived uranium mill tailings from the biosphere. Computer models were developed to simulate relevant features, events, and processes that include water flux through the cover, source-term release, vadose-zone transport, saturated-zone transport, gas transport, and exposure pathways. The component models were then integrated into a total-system performance-assessment model, and uncertainty distributions of important input parameters were constructed and sampled in a stochastic Monte Carlo analysis. Multiple realizations were simulated using the integrated model to produce cumulative distribution functions of the performance metrics, which were used to assess cover performance for both present- and long-term future conditions. Performance metrics for this study included the water percolation reaching the uranium mill tailings, radon gas flux at the surface, groundwater concentrations, and dose. Results from uncertainty analyses, sensitivity analyses, and alternative design comparisons are presented for each of the performance metrics. The benefits from this methodology include a quantification of uncertainty, the identification of parameters most important to performance (to prioritize site characterization and monitoring activities), and the ability to compare alternative designs using probabilistic evaluations of performance (for cost savings)

  6. Development of a Risk-Based Probabilistic Performance-Assessment Method for Long-Term Cover Systems - 2nd Edition

    Energy Technology Data Exchange (ETDEWEB)

    HO, CLIFFORD K.; ARNOLD, BILL W.; COCHRAN, JOHN R.; TAIRA, RANDAL Y.

    2002-10-01

    A probabilistic, risk-based performance-assessment methodology has been developed to assist designers, regulators, and stakeholders in the selection, design, and monitoring of long-term covers for contaminated subsurface sites. This report describes the method, the software tools that were developed, and an example that illustrates the probabilistic performance-assessment method using a repository site in Monticello, Utah. At the Monticello site, a long-term cover system is being used to isolate long-lived uranium mill tailings from the biosphere. Computer models were developed to simulate relevant features, events, and processes that include water flux through the cover, source-term release, vadose-zone transport, saturated-zone transport, gas transport, and exposure pathways. The component models were then integrated into a total-system performance-assessment model, and uncertainty distributions of important input parameters were constructed and sampled in a stochastic Monte Carlo analysis. Multiple realizations were simulated using the integrated model to produce cumulative distribution functions of the performance metrics, which were used to assess cover performance for both present- and long-term future conditions. Performance metrics for this study included the water percolation reaching the uranium mill tailings, radon gas flux at the surface, groundwater concentrations, and dose. Results from uncertainty analyses, sensitivity analyses, and alternative design comparisons are presented for each of the performance metrics. The benefits from this methodology include a quantification of uncertainty, the identification of parameters most important to performance (to prioritize site characterization and monitoring activities), and the ability to compare alternative designs using probabilistic evaluations of performance (for cost savings).

  7. Comparison of Strong Gravitational Lens Model Software II. HydraLens: Computer-Assisted Strong Gravitational Lens Model Generation and Translation

    CERN Document Server

    Lefor, Alsn T

    2015-01-01

    The behavior of strong gravitational lens model software in the analysis of lens models is not necessarily consistent among the various software available, suggesting that the use of several models may enhance the understanding of the system being studied. Among the publicly available codes, the model input files are heterogeneous, making the creation of multiple models tedious. An enhanced method of creating model files and a method to easily create multiple models, may increase the number of comparison studies. HydraLens simplifies the creation of model files for four strong gravitational lens model software packages, including Lenstool, Gravlens/Lensmodel, glafic and PixeLens, using a custom designed GUI for each of the four codes that simplifies the entry of the model for each of these codes, obviating the need for user manuals to set the values of the many flags and in each data field. HydraLens is designed in a modular fashion, which simplifies the addition of other strong gravitational lens codes in th...

  8. Computing and software

    Directory of Open Access Journals (Sweden)

    White, G. C.

    2004-06-01

    Full Text Available The reality is that the statistical methods used for analysis of data depend upon the availability of software. Analysis of marked animal data is no different than the rest of the statistical field. The methods used for analysis are those that are available in reliable software packages. Thus, the critical importance of having reliable, up–to–date software available to biologists is obvious. Statisticians have continued to develop more robust models, ever expanding the suite of potential analysis methods available. But without software to implement these newer methods, they will languish in the abstract, and not be applied to the problems deserving them. In the Computers and Software Session, two new software packages are described, a comparison of implementation of methods for the estimation of nest survival is provided, and a more speculative paper about how the next generation of software might be structured is presented. Rotella et al. (2004 compare nest survival estimation with different software packages: SAS logistic regression, SAS non–linear mixed models, and Program MARK. Nests are assumed to be visited at various, possibly infrequent, intervals. All of the approaches described compute nest survival with the same likelihood, and require that the age of the nest is known to account for nests that eventually hatch. However, each approach offers advantages and disadvantages, explored by Rotella et al. (2004. Efford et al. (2004 present a new software package called DENSITY. The package computes population abundance and density from trapping arrays and other detection methods with a new and unique approach. DENSITY represents the first major addition to the analysis of trapping arrays in 20 years. Barker & White (2004 discuss how existing software such as Program MARK require that each new model’s likelihood must be programmed specifically for that model. They wishfully think that future software might allow the user to combine pieces of likelihood functions together to generate estimates. The idea is interesting, and maybe some bright young statistician can work out the specifics to implement the procedure. Choquet et al. (2004 describe MSURGE, a software package that implements the multistate capture–recapture models. The unique feature of MSURGE is that the design matrix is constructed with an interpreted language called GEMACO. Because MSURGE is limited to just multistate models, the special requirements of these likelihoods can be provided. The software and methods presented in these papers gives biologists and wildlife managers an expanding range of possibilities for data analysis. Although ease–of–use is generally getting better, it does not replace the need for understanding of the requirements and structure of the models being computed. The internet provides access to many free software packages as well as user–discussion groups to share knowledge and ideas. (A starting point for wildlife–related applications is (http://www.phidot.org.

  9. Simulating a measurement of the 2nd knee in the cosmic ray spectrum with an atmospheric fluorescence telescope tower array.

    Science.gov (United States)

    Liu, Jiali; Yang, Qunyu; Bai, Yunxiang; Cao, Zhen

    2014-01-01

    A fluorescence telescope tower array has been designed to measure cosmic rays in the energy range of 10(17)-10(18) eV. A full Monte Carlo simulation, including air shower production, light generation and propagation, detector response, electronics, and trigger system, has been developed for that purpose. Using such a simulation tool, the detector configuration, which includes one main tower array and two side-trigger arrays, 24 telescopes in total, has been optimized. The aperture and the event rate have been estimated. Furthermore, the performance of the X max technique in measuring composition has also been studied. PMID:24737964

  10. Software survey

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    2007-07-15

    This article presented a guide to new software applications designed to facilitate exploration, drilling and production activities. Oil and gas producers can use the proudcts for a range of functions, including reservoir characterization and accounting. In addition to a description of the software application, this article listed the name of software providers and the new features available in products. The featured software of Calgary-based providers included: PetroLOOK by Alcaro Softworks Inc.; ProphetFM and MasterDRIL by Advanced Measurements Inc.,; the EDGE screening tool by Canadian Discovery Ltd.; Emission Manager and Regulatory Document Manager by Envirosoft Corporation; ResSurveil, ResBalance and ResAssist by Epic Consulting Services Ltd; FAST WellTest and FAST RTA by Fekete Associates Inc.; OMNI 3D and VISTA 2D/3D by Gedco; VisualVoxAT, SBED and SBEDStudio by Geomodeling Technology Corporation; geoSCOUT, petroCUBE and gDC by GeoLOGIC Systems Ltd.; IHS Enerdeq Desktop and PETRA by IHS; DataVera by Intervera Data Solutions; FORGAS, PIPEFLO and WELLFLO by Neotechnology Consultants Ltd.; E and P Workflow Solutions by Neuralog Inc.; Oil and Gas Solutions by RiskAdvisory division of SAS; Petrel; GeoFrame, ECLIPSE, OFM, Osprey Risk and Avocet modeler, PIPESIM and Merak by Schlumberger Information Solutions; esi.manage and esi.executive by 3esi; and, dbAFE and PROSPECTOR by Winfund Corporation. Tower Management and Maintenance System, OverSite and Safety Orientation Management System software by Edmonton-based 3C Information Solutions Inc. were also highlighted along with PowerSHAPE, PowerMILL and FeatureCAM software by Windsor, Ontario-based Delcam. Software products by Texas-based companies featured in this article included the HTRI Xchanger Suite by Heat Transfer Research Inc.; Drillworks by Knowledge Systems; and GeoProbe, PowerView; GeoGraphix, AssetPlanner, Nexus software, Decision Management System, AssetSolver, and OpenWorks by Landmark; and, eVIN, Rig-Hand, and OVS by Merrick Systems Inc.

  11. Software engineering

    CERN Document Server

    Thorin, Marc

    1985-01-01

    Software Engineering describes the conceptual bases as well as the main methods and rules on computer programming. This book presents software engineering as a coherent and logically built synthesis and makes it possible to properly carry out an application of small or medium difficulty that can later be developed and adapted to more complex cases. This text is comprised of six chapters and begins by introducing the reader to the fundamental notions of entities, actions, and programming. The next two chapters elaborate on the concepts of information and consistency domains and show that a proc

  12. Software Authentication

    Energy Technology Data Exchange (ETDEWEB)

    Wolford,J K; Geelhood,B D; Hamilton,V A; Ingraham,J; MacArthur,D W; Mitchell,D J; Mullens,J A; Vanier,P E; White,G K; Whiteson,R

    2001-06-21

    The effort to define guidance for authentication of software for arms control and nuclear material transparency measurements draws on a variety of disciplines and has involved synthesizing established criteria and practices with newer methods. Challenges include the need to protect classified information that the software manipulates as well as deal with the rapid pace of innovation in the technology of nuclear material monitoring. The resulting guidance will shape the design of future systems and inform the process of authentication of instruments now being developed. This paper explores the technical issues underlying the guidance and presents its major tenets.

  13. Verification-based Software-fault Detection

    OpenAIRE

    Gladisch, Christoph David

    2011-01-01

    Software is used in many safety- and security-critical systems. Software development is, however, an error-prone task. In this dissertation new techniques for the detection of software faults (or software "bugs") are described which are based on a formal deductive verification technology. The described techniques take advantage of information obtained during verification and combine verification technology with deductive fault detection and test generation in a very unified way.

  14. Software Update.

    Science.gov (United States)

    Currents, 2000

    2000-01-01

    A chart of 40 alumni-development database systems provides information on vendor/Web site, address, contact/phone, software name, price range, minimum suggested workstation/suggested server, standard reports/reporting tools, minimum/maximum record capacity, and number of installed sites/client type. (DB)

  15. Reviews, Software.

    Science.gov (United States)

    Science Teacher, 1988

    1988-01-01

    Reviews two computer software packages for use in physical science, physics, and chemistry classes. Includes "Physics of Model Rocketry" for Apple II, and "Black Box" for Apple II and IBM compatible computers. "Black Box" is designed to help students understand the concept of indirect evidence. (CW)

  16. Integrated Software Pipelining

    OpenAIRE

    Eriksson, Mattias

    2009-01-01

    In this thesis we address the problem of integrated software pipelining for clustered VLIW architectures. The phases that are integrated and solved as one combined problem are: cluster assignment, instruction selection, scheduling, register allocation and spilling. As a first step we describe two methods for integrated code generation of basic blocks. The first method is optimal and based on integer linear programming. The second method is a heuristic based on genetic algorithms. We then exte...

  17. TOWARDS DEVELOPING A PERFORMANCE EVALUATOR FOR COMPONENT BASED SOFTWARE ARCHITECTURES

    OpenAIRE

    B.BHARATHI,; Kulanthaivel, G.

    2011-01-01

    Component Based Software Engineering is a branch of software engineering, which concentrates in the separation of concerns based on functionality and developing self-contained units available throughout a software system. Component Based Development (CBD) facilitates in generating reusable software components, thereby enabling to generate software products more quickly, efficiently and with desirable qualities. With the increasing avenues of component based development it would be relevant to...

  18. Software for Better Documentation of Other Software

    Science.gov (United States)

    Pinedo, John

    2003-01-01

    The Literate Programming Extraction Engine is a Practical Extraction and Reporting Language- (PERL-)based computer program that facilitates and simplifies the implementation of a concept of self-documented literate programming in a fashion tailored to the typical needs of scientists. The advantage for the programmer is that documentation and source code are written side-by-side in the same file, reducing the likelihood that the documentation will be inconsistent with the code and improving the verification that the code performs its intended functions. The advantage for the user is the knowledge that the documentation matches the software because they come from the same file. This program unifies the documentation process for a variety of programming languages, including C, C++, and several versions of FORTRAN. This program can process the documentation in any markup language, and incorporates the LaTeX typesetting software. The program includes sample Makefile scripts for automating both the code-compilation (when appropriate) and documentation-generation processes into a single command-line statement. Also included are macro instructions for the Emacs display-editor software, making it easy for a programmer to toggle between editing in a code or a documentation mode.

  19. Adaptation of Black-Box Software Components

    Directory of Open Access Journals (Sweden)

    Rolf Andreas Rasenack

    2008-01-01

    Full Text Available The globalization of the software market leads to crucial problems for software companies. More competition between software companies arises and leads to the force on companies to develop ever newer software products in ever shortened time interval. Therefore the time to market for software systems is shortened and obviously the product life cycle is shortened too. Thus software companies shortened the time interval for research and development. Due to the fact of competition between software companies software products have to develop low-priced and this leads to a smaller return on investment. A big challenge for software companies is the use of an effective research and development process to have these problems under control. A way to control these problems can be the reuse of existing software components and adapt those software components to new functionality or accommodate mismatched interfaces. Complete redevelopment of software products is more expensive and time consuming than to develop software components. The approach introduced here presents novel technique together with a supportive environment that enables developers to cope with the adaptability of black-box software components. A supportive environment will be designed that checks the compatibility of black-box software components with the assistance of their specifications. Generated adapter software components can take over the part of adaptation and advance the functionality. Besides, a pool of software components can be used to compose an application to satisfy customer needs. Certainly this pool of software components consists of black-box software components and adapter software components which can be connected on demand.

  20. A Study of Mining Software Engineering Data and Software Testing

    Directory of Open Access Journals (Sweden)

    T.Murali Krishna

    2011-11-01

    Full Text Available The primary goal of software development is to deliver Optimal Software, i.e., software produced at low cost, high quality & productivity and scheduled with in time. In order to achieve this optimal software, programmers generally reuse the existing libraries, rather than developing similar code products right from the scratch. While reusing the libraries, programmers are facing several changes such as many existing libraries are not properly documented and many libraries contain large number of program interfaces (PIs through which libraries expose their functionality. These challenges lead to certain problems that affect in producing optimal software. The problems such as reuse of existing libraries consumes more time, lack of knowledge on reusage of program interfaces and we can’t generate effective test inputs during white box testing. The first two problems reduce the software productivity where as last one affect on software testing. To resolve these problems, we propose a general framework called Netminer. Netminer contains a code search engine. With the help of code search engine, we can search the available open source code over the internet. In the analysis phase, Netminer automatically compares the specifications of program interfaces with relevant code examples that are available in the internet. In the next phase, Netminer applies data mining techniques on code examples that are collected and identify common patterns. The common patterns represent exact usage of program interfaces. We propose some more approaches based on Netminer. Some approaches help programmers in effectively reusing program interfaces provided by existing libraries. Some approaches identify defects under analysis from the mined specifications and some approaches help in generating test inputs by the use of static and dynamic test generation. Our research study shows that Netminer framework can be effectively used in software engineering for achieving optimal software.

  1. Treatment of reverse osmosis (RO) concentrate by the combined Fe/Cu/air and Fenton process (1stFe/Cu/air-Fenton-2ndFe/Cu/air).

    Science.gov (United States)

    Ren, Yi; Yuan, Yue; Lai, Bo; Zhou, Yuexi; Wang, Juling

    2016-01-25

    To decompose or transform the toxic and refractory reverse osmosis (RO) concentrate and improve the biodegradability, 1stFe/Cu/air-Fenton-2ndFe/Cu/air were developed to treat RO concentrate obtained from an amino acid production plant in northern China. First, their operating conditions were optimized thoroughly. Furthermore, 5 control experiments were setup to confirm the superiority of 1stFe/Cu/air-Fenton-2ndFe/Cu/air and synergistic reaction between Fe/Cu/air and Fenton. The results suggest that the developed method could obtain high COD removal (65.1%) and BOD5/COD ratio (0.26) due to the synergistic reaction between Fe/Cu/air and Fenton. Under the optimal conditions, the influent and effluent of 1stFe/Cu/air-Fenton-2ndFe/Cu/air and 5 control experiments were analyzed by using UV, FTIR, EEM and LC, which confirm the superiority of 1stFe/Cu/air-Fenton-2ndFe/Cu/air. Therefore, the developed method in this study is a promising process for treatment of RO concentrate. PMID:26448492

  2. Desempenho ortográfico de escolares do 2º ao 5º ano do ensino particular Spelling performance of students of 2nd to 5th grade from private teaching

    Directory of Open Access Journals (Sweden)

    Simone Aparecida Capellini

    2012-04-01

    Full Text Available OBJETIVOS: caracterizar, comparar e classificar o desempenho dos escolares do 2º ao 5º ano do ensino particular segundo a semiologia dos erros. MÉTODO: foram avaliados 115 escolares do 2º ao 5º ano, sendo 27 do 2°ano, 30 do 3° e 4° ano e 28 do 5° ano escolar, divididos em quatro grupos, respectivamente GI, GII, GIII e GIV. As provas do protocolo de avaliação da ortografia - Pró-Ortografia foram divididas em: versão coletiva (escrita de letras do alfabeto, ditado randomizado das letras do alfabeto, ditado de palavras, ditado de pseudopalavras, ditado com figuras, escrita temática induzida por figura e versão individual (ditado de frases, erro proposital, ditado soletrado, memória lexical ortográfica. RESULTADOS: houve diferença estatisticamente significante na comparação intergrupos, indicando que com o aumento da média de acertos em todas as provas da versão coletiva e individual e com o aumento da seriação escolar, os grupos diminuíram a média de erros na escrita com base na semiologia do erro. A maior freqüência de erros encontrada foi de ortografia natural. CONCLUSÃO: os dados deste estudo evidenciaram que o aumento da média de acertos de acordo com a seriação escolar pode ser indicativo do funcionamento normal de desenvolvimento da escrita infantil nesta população. A maior frequência de erros de ortografia natural encontrada indica que pode não estar ocorrendo instrução formal sobre a correspondência fonema-grafema, uma vez que os mesmos estão na dependência direta da aprendizagem da regra de correspondência direta fonema-grafema.PURPOSE: to characterize, compare and classify the performance of students from 2nd to 5th grades of private teaching according to the semiology of errors. METHOD: 115 students from the 2nd to 5th grades, 27 from the 2nd grade, 30 students from the 3rd and 4th grades, and 28 from the 5th grade divided into four groups, respectively, GI, GII, GIII and GIV, were evaluated. The tests of Spelling Evaluation Protocol - Pro-Orthography were divided into: collective version (writing letters of the alphabet, randomized dictation of letters, word dictation, non-word dictation, dictation with pictures, thematic writing induced by picture and individual version (dictation of sentences, purposeful error, spelled dictation, spelling lexical memory. RESULTS: there was a statistically significant difference in inter-group comparison indicating that there was an increase in average accuracy for all tests as for the individual and collective version. With the increase in grade level, the groups decreased the average of writing errors based on the semiology of errors. We found a higher frequency of natural spelling errors. CONCLUSION: data from this study showed that the increase in average accuracy according to grade level may be an indicative for normal development of student's writing in this population. The higher frequency of natural spelling errors found indicates that formal instruction on phoneme-grapheme correspondence may not be occurring, since that they are directly dependent on the learning of the rule of direct phoneme-grapheme correspondence.

  3. Desempenho ortográfico de escolares do 2º ao 5º ano do ensino particular / Spelling performance of students of 2nd to 5th grade from private teaching

    Scientific Electronic Library Online (English)

    Simone Aparecida, Capellini; Ana Carla Leite, Romero; Andrea Batista, Oliveira; Maria Nobre, Sampaio; Natália, Fusco; José Francisco, Cervera-Mérida; Amparo, Ygual Fernández.

    2012-04-01

    Full Text Available OBJETIVOS: caracterizar, comparar e classificar o desempenho dos escolares do 2º ao 5º ano do ensino particular segundo a semiologia dos erros. MÉTODO: foram avaliados 115 escolares do 2º ao 5º ano, sendo 27 do 2°ano, 30 do 3° e 4° ano e 28 do 5° ano escolar, divididos em quatro grupos, respectivame [...] nte GI, GII, GIII e GIV. As provas do protocolo de avaliação da ortografia - Pró-Ortografia foram divididas em: versão coletiva (escrita de letras do alfabeto, ditado randomizado das letras do alfabeto, ditado de palavras, ditado de pseudopalavras, ditado com figuras, escrita temática induzida por figura) e versão individual (ditado de frases, erro proposital, ditado soletrado, memória lexical ortográfica). RESULTADOS: houve diferença estatisticamente significante na comparação intergrupos, indicando que com o aumento da média de acertos em todas as provas da versão coletiva e individual e com o aumento da seriação escolar, os grupos diminuíram a média de erros na escrita com base na semiologia do erro. A maior freqüência de erros encontrada foi de ortografia natural. CONCLUSÃO: os dados deste estudo evidenciaram que o aumento da média de acertos de acordo com a seriação escolar pode ser indicativo do funcionamento normal de desenvolvimento da escrita infantil nesta população. A maior frequência de erros de ortografia natural encontrada indica que pode não estar ocorrendo instrução formal sobre a correspondência fonema-grafema, uma vez que os mesmos estão na dependência direta da aprendizagem da regra de correspondência direta fonema-grafema. Abstract in english PURPOSE: to characterize, compare and classify the performance of students from 2nd to 5th grades of private teaching according to the semiology of errors. METHOD: 115 students from the 2nd to 5th grades, 27 from the 2nd grade, 30 students from the 3rd and 4th grades, and 28 from the 5th grade divid [...] ed into four groups, respectively, GI, GII, GIII and GIV, were evaluated. The tests of Spelling Evaluation Protocol - Pro-Orthography were divided into: collective version (writing letters of the alphabet, randomized dictation of letters, word dictation, non-word dictation, dictation with pictures, thematic writing induced by picture) and individual version (dictation of sentences, purposeful error, spelled dictation, spelling lexical memory). RESULTS: there was a statistically significant difference in inter-group comparison indicating that there was an increase in average accuracy for all tests as for the individual and collective version. With the increase in grade level, the groups decreased the average of writing errors based on the semiology of errors. We found a higher frequency of natural spelling errors. CONCLUSION: data from this study showed that the increase in average accuracy according to grade level may be an indicative for normal development of student's writing in this population. The higher frequency of natural spelling errors found indicates that formal instruction on phoneme-grapheme correspondence may not be occurring, since that they are directly dependent on the learning of the rule of direct phoneme-grapheme correspondence.

  4. Effect of Software Manager, Programmer and Customer over Software Quality

    OpenAIRE

    Ghrehbaghi Farhad; Mehdi Afzali; Mahdi Bazerghan; Morteza Ramazani

    2013-01-01

    Several factors might be affecting the quality of software products. In this study we focus on three significant parameters: software manager, programmer and the customer. Our study demonstrates that the quality of product will improve by increasing the information generated by these three parameters. These parameters can be considered as triangular which the quality is its centroid. In this perspective, if the triangular be equilateral, then the optimum quality of the product will beachieved...

  5. A Computer program for the calculation of /sup 99/ Mo /sup 99m/Tc and /sup 99/Tc in the generator and in the Eluate using JAVA 1.3 software

    International Nuclear Information System (INIS)

    Technetium-99m radionuclide is the workhorse of nuclear medicine and currently accounts for over 80% of all in-vivo diagnostic procedures. Technetium, element 43 in the Periodic Table does not occur naturally but was discovered in 1937 by Perrier and Segre. The daughter radionuclide technetium-99m (/sup 99m/Tc, T/sub 1/2/ = 6 h) is formed from the decay of parent molybdenum-99 (/sup 99/Mo, T/sub 1/2/ = 66 h). The prominent position of /sup 99m/Tc on the market has been due to its near ideal nuclear properties, the ready availability in the form of convenient /sup 99/Mo. /sup 99m/Tc generator system and the rapid progress made in recent years in the development of variety of /99m/Tc radiopharmaceuticals for applications in oncology, cardiology and other fields. The /sup 99/Mo radionuclide is produced by the fission of uranium with thermal or fast neutrons in the nuclear reactor. The separation of /sup 99m/Tc from /sup 99/Mo is based on the chromatographic alumina column where the carrier-free daughter /sup 99m/TcC/sub 4/ formed from /sup 99/Mo by beta-decay is eluted periodically by 0.9% saline while the molybdate remains adsorbed on the alumina column. Examination of the /sup 99/Mo-/sup 99m/Tc-/sup 99/decay scheme leads to mathematical equations for theoretical calculation of generator parameters in order to judge the performance of /sup 99/Mo-/sup 99/Tc generator. However, these calculations are laborious and time consuming. A computer program 'MOGEN-TEC-2' for the calculation of /sup 99/Mo-/sup 99/Tc generator parameters using Java 1.3 software which has the advantage of working under different operating systems, has been developed. Here the data of input variables is taken in text area and Output, for each elution after execution of the program, is displayed in Label area. The input variables are /sup 99/Mo activity, generator elution efficiency, number of elution, the growth time between elution, the decay time between elution and the use of /sup/99m/Tc. The output of the program for each elution gives the activities of /99/Mo and /99m/Tc, atom numbers of /sup 99/Mo, /sup 99m/Tc and /sup 99m/Tc and the /sup 99/Tc//sup 99/Tc atom ratio for the generator immediately before elution, for fresh eluate, for decayed eluate and for the generator after elution. The software is being used comfortably and accurately. (author)

  6. Progress report of the Research Group. 1st part: Tore Supra. 2nd part: Fontenay-aux-Roses

    International Nuclear Information System (INIS)

    Three major events dominated the activities of the EURATOM/CEA association during 1980: the decision to launch the realization of the TORE SUPRA project, the progressive recognition of high frequency heating as a solution for the future, and the increasing support given to the development of heating methods and diagnostics in the JET project. It is estimated that project studies are sufficiently advanced and that industrial fabrication problems have been sufficiently covered for the realization of Tore Supra to begin in 1981. One of the successes of the work carried out is the complete validation for the superfluid helium cooling system. The satisfactory development of high frequency heating and the increasing credibility of this form of heating for future work are very important factors. In this context, the decision of the JET to envisage a large amount of ionic cyclotron heating is particularly important. The results obtained in 1980 are in fact very encouraging. The maximum power of the 500 kW T.F.R. generator was coupled with the plasma and it was possible to establish an energy Q-value. Even though the injection of neutral particles can now be considered as a proved heating method, studies of the accompanying physical phenomena are still important. The T.F.R. experiments carried out in this field in 1980 were very useful. The importance of the realization and development activities conducted during 1980, should not mask the enormous effort that made, both experimentally and theoretically, in order to understand key physical phenomena in plasma. The main peoccupation concerned small and large disruptions and all aspects of the associated instabilities. A detailed analysis of the experimental results using numerical models has led to improved empirical knowledge on the elementary transport phenomena taking place. Increasingly detailed studies on microinstabilities were also fruitful and have even led to a complete reversal in some of the ideas held about the case of universal instabilities

  7. Model-integrating software components engineering flexible software systems

    CERN Document Server

    Derakhshanmanesh, Mahdi

    2015-01-01

    In his study, Mahdi Derakhshanmanesh builds on the state of the art in modeling by proposing to integrate models into running software on the component-level without translating them to code. Such so-called model-integrating software exploits all advantages of models: models implicitly support a good separation of concerns, they are self-documenting and thus improve understandability and maintainability and in contrast to model-driven approaches there is no synchronization problem anymore between the models and the code generated from them. Using model-integrating components, software will be

  8. SOFTWARE TOOL FOR LEARNING THE GENERATION OF THE CARDIOID CURVE IN AN AUTOCAD ENVIRONMENT / HERRAMIENTA SOFTWARE PARA EL APRENDIZAJE DE LA GENERACIÓN DE LA CARDIODE EN UN ENTORNO AUTOCAD

    Scientific Electronic Library Online (English)

    MIGUEL ÁNGEL, GÓMEZ-ELVIRA-GONZÁLEZ; JOSÉ IGNACIO, ROJAS-SOLA; MARÍA DEL PILAR, CARRANZA-CAÑADAS.

    2012-02-01

    Full Text Available Este artículo presenta una novedosa aplicación desarrollada en Visual LISP para el entorno AutoCAD, que presenta de forma rápida e intuitiva la generación de la cardiode de cinco formas diferentes, siendo dicha curva cíclica, la que presenta una amplia gama de aplicaciones artísticas y técnicas, ent [...] re ellas, el perfil de algunas levas. Abstract in english This article presents a novel application which has been developed in Visual LISP for an AutoCAD environment, and which shows the generation of the cardioid curve intuitively and quickly in five different ways (using the conchoid of a circumference, pedal curve of a circumference, inverse of a parab [...] ola, orthoptic curve of a circumference, and epicycloid of a circumference). This cyclic curve has a large number of artistic and technical applications, among them the profile of some cams.

  9. Report of the 2nd RCM on nanoscale radiation engineering of advanced materials for potential biomedical applications

    International Nuclear Information System (INIS)

    There are critical needs for advanced materials in the area of biomaterial engineering, primarily in generating biomaterials of enhanced specific functionalities, improved biocompatibility, and minimal natural rejection but with enhanced interfacial adhesion. These can be achieved by introduction of proper functionalities at the nanoscale dimensions for which, due to their characteristics, radiation techniques are uniquely suited. Accordingly, many of the IAEA Member States (MS) have interest in creating advanced materials for various health-care applications using a wide array of radiation sources and their broad expertise. In seeking new knowledge to advance the field and tackle this specific problem, to collaborate to enhance the quality of the scientific research and improve their efficiency and effectiveness, MS had requested the support of the IAEA for such collaboration. Based on these requests, and the conclusions and recommendations of the Consultant's meeting on Advanced Materials on the Nano-scale Synthesized by Radiation-Induced Processes, held on 10-14 December 2007, the present CRP was formulated and started in 2009. The first RCM was held in 30 March – 3 April 2009, in Vienna, where the work plan for both individual participants and collaborations were discussed and accepted, as reported in the Meeting Report published as IAEA Working Material (http://www-naweb.iaea.org/napc/iachem/working_materials.html). The second RCM was held on 15-19 November 2010, Paris, France, and was attended by 17 participants (chief scientific investigators or team members) and one cost-free observer from Brazil. The participants presented their research achievements since the first RCM, centred on the main expected outputs of this CRP: a. Methodologies to prepare and characterize nanogels; nanoparticles and nanoporous membranes, as well as to synthesize and modify nanoparticle surfaces by attaching organic ligands by radiation; b. Methodologies to radiation synthesize polymeric, inorganic and hybrid nanocarriers, providing a controlled loading and improved releasing rate of drugs; and c. Demonstration of novel functional surfaces for cell-sheet engineering fabricated by utilizing advanced radiation technology, towards improved cell-matrix interactions and cell function control. This meeting report presents in its first part the summaries of the achievements, the conclusions reached and recommendations given, the various collaborations realized among the participants, as well as the list of scientific publications. The second part of the report consists of the full reports of the participants work during the past year

  10. MIAWARE Software

    DEFF Research Database (Denmark)

    Wilkowski, Bartlomiej; Pereira, Oscar N. M.; Dias, Paulo; Castro, Miguel; Janicki, Marcin

    2008-01-01

    This article presents MIAWARE, a software for Medical Image Analysis With Automated Reporting Engine, which was designed and developed for doctor/radiologist assistance. It allows to analyze an image stack from computed axial tomography scan of lungs (thorax) and, at the same time, to mark all pathologies on images and report their characteristics. The reporting process is normalized - radiologists cannot describe pathological changes with their own words, but can only use some terms from a spec...

  11. Calculation Software

    Science.gov (United States)

    1994-01-01

    MathSoft Plus 5.0 is a calculation software package for electrical engineers and computer scientists who need advanced math functionality. It incorporates SmartMath, an expert system that determines a strategy for solving difficult mathematical problems. SmartMath was the result of the integration into Mathcad of CLIPS, a NASA-developed shell for creating expert systems. By using CLIPS, MathSoft, Inc. was able to save the time and money involved in writing the original program.

  12. Software preservation

    Directory of Open Access Journals (Sweden)

    Tadej Vodopivec

    2011-01-01

    Full Text Available Comtrade Ltd. covers a wide range of activities related to information and communication technologies; its deliverables include web applications, locally installed programs,system software, drivers, embedded software (used e.g. in medical devices, auto parts,communication switchboards. Also the extensive knowledge and practical experience about digital long-term preservation technologies have been acquired. This wide spectrum of activities puts us in the position to discuss the often overlooked aspect of the digital preservation - preservation of software programs. There are many resources dedicated to digital preservation of digital data, documents and multimedia records,but not so many about how to preserve the functionalities and features of computer programs. Exactly these functionalities - dynamic response to inputs - render the computer programs rich compared to documents or linear multimedia. The article opens the questions on the beginning of the way to the permanent digital preservation. The purpose is to find a way in the right direction, where all relevant aspects will be covered in proper balance. The following questions are asked: why at all to preserve computer programs permanently, who should do this and for whom, when we should think about permanent program preservation, what should be persevered (such as source code, screenshots, documentation, and social context of the program - e.g. media response to it ..., where and how? To illustrate the theoretic concepts given the idea of virtual national museum of electronic banking is also presented.

  13. Model-based Software Engineering

    DEFF Research Database (Denmark)

    Kindler, Ekkart

    2010-01-01

    The vision of model-based software engineering is to make models the main focus of software development and to automatically generate software from these models. Part of that idea works already today. But, there are still difficulties when it comes to behaviour. Actually, there is no lack in models for behaviour, but a lack of concepts for integrating them with the other models and with existing code. In this paper, we discuss some of the main challenges in behaviour modelling and integration an...

  14. The theory of contractions of 2D 2nd order quantum superintegrable systems and its relation to the Askey scheme for hypergeometric orthogonal polynomials

    International Nuclear Information System (INIS)

    We describe a contraction theory for 2nd order superintegrable systems, showing that all such systems in 2 dimensions are limiting cases of a single system: the generic 3-parameter potential on the 2-sphere, S9 in our listing. Analogously, all of the quadratic symmetry algebras of these systems can be obtained by a sequence of contractions starting from S9. By contracting function space realizations of irreducible representations of the S9 algebra (which give the structure equations for Racah/Wilson polynomials) to the other superintegrable systems one obtains the full Askey scheme of orthogonal hypergeometric polynomials.This relates the scheme directly to explicitly solvable quantum mechanical systems. Amazingly, all of these contractions of superintegrable systems with potential are uniquely induced by Wigner Lie algebra contractions of so(3, C) and e(2, C). The present paper concentrates on describing this intimate link between Lie algebra and superintegrable system contractions, with the detailed calculations presented elsewhere. Joint work with E. Kalnins, S. Post, E. Subag and R. Heinonen.

  15. Report on the 2nd Florence International Symposium on Advances in Cardiomyopathies: 9th meeting of the European Myocardial and Pericardial Diseases WG of the ESC

    Directory of Open Access Journals (Sweden)

    Franco Cecchi

    2012-12-01

    Full Text Available A bridge between clinical and basic science aiming at cross fertilization, with leading experts presenting alongside junior investigators, is the key feature of the “2nd Florence International Symposium on Advances in Cardiomyopathies” , 9th Meeting of the Myocardial and Pericardial Diseases Working Group of the European Society of Cardiology, which was held in Florence, Italy on 26-­-28th September 2012. Patients with cardiomyopathies, with an estimated 3 per thousand prevalence in the general population, constitute an increasingly large proportion of patients seen by most cardiologists. This class of diseases, which are mostly genetically determined with different transmission modalities, can cause important and often unsolved management problems, despite rapid advances in the field. On the other hand, few other areas of cardiology have seen such an impressive contribution from basic science and translational research to the understanding of their pathophysiology and clinical management. The course was designed to constantly promote close interaction between basic science and clinical practice and highlight the top scientific and translational discoveries in this field in 10 scientific sessions. It was preceded by two mini-­-courses, which included the basic concepts of cardiomyocyte mechanical and electrophysiological properties and mechanisms, how-­-to sessions for clinical diagnosis and management and illustrative case study presentations of different cardiomyopathies.

  16. Sexual orientation and the 2nd to 4th finger length ratio: evidence for organising effects of sex hormones or developmental instability?

    Science.gov (United States)

    Rahman, Q; Wilson, G D

    2003-04-01

    It has been proposed that human sexual orientation is influenced by prenatal sex hormones. Some evidence examining putative somatic markers of prenatal sex hormones supports this assumption. An alternative suggestion has been that homosexuality may be due to general developmental disruptions independent of hormonal effects. This study investigated the ratio of the 2nd to 4th finger digits (the 2D:4D ratio), a measure often ascribed to the organisational actions of prenatal androgens, and the fluctuating asymmetry (FA-a measure of general developmental disruption) of these features, in a sample of 240 healthy, right handed and exclusively heterosexual and homosexual males and females (N=60 per group). Homosexual males and females showed significantly lower 2D:4D ratios in comparison to heterosexuals, but sexual orientation did not relate to any measures of FA. The evidence may suggest that homosexual males and females have been exposed to non-disruptive, but elevated levels of androgens in utero. However, these data also draw attention to difficulties in the interpretation of results when somatic features are employed as biological markers of prenatal hormonal influences. PMID:12573297

  17. Ba2NdZrO5.5 as a potential substrate material for YBa2Cu3O7-? superconducting films

    International Nuclear Information System (INIS)

    The new oxide Ba2NdZrO5.5 (BNZO) has been produced by the standard solid state reaction method. X-ray diffraction analysis (XRD) revealed that this synthesized material has an ordered complex cubic perovskite structure characteristic of A2BB'O6 crystalline structure with a lattice parameter of a = 8.40 Aa. It was established through EDX analysis that there is no trace of impurities. Chemical stability of BNZO with YBa2Cu3O7-? (YBCO) has been studied by means of Rietveld analysis of experimental XRD data on several samples of BNZO-YBCO composites. Quantitative analysis of phases on XRD patterns show that all peaks have been indexed for both BNZO and YBCO, and no extra peak is detectable. YBCO and BNZO remain as two different separate phases in the composites with no chemical reaction. Electrical measurements also revealed that superconducting transition temperature of pure YBCO and BNZO-YBCO composites is 90 K. These favorable characteristics of BNZO show that it can be used as a potential substrate material for deposition of YBCO superconducting films. (copyright 2007 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim) (orig.)

  18. Participation in EU SARNET2 Project for an Enhancement of Severe Accident Evaluation Capability in Domestic NPPs: 2nd Year R and D Activities

    International Nuclear Information System (INIS)

    The following results were obtained from the 2nd year SARNET2 research activities. · WP4-2: Preliminary analysis of the RPV lower head corium behavior and heat transfer to the PRV wall for the reference plant (APR1400) using ASTEC 2.0. · WP6-4: Preliminary analysis of the effect of metal components on the cavity concrete erosion process and cooling of relevant corium with the OECD CCI-4 experimental data using the CORQUENCH3.3 code, and a MAAP4-based MCCI uncertainty analysis for the reference plant (OPR1000). · WP7-1: Analysis of solidification process of molten debris according to different material compositions and analysis of physico-chemical characteristics of material components as a preliminary step for applying to the reactor case. · WP7-2: Benchmark blind/open calculations for code validation against the IRSN ENACCEF experiments and reflection of its results to the SARNET2 WP7-2 reports. · WP8-3: Analysis of FP behavior for the reference plant (OPR1000) using MIDAS code and preparation of a TMI-2 based common input deck for ST analysis in containment. The foregoing research results and experimental database for main SA issues obtained by this research are expected to be used for resolving SA issue remained in domestic NPPs (operating, to be constructed, future) and enhancing the evaluating capability of Level-2 PSA

  19. Proceedings of the 2nd international workshop on electromagnetic forces and related effects on blankets and other structures surrounding the fusion plasma torus

    International Nuclear Information System (INIS)

    This publication is the collection of the papers presented at the title meeting. The subjects of the papers presented were categorized in six parts and are contained in this volume. In the first part, the valuable experiences are presented concerning electromagnetic phenomena in existing large devices or those under construction. In the 2nd part, the papers are mainly concerning on the evaluation of the electromagnetic fields and forces for the next experimental reactors. In the 3rd part, electromagnetomechanical coupling problems were treated by numerical and experimental approaches. In the part 4, numerical and experimental approaches for ferromagnetic structures are performed. In the 5th part, papers related to the structural integrity evaluation are presented. The part 6 is devoted to the proposal of the intelligent material system. A summary of the panel discussion held at the final session of the workshop is also included at the end of this volume. The 22 of the presented papers are indexed individually. (J.P.N.)

  20. The 2000 activities and the 2nd Workshop on Human Resources Development in the Nuclear Field as part of Asian regional cooperation

    International Nuclear Information System (INIS)

    In 1999, the Project for Human Resources Development (HRD) was initiated as defined in the framework of the Forum for Nuclear Cooperation in Asia (FNCA), organized by the Atomic Energy Commission of Japan. The objective of the HRD Project is to solidify the foundation of technologies for nuclear development and utilization in Asia by promoting human resources development in Asian countries. In the Project there are two kind of activities; in-workshop activities and outside-of-workshop activities. As in-workshop activities, the 2nd Workshop on Human Resources Development in the Nuclear Field was held on November 27 and 28, 2000, at the Tokai Research Institute of JAERI. As outside-of-workshop activities. 'The presentation of the present state of international training and education in the nuclear field in Japan' was held on November 29, 2000 after the workshop. Participating countries were China, Indonesia, South Korea, Japan, Malaysia, the Philippines, Thailand, and Vietnam. The secretariat for the Human Resources Development Projects is provided by the Nuclear Technology and Education Center of the Japan Atomic Energy Research Institute. This report consists of presentation papers and materials at the Workshop, presentation documents of 'The present state of international training and education in the nuclear field in Japan', a letter of proposal from the Project Leader of Japan to the project leaders of the participating countries after the Workshop and a presentation paper on Human Resources Development at the 3rd Coordinators Meeting of FNCA at Tokyo on March 14-16, 2001. (author)

  1. Influence of long-term altered gravity on the swimming performance of developing cichlid fish: including results from the 2nd German Spacelab Mission D-2

    Science.gov (United States)

    Rahmann, H.; Hilbig, R.; Flemming, J.; Slenzka, K.

    This study presents qualitative and quantitative data concerning gravity-dependent changes in the swimming behaviour of developing cichlid fish larvae (Oreochromis mossambicus) after a 9 resp. 10 days exposure to increased acceleration (centrifuge experiments), to reduced gravity (fast-rotating clinostat), changed accelerations (parabolic air craft flights) and to near weightlessness (2nd German Spacelab Mission D-2). Changes of gravity initially cause disturbances of the swimming performance of the fish larvae. With prolonged stay in orbit a step by step normalisation of the swimming behaviour took place in the fish. After return to 1g earth conditions no somersaulting or looping could be detected concerning the fish, but still slow and disorientated movements as compared to controls occurred. The fish larvae adapted to earth gravity within 3-5 days. Fish seem to be in a distinct early developmental stages extreme sensitive and adaptable to altered gravity. However, elder fish either do not react or show compensatory behaviour e.g. escape reactions.

  2. U-Pb and K-Ar geochronology from the Cerro Empexa Formation, 1st and 2nd Regions, Precordillera, northern Chile

    International Nuclear Information System (INIS)

    The Cerro Empexa Formation (Galli, 1957) is a regionally distributed andesitic volcanic and continental sedimentary unit exposed in the Precordillera of the 1st and 2nd Regions of northern Chile. The formation has generally been considered to lie within the Lower or 'mid' Cretaceous, however, this assignment is based on scant, unreliable geochronologic data. Furthermore, there are conflicting interpretations as to whether the unit predates or postdates the first major Mesozoic shortening event affecting northern Chile. Because of the formation's presumed mid-Cretaceous age and its stratigraphic position over older back-arc sedimentary successions, the unit has been interpreted to represent products of the first eastward jump in the Andean magmatic arc from the arc's initial position in the Cordillera de la Costa (Scheuber and Reutter, 1992). In this paper we present the results of mapping and field observations that indicate exposures previously assigned to the Cerro Empexa Formation include two andesitic volcanic units separated by a major unconformity. The Cerro Empexa Formation proper lies above this unconformity. We also present U-Pb zircon and K-Ar geochronology that indicate the Cerro Empexa Formation is latest Cretaceous in its lower levels, and integrate our data with previously reported 40 Ar/39 Ar and fission-track data in the Cerros de Montecristo area (Maksaev, 1990; Maksaev and Zentilli, 1999) to show that 1800±600 m of rocks were deposited within ca. 2.5 m.y (au)

  3. The lymph drainage of the mammary glands in the bitch: a lymphographic study. Part 1: the 1st, 2nd, 4th and 5th mammary glands

    International Nuclear Information System (INIS)

    The objective of this paper was to study the lymph drainage of the 1st, 2nd, 4th and 5th mammary glands in the bitch using indirect lymphography. The main conclusions drawn after the study of 67 normal lactating mongrel bitches were as follows: lymph drains from the first gland,usually to the axillary nodes, and, in few cases, to the axillary and superficial cervical nodes simultaneously. The second gland drains tothe axillary nodes. The fourth gland usually drains to the superficial inguinal nodes, but it may, rarely, drain to the superficial inguinal and medial iliac nodes simultaneously. The fifth gland drains to the superficial inguinal nodes. Lymphatic connection between the mammary glands could not be demonstrated. Furthermore, it was confirmed that lymph can pass from one gland to another, through their common regional lymph nodes, by retrograde flow. It was demonstrated that there is a connection between the superficial inguinal lymph nodes from either side. It is suggested that lymphatic connection between the axillary and sternal nodes and between the axillary and bronchial nodes should be possible in some cases. Lymphatics of the mammary glands that cross the midline were not demonstrated

  4. [Quantitative determination of the main metabolites of acetylsalicylic acid/2nd communication: the concentrations of salicylic acid and its metabolites in patients with renal insufficiency (author's transl)].

    Science.gov (United States)

    Daneels, R; Loew, D; Pütter, J

    1975-07-01

    Quantitative Determination of the Main Metabolites of Acetylsalicylic Acid / 2nd Communication: The concentrations of salicylic acid and its metabolies in patients with renal insufficiency 9 patients suffering from renal insufficiencies of varing degrees and treated regularly by hemodialysis were given 1.5 g Colfarit (microcapsulated acetyl salicylic acid) as a single dose. The concentrations of salicylic acid (SA), salicyluric acid (SU), further salicylic acid conjugates (SAC) and salicyluric acid conjugates (SUC) were determined in the blood plasma. Likewise urea and creatinine were determined. SA concentration decreased continually and, at the end of the trial (72 h after application), had vanished almost completely from the plasma of most patients. SU increased at first and decreased afterwards. With the exception of the dailysis time SAC and SUC increased during the trial. After 3 days the SUC level was more than 50% of total salicylate (SSS) in most patients. SSS (the sum of SA + SU + SAC + SUC) did not change very much before dialysis, but showed a rather high decrease during the first hours of dialysis. tafter dialysis the SSS levels rose again, apparently as a consequence of a redistribution and of the synthesis of conjugates with decreased tissue affinity. It could be shown that SSS in the blood plasma does not parallel SSS in the whole body. The interindividual variation of SA metabolism as well as the variation of the biological blank values was rather high. The results are discussed with regard to salicylate pharmacokinetics in renal insufficiency and to normal salicylate metabolism. PMID:1174413

  5. The ABINIT software project

    Science.gov (United States)

    Gonze, Xavier; Allan, Douglas

    2001-03-01

    The computation of electronic structure, total energy, forces and many related properties of condensed matter, thanks to density-functional theory (DFT), is a field in constant progress. A DFT software project that wants to stay at the frontier of knowledge cannot be the work of a single individual, neither of a small group. Also, up-to-date software engineering concepts can considerably ease the harmonious development of such software. The ABINIT project relies upon these ideas : concepts of reliability, portability, readability and freedom of sources are emphasized, in the course of developping a sophisticated plane-wave pseudopotential code. More than 200 automated tests secure existing capabilities despite heavy development efforts and the associated bug generation; thanks to MAKE and PERL scripts, and CPP directives, the unique set of Fortran90 source files (about 100000 lines) can generate sequential (or parallel) object code for many platforms, under Unix/Linux, DOS/Windows and MacOS; strict coding rules have been followed to make the source readable. Moreover, the whole package is distributed under the GNU General Public Licence, often nicknamed 'copyleft' (see http://www.pcpm.ucl.ac.be/ABINIT).

  6. Squezed-light generation in a nonlinear planar waveguide with a periodic corrugation.

    Czech Academy of Sciences Publication Activity Database

    Pe?ina ml., Jan; Haderka, Ond?ej; Sibilia, C.; Bertolotti, M.; Scalora, M.

    2007-01-01

    Ro?. 76, ?. 3 (2007), 033813/1-033813/14. ISSN 1050-2947 Grant ostatní: GA ?R(CZ) GA202/05/0498 Institutional research plan: CEZ:AV0Z10100522 Keywords : matched 2nd- harmonic generation * photonic-bandgap structures * lithium-niobate * oscillations * enhancement * states Subject RIV: BH - Optics, Masers, Lasers Impact factor: 2.893, year: 2007

  7. SOFTWARE METRICS VALIDATION METHODOLOGIES IN SOFTWARE ENGINEERING

    Directory of Open Access Journals (Sweden)

    K.P. Srinivasan

    2014-12-01

    Full Text Available In the software measurement validations, assessing the validation of software metrics in software engineering is a very difficult task due to lack of theoretical methodology and empirical methodology [41, 44, 45]. During recent years, there have been a number of researchers addressing the issue of validating software metrics. At present, software metrics are validated theoretically using properties of measures. Further, software measurement plays an important role in understanding and controlling software development practices and products. The major requirement in software measurement is that the measures must represent accurately those attributes they purport to quantify and validation is critical to the success of software measurement. Normally, validation is a collection of analysis and testing activities across the full life cycle and complements the efforts of other quality engineering functions and validation is a critical task in any engineering project. Further, validation objective is to discover defects in a system and assess whether or not the system is useful and usable in operational situation. In the case of software engineering, validation is one of the software engineering disciplines that help build quality into software. The major objective of software validation process is to determine that the software performs its intended functions correctly and provides information about its quality and reliability. This paper discusses the validation methodology, techniques and different properties of measures that are used for software metrics validation. In most cases, theoretical and empirical validations are conducted for software metrics validations in software engineering [1-50].

  8. Software architecture

    CERN Document Server

    Vogel, Oliver; Chughtai, Arif

    2011-01-01

    As a software architect you work in a wide-ranging and dynamic environment. You have to understand the needs of your customer, design architectures that satisfy both functional and non-functional requirements, and lead development teams in implementing the architecture. And it is an environment that is constantly changing: trends such as cloud computing, service orientation, and model-driven procedures open up new architectural possibilities. This book will help you to develop a holistic architectural awareness and knowledge base that extends beyond concrete methods, techniques, and technologi

  9. Ecology. 2nd German ed.

    International Nuclear Information System (INIS)

    The second edition of this outstanding textbook is now available in translation to English-speaking readers. Revised and expanded from the first edition, it brings into even greater focus the relationship between ecology and sensory physiology. (orig./HP)

  10. Patient Safety: 2nd edition

    OpenAIRE

    Vincent, C

    2010-01-01

    When you are ready to implement measures to improve patient safety, this is the book to consult. Charles Vincent, one of the world's pioneers in patient safety, discusses each and every aspect clearly and compellingly. He reviews the evidence of risks and harms to patients, and he provides practical guidance on implementing safer practices in health care. The second edition puts greater emphasis on this practical side. Examples of team based initiatives show how patient safety can be improved...

  11. Quantum Gravity (2nd edn)

    International Nuclear Information System (INIS)

    There has been a flurry of books on quantum gravity in the past few years. The first edition of Kiefer's book appeared in 2004, about the same time as Carlo Rovelli's book with the same title. This was soon followed by Thomas Thiemann's 'Modern Canonical Quantum General Relativity'. Although the main focus of each of these books is non-perturbative and non-string approaches to the quantization of general relativity, they are quite orthogonal in temperament, style, subject matter and mathematical detail. Rovelli and Thiemann focus primarily on loop quantum gravity (LQG), whereas Kiefer attempts a broader introduction and review of the subject that includes chapters on string theory and decoherence. Kiefer's second edition attempts an even wider and somewhat ambitious sweep with 'new sections on asymptotic safety, dynamical triangulation, primordial black holes, the information-loss problem, loop quantum cosmology, and other topics'. The presentation of these current topics is necessarily brief given the size of the book, but effective in encapsulating the main ideas in some cases. For instance the few pages devoted to loop quantum cosmology describe how the mini-superspace reduction of the quantum Hamiltonian constraint of LQG becomes a difference equation, whereas the discussion of 'dynamical triangulations', an approach to defining a discretized Lorentzian path integral for quantum gravity, is less detailed. The first few chapters of the book provide, in a roughly historical sequence, the covariant and canonical metric variable approach to the subject developed in the 1960s and 70s. The problem(s) of time in quantum gravity are nicely summarized in the chapter on quantum geometrodynamics, followed by a detailed and effective introduction of the WKB approach and the semi-classical approximation. These topics form the traditional core of the subject. The next three chapters cover LQG, quantization of black holes, and quantum cosmology. Of these the chapter on LQG is the shortest at fourteen pages-a reflection perhaps of the fact that there are two books and a few long reviews of the subject available written by the main protagonists in the field. The chapters on black holes and cosmology provide a more or less standard introduction to black hole thermodynamics, Hawking and Unruh radiation, quantization of the Schwarzschild metric and mini-superspace collapse models, and the DeWitt, Hartle-Hawking and Vilenkin wavefunctions. The chapter on string theory is an essay-like overview of its quantum gravitational aspects. It provides a nice introduction to selected ideas and a guide to the literature. Here a prescient student may be left wondering why there is no quantum cosmology in string theory, perhaps a deliberate omission to avoid the 'landscape' and its fauna. In summary, I think this book succeeds in its purpose of providing a broad introduction to quantum gravity, and nicely complements some of the other books on the subject. (book review)

  12. 2nd Ralf Yorque Workshop

    CERN Document Server

    1985-01-01

    These are the proceedings of the Second R. Yorque Workshop on Resource Management which took place in Ashland, Oregon on July 23-25, 1984. The purpose of the workshop is to provide an informal atmosphere for the discussion of resource assessment and management problems. Each participant presented a one hour morning talk; afternoons were reserved for informal chatting. The workshop was successful in stimulating ideas and interaction. The papers by R. Deriso, R. Hilborn and C. Walters all address the same basic issue, so they are lumped together. Other than that, the order to the papers in this volume was determined in the same fashion as the order of speakers during the workshop -- by random draw. Marc Mangel Department of Mathematics University of California Davis, California June 1985 TABLE OF CONTENTS A General Theory for Fishery Modeling Jon Schnute Data Transformations in Regression Analysis with Applications to Stock-Recruitment Relationships David Ruppert and Raymond J. Carroll ••••••. •?...

  13. Contaminant Hydrogeology, 2nd Edition

    Science.gov (United States)

    Smith, James E.

    Groundwater is a valuable resource that has received much attention over the last couple of decades. Extremely large sums of money have been and will be spent on groundwater contamination problems and the public has become increasingly sensitive to groundwater issues. Groundwater contamination has even become the subject of a major Hollywood movie with the recent release of A Civil Action starring John Travolta. The high profile of groundwater contaminant problems, the associated relatively strong job market over the last 20 years, and the general shift toward an environmental emphasis in science and engineering have resulted in a sustained high demand for senior undergraduate courses and graduate programs in hydrogeology Many voice the opinion that we have seen the peak demand for hydrogeologists pass, but the placement of graduates from hydrogeology programs into career-oriented positions has remained very high.

  14. 2nd international cadmium conference

    International Nuclear Information System (INIS)

    The paper gives a short review of the conference, with survey papers on the state of development and on the progess made since the last conference in 1977. The papers read dealt with new fields of application, environmental protection, and health at the place of work (cadmium pollution, limiting values). (IHOE)

  15. Statistical Physics, 2nd Edition

    Science.gov (United States)

    Mandl, F.

    1989-01-01

    The Manchester Physics Series General Editors: D. J. Sandiford; F. Mandl; A. C. Phillips Department of Physics and Astronomy, University of Manchester Properties of Matter B. H. Flowers and E. Mendoza Optics Second Edition F. G. Smith and J. H. Thomson Statistical Physics Second Edition E. Mandl Electromagnetism Second Edition I. S. Grant and W. R. Phillips Statistics R. J. Barlow Solid State Physics Second Edition J. R. Hook and H. E. Hall Quantum Mechanics F. Mandl Particle Physics Second Edition B. R. Martin and G. Shaw The Physics of Stars Second Edition A. C. Phillips Computing for Scientists R. J. Barlow and A. R. Barnett Statistical Physics, Second Edition develops a unified treatment of statistical mechanics and thermodynamics, which emphasises the statistical nature of the laws of thermodynamics and the atomic nature of matter. Prominence is given to the Gibbs distribution, leading to a simple treatment of quantum statistics and of chemical reactions. Undergraduate students of physics and related sciences will find this a stimulating account of the basic physics and its applications. Only an elementary knowledge of kinetic theory and atomic physics, as well as the rudiments of quantum theory, are presupposed for an understanding of this book. Statistical Physics, Second Edition features: A fully integrated treatment of thermodynamics and statistical mechanics. A flow diagram allowing topics to be studied in different orders or omitted altogether. Optional "starred" and highlighted sections containing more advanced and specialised material for the more ambitious reader. Sets of problems at the end of each chapter to help student understanding. Hints for solving the problems are given in an Appendix.

  16. Nuclear power. 2nd ed

    International Nuclear Information System (INIS)

    The subject is dealt with in chapters, covering the following topics: basic description of nuclear energy; different reactor types; fuel cycle; historical account of nuclear weapons development; historical account of reactor accidents; confrontations between atomic energy developers and objectors, including further accounts of accidents; economic and political problems; economics of nuclear power compared with other energy sources; proliferation of nuclear weapon capability, the Non-proliferation Treaty and other measures designed to reduce proliferation dangers; national nuclear power programmes. (U.K.)

  17. The software life cycle

    CERN Document Server

    Ince, Darrel

    1990-01-01

    The Software Life Cycle deals with the software lifecycle, that is, what exactly happens when software is developed. Topics covered include aspects of software engineering, structured techniques of software development, and software project management. The use of mathematics to design and develop computer systems is also discussed. This book is comprised of 20 chapters divided into four sections and begins with an overview of software engineering and software development, paying particular attention to the birth of software engineering and the introduction of formal methods of software develop

  18. Software engineering architecture-driven software development

    CERN Document Server

    Schmidt, Richard F

    2013-01-01

    Software Engineering: Architecture-driven Software Development is the first comprehensive guide to the underlying skills embodied in the IEEE's Software Engineering Body of Knowledge (SWEBOK) standard. Standards expert Richard Schmidt explains the traditional software engineering practices recognized for developing projects for government or corporate systems. Software engineering education often lacks standardization, with many institutions focusing on implementation rather than design as it impacts product architecture. Many graduates join the workforce with incomplete skil

  19. Implementación de estrategias curriculares en asignaturas de segundo año de la Licenciatura en Enfermería / Implementation of curricular strategies in Nursing 2nd year subjects

    Scientific Electronic Library Online (English)

    María Cristina, Pérez Guerrero; Maité, Suárez Fernández; Alina, Carrasco Milanés.

    2013-04-01

    Full Text Available Se presenta una revisión bibliográfica en la que se valora la implementación de las estrategias curriculares en la Licenciatura en Enfermería, a partir de las asignaturas impartidas en el segundo año de la carrera. Se destaca la importancia de estos recursos pedagógicos en la formación de los estudi [...] antes, su desarrollo y contribución a la solución de situaciones relacionadas con el cuidado, la calidad de la atención de salud, la disminución de eventos adversos y la seguridad del paciente. Las estrategias curriculares vinculadas a la carrera de Enfermería constituyen una forma particular de desarrollar el proceso docente, caracterizadas por una direccionalidad coordinada que responde al perfil de salida del egresado, en la que se imbrican los contenidos y métodos teóricos y prácticos de las unidades curriculares correspondientes al plan de estudio, a partir de una estructura metodológica que garantiza su funcionamiento. Ello contribuye a la formación integral de un profesional competente. Abstract in english A bibliographic revision is presented in order to assess the implementation of curricular strategies in Nursing 2nd year subjects. The importance of teaching aids in students' training, their development and contribution to solve issues related to care and quality of the health system, the lowering [...] of adverse events and the patient´s safety are pointed out. Curricular strategies related to Nursing represent a particular way to develop the teaching process, characterized by a coordinated direction which responds to the graduate's experience, in which theoretical and practical methods and contents of the curricular units that belong to the syllabus are interwoven, starting from a methodological structure that ensures these strategy functions. It contributes to the comprehensive formation of a competent professional.

  20. Implementación de estrategias curriculares en asignaturas de segundo año de la Licenciatura en Enfermería Implementation of curricular strategies in Nursing 2nd year subjects

    Directory of Open Access Journals (Sweden)

    María Cristina Pérez Guerrero

    2013-04-01

    Full Text Available Se presenta una revisión bibliográfica en la que se valora la implementación de las estrategias curriculares en la Licenciatura en Enfermería, a partir de las asignaturas impartidas en el segundo año de la carrera. Se destaca la importancia de estos recursos pedagógicos en la formación de los estudiantes, su desarrollo y contribución a la solución de situaciones relacionadas con el cuidado, la calidad de la atención de salud, la disminución de eventos adversos y la seguridad del paciente. Las estrategias curriculares vinculadas a la carrera de Enfermería constituyen una forma particular de desarrollar el proceso docente, caracterizadas por una direccionalidad coordinada que responde al perfil de salida del egresado, en la que se imbrican los contenidos y métodos teóricos y prácticos de las unidades curriculares correspondientes al plan de estudio, a partir de una estructura metodológica que garantiza su funcionamiento. Ello contribuye a la formación integral de un profesional competente.A bibliographic revision is presented in order to assess the implementation of curricular strategies in Nursing 2nd year subjects. The importance of teaching aids in students' training, their development and contribution to solve issues related to care and quality of the health system, the lowering of adverse events and the patient´s safety are pointed out. Curricular strategies related to Nursing represent a particular way to develop the teaching process, characterized by a coordinated direction which responds to the graduate's experience, in which theoretical and practical methods and contents of the curricular units that belong to the syllabus are interwoven, starting from a methodological structure that ensures these strategy functions. It contributes to the comprehensive formation of a competent professional.