WorldWideScience

Sample records for 2nd generation software

  1. BASE - 2nd generation software for microarray data management and analysis

    Directory of Open Access Journals (Sweden)

    Nordborg Nicklas

    2009-10-01

    Full Text Available Abstract Background Microarray experiments are increasing in size and samples are collected asynchronously over long time. Available data are re-analysed as more samples are hybridized. Systematic use of collected data requires tracking of biomaterials, array information, raw data, and assembly of annotations. To meet the information tracking and data analysis challenges in microarray experiments we reimplemented and improved BASE version 1.2. Results The new BASE presented in this report is a comprehensive annotable local microarray data repository and analysis application providing researchers with an efficient information management and analysis tool. The information management system tracks all material from biosource, via sample and through extraction and labelling to raw data and analysis. All items in BASE can be annotated and the annotations can be used as experimental factors in downstream analysis. BASE stores all microarray experiment related data regardless if analysis tools for specific techniques or data formats are readily available. The BASE team is committed to continue improving and extending BASE to make it usable for even more experimental setups and techniques, and we encourage other groups to target their specific needs leveraging on the infrastructure provided by BASE. Conclusion BASE is a comprehensive management application for information, data, and analysis of microarray experiments, available as free open source software at http://base.thep.lu.se under the terms of the GPLv3 license.

  2. 2nd & 3rd Generation Vehicle Subsystems

    Science.gov (United States)

    2000-01-01

    This paper contains viewgraph presentation on the "2nd & 3rd Generation Vehicle Subsystems" project. The objective behind this project is to design, develop and test advanced avionics, power systems, power control and distribution components and subsystems for insertion into a highly reliable and low-cost system for a Reusable Launch Vehicles (RLV). The project is divided into two sections: 3rd Generation Vehicle Subsystems and 2nd Generation Vehicle Subsystems. The following topics are discussed under the first section, 3rd Generation Vehicle Subsystems: supporting the NASA RLV program; high-performance guidance & control adaptation for future RLVs; Evolvable Hardware (EHW) for 3rd generation avionics description; Scaleable, Fault-tolerant Intelligent Network or X(trans)ducers (SFINIX); advance electric actuation devices and subsystem technology; hybrid power sources and regeneration technology for electric actuators; and intelligent internal thermal control. Topics discussed in the 2nd Generation Vehicle Subsystems program include: design, development and test of a robust, low-maintenance avionics with no active cooling requirements and autonomous rendezvous and docking systems; design and development of a low maintenance, high reliability, intelligent power systems (fuel cells and battery); and design of a low cost, low maintenance high horsepower actuation systems (actuators).

  3. 2nd Generation Alkaline Electrolysis : Final report

    DEFF Research Database (Denmark)

    Yde, Lars; Kjartansdóttir, Cecilia Kristin

    2013-01-01

    This report provides the results of the 2nd Generation Alkaline Electrolysis project which was initiated in 2008. The project has been conducted from 2009-2012 by a consortium comprising Århus University Business and Social Science – Centre for Energy Technologies (CET (former HIRC)), Technical University of Denmark – Mechanical Engineering (DTU-ME), Technical University of Denmark – Energy Conversion (DTU-EC), FORCE Technology and GreenHydrogen.dk. The project has been supported by EUDP.

  4. 2nd Generation alkaline electrolysis. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Yde, L. [Aarhus Univ. Business and Social Science - Centre for Energy Technologies (CET), Aarhus (Denmark); Kjartansdottir, C.K. [Technical Univ. of Denmark. DTU Mechanical Engineering, Kgs. Lyngby (Denmark); Allebrod, F. [Technical Univ. of Denmark. DTU Energy Conversion, DTU Risoe Campus, Roskilde (Denmark)] [and others

    2013-03-15

    The overall purpose of this project has been to contribute to this load management by developing a 2{sup nd} generation of alkaline electrolysis system characterized by being compact, reliable, inexpensive and energy efficient. The specific targets for the project have been to: 1) Increase cell efficiency to more than 88% (according to the higher heating value (HHV)) at a current density of 200 mA /cm{sup 2}; 2) Increase operation temperature to more than 100 degree Celsius to make the cooling energy more valuable; 3) Obtain an operation pressure more than 30 bar hereby minimizing the need for further compression of hydrogen for storage; 4) Improve stack architecture decreasing the price of the stack with at least 50%; 5) Develop a modular design making it easy to customize plants in the size from 20 to 200 kW; 6) Demonstrating a 20 kW 2{sup nd} generation stack in H2College at the campus of Arhus University in Herning. The project has included research and development on three different technology tracks of electrodes; an electrochemical plating, an atmospheric plasma spray (APS) and finally a high temperature and pressure (HTP) track with operating temperature around 250 deg. C and pressure around 40 bar. The results show that all three electrode tracks have reached high energy efficiencies. In the electrochemical plating track a stack efficiency of 86.5% at a current density of 177mA/cm{sup 2} and a temperature of 74.4 deg. C has been shown. The APS track showed cell efficiencies of 97%, however, coatings for the anode side still need to be developed. The HTP cell has reached 100 % electric efficiency operating at 1.5 V (the thermoneutral voltage) with a current density of 1. 1 A/cm{sup 2}. This track only tested small cells in an externally heated laboratory set-up, and thus the thermal loss to surroundings cannot be given. The goal set for the 2{sup nd} generation electrolyser system, has been to generate 30 bar pressure in the cell stack. An obstacle to be overcomed has been to ensure equalisation of the H{sub 2} and O{sub 2} pressure to avoid that mixing of gasses can occur. To solve this problem, a special equilibrium valve has been developed to mechanically control that the pressure of the H{sub 2} at all times equals the O{sub 2} side. The developments have resulted in a stack design, which is a cylindrical pressure vessel, with each cell having a cell ''wall'' sufficiently thick, to resist the high pressure and sealed with O-rings for perfect sealing at high pressures. The stack has in test proved to resist a pressure on 45 bar, though some adjustment is still needed to optimize the pressure resistance and efficiency. When deciding on the new stack design both a 'zero gap' and 'non-zero gap' was considered. The zero gap design is more efficient than non-zero gap, however the design is more complex and very costly, primarily because the additional materials and production costs for zero gap electrodes. From these considerations, the concept of a ''low gap'', low diameter, high pressure and high cell number electrolyser stack was born, which could offer an improved efficiency of the electrolyser without causing the same high material and production cost as a zero gap zero gap solution. As a result the low gap design and pressurized stack has reduced the price by 60% of the total system, as well as a reduced system footprint. The progress of the project required a special focus on corrosion testing and examination of polymers in order to find alternative durable membrane and gasket materials. The initial literature survey and the first tests indicated that the chemical resistance of polymers presented a greater challenge than anticipated, and that test data from commercial suppliers were insufficient to model the conditions in the electrolyser. The alkali resistant polymers (e.g. Teflon) are costly and the search for cheaper alternatives turned into a major aim. A number of different tests were run under accelerated conditions and the degradation mechani

  5. Software for aerospace education: A bibliography, 2nd edition

    Science.gov (United States)

    Vogt, Gregory L.; Roth, Susan Kies; Phelps, Malcom V.

    1990-01-01

    This is the second aerospace education software bibliography to be published by the NASA Educational Technology Branch in Washington, DC. Unlike many software bibliographies, this bibliography does not evaluate and grade software according to its quality and value to the classroom, nor does it make any endorsements or warrant scientific accuracy. Rather, it describes software, its subject, approach, and technical details. This bibliography is intended as a convenience to educators. The specific software included represents replies to more than 300 queries to software producers for aerospace education programs.

  6. Super Boiler 2nd Generation Technology for Watertube Boilers

    Energy Technology Data Exchange (ETDEWEB)

    Mr. David Cygan; Dr. Joseph Rabovitser

    2012-03-31

    This report describes Phase I of a proposed two phase project to develop and demonstrate an advanced industrial watertube boiler system with the capability of reaching 94% (HHV) fuel-to-steam efficiency and emissions below 2 ppmv NOx, 2 ppmv CO, and 1 ppmv VOC on natural gas fuel. The boiler design would have the capability to produce >1500 F, >1500 psig superheated steam, burn multiple fuels, and will be 50% smaller/lighter than currently available watertube boilers of similar capacity. This project is built upon the successful Super Boiler project at GTI. In that project that employed a unique two-staged intercooled combustion system and an innovative heat recovery system to reduce NOx to below 5 ppmv and demonstrated fuel-to-steam efficiency of 94% (HHV). This project was carried out under the leadership of GTI with project partners Cleaver-Brooks, Inc., Nebraska Boiler, a Division of Cleaver-Brooks, and Media and Process Technology Inc., and project advisors Georgia Institute of Technology, Alstom Power Inc., Pacific Northwest National Laboratory and Oak Ridge National Laboratory. Phase I of efforts focused on developing 2nd generation boiler concepts and performance modeling; incorporating multi-fuel (natural gas and oil) capabilities; assessing heat recovery, heat transfer and steam superheating approaches; and developing the overall conceptual engineering boiler design. Based on our analysis, the 2nd generation Industrial Watertube Boiler when developed and commercialized, could potentially save 265 trillion Btu and $1.6 billion in fuel costs across U.S. industry through increased efficiency. Its ultra-clean combustion could eliminate 57,000 tons of NOx, 460,000 tons of CO, and 8.8 million tons of CO2 annually from the atmosphere. Reduction in boiler size will bring cost-effective package boilers into a size range previously dominated by more expensive field-erected boilers, benefiting manufacturers and end users through lower capital costs.

  7. A ZeroDimensional Model of a 2nd Generation Planar SOFC Using Calibrated Parameters

    Directory of Open Access Journals (Sweden)

    Brian Elmegaard

    2006-12-01

    Full Text Available This paper presents a zero-dimensional mathematical model of a planar 2nd generation coflow SOFC developed for simulation of power systems. The model accounts for the electrochemical oxidation of hydrogen as well as the methane reforming reaction and the water-gas shift reaction. An important part of the paper is the electrochemical sub-model, where experimental data was used to calibrate specific parameters. The SOFC model was implemented in the DNA simulation software which is designed for energy system simulation. The result is an accurate and flexible tool suitable for simulation of many different SOFC-based power systems.

  8. Aging Studies of 2nd Generation BaBar RPCs

    Energy Technology Data Exchange (ETDEWEB)

    Band, H.R.; /SLAC

    2007-09-25

    The BaBar detector, operating at the PEPII B factory of the Stanford Linear Accelerator Center (SLAC), installed over 200 2nd generation Resistive Plate Chambers (RPCs) in 2002. The streamer rates produced by backgrounds and signals from normal BaBar running vary considerably (0.1- >20 Hz/cm2) depending on the layer and position of the chambers, thus providing a broad spectrum test of RPC performance and aging. The lowest rate chambers have performed very well with stable efficiencies averaging 95%. Other chambers had rate-dependant inefficiencies due to Bakelite drying which were reversed by the introduction of humidified gases. RPC inefficiencies in the highest rate regions of the higher rate chambers have been observed and also found to be rate dependant. The inefficient regions grow with time and have not yet been reduced by operation with humidified input gas. Three of these chambers were converted to avalanche mode operation and display significantly improved efficiencies. The rate of production of HF in the RPC exhaust gases was measured in avalanche and streamer mode RPCs and found to be comparable despite the lower current of the avalanche mode RPCs.

  9. Operations Analysis of the 2nd Generation Reusable Launch Vehicle

    Science.gov (United States)

    Noneman, Steven R.; Smith, C. A. (Technical Monitor)

    2002-01-01

    The Space Launch Initiative (SLI) program is developing a second-generation reusable launch vehicle. The program goals include lowering the risk of loss of crew to 1 in 10,000 and reducing annual operations cost to one third of the cost of the Space Shuttle. The SLI missions include NASA, military and commercial satellite launches and crew and cargo launches to the space station. The SLI operations analyses provide an assessment of the operational support and infrastructure needed to operate candidate system architectures. Measures of the operability are estimated (i.e. system dependability, responsiveness, and efficiency). Operations analysis is used to determine the impact of specific technologies on operations. A conceptual path to reducing annual operations costs by two thirds is based on key design characteristics, such as reusability, and improved processes lowering labor costs. New operations risks can be expected to emerge. They can be mitigated with effective risk management with careful identification, assignment, tracking, and closure. SLI design characteristics such as nearly full reusability, high reliability, advanced automation, and lowered maintenance and servicing coupled with improved processes are contributors to operability and large operating cost reductions.

  10. Boosting biogas yield of anaerobic digesters by utilizing concentrated molasses from 2nd generation bioethanol plant

    OpenAIRE

    Shiplu Sarker, Henrik Bjarne Møller

    2013-01-01

    Concentrated molasses (C5 molasses) from 2nd generation bioethanol plant has been investigated for enhancing productivity of manure based digesters. A batch study at mesophilic condition (35±1C) showed the maximum methane yield from molasses as 286 LCH4/kgVS which was approximately 63% of the calculated theoretical yield. In addition to the batch study, co-digestion of molasses with cattle manure in a semi-continuously stirred reactor at thermophilic temperature (50±1°C) was also performed...

  11. Effects of Thermal Cycling on Control and Irradiated EPC 2nd Generation GaN FETs

    Science.gov (United States)

    Patterson, Richard L.; Scheick, Leif; Lauenstein, Jean-Marie; Casey, Megan; Hammoud, Ahmad

    2013-01-01

    The power systems for use in NASA space missions must work reliably under harsh conditions including radiation, thermal cycling, and exposure to extreme temperatures. Gallium nitride semiconductors show great promise, but information pertaining to their performance is scarce. Gallium nitride N-channel enhancement-mode field effect transistors made by EPC Corporation in a 2nd generation of manufacturing were exposed to radiation followed by long-term thermal cycling in order to address their reliability for use in space missions. Results of the experimental work are presented and discussed.

  12. Angiographic patterns of restenosis with 2nd generation drug-eluting stent.

    Science.gov (United States)

    Lee, Sahmin; Yoon, Chang-Hwan; Oh, Il-Young; Suh, Jung-Won; Cho, Young-Seok; Cho, Goo-Yeong; Chae, In-Ho; Choi, Dong-Ju; Youn, Tae-Jin

    2015-01-21

    The angiographic features of restenosis contain prognostic information. However, restenosis patterns of the new generation drug-eluting stents (DES), everolimus-(EES) and resolute zotarolimus-eluting stent (ZES) have not been described.A total of 210 consecutive patients with DES restenosis were enrolled from 2003 to 2012. We analyzed 217 restenotic lesions after DES implantation, and compared the morphologic characteristics of the 2nd generation DES restenosis to those of restenosis with 2 first generation DES, sirolimus-(SES) and paclitaxel-eluting stent (PES).Baseline characteristics were comparable between the different stent groups. The incidence of focal restenosis was significantly lower for PES than the other stents (49.5% versus 87.0%, 76.2%, and 82.1% for PES versus SES, EES, and ZES, respectively, P < 0.001). When considering the pattern of restenosis solely within the stent margins, a further clear distinction between PES and other stents was observed (40.0% versus 92.9%, 88.9%, and 81.2% in PES versus SES, EES, and ZES, respectively, P < 0.001). There were no significant differences in restenosis patterns among SES, EES, and ZES. In multivariate analysis, PES implantation, hypertension, and age were associated with non-focal type of restenosis after DES implantation. After the introduction of EES and ZES into routine clinical practice in 2008, focal restenosis significantly increased from 63.9% to 76.7% and diffuse restenosis significantly decreased from 26.4% to 11.0% (P = 0.045).Focal restenosis was the most common pattern of restenosis in the new generation DES and the incidence of diffuse restenosis significantly decreased with the introduction of the 2nd generation DES. PMID:25503656

  13. The New 2nd-Generation SRF R&D Facility at Jefferson Lab: TEDF

    Energy Technology Data Exchange (ETDEWEB)

    Reece, Charles E.; Reilly, Anthony V.

    2012-09-01

    The US Department of Energy has funded a near-complete renovation of the SRF-based accelerator research and development facilities at Jefferson Lab. The project to accomplish this, the Technical and Engineering Development Facility (TEDF) Project has completed the first of two phases. An entirely new 3,100 m{sup 2} purpose-built SRF technical work facility has been constructed and was occupied in summer of 2012. All SRF work processes with the exception of cryogenic testing have been relocated into the new building. All cavity fabrication, processing, thermal treatment, chemistry, cleaning, and assembly work is collected conveniently into a new LEED-certified building. An innovatively designed 800 m2 cleanroom/chemroom suite provides long-term flexibility for support of multiple R&D and construction projects as well as continued process evolution. The characteristics of this first 2nd-generation SRF facility are described.

  14. Improved beam spot measurements in the 2nd generation proton beam writing system

    International Nuclear Information System (INIS)

    Nanosized ion beams (especially proton and helium) play a pivotal role in the field of ion beam lithography and ion beam analysis. Proton beam writing has shown lithographic details down to the 20 nm level, limited by the proton beam spot size. Introducing a smaller spot size will allow smaller lithographic features. Smaller probe sizes, will also drastically improve the spatial resolution for ion beam analysis techniques. Among many other requirements, having an ideal resolution standard, used for beam focusing and a reliable focusing method, is an important pre-requisite for sub-10 nm beam spot focusing. In this paper we present the fabrication processes of a free-standing resolution standard with reduced side-wall projection and high side-wall verticality. The resulting grid is orthogonal (90.0° ± 0.1), has smooth edges with better than 6 nm side-wall projection. The new resolution standard has been used in focusing a 2 MeV H2+ beam in the 2nd generation PBW system at Center for Ion Beam Applications, NUS. The beam size has been characterized using on- and off-axis scanning transmission ion microscopy (STIM) and ion induced secondary electron detection, carried out with a newly installed micro channel plate electron detector. The latter has been shown to be a realistic alternative to STIM measurements, as the drawback of PIN diode detector damage is alleviated. With these improvements we show reproducible beam focusing down to 14 nm

  15. Generation of higher order Gauss-Laguerre modes in single-pass 2nd harmonic generation

    DEFF Research Database (Denmark)

    Buchhave, Preben; Tidemand-Lichtenberg, Peter

    2008-01-01

    We present a realistic method for dynamic simulation of the development of higher order modes in second harmonic generation. The deformation of the wave fronts due to the nonlinear interaction is expressed by expansion in higher order Gauss-Laguerre modes.

  16. 1st or 2nd generation bioethanol-impacts of technology integration & on feed production and land use

    DEFF Research Database (Denmark)

    Bentsen, Niclas Scott; Felby, Claus

    2009-01-01

    "1st or 2nd generation bioethanol-impacts of technology integration & on feed production and land use" Liquid bio fuels are perceived as a means of mitigating CO2 emissions from transport and thus climate change, but much concern has been raised to the energy consumption from refining biomass to liquid fuels. Integrating technologies such that waste stream can be used will reduce energy consumption in the production of bioethanol from wheat. We show that the integration of bio refining and combined heat an power generation reduces process energy requirements with 30-40 % and makes bioethanol production comparable to gasoline production in terms of energy loss. Utilisation of biomass in the energy sector is inevitably linked to the utilisation of land. This is a key difference between fossil and bio based energy systems. Thus evaluations of bioethanol production based on energy balances alone are inadequate. 1st and 2nd generation bioethanol production exhibits major differences when evaluated on characteristics as feed energy and feed protein production and subsequently on land use changes. 1st generation bioethanol production based on wheat grain in Denmark may in fact reduce the pressure on agricultural land on a global scale, but increase the pressure on local/national scale. In contrast to that 2nd generation bioethanol based on wheat straw exhibits a poorer energy balance than 1st generation, but the induced imbalances on feed energy are smaller. Proteins are some of the plant components with the poorest bio synthesis efficiency and as such the area demand for their production is relatively high. Preservation of the proteins in the biomass such as in feed by-products from bioethanol production is of paramount importance in developing sustainable utilisation of biomass in the energy sector.

  17. White paper on perspectives of biofuels in Denmark - with focus on 2nd generation bioethanol; Hvidbog om perspektiver for biobraendstoffer i Danmark - med fokus paa 2. generations bioethanol

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, Gy.; Foghmar, J.

    2009-11-15

    The white paper presents the perspectives - both options and barriers - for a Danish focus on production and use of biomass, including sustainable 2nd generation bioethanol, for transport. The white paper presents the current knowledge of biofuels and bioethanol and recommendations for a Danish strategy. (ln)

  18. Software of second generation

    International Nuclear Information System (INIS)

    While the most of the subroutines in the existing software libraries for scientific computation are of general, unspecific use, there is an increasing pressure for developing subroutines specially tailored for specific problems. Such ad-hoc modules are usually called as belonging to the second generation software. The talk is aimed at presenting a few subroutines of this type, from the perspective of a physicist. Typical numerical operations on functions (differentiation, integration, solving differential equations, interpolation) will be considered. (author)

  19. Clinical evaluation of the 2nd generation radio-receptor assay for anti-thyrotropin receptor antibodies (TRAb) in Graves' disease

    International Nuclear Information System (INIS)

    Full text: Detection of autoantibodies to the TSH receptor by radioreceptorial assays (RRA) is largely requested in clinical practice for the diagnosis of Graves' disease and its differentiation from diffuse thyroid autonomy. Additionally, TRAb measurement during antithyroid drug treatment can be useful to evaluate the risk of disease's relapse alter therapy discontinuation. Nevertheless, some patients affected by Graves' disease are TRAb negative when 1st generation assay is used. Recently a new RRA method for TRAb assay was developed by using human recombinant TSH-receptor and solid-phase technique. Aim of our work was the comparison between 1st and 2nd generation TRAb assays in Graves' disease patients and, particularly, the evaluation of 2nd generation test in a sub-group of patients affected by Graves' disease but with negative 1st generation TRAb assay. We evaluated the diagnostic performance of a newly developed 2nd generation TRAb assay (DYNOtest(r) TRAK human, BRAHMS Diagnostica GmbH, Germany) in 46 patients affected by Graves' disease with negative 1st generation TRAb assay (TRAK Assay(r), BRAHMS Diagnostica GmbH, Germany) . A control groups of 50 Graves' disease patients with positive 1st generation TRAb assay, 50 patients affected by Hashimoto's thyroiditis and 50 patients affected by nodular goiter were also examined. 41 out of 46 patients affected by Graves' disease with negative 1st generation TRAb assay showed a positive 2nd generation test. The overall sensitivity of the 2nd generation test was significantly improved respect the 1st generation assay in Graves' disease patients (?2 = 22.5, p<0.0001). 1 and 3 out of 50 patients affected by Hashimoto's thyroiditis were positive by 1st and 2nd generation TRAB assay, respectively. All these patients showed primary hypothyroidism. No differences resulted in euthyroid Hashimoto's thyroiditis sub-group and in nodular goiter control group. The 2nd generation TRAB assay is clearly more sensitive than the 1st generation test and should be used in clinical practice to minimize the incidence of TRAb negative Graves' disease. Long-term prospective studies are needed to evaluate the prognostic role of 2nd generation TRAb assay in Graves' disease treated by antithyroid drugs. (author)

  20. Bellman's GAP : a 2nd generation language and system for algebraic dynamic programming

    OpenAIRE

    Sauthoff, Georg

    2010-01-01

    The dissertation describes the new Bellman?s GAP which is a programming system for writing dynamic programming algorithms over sequential data. It is the second generation implementation of the algebraic dynamic programming framework (ADP). The system includes the multi-paradigm language (GAP-L), its compiler (GAP-C), functional modules (GAP-M) and a web site (GAP Pages) to experiment with GAP-L programs. GAP-L includes declarative constructs, e.g. tree grammars to model the search space, an...

  1. Cogeneration and production of 2nd generation bio fuels using biomass gasification; Cogeneracion y produccion de biocombustibles de 2 generacion mediante gasificacion de biomasa

    Energy Technology Data Exchange (ETDEWEB)

    Uruena Leal, A.; Diez Rodriguez, D.; Antolin Giraldo, G.

    2011-07-01

    Thermochemical decomposition process of gasification, in which a carbonaceous fuel, under certain conditions of temperature and oxygen deficiency, results in a series of reactions that will produce a series of gaseous products is now widely used for high performance energetic and versatility of these gaseous products for energy and 2nd generation bio fuels and reduce the emission of greenhouse gases. (Author)

  2. Strategies for 2nd generation biofuels in EU - Co-firing to stimulate feedstock supply development and process integration to improve energy efficiency and economic competitiveness

    International Nuclear Information System (INIS)

    The present biofuel policies in the European Union primarily stimulate 1st generation biofuels that are produced based on conventional food crops. They may be a distraction from lignocellulose based 2nd generation biofuels - and also from biomass use for heat and electricity - by keeping farmers' attention and significant investments focusing on first generation biofuels and the cultivation of conventional food crops as feedstocks. This article presents two strategies that can contribute to the development of 2nd generation biofuels based on lignocellulosic feedstocks. The integration of gasification-based biofuel plants in district heating systems is one option for increasing the energy efficiency and improving the economic competitiveness of such biofuels. Another option, biomass co-firing with coal, generates high-efficiency biomass electricity and reduces CO2 emissions by replacing coal. It also offers a near-term market for lignocellulosic biomass, which can stimulate development of supply systems for biomass also suitable as feedstock for 2nd generation biofuels. Regardless of the long-term priorities of biomass use for energy, the stimulation of lignocellulosic biomass production by development of near term and cost-effective markets is judged to be a no-regrets strategy for Europe. Strategies that induce a relevant development and exploit existing energy infrastructures in order to reduce risk and reach lower costs, are proposed an attractive complement the present and prospective biofuel policies.

  3. Reproductive performance, biochemical composition and fatty acid profile of wild-caught and 2nd generation domesticated Farfantepenaeus duorarum (Burkenroad, 1939) broodstock

    OpenAIRE

    Emerenciano, Mauricio; Cuzon, Gerard; Mascaro, Maite; Arevalo, Miguel; Norena-barroso, Elsa; Jeronimo, Gilberto; Racotta, Ilie; Gaxiola, Gabriela

    2012-01-01

    A 30-day trial was performed to evaluate the reproductive performance of wild and 2nd generation domesticated Farfantepenaeus duorarum broodstock. Changes in biochemical composition and fatty acids (FA) profile in the 1st and 4th spawn order females were used as indicators of nutritional condition. Wild population of F. duorarum presented significantly better reproductive outcomes as compared to domesticated ones. Wild spawners achieved significantly higher number of eggs per spawn, number of...

  4. Development of WWER-440 fuel. Use of fuel assemblies of 2-nd and 3-rd generations with increased enrichment

    International Nuclear Information System (INIS)

    The problem of increasing the power of units at NPPs with WWER-440 is of current importance. There are all the necessary prerequisites for the above-stated problem as a result of updating the design of fuel assemblies and codes. The decrease of power peaking factor in the core is achieved by using profiled fuel assemblies, fuel-integrated burning absorber, FAs with modernized docking unit, modern codes, which allows decreasing conservatism of RP safety substantiation. A wide range of experimental studies of fuel behaviour has been performed which has reached burn-up of (50-60) MW·day/kgU in transition and emergency conditions, post-reactor studies of fuel assemblies, fuel rods and fuel pellets with a 5-year operating period have been performed, which prove high reliability of fuel, presence of a large margin in the fuel pillar, which helps reactor operation at increased power. The results of the work performed on introduction of 5-6 fuel cycles show that the ultimate fuel state on operability in WWER-440 reactors is far from being achieved. Neutron-physical and thermal-hydraulic characteristics of the cores of working power units with RP V-213 are such that actual (design and measured) power peaking factors on fuel assemblies and fuel rods, as a rule, are smaller than the maximum design values. This factor is a real reserve for power forcing. There is experience of operating Units 1, 2, 4 of the Kola NPP and Unit 2 of the Rovno NPP at increased power. Units of the Loviisa NPP are operated at 109 % power. During transfer to work at increased power it is reasonable to use fuel assemblies with increased height of the fuel pillar, which allows decreasing medium linear power distribution. Further development of the 2-nd generation fuel assembly design and consequent transition to working fuel assemblies of the 3-rd generation provides significant improvement of fuel consumption under the conditions of WWER-440 reactors operation with more continuous fuel cycles and increased power

  5. BASE--2nd generation software for microarray data management and analysis.

    OpenAIRE

    Vallon-christersson, Johan; Nordborg, Nicklas; Svensson, Martin; Ha?kkinen, Jari

    2009-01-01

    BACKGROUND: Microarray experiments are increasing in size and samples are collected asynchronously over long time. Available data are re-analysed as more samples are hybridized. Systematic use of collected data requires tracking of biomaterials, array information, raw data, and assembly of annotations. To meet the information tracking and data analysis challenges in microarray experiments we reimplemented and improved BASE version 1.2. RESULTS: The new BASE presented in this report is a compr...

  6. Evaluation of the Available Means of Support for Integration of 2nd Generation Immigrants : Pasila Community Center

    OpenAIRE

    Ka?ha?ra?, Pertti; Ngugi, Simon

    2013-01-01

    The objective of this Bachelor’s thesis was to evaluate the available means of support for the integration of second generation immigrants at the Pasila Community Center. The aim was to establish the experiences of the workers and young clients who frequent the community center while at the same time gaining insight at how the various activities support the young clients. In this study, we have detailed what immigration is and shown the historical trend and demographics in Finland. We h...

  7. Implementation of 2nd-order QCD 3-jet matrix elements in Monte Carlo generators for e+e- annihilation

    International Nuclear Information System (INIS)

    Matrix elements to be used in second order Monte Carlo generators for e+e- annihilation have been derived from the O(?s2) calculations by Ellis, Ross and Terrano and by Kramer and Lampe. They were incorporated into the JETSET 6.3 Lund String Monte Carlo program. The recombination scheme dependence of the O(?s2) jet cross sections are studied in detail on the parton level (ERT) and on the hadron level (ERT and KL). (orig.)

  8. Control system for the 2nd generation Berkeley automounters (BAM2) at GM/CA-CAT macromolecular crystallography beamlines

    Energy Technology Data Exchange (ETDEWEB)

    Makarov, O., E-mail: makarov@anl.gov [GM/CA-CAT, Biosciences Division, Argonne National Laboratory, Argonne, IL 60439 (United States); Hilgart, M.; Ogata, C.; Pothineni, S. [GM/CA-CAT, Biosciences Division, Argonne National Laboratory, Argonne, IL 60439 (United States); Cork, C. [Physical Biosciences Division, Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States)

    2011-09-01

    GM/CA-CAT at Sector 23 of the Advanced Photon Source (APS) is an NIH funded facility for crystallographic structure determination of biological macromolecules by X-ray diffraction. A second-generation Berkeley automounter is being integrated into the beamline control system at the 23BM experimental station. This new device replaces the previous all-pneumatic gripper motions with a combination of pneumatics and XYZ motorized linear stages. The latter adds a higher degree of flexibility to the robot including auto-alignment capability, accommodation of a larger capacity sample Dewar of arbitrary shape, and support for advanced operations such as crystal washing, while preserving the overall simplicity and efficiency of the Berkeley automounter design.

  9. Contribution of ion beam analysis methods to the development of 2nd generation high temperature superconducting (HTS) wires

    Energy Technology Data Exchange (ETDEWEB)

    Usov, Igor O [Los Alamos National Laboratory; Arendt, Paul N [Los Alamos National Laboratory; Stan, Liliana [Los Alamos National Laboratory; Holesinger, Terry G [Los Alamos National Laboratory; Foltyn, Steven R [Los Alamos National Laboratory; Depaula, Raymond F [Los Alamos National Laboratory

    2009-01-01

    One of the crucial steps in the second generation high temperature superconducting wire program was development of the buffer layer architecture. The architecture designed at the Superconductivity Technology Center at Los Alamos National Laboratory consists of several oxide layers wherein each layer plays a specific role, namely: nucleation layer, diffusion barrier, biaxially textured template, and an intermediate layer with a good match to the lattice parameter of superconducting Y{sub 1}Ba{sub 2}Cu{sub 3}O{sub 7} (YBCO) compound. This report demonstrates how a wide range of ion beam analysis techniques (SIMS, RBS, channeling, PIXE, PIGE, NRA, ERD) was employed for analysis of each buffer layer and the YBCO films. These results assisted in understanding of a variety of physical processes occurring during the buffet layer fabrication and helped to optimize the buffer layer architecture as a whole.

  10. FT-IR Investigation of Hoveyda-Grubbs'2nd Generation Catalyst in Self-Healing Epoxy Mixtures

    International Nuclear Information System (INIS)

    The development of smart composites capable of self-repair on aeronautical structures is still at the planning stage owing to complex issues to overcome. A very important issue to solve concerns the components' stability of the proposed composites which are compromised at the cure temperatures necessary for good performance of the composite. In this work we analyzed the possibility to apply Hoveyda Grubbs' second generation catalyst (HG2) to develop self-healing systems. Our experimental results have shown critical issues in the use of epoxy precursors in conjunction with Hoveyda-Grubbs II metathesis catalyst. However, an appropriate curing cycle of the self-healing mixture permits to overcome the critical issues making possible high temperatures for the curing process without deactivating self-repair activity.

  11. Automated Sequence Generation Process and Software

    Science.gov (United States)

    Gladden, Roy

    2007-01-01

    "Automated sequence generation" (autogen) signifies both a process and software used to automatically generate sequences of commands to operate various spacecraft. The autogen software comprises the autogen script plus the Activity Plan Generator (APGEN) program. APGEN can be used for planning missions and command sequences.

  12. Experimental Investigation of 2nd Generation Bioethanol Derived from Empty-fruit-bunch (EFB of Oil-palm on Performance and Exhaust Emission of SI Engine

    Directory of Open Access Journals (Sweden)

    Yanuandri Putrasari

    2014-07-01

    Full Text Available The experimental investigation of 2nd generation bioethanol derived from EFB of oil-palm blended with gasoline for 10, 20, 25% by volume and pure gasoline were conducted on performance and exhaust emission tests of SI engine. A four stroke, four cylinders, programmed fuel injection (PGMFI, 16 valves variable valve timing and electronic lift control (VTEC, single overhead camshaft (SOHC, and 1,497 cm3 SI engine (Honda/L15A was used in this investigation. Engine performance test was carried out for brake torque, power, and fuel consumption. The exhaust emission was analyzed for carbon monoxide (CO and hydrocarbon (HC. The engine was operated on speed range from1,500 until 4,500 rev/min with 85% throttle opening position. The results showed that the highest brake torque of bioethanol blends achieved by 10% bioethanol content at 3,000 to 4,500 rpm, the brake power was greater than pure gasoline at 3,500 to 4,500 rpm for 10% bioethanol, and bioethanol-gasoline blends of 10 and 20% resulted greater bsfc than pure gasoline at low speed from 1,500 to 3,500 rpm. The trend of CO and HC emissions tended to decrease when the engine speed increased.

  13. Automatic Code Generation for Instrument Flight Software

    Science.gov (United States)

    Wagstaff, Kiri L.; Benowitz, Edward; Byrne, D. J.; Peters, Ken; Watney, Garth

    2008-01-01

    Automatic code generation can be used to convert software state diagrams into executable code, enabling a model- based approach to software design and development. The primary benefits of this process are reduced development time and continuous consistency between the system design (statechart) and its implementation. We used model-based design and code generation to produce software for the Electra UHF radios that is functionally equivalent to software that will be used by the Mars Reconnaissance Orbiter (MRO) and the Mars Science Laboratory to communicate with each other. The resulting software passed all of the relevant MRO flight software tests, and the project provides a useful case study for future work in model-based software development for flight software systems.

  14. Direct and non-destructive proof of authenticity for the 2nd generation of Brazilian real banknotes via easy ambient sonic spray ionization mass spectrometry.

    Science.gov (United States)

    Schmidt, Eduardo Morgado; Franco, Marcos Fernando; Regino, Karen Gomes; Lehmann, Eraldo Luiz; Arruda, Marco Aurélio Zezzi; de Carvalho Rocha, Werickson Fortunato; Borges, Rodrigo; de Souza, Wanderley; Eberlin, Marcos Nogueira; Correa, Deleon Nascimento

    2014-12-01

    Using a desorption/ionization technique, easy ambient sonic-spray ionization coupled to mass spectrometry (EASI-MS), documents related to the 2nd generation of Brazilian Real currency (R$) were screened in the positive ion mode for authenticity based on chemical profiles obtained directly from the banknote surface. Characteristic profiles were observed for authentic, seized suspect counterfeit and counterfeited homemade banknotes from inkjet and laserjet printers. The chemicals in the authentic banknotes' surface were detected via a few minor sets of ions, namely from the plasticizers bis(2-ethylhexyl)phthalate (DEHP) and dibutyl phthalate (DBP), most likely related to the official offset printing process, and other common quaternary ammonium cations, presenting a similar chemical profile to 1st-generation R$. The seized suspect counterfeit banknotes, however, displayed abundant diagnostic ions in the m/z 400-800 range due to the presence of oligomers. High-accuracy FT-ICR MS analysis enabled molecular formula assignment for each ion. The ions were separated by 44 m/z, which enabled their characterization as Surfynol® 4XX (S4XX, XX=40, 65, and 85), wherein increasing XX values indicate increasing amounts of ethoxylation on a backbone of 2,4,7,9-tetramethyl-5-decyne-4,7-diol (Surfynol® 104). Sodiated triethylene glycol monobutyl ether (TBG) of m/z 229 (C10H22O4Na) was also identified in the seized counterfeit banknotes via EASI(+) FT-ICR MS. Surfynol® and TBG are constituents of inks used for inkjet printing. PMID:25498934

  15. Quantification of left and right ventricular function and myocardial mass: Comparison of low-radiation dose 2nd generation dual-source CT and cardiac MRI

    International Nuclear Information System (INIS)

    Objective: To prospectively evaluate the accuracy of left and right ventricular function and myocardial mass measurements based on a dual-step, low radiation dose protocol with prospectively ECG-triggered 2nd generation dual-source CT (DSCT), using cardiac MRI (cMRI) as the reference standard. Materials and methods: Twenty patients underwent 1.5 T cMRI and prospectively ECG-triggered dual-step pulsing cardiac DSCT. This image acquisition mode performs low-radiation (20% tube current) imaging over the majority of the cardiac cycle and applies full radiation only during a single adjustable phase. Full-radiation-phase images were used to assess cardiac morphology, while low-radiation-phase images were used to measure left and right ventricular function and mass. Quantitative CT measurements based on contiguous multiphase short-axis reconstructions from the axial CT data were compared with short-axis SSFP cardiac cine MRI. Contours were manually traced around the ventricular borders for calculation of left and right ventricular end-diastolic volume, end-systolic volume, stroke volume, ejection fraction and myocardial mass for both modalities. Statistical methods included independent t-tests, the Mann–Whitney U test, Pearson correlation statistics, and Bland–Altman analysis. Results: All CT measurements of left and right ventricular function and mass correlated well with those from cMRI: for left/right end-diastolic volume r = 0.885/0.801, left/right end-systolic volum5/0.801, left/right end-systolic volume r = 0.947/0.879, left/right stroke volume r = 0.620/0.697, left/right ejection fraction r = 0.869/0.751, and left/right myocardial mass r = 0.959/0.702. Mean radiation dose was 6.2 ± 1.8 mSv. Conclusions: Prospectively ECG-triggered, dual-step pulsing cardiac DSCT accurately quantifies left and right ventricular function and myocardial mass in comparison with cMRI with substantially lower radiation exposure than reported for traditional retrospective ECG-gating.

  16. Stroke Symbol Generation Software for Fighter Aircraft

    Directory of Open Access Journals (Sweden)

    G.K. Tripathi

    2013-03-01

    Full Text Available This paper gives an overview of the stroke symbol generation software developed by Hindustan Aeronautics Limited for fighter aircraft. This paper covers the working principle of head-up-display, overview of target hardware on which the developed software has been integrated and tested, software architecture, hardware software interfaces and design details of stroke symbol generation software. The paper also covers the issues related to stroke symbol quality which were encountered by the design team and the details about how the issues were resolved during integration and test phase.Defence Science Journal, 2013, 63(2, pp.153-156, DOI:http://dx.doi.org/10.14429/dsj.63.4257

  17. Developing software for Symbian OS 2nd edition a beginner''s guide to creating Symbian OS V9 smartphone applications in C++

    CERN Document Server

    Babin, Steve

    2008-01-01

    Many problems encountered by engineers developing code for specialized Symbian subsystems boil down to a lack of understanding of the core Symbian programming concepts. Developing Software for Symbian OS remedies this problem as it provides a comprehensive coverage of all the key concepts. Numerous examples and descriptions are also included, which focus on the concepts the author has seen developers struggle with the most. The book covers development ranging from low-level system programming to end user GUI applications. It also covers the development and packaging tools, as well as providing some detailed reference and examples for key APIs. The new edition includes a completely new chapter on platform security.The overall goal of the book is to provide introductory coverage of Symbian OS v9 and help developers with little or no knowledge of Symbian OS to develop as quickly as possible. There are few people with long Symbian development experience compared to demand, due to the rapid growth of Symbian in re...

  18. A Generic Software Safety Document Generator

    Science.gov (United States)

    Denney, Ewen; Venkatesan, Ram Prasad

    2004-01-01

    Formal certification is based on the idea that a mathematical proof of some property of a piece of software can be regarded as a certificate of correctness which, in principle, can be subjected to external scrutiny. In practice, however, proofs themselves are unlikely to be of much interest to engineers. Nevertheless, it is possible to use the information obtained from a mathematical analysis of software to produce a detailed textual justification of correctness. In this paper, we describe an approach to generating textual explanations from automatically generated proofs of program safety, where the proofs are of compliance with an explicit safety policy that can be varied. Key to this is tracing proof obligations back to the program, and we describe a tool which implements this to certify code auto-generated by AutoBayes and AutoFilter, program synthesis systems under development at the NASA Ames Research Center. Our approach is a step towards combining formal certification with traditional certification methods.

  19. User software for the next generation

    CERN Document Server

    Worlton, T G; Hammonds, J P; Peterson, P F; Mikkelson, D J; Mikkelson, R L

    2002-01-01

    New generations of neutron scattering sources and instrumentation are providing challenges in data handling for user software. Time-of-Flight instruments used at pulsed sources typically produce hundreds or thousands of channels of data for each detector segment. New instruments are being designed with thousands to hundreds of thousands of detector segments. High intensity neutron sources make possible parametric studies and texture studies which further increase data handling requirements. The Integrated Spectral Analysis Workbench (ISAW) software developed at Argonne handles large numbers of spectra simultaneously while providing operations to reduce, sort, combine and export the data. It includes viewers to inspect the data in detail in real time. ISAW uses existing software components and packages where feasible and takes advantage of the excellent support for user interface design and network communication in Java. The included scripting language simplifies repetitive operations for analyzing many files ...

  20. Monte Carlo generators in ATLAS software

    International Nuclear Information System (INIS)

    This document describes how Monte Carlo (MC) generators can be used in the ATLAS software framework (Athena). The framework is written in C++ using Python scripts for job configuration. Monte Carlo generators that provide the four-vectors describing the results of LHC collisions are written in general by third parties and are not part of Athena. These libraries are linked from the LCG Generator Services (GENSER) distribution. Generators are run from within Athena and the generated event output is put into a transient store, in HepMC format, using StoreGate. A common interface, implemented via inheritance of a GeneratorModule class, guarantees common functionality for the basic generation steps. The generator information can be accessed and manipulated by helper packages like TruthHelper. The ATLAS detector simulation as well access the truth information from StoreGate1. Steering is done through specific interfaces to allow for flexible configuration using ATLAS Python scripts. Interfaces to most general purpose generators, including: Pythia6, Pythia8, Herwig, Herwig++ and Sherpa are provided, as well as to more specialized packages, for example Phojet and Cascade. A second type of interface exist for the so called Matrix Element generators that only generate the particles produced in the hard scattering process and write events in the Les Houches event format. A generic interface to pass these events to Pythia6 and Herwig for parton showering and hadronisatHerwig for parton showering and hadronisation has been written.

  1. Next-Generation Lightweight Mirror Modeling Software

    Science.gov (United States)

    Arnold, William R., Sr.; Fitzgerald, Mathew; Rosa, Rubin Jaca; Stahl, Phil

    2013-01-01

    The advances in manufacturing techniques for lightweight mirrors, such as EXELSIS deep core low temperature fusion, Corning's continued improvements in the Frit bonding process and the ability to cast large complex designs, combined with water-jet and conventional diamond machining of glasses and ceramics has created the need for more efficient means of generating finite element models of these structures. Traditional methods of assembling 400,000 + element models can take weeks of effort, severely limiting the range of possible optimization variables. This paper will introduce model generation software developed under NASA sponsorship for the design of both terrestrial and space based mirrors. The software deals with any current mirror manufacturing technique, single substrates, multiple arrays of substrates, as well as the ability to merge submodels into a single large model. The modeler generates both mirror and suspension system elements, suspensions can be created either for each individual petal or the whole mirror. A typical model generation of 250,000 nodes and 450,000 elements only takes 5-10 minutes, much of that time being variable input time. The program can create input decks for ANSYS, ABAQUS and NASTRAN. An archive/retrieval system permits creation of complete trade studies, varying cell size, depth, and petal size, suspension geometry with the ability to recall a particular set of parameters and make small or large changes with ease. The input decks created by the modeler are text files which can be modified by any editor, all the key shell thickness parameters are accessible and comments in deck identify which groups of elements are associated with these parameters. This again makes optimization easier. With ANSYS decks, the nodes representing support attachments are grouped into components; in ABAQUS these are SETS and in NASTRAN as GRIDPOINT SETS, this make integration of these models into large telescope or satellite models possible

  2. Next Generation Lightweight Mirror Modeling Software

    Science.gov (United States)

    Arnold, William R., Sr.; Fitzgerald, Mathew; Rosa, Rubin Jaca; Stahl, H. Philip

    2013-01-01

    The advances in manufacturing techniques for lightweight mirrors, such as EXELSIS deep core low temperature fusion, Corning's continued improvements in the Frit bonding process and the ability to cast large complex designs, combined with water-jet and conventional diamond machining of glasses and ceramics has created the need for more efficient means of generating finite element models of these structures. Traditional methods of assembling 400,000 + element models can take weeks of effort, severely limiting the range of possible optimization variables. This paper will introduce model generation software developed under NASA sponsorship for the design of both terrestrial and space based mirrors. The software deals with any current mirror manufacturing technique, single substrates, multiple arrays of substrates, as well as the ability to merge submodels into a single large model. The modeler generates both mirror and suspension system elements, suspensions can be created either for each individual petal or the whole mirror. A typical model generation of 250,000 nodes and 450,000 elements only takes 5-10 minutes, much of that time being variable input time. The program can create input decks for ANSYS, ABAQUS and NASTRAN. An archive/retrieval system permits creation of complete trade studies, varying cell size, depth, and petal size, suspension geometry with the ability to recall a particular set of parameters and make small or large changes with ease. The input decks created by the modeler are text files which can be modified by any editor, all the key shell thickness parameters are accessible and comments in deck identify which groups of elements are associated with these parameters. This again makes optimization easier. With ANSYS decks, the nodes representing support attachments are grouped into components; in ABAQUS these are SETS and in NASTRAN as GRIDPOINT SETS, this make integration of these models into large telescope or satellite models easier.

  3. Experimental Stochatics (2nd edition)

    International Nuclear Information System (INIS)

    Otto Moeschlin and his co-authors have written a book about simulation of stochastic systems. The book comes with a CD-ROM that contains the experiments discussed in the book, and the text from the book is repeated on the CD-ROM. According to the authors, the aim of the book is to give a quick introduction to stochastic simulation for 'all persons interested in experimental stochastics'. To please this diverse audience, the authors offer a book that has four parts. Part 1, called 'Artificial Randomness', is the longest of the four parts. It gives an overview of the generation, testing and basic usage of pseudo random numbers in simulation. Although algorithms for generating sequences of random numbers are fundamental to simulation, it is a slightly unusual choice to give it such weight in comparison to other algorithmic topics. The remaining three parts consist of simulation case studies. Part 2, 'Stochastic Models', treats four problems - Buffon's needle, a queuing system, and two problems related to the kinetic theory of gases. Part 3 is called 'Stochastic Processes' and discusses the simulation of discrete time Markov chains, birth-death processes, Brownian motion and diffusions. The last section of Part 3 is about simulation as a tool to understand the traffic flow in a system controlled by stoplights, an area of research for the authors. Part 4 is called 'Evaluation of Statistical Procedures'. This section contains examples where simulation is used to test the pees where simulation is used to test the performance of statistical methods. It covers four examples: the Neymann-Pearson lemma, the Wald sequential test, Bayesian point estimation and Hartigan procedures. The CD-ROM contains an easy-to-install software package that runs under Microsoft Windows. The software contains the text and simulations from the book. What I found most enjoyable about this book is the number of topics covered in the case studies. The highly individual selection of applications, which may serve as a source of inspiration for teachers of computational stochastic methods, is the main contribution of this electronic monograph. However, both the book and software suffer from several severe problems. Firstly, I feel that the structure of the text is weak. Probably this is partly the result of the text from the CD-ROM being put into a book format, but the short paragraphs and poorly structured sentences destroy the reading experience. Secondly, although the software is functional, I believe that, like me, many users will be disappointed by the quality of the user interface and the visualizations. The opportunities to interact with the simulations are limited. Thirdly, the presentation is slightly old fashioned and lacking in pedagogical structure. For example, flow charts and Pascal programs are used to present algorithms. To conclude, I am surprised that this electronic monograph warranted a second edition in this form. Teachers may find the examples useful as a starting point, but students and researchers are advised to look elsewhere. (book review)

  4. Integration of health management and support systems is key to achieving cost reduction and operational concept goals of the 2nd generation reusable launch vehicle

    Science.gov (United States)

    Koon, Phillip L.; Greene, Scott

    2002-07-01

    Our aerospace customers are demanding that we drastically reduce the cost of operating and supporting our products. Our space customer in particular is looking for the next generation of reusable launch vehicle systems to support more aircraft like operation. To achieve this goal requires more than an evolution in materials, processes and systems, what is required is a paradigm shift in the design of the launch vehicles and the processing systems that support the launch vehicles. This paper describes the Automated Informed Maintenance System (AIM) we are developing for NASA's Space Launch Initiative (SLI) Second Generation Reusable Launch Vehicle (RLV). Our system includes an Integrated Health Management (IHM) system for the launch vehicles and ground support systems, which features model based diagnostics and prognostics. Health Management data is used by our AIM decision support and process aids to automatically plan maintenance, generate work orders and schedule maintenance activities along with the resources required to execute these processes. Our system will automate the ground processing for a spaceport handling multiple RLVs executing multiple missions. To accomplish this task we are applying the latest web based distributed computing technologies and application development techniques.

  5. Automatic Testcase Generation for Flight Software

    Science.gov (United States)

    Bushnell, David Henry; Pasareanu, Corina; Mackey, Ryan M.

    2008-01-01

    The TacSat3 project is applying Integrated Systems Health Management (ISHM) technologies to an Air Force spacecraft for operational evaluation in space. The experiment will demonstrate the effectiveness and cost of ISHM and vehicle systems management (VSM) technologies through onboard operation for extended periods. We present two approaches to automatic testcase generation for ISHM: 1) A blackbox approach that views the system as a blackbox, and uses a grammar-based specification of the system's inputs to automatically generate *all* inputs that satisfy the specifications (up to prespecified limits); these inputs are then used to exercise the system. 2) A whitebox approach that performs analysis and testcase generation directly on a representation of the internal behaviour of the system under test. The enabling technologies for both these approaches are model checking and symbolic execution, as implemented in the Ames' Java PathFinder (JPF) tool suite. Model checking is an automated technique for software verification. Unlike simulation and testing which check only some of the system executions and therefore may miss errors, model checking exhaustively explores all possible executions. Symbolic execution evaluates programs with symbolic rather than concrete values and represents variable values as symbolic expressions. We are applying the blackbox approach to generating input scripts for the Spacecraft Command Language (SCL) from Interface and Control Systems. SCL is an embedded interpreter for controlling spacecraft systems. TacSat3 will be using SCL as the controller for its ISHM systems. We translated the SCL grammar into a program that outputs scripts conforming to the grammars. Running JPF on this program generates all legal input scripts up to a prespecified size. Script generation can also be targeted to specific parts of the grammar of interest to the developers. These scripts are then fed to the SCL Executive. ICS's in-house coverage tools will be run to measure code coverage. Because the scripts exercise all parts of the grammar, we expect them to provide high code coverage. This blackbox approach is suitable for systems for which we do not have access to the source code. We are applying whitebox test generation to the Spacecraft Health INference Engine (SHINE) that is part of the ISHM system. In TacSat3, SHINE will execute an on-board knowledge base for fault detection and diagnosis. SHINE converts its knowledge base into optimized C code which runs onboard TacSat3. SHINE can translate its rules into an intermediate representation (Java) suitable for analysis with JPF. JPF will analyze SHINE's Java output using symbolic execution, producing testcases that can provide either complete or directed coverage of the code. Automatically generated test suites can provide full code coverage and be quickly regenerated when code changes. Because our tools analyze executable code, they fully cover the delivered code, not just models of the code. This approach also provides a way to generate tests that exercise specific sections of code under specific preconditions. This capability gives us more focused testing of specific sections of code.

  6. The 2nd Generation z(Redshift) and Early Universe Spectrometer Part I: First-light observation of a highly lensed local-ULIRG analog at high-z

    CERN Document Server

    Ferkinhoff, Carl; Parshley, Stephen; Nikola, Thomas; Stacey, Gordon J; Schoenwald, Justin; Higdon, James L; Higdon, Sarah J U; Verma, Aprajita; Riechers, Dominik; Hailey-Dunsheath, Steven; Menten, Karl; Güsten, Rolf; Wieß, Axel; Irwin, Kent; Cho, Hsiao M; Niemack, Michael; Halpern, Mark; Amiri, Mandana; Hasselfield, Matthew; Wiebe, D V; Ade, Peter A R; Tucker, Carol E

    2013-01-01

    We report first science results from our new spectrometer, the 2nd generation z(Redshift) and Early Universe Spectrometer (ZEUS-2), recently commissioned on the Atacama Pathfinder Experiment telescope (APEX). ZEUS-2 is a submillimeter grating spectrometer optimized for detecting the faint and broad lines from distant galaxies that are redshifted into the telluric windows from 200 to 850 microns. It utilizes a focal plane array of transition-edge sensed bolometers, the first use of these arrays for astrophysical spectroscopy. ZEUS-2 promises to be an important tool for studying galaxies in the years to come due to its synergy with ALMA and its capabilities in the short submillimeter windows that are unique in the post Herschel era. Here we report on our first detection of the [CII] 158 $\\mu m$ line with ZEUS-2. We detect the line at z ~ 1.8 from H-ATLAS J091043.1-000322 with a line flux of $(6.44 \\pm 0.42) \\times 10^{-18} W m^{-2}$. Combined with its far-infrared luminosity and a new Herschel-PACS detection of...

  7. Generation of test cases from software requirements using combination trees

    OpenAIRE

    Ravi Prakash Verma; Bal Gopal; Md Rizwan Beg

    2011-01-01

    Requirements play an important role in conformance of software quality, which is verified and validated through software testing. Usually the software requirements are expressed natural language such as English. In this paper we present an approach to generate test case from requirements. Our approach takes requirements expressed in natural language and generates test cases using combination trees. However until now we have the tabular representations for combination pairs or simply the chart...

  8. Improved Ant Algorithms for Software Testing Cases Generation

    OpenAIRE

    Shunkun Yang; Tianlong Man; Jiaqi Xu

    2014-01-01

    Existing ant colony optimization (ACO) for software testing cases generation is a very popular domain in software testing engineering. However, the traditional ACO has flaws, as early search pheromone is relatively scarce, search efficiency is low, search model is too simple, positive feedback mechanism is easy to porduce the phenomenon of stagnation and precocity. This paper introduces improved ACO for software testing cases generation: improved local pheromone update strategy for ant colony...

  9. Automatic program generation: future of software engineering

    Energy Technology Data Exchange (ETDEWEB)

    Robinson, J.H.

    1979-01-01

    At this moment software development is still more of an art than an engineering discipline. Each piece of software is lovingly engineered, nurtured, and presented to the world as a tribute to the writer's skill. When will this change. When will the craftsmanship be removed and the programs be turned out like so many automobiles from an assembly line. Sooner or later it will happen: economic necessities will demand it. With the advent of cheap microcomputers and ever more powerful supercomputers doubling capacity, much more software must be produced. The choices are to double the number of programers, double the efficiency of each programer, or find a way to produce the needed software automatically. Producing software automatically is the only logical choice. How will automatic programing come about. Some of the preliminary actions which need to be done and are being done are to encourage programer plagiarism of existing software through public library mechanisms, produce well understood packages such as compiler automatically, develop languages capable of producing software as output, and learn enough about the whole process of programing to be able to automate it. Clearly, the emphasis must not be on efficiency or size, since ever larger and faster hardware is coming.

  10. Generating Protocol Software from CPN Models Annotated with Pragmatics

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge; Kristensen, Lars M.

    2013-01-01

    Model-driven software engineering (MDSE) provides a foundation for automatically generating software based on models that focus on the problem domain while abstracting from the details of underlying implementation platforms. Coloured Petri Nets (CPNs) have been widely used to formally model and verify protocol software, but limited work exists on using CPN models of protocols as a basis for automated code generation. The contribution of this paper is a method for generating protocol software from a class of CPN models annotated with code generation pragmatics. Our code generation method consists of three main steps: automatically adding so-called derived pragmatics to the CPN model, computing an abstract template tree, which associates pragmatics with code templates, and applying the templates to generate code which can then be compiled. We illustrate our method using a unidirectional data framing protocol.

  11. Next-generation business intelligence software with Silverlight 3

    CERN Document Server

    Czernicki, Bart

    2010-01-01

    Business Intelligence (BI) software is the code and tools that allow you to view different components of a business using a single visual platform, making comprehending mountains of data easier. Applications that include reports, analytics, statistics, and historical and predictive modeling are all examples of BI applications. Currently, we are in the second generation of BI software, called BI 2.0. This generation is focused on writing BI software that is predictive, adaptive, simple, and interactive. As computers and software have evolved, more data can be presented to end users with increas

  12. Chemistry: The Molecular Science, 2nd Edition

    Science.gov (United States)

    Finds ChemEd DL resources related to the sections of the General Chemistry textbook, Chemistry: The Molecular Science, 2nd Edition, by John W. Moore, Conrad L. Stanitski, Peter C. Jurs published by Brooks/Cole, 2005.

  13. Using DSL for Automatic Generation of Software Connectors.

    Czech Academy of Sciences Publication Activity Database

    Bureš, Tomáš; Malohlava, M.; Hn?tynka, P.

    Los Alamitos : IEEE Computer Society, 2008, s. 138-147. ISBN 978-0-7695-3091-8. [ICCBSS 2008. International Conference on Composition-Based Software Systems /7./. Madrid (ES), 25.02.2008-29.02.2008,] R&D Projects: GA AV ?R 1ET400300504 Institutional research plan: CEZ:AV0Z10300504 Keywords : component based systems * software connectors * code generation * domain-specific languages Subject RIV: JC - Computer Hardware ; Software

  14. Creating the next generation control system software

    International Nuclear Information System (INIS)

    A new 1980's style support package for future accelerator control systems is proposed. It provides a way to create accelerator applications software without traditional programming. Visual Interactive Applications (VIA) is designed to meet the needs of expanded accelerator complexes in a more cost effective way than past experience with procedural languages by using technology from the personal computer and artificial intelligence communities. 4 refs

  15. Generating DEM from LIDAR data - comparison of available software tools

    Science.gov (United States)

    Korzeniowska, K.; Lacka, M.

    2011-12-01

    In recent years many software tools and applications have appeared that offer procedures, scripts and algorithms to process and visualize ALS data. This variety of software tools and of "point cloud" processing methods contributed to the aim of this study: to assess algorithms available in various software tools that are used to classify LIDAR "point cloud" data, through a careful examination of Digital Elevation Models (DEMs) generated from LIDAR data on a base of these algorithms. The works focused on the most important available software tools: both commercial and open source ones. Two sites in a mountain area were selected for the study. The area of each site is 0.645 sq km. DEMs generated with analysed software tools ware compared with a reference dataset, generated using manual methods to eliminate non ground points. Surfaces were analysed using raster analysis. Minimum, maximum and mean differences between reference DEM and DEMs generated with analysed software tools were calculated, together with Root Mean Square Error. Differences between DEMs were also examined visually using transects along the grid axes in the test sites.

  16. Application of automated reasoning software: procedure generation system verifier

    International Nuclear Information System (INIS)

    An on-line, automated reasoning software system for verifying the actions of other software or human control systems has been developed. It was demonstrated by verifying the actions of an automated procedure generation system. The verifier uses an interactive theorem prover as its inference engine with the rules included as logic axioms. Operation of the verifier is generally transparent except when the verifier disagrees with the actions of the monitored software. Testing with an automated procedure generation system demonstrates the successful application of automated reasoning software for verification of logical actions in a diverse, redundant manner. A higher degree of confidence may be placed in the verified actions gathered by the combined system

  17. Radioisotope thermoelectric generator transportation system subsystem 143 software development plan

    International Nuclear Information System (INIS)

    This plan describes the activities to be performed and the controls to be applied to the process of specifying, developing, and qualifying the data acquisition software for the Radioisotope Thermoelectric Generator (RTG) Transportation System Subsystem 143 Instrumentation and Data Acquisition System (IDAS). This plan will serve as a software quality assurance plan, a verification and validation (V and V) plan, and a configuration management plan

  18. Developing a harmonic power flow software in distributed generation systems

    OpenAIRE

    Bureau, Ce?dric

    2012-01-01

    The main topic of this thesis is harmonic power flow and its use in a simulation software that I have developped. The idea of the software is to combine distribution grids' description, non-linear load models and power flow methods. Nowadays, power electronics is more and more present in electric devices in distributed generation systems. Those power electronics systems can emit or absorb harmonics that can damage the devices in the grid. Thus, it is important to be able to estimate harmonic ...

  19. Abstracts: 2nd interventional MRI symposium

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1997-09-01

    Main topics of the 2nd interventional MRI symposium were: MR compatibility and pulse sequences; MR thermometry, biopsy, musculoskeletal system; laser-induced interstitial thermotherapy, radiofrequency ablations; intraoperative MR; vascular applications, breast, endoscopy; focused ultrasound, cryotherapy, perspectives; poster session with 34 posters described. (AJ)

  20. Search-Based Software Test Data Generation Using Evolutionary Computation

    Directory of Open Access Journals (Sweden)

    P. Maragathavalli

    2011-02-01

    Full Text Available Search-based Software Engineering has been utilized for a number of software engineering activities.One area where Search-Based Software Engineering has seen much application is test data generation. Evolutionary testing designates the use of metaheuristic search methods for test case generation. The search space is the input domain of the test object, with each individual or potential solution, being an encoded set of inputs to that test object. The fitness function is tailored to find test data for the type of testthat is being undertaken. Evolutionary Testing (ET uses optimizing search techniques such as evolutionary algorithms to generate test data. The effectiveness of GA-based testing system is compared with a Random testing system. For simple programs both testing systems work fine, but as the complexity of the program or the complexity of input domain grows, GA-based testing system significantly outperforms Random testing.

  1. Development of the software generation method using model driven software engineering tool

    International Nuclear Information System (INIS)

    The methodologies to generate the automated software design specification and source code for the nuclear I and C systems software using model driven language is developed in this work. For qualitative analysis of the algorithm, the activity diagram is modeled and generated using Unified Modeling Language (UML), and then the sequence diagram is designed for automated source code generation. For validation of the generated code, the code audits and module test is performed using Test and QA tool. The code coverage and complexities of example code are examined in this stage. The low pressure pressurizer reactor trip module of the Plant Protection System was programmed as subject for this task. The test result using the test tool shows that errors were easily detected from the generated source codes that have been generated using test tool. The accuracy of input/output processing by the execution modules was clearly identified

  2. Computer Software Configuration Item-Specific Flight Software Image Transfer Script Generator

    Science.gov (United States)

    Bolen, Kenny; Greenlaw, Ronald

    2010-01-01

    A K-shell UNIX script enables the International Space Station (ISS) Flight Control Team (FCT) operators in NASA s Mission Control Center (MCC) in Houston to transfer an entire or partial computer software configuration item (CSCI) from a flight software compact disk (CD) to the onboard Portable Computer System (PCS). The tool is designed to read the content stored on a flight software CD and generate individual CSCI transfer scripts that are capable of transferring the flight software content in a given subdirectory on the CD to the scratch directory on the PCS. The flight control team can then transfer the flight software from the PCS scratch directory to the Electronically Erasable Programmable Read Only Memory (EEPROM) of an ISS Multiplexer/ Demultiplexer (MDM) via the Indirect File Transfer capability. The individual CSCI scripts and the CSCI Specific Flight Software Image Transfer Script Generator (CFITSG), when executed a second time, will remove all components from their original execution. The tool will identify errors in the transfer process and create logs of the transferred software for the purposes of configuration management.

  3. Improved ant algorithms for software testing cases generation.

    Science.gov (United States)

    Yang, Shunkun; Man, Tianlong; Xu, Jiaqi

    2014-01-01

    Existing ant colony optimization (ACO) for software testing cases generation is a very popular domain in software testing engineering. However, the traditional ACO has flaws, as early search pheromone is relatively scarce, search efficiency is low, search model is too simple, positive feedback mechanism is easy to produce the phenomenon of stagnation and precocity. This paper introduces improved ACO for software testing cases generation: improved local pheromone update strategy for ant colony optimization, improved pheromone volatilization coefficient for ant colony optimization (IPVACO), and improved the global path pheromone update strategy for ant colony optimization (IGPACO). At last, we put forward a comprehensive improved ant colony optimization (ACIACO), which is based on all the above three methods. The proposed technique will be compared with random algorithm (RND) and genetic algorithm (GA) in terms of both efficiency and coverage. The results indicate that the improved method can effectively improve the search efficiency, restrain precocity, promote case coverage, and reduce the number of iterations. PMID:24883391

  4. Model Based Analysis and Test Generation for Flight Software

    Science.gov (United States)

    Pasareanu, Corina S.; Schumann, Johann M.; Mehlitz, Peter C.; Lowry, Mike R.; Karsai, Gabor; Nine, Harmon; Neema, Sandeep

    2009-01-01

    We describe a framework for model-based analysis and test case generation in the context of a heterogeneous model-based development paradigm that uses and combines Math- Works and UML 2.0 models and the associated code generation tools. This paradigm poses novel challenges to analysis and test case generation that, to the best of our knowledge, have not been addressed before. The framework is based on a common intermediate representation for different modeling formalisms and leverages and extends model checking and symbolic execution tools for model analysis and test case generation, respectively. We discuss the application of our framework to software models for a NASA flight mission.

  5. Beyond the 2nd Fermi Pulsar Catalog

    CERN Document Server

    Hou, Xian; Reposeur, Thierry; Rousseau, Romain

    2013-01-01

    Over thirteen times more gamma-ray pulsars have now been studied with the Large Area Telescope on NASA's Fermi satellite than the ten seen with the Compton Gamma-Ray Observatory in the nineteen-nineties. The large sample is diverse, allowing better understanding both of the pulsars themselves and of their roles in various cosmic processes. Here we explore the prospects for even more gamma-ray pulsars as Fermi enters the 2nd half of its nominal ten-year mission. New pulsars will naturally tend to be fainter than the first ones discovered. Some of them will have unusual characteristics compared to the current population, which may help discriminate between models. We illustrate a vision of the future with a sample of six pulsars discovered after the 2nd Fermi Pulsar Catalog was written.

  6. Two Sources of Control over the Generation of Software Instructions

    CERN Document Server

    Hartley, A; Hartley, Anthony; Paris, Cecile

    1996-01-01

    This paper presents an analysis conducted on a corpus of software instructions in French in order to establish whether task structure elements (the procedural representation of the users' tasks) are alone sufficient to control the grammatical resources of a text generator. We show that the construct of genre provides a useful additional source of control enabling us to resolve undetermined cases.

  7. Newton's 2nd Law: Inquiry Approach

    Science.gov (United States)

    Cecilia Tung

    2010-01-01

    In this lab activity, learners act as fellow scientists and colleagues of Isaac Newton. He has asked them to independently test his ideas on the nature of motion, in particular his 2nd Law. The emphasis here is on the process of science rather than the actual results. Learners can use the Science Flowchart to trace and discuss their process. Time estimate and materials are given for learners to run their designed experiments.

  8. 2nd International Arctic Ungulate Conference

    OpenAIRE

    Anonymous, A.

    1996-01-01

    The 2nd International Arctic Ungulate Conference was held 13-17 August 1995 on the University of Alaska Fairbanks campus. The Institute of Arctic Biology and the Alaska Cooperative Fish and Wildlife Research Unit were responsible for organizing the conference with assistance from biologists with state and federal agencies and commercial organizations. David R. Klein was chair of the conference organizing committee. Over 200 people attended the conference, coming from 10 different countries. T...

  9. Overview of the next generation of Fermilab collider software

    International Nuclear Information System (INIS)

    Fermilab is entering an era of operating a more complex collider facility. In addition, new operator workstations are available that have increased capabilities. The task of providing updated software in this new environment precipitated a project called Colliding Beam Software (CBS). It was soon evident that a new approach was needed for developing console software. Hence CBS, although a common acronym, is too narrow a description. A new generation of the application program subroutine library has been created to enhance the existing programming environment with a set of value added tools. Several key Collider applications were written that exploit CBS tools. This paper will discuss the new tools and the underlying change in methodology in application program development for accelerator control at Fermilab. (author)

  10. Model-driven Generative Development of Measurement Software

    OpenAIRE

    Monperrus, Martin; Je?ze?quel, Jean-marc; Baudry, Benoit; Champeau, Joe?l; Hoeltzener, Brigitte

    2011-01-01

    Metrics offer a practical approach to evaluate non-functional properties of domain-specific models. However, it is tedious and costly to develop and maintain a measurement software for each domain specific modeling language (DSML). In this paper, we present the principles of a domain-independent, metamodel-independent and generative approach to measuring models. The approach is operationalized through a prototype that synthesizes a measurement infrastructure for a DSML. This model-driven meas...

  11. 2nd International Conference on Pattern Recognition

    CERN Document Server

    Marsico, Maria

    2015-01-01

    This book contains the extended and revised versions of a set of selected papers from the 2nd International Conference on Pattern Recognition (ICPRAM 2013), held in Barcelona, Spain, from 15 to 18 February, 2013. ICPRAM was organized by the Institute for Systems and Technologies of Information, Control and Communication (INSTICC) and was held in cooperation with the Association for the Advancement of Artificial Intelligence (AAAI). The hallmark of this conference was to encourage theory and practice to meet in a single venue. The focus of the book is on contributions describing applications of Pattern Recognition techniques to real-world problems, interdisciplinary research, experimental and/or theoretical studies yielding new insights that advance Pattern Recognition methods.

  12. Community Hurricane Preparedness, 2nd Edition

    Science.gov (United States)

    COMET

    2010-05-20

    The purpose of this course is to provide emergency managers who face threats from tropical cyclones and hurricanes with basic information about: How tropical cyclones form The hazards they pose How the NWS forecasts future hurricane behavior What tools and guiding principles can help emergency managers prepare their communities The course is not intended to take the place of courses sponsored by FEMA, the National Hurricane Center, and/or state agencies. However, it will provide a good background for those who either plan to attend those courses or cannot attend them. The original module was published in 2000. This 2nd edition provides updated information on hurricane science and National Weather Service forecast products. In addition, a new section on Emergency Management discusses decision-making tools that can help emergency managers in response and evacuation decision-making during hurricane threats. This module is course number IS-324.a in FEMA's Emergency Management Institute's Independent Study catalog.

  13. Web Style Guide, 2nd Edition

    Science.gov (United States)

    The Web Style Guide, 2nd Edition, which is the online version of a book with the same name, demonstrates the step-by-step process involved in designing a Web site. Visitors are assumed to be familiar with whatever Web publishing tool they are using. The guide gives few technical details but instead focuses on the usability, layout, and attractiveness of a Web site, with the goal being to make it as popular with the intended audience as possible. Considerations such as graphics, typography, and multimedia enhancements are discussed. Web site structure, fine-tuned features on individual pages, and almost everything in between is addressed by the guide, making it a handy resource for people who place great importance on the effectiveness of their online creations.

  14. Anti-Random Test Generation In Software Testing

    OpenAIRE

    seema rani

    2011-01-01

    The main purpose of software testing is found a error and then correct it. Random testing selects test cases randomly but it does not explore the previous information. Anti-random testing in which each test applied its total distance from all previous tests is maximum. Anti-Random testing is a variation of random testing, which is the process of generating random input and sending that input to a system for test. In which use hamming Distance and Cartesian Distance for measure of difference.

  15. Anti-Random Test Generation In Software Testing

    Directory of Open Access Journals (Sweden)

    seema rani

    2011-06-01

    Full Text Available The main purpose of software testing is found a error and then correct it. Random testing selects test cases randomly but it does not explore the previous information. Anti-random testing in which each test applied its total distance from all previous tests is maximum. Anti-Random testing is a variation of random testing, which is the process of generating random input and sending that input to a system for test. In which use hamming Distance and Cartesian Distance for measure of difference.

  16. Software para la Evaluación y Selección de Generadores Independientes / Independent Generator Evaluation and Selection Software

    Scientific Electronic Library Online (English)

    Marcos Alberto, de Armas Teyra; Miguel, Castro Fernández.

    2014-04-01

    Full Text Available SciELO Cuba | Language: Spanish Abstract in spanish En muchas industrias, edificios, empresas y servicios por razones productivas, emergentes o de confiablidad, es necesario generar energía eléctrica independientemente del Sistema Eléctrico Nacional. En otros casos se necesitan plantas de generación de energía eléctrica que puedan trabajar de forma a [...] islada alimentando un circuito determinado. Para realizar la selección más económica y eficiente de la capacidad a instalar se debe tomar en consideración no sólo el comportamiento del sistema donde se va a insertar la unidad, sino también, las características operacionales del generador ante las fluctuaciones de la carga, sus límites operacionales y la estabilidad resultante. Este trabajo presenta un software que permite realizar la selección más adecuada considerando la curva de capacidad y la estabilidad del generador ante las particularidades del sistema. Con su aplicación es posible reducir los gastos y las pérdidas económicas debidas a una selección inapropiada. Como caso se presenta su aplicación a una planta de una fábrica de envases de alimentos. Abstract in english In many industries, buildings and services is necessary to employ independent power plants to supply the electric power to a particular system. In other cases, islanded operation is desired due to several specific situations. In order to realize the most efficient, economic and adequate selection of [...] the generator capacity is necessary to consider not only the systems load characteristic, also is necessary to know the limits of capabilities and the resultant stability of the power generator. This paper presents a software that allows to select the adequate generator, expose the operating limits and the resulting stability in fluctuating load condition. With its application is possible to reduce economic losses and the costs due to an impropriated generator selection with an oversized o sub utilized machine. As a case a 2500 kVA power plant is analyzed.

  17. 2nd International Arctic Ungulate Conference

    Directory of Open Access Journals (Sweden)

    A. Anonymous

    1996-01-01

    Full Text Available The 2nd International Arctic Ungulate Conference was held 13-17 August 1995 on the University of Alaska Fairbanks campus. The Institute of Arctic Biology and the Alaska Cooperative Fish and Wildlife Research Unit were responsible for organizing the conference with assistance from biologists with state and federal agencies and commercial organizations. David R. Klein was chair of the conference organizing committee. Over 200 people attended the conference, coming from 10 different countries. The United States, Canada, and Norway had the largest representation. The conference included invited lectures; panel discussions, and about 125 contributed papers. There were five technical sessions on Physiology and Body Condition; Habitat Relationships; Population Dynamics and Management; Behavior, Genetics and Evolution; and Reindeer and Muskox Husbandry. Three panel sessions discussed Comparative caribou management strategies; Management of introduced, reestablished, and expanding muskox populations; and Health risks in translocation of arctic ungulates. Invited lectures focused on the physiology and population dynamics of arctic ungulates; contaminants in food chains of arctic ungulates and lessons learned from the Chernobyl accident; and ecosystem level relationships of the Porcupine Caribou Herd.

  18. 2nd European symposium on liquid ventilation.

    Science.gov (United States)

    Valls i Soler, A; Wauer, R R

    2001-03-26

    After the successful meeting in Berlin, advances and research results in the understanding of liquid ventilation has been performed. About 80 applied and basic scientists met in the 1st European Symposium, from clinicians (pediatricians, neonatologists, intensivists, etc.) to other basic scientists (physiologists, biologists, bioengineerings, etc.). Furthermore, we also invited representatives of pharmaceutic industry interested in this hot topic. Our main goal is to provide an opportunity for all liquid researchers in this field to meet together and with the top scientists of Liquid Ventilation Research. We planned to provide both a scientific and a friendly atmosphere to enhance the exchange of experiences and to facilitate future plans. We hope this 2nd European Symposium will be a continuation point for collaboration of groups in Europe, to study all research aspects of the technique to carry on future trials. There are still a lot of unanswered questions to be solved. Among the unsolved issues and practical questions we would like to point out the following items: 1. Perfluorocarbon: which product to use and how to deliver it. 2. Perfluorocarbon interactions in the lung. 3. Perfluorocarbon. Toxicity and cytoprotection. 4. Partial Liquid Ventilation: ventilatory strategies from delivering to weaning. 5. Impact of Partial Liquid Ventilation Experimental and clinical aspects. 6. General discussion and plan for the future. We know that none of these questions can be completely answered now, but hope collaboration and communication will bring us closer to achieve these goals. Moreover, concerted actions should be started to search for research grant founding. For all those reasons we would like to thank all active and passive participants, who came to Bilbao to present, discuss and foster future work in Liquid Ventilation. PMID:11309225

  19. 2nd International technical meeting on small reactors

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-07-01

    The 2nd International Technical Meeting on Small Reactors was held on November 7-9, 2012 in Ottawa, Ontario. The meeting was hosted by Atomic Energy of Canada Limited (AECL) and Canadian Nuclear Society (CNS). There is growing international interest and activity in the development of small nuclear reactor technology. This meeting provided participants with an opportunity to share ideas and exchange information on new developments. This Technical Meeting covered topics of interest to designers, operators, researchers and analysts involved in the design, development and deployment of small reactors for power generation and research. A special session focussed on small modular reactors (SMR) for generating electricity and process heat, particularly in small grids and remote locations. Following the success of the first Technical Meeting in November 2010, which captured numerous accomplishments of low-power critical facilities and small reactors, the second Technical Meeting was dedicated to the achievements, capabilities, and future prospects of small reactors. This meeting also celebrated the 50th Anniversary of the Nuclear Power Demonstration (NPD) reactor which was the first small reactor (20 MWe) to generate electricity in Canada.

  20. Elements of the Next Generation Science Standards' (NGSS) New Framework for K-12 Science Education aligned with STEM designed projects created by Kindergarten, 1st and 2nd grade students in a Reggio Emilio project approach setting

    Science.gov (United States)

    Facchini, Nicole

    This paper examines how elements of the Next Generation Science Standards' (NGSS) New Framework for K-12 Science Education standards (National Research Council 2011)---specifically the cross-cutting concept "cause and effect" are aligned with early childhood students' creation of projects of their choice. The study took place in a Reggio Emilio-inspired, K-12 school, in a multi-aged kindergarten, first and second grade classroom with 14 students. Students worked on their projects independently with the assistance of their peers and teachers. The students' projects and the alignment with the Next Generation Science Standards' New Framework were analyzed by using pre and post assessments, student interviews, and discourse analysis. Results indicate that elements of the New Framework for K-12 Science Education emerged through students' project presentation, particularly regarding the notion of "cause and effect". More specifically, results show that initially students perceived the relationship between "cause and effect" to be negative.

  1. Two live attenuated Shigella flexneri 2a strains WRSf2G12 and WRSf2G15: a new combination of gene deletions for 2nd generation live attenuated vaccine candidates.

    Science.gov (United States)

    Ranallo, Ryan T; Fonseka, Suramya; Boren, Tara L; Bedford, Lisa A; Kaminski, Robert W; Thakkar, Sejal; Venkatesan, Malabi M

    2012-07-20

    Shigella infections are a major cause of inflammatory diarrhea and dysentery worldwide. First-generation virG-based live attenuated Shigella strains have been successfully tested in phase I and II clinical trials and are a leading approach for Shigella vaccine development. Additional gene deletions in senA, senB and msbB2 have been engineered into second-generation virG-based Shigella flexneri 2a strains producing WRSf2G12 and WRSf2G15. Both strains harbor a unique combination of gene deletions designed to increase the safety of live Shigella vaccines. WRSf2G12 and WRSf2G15 are genetically stable and highly attenuated in both cell culture and animal models of infection. Ocular immunization of guinea pigs with either strain induces robust systemic and mucosal immune responses that protect against homologous challenge with wild-type Shigella. The data support further evaluation of the second-generation strains in a phase I clinical trial. PMID:22658966

  2. The 2nd International Home Care Nurses Organization conference.

    Science.gov (United States)

    Narayan, Mary

    2015-03-01

    The International Home Care Nurses Organization, a grass-roots movement, held its 2nd annual conference in Singapore in September 2014. This article describes the highlights of the conference. PMID:25738272

  3. Horizontal Branch Morphology and the 2nd Parameter Problem

    OpenAIRE

    Pecci, F. Fusi; Bellazzini, M.; Ferraro, F. R.; Buonanno, R.; Corsi, C.

    1996-01-01

    We review the most outstanding issues related to the study of the morphology of the Horizontal Branch (HB) in the Color-Magnitude Diagrams of Galactic Globular Clusters and its use as age indicator. It is definitely demonstrated (see also Bolte, this meeting) that age cannot be the only 2nd-P driving the HB morphology. Other candidate 2nd-Ps are briefly examined, with special attention to the possible influence of cluster stellar density.

  4. Software Defined Radio Architecture Contributions to Next Generation Space Communications

    Science.gov (United States)

    Kacpura, Thomas J.; Eddy, Wesley M.; Smith, Carl R.; Liebetreu, John

    2015-01-01

    Space communications architecture concepts, comprising the elements of the system, the interactions among them, and the principles that govern their development, are essential factors in developing National Aeronautics and Space Administration (NASA) future exploration and science missions. Accordingly, vital architectural attributes encompass flexibility, the extensibility to insert future capabilities, and to enable evolution to provide interoperability with other current and future systems. Space communications architectures and technologies for this century must satisfy a growing set of requirements, including those for Earth sensing, collaborative observation missions, robotic scientific missions, human missions for exploration of the Moon and Mars where surface activities require supporting communications, and in-space observatories for observing the earth, as well as other star systems and the universe. An advanced, integrated, communications infrastructure will enable the reliable, multipoint, high-data-rate capabilities needed on demand to provide continuous, maximum coverage for areas of concentrated activity. Importantly, the cost/value proposition of the future architecture must be an integral part of its design; an affordable and sustainable architecture is indispensable within anticipated future budget environments. Effective architecture design informs decision makers with insight into the capabilities needed to efficiently satisfy the demanding space-communication requirements of future missions and formulate appropriate requirements. A driving requirement for the architecture is the extensibility to address new requirements and provide low-cost on-ramps for new capabilities insertion, ensuring graceful growth as new functionality and new technologies are infused into the network infrastructure. In addition to extensibility, another key architectural attribute of the space communication equipment's interoperability with other NASA communications systems, as well as those communications and navigation systems operated by international space agencies and civilian and government agencies. In this paper, we review the philosophies, technologies, architectural attributes, mission services, and communications capabilities that form the structure of candidate next-generation integrated communication architectures for space communications and navigation. A key area that this paper explores is from the development and operation of the software defined radio for the NASA Space Communications and Navigation (SCaN) Testbed currently on the International Space Station (ISS). Evaluating the lessons learned from development and operation feed back into the communications architecture. Leveraging the reconfigurability provides a change in the way that operations are done and must be considered. Quantifying the impact on the NASA Space Telecommunications Radio System (STRS) software defined radio architecture provides feedback to keep the standard useful and up to date. NASA is not the only customer of these radios. Software defined radios are developed for other applications, and taking advantage of these developments promotes an architecture that is cost effective and sustainable. Developments in the following areas such as an updated operating environment, higher data rates, networking and security can be leveraged. The ability to sustain an architecture that uses radios for multiple markets can lower costs and keep new technology infused.

  5. Basis Principles of Software Development for Eddy Current Inspection of PWR/WWER Steam Generator Tubes

    International Nuclear Information System (INIS)

    Extensive inspection of PWR/WWER steam generators associated with development of own designs of eddy current inspection systems including manipulators, push-pullers, controllers, probes, etc. influence on INETEC decision to start with development of its own software for EC inspections. In last year incredible results were obtained. Main software packages were finished with increased possibilities compared to other software available on the world market. In this article some basic principles of EC NDT software development is described including organizational aspects of software team, description of tasks and description of main achievements. Also associated problems and future development directions are discussed. (author)

  6. Yxilon: Designing The Next Generation, Vertically Integrable Statistical Software Environment

    OpenAIRE

    Ziegenhagen, Uwe; Klinke, Sigbert; Ha?rdle, Wolfgang Karl

    2004-01-01

    Modern statistical computing requires smooth integration of new algorithms and quantitative analysis results in all sorts of platforms such as webbrowsers, standard and proprietary application software. Common statistical software packages can often not be adapted to integrate into new environments or simply lack the demands users and especially beginners have. With Yxilon we propose a vertically integrable, modular statistical computing environment, providing the user a rich set of methods a...

  7. 2nd ISPRA nuclear electronics symposium, Stresa, Italy May 20-23, 1975

    International Nuclear Information System (INIS)

    Two round tables were annexed to the 2nd Ispra Nuclear Electronics Symposium. The first one was concerned with software support for the implementation of microprocessors, MOS and bipolar microporcessors, environmental data systems, and the use of microprocessors and minicomputers in nuclear, biomedical and environmental fields. Nuclear electronics future, and its diversification, gravitational waves and electronics, the environmental measurements of air and water quality were discussed during the second round table, and relevant feelings brought out during the discussion on the extension of nuclear electronics techniques to other fields

  8. Advanced Virgo: a 2nd generation interferometric gravitational wave detector

    CERN Document Server

    ,

    2014-01-01

    Advanced Virgo is the project to upgrade the Virgo interferometric detector of gravitational waves, with the aim of increasing the number of observable galaxies (and thus the detection rate) by three orders of magnitude. The project is now in an advanced construction phase and the assembly and integration will be completed by the end of 2015. Advanced Virgo will be part of a network with the two Advanced LIGO detectors in the US and GEO HF in Germany, with the goal of contributing to the early detections of gravitational waves and to opening a new observation window on the universe. In this paper we describe the main features of the Advanced Virgo detector and outline the status of the construction.

  9. Performance of 2nd generation CALICE/EUDET ASICs

    Science.gov (United States)

    de La Taille, C.; CALICE Collaboration; EUDET Collaboration

    2011-04-01

    The paper reviews the performance of the three ASICs : HARDROC2, SPIROC2 and SKIROC2 developed to readout the ILC calorimeter prototypes. The chips integrate 36 to 64 channels of front-end, digitization and backend electronics in SiGe 0.35 ?m technology. This second version was found mature enough to be produced in several hundreds to equip large scale technological prototypes and establish the feasibility of these highly granular "imaging" calorimeters as required for particle flow algorithms at the ILC. The low noise and low power sequential readout as well as power-pulsing operation at detector level and in magnetic field are proven.

  10. Performance of 2nd generation CALICE/EUDET ASICs

    Energy Technology Data Exchange (ETDEWEB)

    La Taille, C de, E-mail: taille@lal.in2p3.fr

    2011-04-01

    The paper reviews the performance of the three ASICs : HARDROC2, SPIROC2 and SKIROC2 developed to readout the ILC calorimeter prototypes. The chips integrate 36 to 64 channels of front-end, digitization and backend electronics in SiGe 0.35 {mu}m technology. This second version was found mature enough to be produced in several hundreds to equip large scale technological prototypes and establish the feasibility of these highly granular 'imaging' calorimeters as required for particle flow algorithms at the ILC. The low noise and low power sequential readout as well as power-pulsing operation at detector level and in magnetic field are proven.

  11. 2nd generation ASICs for CALICE/EUDET calorimeters

    International Nuclear Information System (INIS)

    Imaging calorimetry depends heavily on the development of high performance, highly integrated readout ASICs embedded inside the detector which readout the millions of foreseen channels. Suitable ASICs prototypes have been fabricated in 2006-2007 and show good preliminary performance.

  12. 2nd generation ASICs for CALICE/EUDET calorimeters

    Science.gov (United States)

    Dulucq, F.; Fleury, J.; de La Taille, C.; Martin-Chassard, G.; Raux, L.; Seguin-Moreau, N.

    2009-04-01

    Imaging calorimetry depends heavily on the development of high performance, highly integrated readout ASICs embedded inside the detector which readout the millions of foreseen channels. Suitable ASICs prototypes have been fabricated in 2006-2007 and show good preliminary performance.

  13. Safety profile of bilastine: 2nd generation H1-antihistamines.

    Science.gov (United States)

    Scaglione, F

    2012-12-01

    Bilastine is a new H1 antagonist with no sedative side effects, no cardiotoxic effects, and no hepatic metabolism. In addition, bilastine has proved to be effective for the symptomatic treatment of allergic rhinoconjunctivitis and urticaria. Pharmacological studies have shown that bilastine is highly selective for the H1 receptor in both in vivo and in vitro studies, and with no apparent affinity for other receptors. The absorption of bilastine is fast, linear and dose-proportional; it appears to be safe and well tolerated at all doses levels in healthy population. Multiple administration of bilastine has confirmed the linearity of the kinetic parameters. The distribution in the brain is undetectable. The safety profile in terms of adverse effects is very similar to placebo in all Phase I, II and III clinical trials. Bilastine (20 mg), unlike cetirizine, does not increase alcohol effects on the CNS. Bilastine 20 mg does not increase the CNS depressant effect of lorazepam. Bilastine 20 mg is similar to placebo in the driving test. Therefore, it meets the current criteria for medication used in the treatment of allergic rhinitis and urticaria. PMID:23242729

  14. JNCI 92#3/2nd pages

    Science.gov (United States)

    New Guidelines to Evaluate the Response to Treatment in Solid Tumors Patrick Therasse, Susan G. Arbuck, Elizabeth A. Eisenhauer, Jantien Wanders, Richard S. Kaplan, Larry Rubinstein, Jaap Verweij, Martine Van Glabbeke, Allan T. van Oosterom, Michaele C. Christian, Steve G. Gwyther Anticancer cytotoxic agents go through a process by which their antitumor activityÐon the basis of the amount of tu-mor shrinkage they could generateÐhas been investigated.

  15. Development of a Prototype Automation Simulation Scenario Generator for Air Traffic Management Software Simulations

    Science.gov (United States)

    Khambatta, Cyrus F.

    2007-01-01

    A technique for automated development of scenarios for use in the Multi-Center Traffic Management Advisor (McTMA) software simulations is described. The resulting software is designed and implemented to automate the generation of simulation scenarios with the intent of reducing the time it currently takes using an observational approach. The software program is effective in achieving this goal. The scenarios created for use in the McTMA simulations are based on data taken from data files from the McTMA system, and were manually edited before incorporation into the simulations to ensure accuracy. Despite the software s overall favorable performance, several key software issues are identified. Proposed solutions to these issues are discussed. Future enhancements to the scenario generator software may address the limitations identified in this paper.

  16. SEMANTIC WEB-BASED SOFTWARE ENGINEERING BY AUTOMATED REQUIREMENTS ONTOLOGY GENERATION IN SOA

    OpenAIRE

    Vahid Rastgoo; Monireh-Sadat Hosseini; Esmaeil Kheirkhah

    2014-01-01

    This paper presents an approach for automated generation of requirements ontology using UML diagrams in service-oriented architecture (SOA). The goal of this paper is to convenience progress of software engineering processes like software design, software reuse, service discovering and etc. The proposed method is based on a four conceptual layers. The first layer includes requirements achieved by stakeholders, the second one designs service-oriented diagrams from the data in f...

  17. Automatic test vector generation and coverage analysis in model-based software development

    OpenAIRE

    Andersson, Jonny

    2005-01-01

    Thorough testing of software is necessary to assure the quality of a product before it is released. The testing process requires substantial resources in software development. Model-based software development provides new possibilities to automate parts of the testing process. By automating tests, valuable time can be saved. This thesis focuses on different ways to utilize models for automatic generation of test vectors and how test coverage analysis can be used to assure the quality of a tes...

  18. Thermoluminescent characteristics of ZrO2:Nd films

    International Nuclear Information System (INIS)

    In this work it is exposed the obtained results after analysing the photo luminescent and thermoluminescent characteristics of activated zirconium oxide with neodymium (ZrO2 :Nd) and its possible application in the UV radiation dosimetry. The realized experiments had as objective to study the characteristics such as the optimum thermal erased treatment, the influence of light on the response, the response depending on the wavelength, the fadeout of the information, the temperature effect, the response depending on the time and the recurring of the response. The results show that the ZrO2 :Nd is a promising material to be used as Tl dosemeter for the UV radiation. (Author)

  19. The 2nd reactor core of the NS Otto Hahn

    International Nuclear Information System (INIS)

    Details of the design of the 2nd reactor core are given, followed by a brief report summarising the operating experience gained with this 2nd core, as well as by an evaluation of measured data and statements concerning the usefulness of the knowledge gained for the development of future reactor cores. Quite a number of these data have been used to improve the concept and thus the specifications for the fuel elements of the 3rd core of the reactor of the NS Otto Hahn. (orig./HP)

  20. Using Automatic Code Generation in the Attitude Control Flight Software Engineering Process

    Science.gov (United States)

    McComas, David; O'Donnell, James R., Jr.; Andrews, Stephen F.

    1999-01-01

    This paper presents an overview of the attitude control subsystem flight software development process, identifies how the process has changed due to automatic code generation, analyzes each software development phase in detail, and concludes with a summary of our lessons learned.

  1. 2nd Quarter Transportation Report FY 2014

    Energy Technology Data Exchange (ETDEWEB)

    Gregory, L.

    2014-07-30

    This report satisfies the U.S. Department of Energy (DOE), National Nuclear Security Administration Nevada Field Office (NNSA/NFO) commitment to prepare a quarterly summary report of radioactive waste shipments to the Nevada National Security Site (NNSS) Radioactive Waste Management Complex (RWMC) at Area 5. There were no shipments sent for offsite treatment and returned to the NNSS this quarter. This report summarizes the second quarter of fiscal year (FY) 2014 low-level radioactive waste (LLW) and mixed low-level radioactive waste (MLLW) shipments. This report also includes annual summaries for FY 2014 in Tables 4 and 5. Tabular summaries are provided which include the following: Sources of and carriers for LLW and MLLW shipments to and from the NNSS; Number and external volume of LLW and MLLW shipments; Highway routes used by carriers; and Incident/accident data applicable to LLW and MLLW shipments. In this report shipments are accounted for upon arrival at the NNSS, while disposal volumes are accounted for upon waste burial. The disposal volumes presented in this report do not include minor volumes of non-radioactive materials that were approved for disposal. Volume reports showing cubic feet (ft3) generated using the Low-Level Waste Information System may vary slightly due to differing rounding conventions.

  2. Development and Testing of Automatically Generated ACS Flight Software for the MAP Spacecraft

    Science.gov (United States)

    ODonnell, James R., Jr.; McComas, David C.; Andrews, Stephen F.

    1998-01-01

    By integrating the attitude determination and control system (ACS) analysis and design, flight software development, and flight software testing processes, it is possible to improve the overall spacecraft development cycle, as well as allow for more thorough software testing. One of the ways to achieve this integration is to use code-generation tools to automatically generate components of the ACS flight software directly from a high-fidelity (HiFi) simulation. In the development of the Microwave Anisotropy Probe (MAP) spacecraft, currently underway at the NASA Goddard Space Flight Center, approximately 1/3 of the ACS flight software was automatically generated. In this paper, we will examine each phase of the ACS subsystem and flight software design life cycle: analysis, design, and testing. In the analysis phase, we scoped how much software we would automatically generate and created the initial interface. The design phase included parallel development of the HiFi simulation and the hand-coded flight software components. Everything came together in the test phase, in which the flight software was tested, using results from the HiFi simulation as one of the bases of comparison for testing. Because parts of the spacecraft HiFi simulation were converted into flight software, more care needed to be put into its development and configuration control to support both the HiFi simulation and flight software. The components of the HiFi simulation from which code was generated needed to be designed based on the fact that they would become flight software. This process involved such considerations as protecting against mathematical exceptions, using acceptable module and parameter naming conventions, and using an input/output interface compatible with the rest of the flight software. Maintaining good configuration control was an issue for the HiFi simulation and the flight software, and a way to track the two systems was devised. Finally, an integrated test approach was devised to support flight software testing at both the unit- and build-test levels using the HiFi simulation to generate data for performance verification. Another benefit of the simulation and code-generation application used on the MAP project is that it supported bringing flight software and test data into the HiFi simulation environment. It was possible to integrate parts of the hand-coded flight software into the HiFi simulation, and also possible to import flight software test data for comparison and performance verification. This capability was used to incorporate the flight software Kalman filter into the HiFi simulation. This enabled us to greatly increase the amount of testing that could be done on the filter, because we could exert a greater degree of control over the software-only simulation than over the flight software test environment. Also, since the simulation could be used to run the Kalman filter faster than real time, our testing efficiency was greatly increased. We will conclude our discussion with a summary of the lessons learned thus far using automatically- generated code for the MAP project, and the spacecraft status as we work towards our scheduled launch in the year 2000.

  3. A Code Generator for Software Component Services in Smart Devices

    OpenAIRE

    Ahmad, Manzoor

    2010-01-01

    A component is built to be reused and reusability has significant impact on component generality and flexibility requirement. A component model plays a critical role in reusability of software component and defines a set of standards for component implementation, evolution, composition, deployment and standardization of the run-time environment for execution of component. In component based development (CBD), standardization of the runtime environment includes specification of component’s i...

  4. 2nd International Conference on Data Management Technologies and Applications

    CERN Document Server

    2013-01-01

    The 2nd International Conference on Data Management Technologies and Applications (DATA) aims to bring together researchers, engineers and practitioners interested on databases, data warehousing, data mining, data management, data security and other aspects of information systems and technology involving advanced applications of data.

  5. SrF2:Nd3+ laser fluoride ceramics.

    Science.gov (United States)

    Basiev, T T; Doroshenko, M E; Konyushkin, V A; Osiko, V V

    2010-12-01

    SrF(2):Nd(3+) fluoride ceramics of high optical quality was prepared and its spectroscopic and laser properties investigated. Oscillations of different optical centers depending on the excitation wavelength were obtained with a slope efficiency of up to 19%. PMID:21124595

  6. 2nd National Conference on Theoretical Physics. Abstracts Book

    International Nuclear Information System (INIS)

    The 2nd National Conference on Theoretical Physics was held on 26-29 August 2004 in Constanta, Romania. The addressed physics fields within the INIS scope are as follows: classical and quantum mechanics, general physics, physics of elementary particles and fields, nuclear physics and radiation physics, classical and quantum mechanics, general physics, atomic and molecular physics, condensed matter physics

  7. A Handbook for Classroom Instruction That Works, 2nd Edition

    Science.gov (United States)

    Association for Supervision and Curriculum Development, 2012

    2012-01-01

    Perfect for self-help and professional learning communities, this handbook makes it much easier to apply the teaching practices from the ASCD-McREL best-seller "Classroom Instruction That Works: Research-Based Strategies for Increasing Student Achievement, 2nd Edition." The authors take you through the refined Instructional Planning Guide, so you…

  8. Food irradiation - 2nd all-German conference. Proceedings

    International Nuclear Information System (INIS)

    The 2nd conference on 'Food Irradiation' in re-united Germany took place in Eggenstein-Leopoldshafen at the Karlsruhe Nuclear Research Centre, 9th to 10th December 1992. Participants came from government investigating agencies and research institutions of the German Federal Government and the Federal States. Abstracts focus on issues of food laws and certification of irradiation treatment. (UHE)

  9. Book Review: Bioassays with Arthropods: 2nd Edition

    Science.gov (United States)

    The technical book "Bioassays with Arthropods: 2nd Edition" (2007. Jacqueline L. Robertson, Robert M. Russell, Haiganoush K, Preisler and N. E. Nevin, Eds. CRC Press, Boca Raton, FL, 224 pp.) was reviewed for the scientific readership of the peer-reviewed publication Journal of Economic Entomology. ...

  10. Learning from examples - Generation and evaluation of decision trees for software resource analysis

    Science.gov (United States)

    Selby, Richard W.; Porter, Adam A.

    1988-01-01

    A general solution method for the automatic generation of decision (or classification) trees is investigated. The approach is to provide insights through in-depth empirical characterization and evaluation of decision trees for software resource data analysis. The trees identify classes of objects (software modules) that had high development effort. Sixteen software systems ranging from 3,000 to 112,000 source lines were selected for analysis from a NASA production environment. The collection and analysis of 74 attributes (or metrics), for over 4,700 objects, captured information about the development effort, faults, changes, design style, and implementation style. A total of 9,600 decision trees were automatically generated and evaluated. The trees correctly identified 79.3 percent of the software modules that had high development effort or faults, and the trees generated from the best parameter combinations correctly identified 88.4 percent of the modules on the average.

  11. AMON: A Software System for Automatic Generation of Ontology Mappings

    OpenAIRE

    Sa?nchez-alberca, A.; Garci?a-garci?a, R.; Sorzano, C. O. S.; Gutie?rrez-cossi?o, Celia; Chagoyen, Mo?nica; Ferna?ndez Lo?pez, Mariano

    2005-01-01

    Some of the most outstanding problems in Computer Science (e.g. access to heterogeneous information sources, use of different e-commerce standards, ontology translation, etc.) are often approached through the identification of ontology mappings. A manual mapping generation slows down, or even makes unfeasible, the solution of particular cases of the aforementioned problems via ontology mappings. Some algorithms and formal models for partial tasks of automatic generation of mappings have been ...

  12. Automatically generated acceptance test: A software reliability experiment

    Science.gov (United States)

    Protzel, Peter W.

    1988-01-01

    This study presents results of a software reliability experiment investigating the feasibility of a new error detection method. The method can be used as an acceptance test and is solely based on empirical data about the behavior of internal states of a program. The experimental design uses the existing environment of a multi-version experiment previously conducted at the NASA Langley Research Center, in which the launch interceptor problem is used as a model. This allows the controlled experimental investigation of versions with well-known single and multiple faults, and the availability of an oracle permits the determination of the error detection performance of the test. Fault interaction phenomena are observed that have an amplifying effect on the number of error occurrences. Preliminary results indicate that all faults examined so far are detected by the acceptance test. This shows promise for further investigations, and for the employment of this test method on other applications.

  13. 2nd International Conference on Green Communications and Networks 2012

    CERN Document Server

    Ma, Maode; GCN 2012

    2013-01-01

    The objective of the 2nd International Conference on Green Communications and Networks 2012 (GCN 2012) is to facilitate an exchange of information on best practices for the latest research advances in the area of communications, networks and intelligence applications. These mainly involve computer science and engineering, informatics, communications and control, electrical engineering, information computing, and business intelligence and management. Proceedings of the 2nd International Conference on Green Communications and Networks 2012 (GCN 2012) will focus on green information technology and applications, which will provide in-depth insights for engineers and scientists in academia, industry, and government. The book addresses the most innovative research developments including technical challenges, social and economic issues, and presents and discusses the authors’ ideas, experiences, findings, and current projects on all aspects of advanced green information technology and applications. Yuhang Yang is ...

  14. Fusion Technologies: 2nd Karlsruhe International Summer School

    International Nuclear Information System (INIS)

    Nuclear fusion promises to deliver a future non-polluting energy supply with nearly unlimited fuel reserves. To win young scientists and engineers for nuclear fusion, the Karlsruhe Research Center, together with other partners in the European Fusion Education Network being established by the European Commission, organizes the 2nd Karlsruhe International Summer School on Fusion Technologies on September 1-12, 2008. The program covers all key technologies necessary for construction and operation of a fusion reactor. (orig.)

  15. Magnetic field analysis by using 2nd-order elements

    International Nuclear Information System (INIS)

    We are developing a code for analyzing static magnetic fields in magnets with cylindrical symmetry. The developing code uses finite element method technique with 2nd-order elements. To confirm accuracy of developing code, we compared calculation results with that of POISSON. As a result, both magnetic field distributions agree within 2%. In this paper, we will present current status and point out the remaining problems. (author)

  16. Writing TAFs for Convective Weather, 2nd Edition

    Science.gov (United States)

    2014-09-14

    "Writing TAFs for Convective Weather, 2nd Edition" uses a severe thunderstorm event to illustrate techniques for producing an effective Terminal Aerodrome Forecast (TAF) following current National Weather Service directives. The unit offers guidance for developing TAFs for different types of convection and discusses how to concisely communicate logic and uncertainty in an aviation forecast discussion (AvnFD) or by other means. It also addresses the importance of maintaining an effective TAF weather watch and updating the TAF proactively.

  17. 2nd International Open and Distance Learning (IODL) Symposium

    OpenAIRE

    Barkan, Reviewed By Murat

    2006-01-01

    This closing remarks prepared and presented by Prof. Dr. Murat BARKAN Anadolu University, Eskisehir, TURKEY DEAR GUESTS, As the 2nd International Open and Distance Learning (IODL) Symposium is now drawing to end, I would like to thank you all for your outstanding speeches, distinguished presentations, constructive roundtable and session discussions, and active participation during the last five days. I hope you all share my view that the whole symposium has been...

  18. Software tool to generate MLC leaf sequence for the delivery of a given intensity profile

    International Nuclear Information System (INIS)

    In the step and shoot method of achieving beam intensity modulation, a field is divided into a set of sub fields each of which are irradiated with uniform beam intensity levels. Usually the fluence map for each beam, provided by the inverse treatment planning software program is normalized into a specified number of intensity levels. The leaf sequencing software then generates the multileaf collimator (MLC) leaf position sequence required to produce the fluence profile

  19. Semi-automatic generation of web-based computing environments for software libraries

    OpenAIRE

    Kressner, D.; Johansson, P.

    2002-01-01

    A set of utilities for generating web computing environments related to mathematical and engineering library software is presented. The web interface can be accessed from a standard world wide web browser with no need for additional software installations on the local machine. The environment provides a user-friendly access to computational routines, workspace management, reusable sessions and support of various data formats, including MATLAB binaries. The creation of new interfaces is a stra...

  20. TagGD: Fast and Accurate Software for DNA Tag Generation and Demultiplexing

    OpenAIRE

    Costea, Paul Igor; Lundeberg, Joakim; Akan, Pelin

    2013-01-01

    Multiplexing is of vital importance for utilizing the full potential of next generation sequencing technologies. We here report TagGD (DNA-based Tag Generator and Demultiplexor), a fully-customisable, fast and accurate software package that can generate thousands of barcodes satisfying user-defined constraints and can guarantee full demultiplexing accuracy. The barcodes are designed to minimise their interference with the experiment. Insertion, deletion and substitution events are considered ...

  1. TagGD : Fast and Accurate Software for DNA Tag Generation and Demultiplexing

    OpenAIRE

    Costea, Paul Igor; Lundeberg, Joakim; Akan, Pelin

    2013-01-01

    Multiplexing is of vital importance for utilizing the full potential of next generation sequencing technologies. We here report TagGD (DNA-based Tag Generator and Demultiplexor), a fully-customisable, fast and accurate software package that can generate thousands of barcodes satisfying user-defined constraints and can guarantee full demultiplexing accuracy. The barcodes are designed to minimise their interference with the experiment. Insertion, deletion and substitution events are considered ...

  2. THR Simulator – the software for generating radiographs of THR prosthesis

    Directory of Open Access Journals (Sweden)

    Hou Sheng-Mou

    2009-01-01

    Full Text Available Abstract Background Measuring the orientation of acetabular cup after total hip arthroplasty is important for prognosis. The verification of these measurement methods will be easier and more feasible if we can synthesize prosthesis radiographs in each simulated condition. One reported method used an expensive mechanical device with an indeterminable precision. We thus develop a program, THR Simulator, to directly synthesize digital radiographs of prostheses for further analysis. Under Windows platform and using Borland C++ Builder programming tool, we developed the THR Simulator. We first built a mathematical model of acetabulum and femoral head. The data of the real dimension of prosthesis was adopted to generate the radiograph of hip prosthesis. Then with the ray tracing algorithm, we calculated the thickness each X-ray beam passed, and then transformed to grey scale by mapping function which was derived by fitting the exponential function from the phantom image. Finally we could generate a simulated radiograph for further analysis. Results Using THR Simulator, the users can incorporate many parameters together for radiograph synthesis. These parameters include thickness, film size, tube distance, film distance, anteversion, abduction, upper wear, medial wear, and posterior wear. These parameters are adequate for any radiographic measurement research. This THR Simulator has been used in two studies, and the errors are within 2° for anteversion and 0.2 mm for wearing measurement. Conclusion We design a program, THR Simulator that can synthesize prosthesis radiographs. Such a program can be applied in future studies for further analysis and validation of measurement of various parameters of pelvis after total hip arthroplasty.

  3. Software.

    Science.gov (United States)

    Journal of Chemical Education, 1989

    1989-01-01

    Presented are reviews of two computer software packages for Apple II computers; "Organic Spectroscopy," and "Videodisc Display Program" for use with "The Periodic Table Videodisc." A sample spectrograph from "Organic Spectroscopy" is included. (CW)

  4. 2nd International Conference on Computer Science, Applied Mathematics and Applications

    CERN Document Server

    Thi, Hoai; Nguyen, Ngoc

    2014-01-01

    The proceedings consists of 30 papers which have been selected and invited from the submissions to the 2nd International Conference on Computer Science, Applied Mathematics and Applications (ICCSAMA 2014) held on 8-9 May, 2014 in Budapest, Hungary. The conference is organized into 7 sessions: Advanced Optimization Methods and Their Applications, Queueing Models and Performance Evaluation, Software Development and Testing, Computational Methods for Mobile and Wireless Networks, Computational Methods for Knowledge Engineering, Logic Based Methods for Decision Making and Data Mining, and Nonlinear Systems and Applications, respectively. All chapters in the book discuss theoretical and practical issues connected with computational methods and optimization methods for knowledge engineering. The editors hope that this volume can be useful for graduate and Ph.D. students and researchers in Computer Science and Applied Mathematics. It is the hope of the editors that readers of this volume can find many inspiring idea...

  5. Advanced User Interface Generation in the Software Framework for Magnetic Measurements at CERN

    CERN Document Server

    Arpaia, P; La Commara, Giuseppe; Arpaia, Pasquale

    2010-01-01

    A model-based approach, the Model-View-Interactor Paradigm, for automatic generation of user interfaces in software frameworks for measurement systems is proposed. The Model-View-Interactor Paradigm is focused on the ``interaction{''} typical in a software framework for measurement applications: the final user interacts with the automatic measurement system executing a suitable high-level script previously written by a test engineer. According to the main design goal of frameworks, the proposed approach allows the user interfaces to be separated easily from the application logic for enhancing the flexibility and reusability of the software. As a practical case study, this approach has been applied to the flexible software framework for magnetic measurements at the European Organization for Nuclear research (CERN). In particular, experimental results about the scenario of permeability measurements are reported.

  6. GEOGRAPHICAL DESCRIPTION OF THE 2 ND DISTRICT OF BUCHAREST

    Directory of Open Access Journals (Sweden)

    Mariana Cârstea

    2009-10-01

    Full Text Available Located in the north-east of Bucharest, with a population of approx. 400,000 inhabitants, the current territory of the 2nd district was once part of Vl?siei forests, crossed by the river Colentina. It is a tabular plain, with low declivity on NW-SE direction the only major bumps are determined leading to the terrace Colentina, tablelands and anthropic relief. The Colentina Plain covers 36% of the Bucharest Municipality and it is characterized by altitudes that vary between 88.9 m in the Free Press Square and 55m at C?telu. Field overlap Otopeni Bucharest north (northern district Colentina, B?neasa, Pipera is characterized by altitudes of 85-90 m, by fragmentation of 0.5 km/square km relief, through a high frequency tablelands and growth of local slopes (common values of 10 degrees. The 2nd district is on the second place in terms of total area of green spaces (4,187,000 square meters with an index of area of green space per capita of 13.6 square meters per head, but uneven distributed in the sector. The vegetation of 2nd district is represented in particular by vegetation in parks (Circus’ Park, Plumbuita, Morarilor, Tei, gardens and green spaces in housing blocks. Valleys are cut into loess are generally steep sides with intense phenomena of warping and biogenic mineral presents meadows, sometimes covered by lakes or swamps. The largest lakes of the valley, made by dams are located on Colentina river. Geomorphologic defining characteristics are the result of the action of erosion, transportation and deposition on the lower course of the Dâmbovi?a river. Altimetry and the average curve in the same time the capital is 80 m.

  7. 2nd Karlsruhe International Summer School on Fusion Technologies

    International Nuclear Information System (INIS)

    For the second time, the Karlsruhe Research enter together with European research institutions and industries invited young scientists and engineers to its ''International Summer School on Fusion Technologies.'' Fifty participants from all over Europe attended the lectures by 35 experts preesenting contributions from their areas of competence. Ten young scientists from India and another 10 from China were connected to the events by video link. Physics student Kornelia Stycz describes her impressions as a participant in the ''2nd International Summer School on Fusion Technologies.'' (orig.)

  8. GEOGRAPHICAL DESCRIPTION OF THE 2 ND DISTRICT OF BUCHAREST

    OpenAIRE

    Mariana Cârstea

    2009-01-01

    Located in the north-east of Bucharest, with a population of approx. 400,000 inhabitants, the current territory of the 2nd district was once part of Vl?siei forests, crossed by the river Colentina. It is a tabular plain, with low declivity on NW-SE direction the only major bumps are determined leading to the terrace Colentina, tablelands and anthropic relief. The Colentina Plain covers 36% of the Bucharest Municipality and it is characterized by altitudes that vary between 88.9 m in the Free...

  9. Beyond Chat--New Generation Software for Real-Time Discussion.

    Science.gov (United States)

    Charlton, Colin; Little, Janet; Morris, Simon; Neilson, Irene

    This paper discusses the need for and the design requirements of generic chatroom software that is readily configurable to particular domains and that is also extensible. The architecture of a client/server chatroom generation system, ChatterBox, implemented in Java, is presented. The following components of ChatterBox are described: (1) the basic…

  10. Software quality assurance project for reactor physics codes at the Point Lepreau Generating Station

    International Nuclear Information System (INIS)

    One of the ongoing challenges faced by the Nuclear Industry is Software Quality Assurance (SQA). In this paper, a project to address SQA issues in the Reactor Physics Group at the Point Lepreau Generating Station (PLGS) will be discussed. The work illustrates a process which could be implemented at any facility to achieve code compliance to CSA Standard N286.7 requirements. (author)

  11. Scoping analysis of the Advanced Test Reactor using SN2ND

    Energy Technology Data Exchange (ETDEWEB)

    Wolters, E.; Smith, M. (NE NEAMS PROGRAM); ( SC)

    2012-07-26

    A detailed set of calculations was carried out for the Advanced Test Reactor (ATR) using the SN2ND solver of the UNIC code which is part of the SHARP multi-physics code being developed under the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program in DOE-NE. The primary motivation of this work is to assess whether high fidelity deterministic transport codes can tackle coupled dynamics simulations of the ATR. The successful use of such codes in a coupled dynamics simulation can impact what experiments are performed and what power levels are permitted during those experiments at the ATR. The advantages of the SN2ND solver over comparable neutronics tools are its superior parallel performance and demonstrated accuracy on large scale homogeneous and heterogeneous reactor geometries. However, it should be noted that virtually no effort from this project was spent constructing a proper cross section generation methodology for the ATR usable in the SN2ND solver. While attempts were made to use cross section data derived from SCALE, the minimal number of compositional cross section sets were generated to be consistent with the reference Monte Carlo input specification. The accuracy of any deterministic transport solver is impacted by such an approach and clearly it causes substantial errors in this work. The reasoning behind this decision is justified given the overall funding dedicated to the task (two months) and the real focus of the work: can modern deterministic tools actually treat complex facilities like the ATR with heterogeneous geometry modeling. SN2ND has been demonstrated to solve problems with upwards of one trillion degrees of freedom which translates to tens of millions of finite elements, hundreds of angles, and hundreds of energy groups, resulting in a very high-fidelity model of the system unachievable by most deterministic transport codes today. A space-angle convergence study was conducted to determine the meshing and angular cubature requirements for the ATR, and also to demonstrate the feasibility of performing this analysis with a deterministic transport code capable of modeling heterogeneous geometries. The work performed indicates that a minimum of 260,000 linear finite elements combined with a L3T11 cubature (96 angles on the sphere) is required for both eigenvalue and flux convergence of the ATR. A critical finding was that the fuel meat and water channels must each be meshed with at least 3 'radial zones' for accurate flux convergence. A small number of 3D calculations were also performed to show axial mesh and eigenvalue convergence for a full core problem. Finally, a brief analysis was performed with different cross sections sets generated from DRAGON and SCALE, and the findings show that more effort will be required to improve the multigroup cross section generation process. The total number of degrees of freedom for a converged 27 group, 2D ATR problem is {approx}340 million. This number increases to {approx}25 billion for a 3D ATR problem. This scoping study shows that both 2D and 3D calculations are well within the capabilities of the current SN2ND solver, given the availability of a large-scale computing center such as BlueGene/P. However, dynamics calculations are not realistic without the implementation of improvements in the solver.

  12. Conference report: 2nd Medicon Valley Inhalation Symposium.

    Science.gov (United States)

    Lastow, Orest

    2014-02-01

    2nd Medicon Valley Inhalation Symposium 16 October 2013, Lund, Sweden The 2nd Medicon Valley Inhalation Symposium was arranged by the Medicon Valley Inhalation Consortium. It was held at the Medicon Village, which is the former AstraZeneca site in Lund, Sweden. It was a 1 day symposium focused on inhaled drug delivery and inhalation product development. 120 delegates listened to 11 speakers. The program was organized to follow the value chain of an inhalation product development. This year there was a focus on inhaled biomolecules. The inhaled delivery of insulin was covered by two presentations and a panel discussion. The future of inhaled drug delivery was discussed together with an overview of the current market situation. Two of the inhalation platforms, capsule inhalers and metered-dose inhalers, were discussed in terms of the present situation and the future opportunities. Much focus was on the regulatory and intellectual aspects of developing inhalation products. The manufacturing of a dry powder inhaler requires precision filling of powder, and the various techniques were presented. The benefits of nebulization and nasal delivery were illustrated with some case studies and examples. The eternal challenge of poor compliance was addressed from an industrial design perspective and some new approaches were introduced. PMID:24483190

  13. Método para generar casos de prueba funcional en el desarrollo de software / Generating functional testing case method in software development

    Scientific Electronic Library Online (English)

    Liliana, González Palacio.

    2009-07-01

    Full Text Available Un aspecto crucial en el control de calidad del desarrollo de software son las pruebas y, dentro de estas, las pruebas funcionales, en las cuales se hace una verificación dinámica del comportamiento de un sistema, basada en la observación de un conjunto seleccionado de ejecuciones controladas o caso [...] s de prueba. Para hacer pruebas funcionales se requiere una planificación que consiste en definir los aspectos a chequear y la forma de verificar su correcto funcionamiento, punto en el cual adquieren sentido los casos de prueba. En este artículo derivado de investigación se define un método para generar casos de prueba funcional a partir de casos de uso del sistema, como producto intermedio del proyecto cofinanciado titulado "Herramienta para la documentación de pruebas funcionales" Abstract in english Testing is a main aspect in quality control of software development, especially functional tests. The aim of functional testing is to dynamically verify the system behavior, based on the observation of a given set of controlled executions or test cases. Planning is required to make functional tests, [...] defining the aspects to be checked and the way to verify its proper operation; this allows test cases make sense. In this paper (research based), we propose a method to generate functional test cases from system use cases, based on the co-financed project "Tool for Documenting Functional Testing."

  14. Minimal Testcase Generation for Object-Oriented Software with State Charts

    Directory of Open Access Journals (Sweden)

    Ranjita Kumari Swain

    2012-08-01

    Full Text Available Today statecharts are a de facto standard in industry for modeling system behavior. Test data generation isone of the key issues in software testing. This paper proposes an reduction approach to test data generationfor the state-based software testing. In this paper, first state transition graph is derived from state chartdiagram. Then, all the required information are extracted from the state chart diagram. Then, test casesare generated. Lastly, a set of test cases are minimized by calculating the node coverage for each test case.It is also determined that which test cases are covered by other test cases. The advantage of our testgeneration technique is that it optimizes test coverage by minimizing time and cost. The present test datageneration scheme generates test cases which satisfy transition path coverage criteria, path coveragecriteria and action coverage criteria. A case study on Railway Ticket Vending Machine (RTVM has beenpresented to illustrate our approach.

  15. Reliable Mining of Automatically Generated Test Cases from Software Requirements Specification (SRS)

    CERN Document Server

    Raamesh, Lilly

    2010-01-01

    Writing requirements is a two-way process. In this paper we use to classify Functional Requirements (FR) and Non Functional Requirements (NFR) statements from Software Requirements Specification (SRS) documents. This is systematically transformed into state charts considering all relevant information. The current paper outlines how test cases can be automatically generated from these state charts. The application of the states yields the different test cases as solutions to a planning problem. The test cases can be used for automated or manual software testing on system level. And also the paper presents a method for reduction of test suite by using mining methods thereby facilitating the mining and knowledge extraction from test cases.

  16. Software for Generating Troposphere Corrections for InSAR Using GPS and Weather Model Data

    Science.gov (United States)

    Moore, Angelyn W.; Webb, Frank H.; Fishbein, Evan F.; Fielding, Eric J.; Owen, Susan E.; Granger, Stephanie L.; Bjoerndahl, Fredrik; Loefgren, Johan; Fang, Peng; Means, James D.; Bock, Yehuda; Tong, Xiaopeng

    2013-01-01

    Atmospheric errors due to the troposphere are a limiting error source for spaceborne interferometric synthetic aperture radar (InSAR) imaging. This software generates tropospheric delay maps that can be used to correct atmospheric artifacts in InSAR data. The software automatically acquires all needed GPS (Global Positioning System), weather, and Digital Elevation Map data, and generates a tropospheric correction map using a novel algorithm for combining GPS and weather information while accounting for terrain. Existing JPL software was prototypical in nature, required a MATLAB license, required additional steps to acquire and ingest needed GPS and weather data, and did not account for topography in interpolation. Previous software did not achieve a level of automation suitable for integration in a Web portal. This software overcomes these issues. GPS estimates of tropospheric delay are a source of corrections that can be used to form correction maps to be applied to InSAR data, but the spacing of GPS stations is insufficient to remove short-wavelength tropospheric artifacts. This software combines interpolated GPS delay with weather model precipitable water vapor (PWV) and a digital elevation model to account for terrain, increasing the spatial resolution of the tropospheric correction maps and thus removing short wavelength tropospheric artifacts to a greater extent. It will be integrated into a Web portal request system, allowing use in a future L-band SAR Earth radar mission data system. This will be a significant contribution to its technology readiness, building on existing investments in in situ space geodetic networks, and improving timeliness, quality, and science value of the collected data

  17. Software architecture for control and data acquisition of linear plasma generator Magnum-PSI

    Energy Technology Data Exchange (ETDEWEB)

    Groen, P.W.C., E-mail: p.w.c.groen@differ.nl [FOM Institute DIFFER – Dutch Institute For Fundamental Energy Research, Association EURATOM-FOM, Partner in the Trilateral Euregio Cluster, P.O. Box 1207, 3430 BE, Nieuwegein (Netherlands); Beveren, V. van; Broekema, A.; Busch, P.J.; Genuit, J.W.; Kaas, G.; Poelman, A.J.; Scholten, J.; Zeijlmans van Emmichoven, P.A. [FOM Institute DIFFER – Dutch Institute For Fundamental Energy Research, Association EURATOM-FOM, Partner in the Trilateral Euregio Cluster, P.O. Box 1207, 3430 BE, Nieuwegein (Netherlands)

    2013-10-15

    Highlights: ? An architecture based on a modular design. ? The design offers flexibility and extendability. ? The design covers the overall software architecture. ? It also covers its (sub)systems’ internal structure. -- Abstract: The FOM Institute DIFFER – Dutch Institute for Fundamental Energy Research has completed the construction phase of Magnum-PSI, a magnetized, steady-state, large area, high-flux linear plasma beam generator to study plasma surface interactions under ITER divertor conditions. Magnum-PSI consists of several hardware subsystems, and a variety of diagnostic systems. The COntrol, Data Acquisition and Communication (CODAC) system integrates these subsystems and provides a complete interface for the Magnum-PSI users. Integrating it all, from the lowest hardware level of sensors and actuators, via the level of networked PLCs and computer systems, up to functions and classes in programming languages, demands a sound and modular software architecture, which is extendable and scalable for future changes. This paper describes this architecture, and the modular design of the software subsystems. The design is implemented in the CODAC system at the level of services and subsystems (the overall software architecture), as well as internally in the software subsystems.

  18. Proceedings 2nd Interaction and Concurrency Experience: Structured Interactions

    CERN Document Server

    Bonchi, Filippo; Spoletini, Paola; Tuosto, Emilio; 10.4204/EPTCS.12

    2009-01-01

    This volume contains the proceedings of the 2nd Workshop on Interaction and Concurrency Experience (ICE'09). The workshop was held in Bologna, Italy on 31th of August 2009, as a satellite workshop of CONCUR'09. The previous edition of ICE has been organized in Reykjavik (2008). The ICE workshop is intended as a series of international scientific meetings oriented to researchers in various fields of theoretical computer science and, each year, the workshop focuses on a specific topic: ICE 2009 focused on structured interactions meant as the class of synchronisations that go beyond the "simple" point-to-point synchronisations (e.g., multicast or broadcast synchronisations, even-notification based interactions, time dependent interactions, distributed transactions,...).

  19. 2nd international conference on advanced nanomaterials and nanotechnology

    CERN Document Server

    Goswami, D; Perumal, A

    2013-01-01

    Nanoscale science and technology have occupied centre stage globally in modern scientific research and discourses in the early twenty first century. The enabling nature of the technology makes it important in modern electronics, computing, materials, healthcare, energy and the environment. This volume contains selected articles presented (as Invited/Oral/Poster presentations) at the 2nd international conference on advanced materials and nanotechnology (ICANN-2011) held recently at the Indian Institute of Technology Guwahati, during Dec 8-10, 2011. The list of topics covered in this proceedings include: Synthesis and self assembly of nanomaterials Nanoscale characterisation Nanophotonics & Nanoelectronics Nanobiotechnology Nanocomposites  F   Nanomagnetism Nanomaterials for Enery Computational Nanotechnology Commercialization of Nanotechnology The conference was represented by around 400 participants from several countries including delegates invited from USA, Germany, Japan, UK, Taiwan, Italy, Singapor...

  20. Anticipating Hazardous Weather and Community Risk, 2nd Edition

    Science.gov (United States)

    COMET

    2012-06-01

    Anticipating Hazardous Weather and Community Risk, 2nd Edition provides emergency managers and other decision makers with background information about weather, natural hazards, and preparedness. Additional topics include risk communication, human behavior, and effective warning partnerships, as well as a desktop exercise allowing the learner to practice the types of decisions required as hazardous situations unfold. This module offers web-based content designed to address topics covered in the multi-day Hazardous Weather and Flood Preparedness course offered by the Federal Emergency Management Agency (FEMA) and the National Weather Service (NWS). The module also complements other onsite courses by those agencies and provides useful information for evaluating and preparing for threats from a range of weather and natural hazards.

  1. 2nd International Conference on NeuroRehabilitation

    CERN Document Server

    Andersen, Ole; Akay, Metin

    2014-01-01

    The book is the proceedings of the 2nd International Conference on NeuroRehabilitation (ICNR 2014), held 24th-26th June 2014 in Aalborg, Denmark. The conference featured the latest highlights in the emerging and interdisciplinary field of neural rehabilitation engineering and identified important healthcare challenges the scientific community will be faced with in the coming years. Edited and written by leading experts in the field, the book includes keynote papers, regular conference papers, and contributions to special and innovation sessions, covering the following main topics: neuro-rehabilitation applications and solutions for restoring impaired neurological functions; cutting-edge technologies and methods in neuro-rehabilitation; and translational challenges in neuro-rehabilitation. Thanks to its highly interdisciplinary approach, the book will not only be a  highly relevant reference guide for academic researchers, engineers, neurophysiologists, neuroscientists, physicians and physiotherapists workin...

  2. Testing Product Generation in Software Product Lines Using Pairwise for Features Coverage

    Science.gov (United States)

    Pérez Lamancha, Beatriz; Polo Usaola, Macario

    A Software Product Lines (SPL) is "a set of software-intensive systems sharing a common, managed set of features that satisfy the specific needs of a particular market segment or mission and that are developed from a common set of core assets in a prescribed way". Variability is a central concept that permits the generation of different products of the family by reusing core assets. It is captured through features which, for a SPL, define its scope. Features are represented in a feature model, which is later used to generate the products from the line. From the testing point of view, testing all the possible combinations in feature models is not practical because: (1) the number of possible combinations (i.e., combinations of features for composing products) may be untreatable, and (2) some combinations may contain incompatible features. Thus, this paper resolves the problem by the implementation of combinatorial testing techniques adapted to the SPL context.

  3. Towards a Pattern-based Automatic Generation of Logical Specifications for Software Models

    OpenAIRE

    Klimek, Radoslaw

    2014-01-01

    The work relates to the automatic generation of logical specifications, considered as sets of temporal logic formulas, extracted directly from developed software models. The extraction process is based on the assumption that the whole developed model is structured using only predefined workflow patterns. A method of automatic transformation of workflow patterns to logical specifications is proposed. Applying the presented concepts enables bridging the gap between the benefit...

  4. RMAWGEN: A software project for a daily Multi-Site Weather Generator with R

    Science.gov (United States)

    Cordano, E.; Eccel, E.

    2012-04-01

    The modeling in in climate change applications for agricultural or hydrological purposes often requires daily time-series of precipitation and temperature. This is the case of downscaled series from monthly or seasonal predictions of Global Climate Models (GCMs). This poster presents a software project, the R package RMAWGEN (R Multi-Sites Auto-regressive Weather GENerator), to generate daily temperature and precipitation time series in several sites by using the theory of vectorial auto-regressive models (VAR). The VAR model is used because it is able to maintain the temporal and spatial correlations among the several series. In particular, observed time series of daily maximum and minimum temperature and precipitation are used to calibrate the parameters of a VAR model (saved as "GPCAvarest2" or "varest2" classes, which inherit the "varest" S3 class defined in the package vars [Pfaff, 2008]). Therefore the VAR model, coupled with monthly mean weather variables downscaled by GCM predictions, allows to generate several stochastic daily scenarios. The structure of the package consists in functions that transform precipitation and temperature time series into Gaussian-distributed random variables through deseasonalization and Principal Component Analysis. Then a VAR model is calibrated on transformed time series. The time series generated by VAR are then inversely re-transformed into precipitation and/or temperature series. An application is included in the software package as an example; it is presented by using a dataset with daily weather time series recorded in 59 different sites of Trentino (Italy) and its neighborhoods for the period 1958-2007. The software is distributed as a Free Software with General Public License (GPL) and is available on CRAN website (http://cran.r-project.org/web/packages/RMAWGEN/index.html)

  5. POLITO- A new open-source, platform independent software for generating high-quality lithostratigraphic columns

    Directory of Open Access Journals (Sweden)

    Cipran C. Stremtan

    2010-08-01

    Full Text Available POLITO is a free, open-source, and platform-independent software which can automatically generate lithostratigraphic columns from field data. Its simple and easy to use interface allows users to manipulate large datasets and create high-quality graphical outputs, either in editable vector or raster format, or as PDF files. POLITO uses USGS standard lithology patterns and can be downloaded from its Sourceforge project page (http://sourceforge.net/projects/polito/.

  6. Simulation of photovoltaic systems electricity generation using homer software in specific locations in Serbia

    OpenAIRE

    Pavlovi? Tomislav M.; Milosavljevi? Dragana D.; Pirsl Danica S.

    2013-01-01

    In this paper basic information of Homer software for PV system electricity generation, NASA - Surface meteorology and solar energy database, RETScreen, PVGIS and HMIRS (Hydrometeorological Institute of Republic of Serbia) solar databases are given. The comparison of the monthly average values for daily solar radiation per square meter received by the horizontal surface taken from NASA, RETScreen, PVGIS and HMIRS solar databases for three locations in Serbia (Belgrade, Negotin and Zlati...

  7. Development of the EVEREST device modelling software for change generation events

    International Nuclear Information System (INIS)

    The modelling of CCD and pixel detector devices is an important area in which semiconductor device modelling can be applied to better understand and optimise new designs. The three dimensional device modelling suite EVEREST has been extended to deal with charge generation events which occur when a charged particle or X-ray strikes a semiconductor detector. This report details the developments that have been made to model these events and presents some initial results to validate the changes to the software. (author)

  8. Reliable Mining of Automatically Generated Test Cases from Software Requirements Specification (SRS)

    OpenAIRE

    Lilly Raamesh; Uma, G. V.

    2010-01-01

    Writing requirements is a two-way process. In this paper we use to classify Functional Requirements (FR) and Non Functional Requirements (NFR) statements from Software Requirements Specification (SRS) documents. This is systematically transformed into state charts considering all relevant information. The current paper outlines how test cases can be automatically generated from these state charts. The application of the states yields the different test cases as solutions to ...

  9. SEMANTIC WEB-BASED SOFTWARE ENGINEERING BY AUTOMATED REQUIREMENTS ONTOLOGY GENERATION IN SOA

    Directory of Open Access Journals (Sweden)

    Vahid Rastgoo

    2014-04-01

    Full Text Available This paper presents an approach for automated generation of requirements ontology using UML diagrams in service-oriented architecture (SOA. The goal of this paper is to convenience progress of software engineering processes like software design, software reuse, service discovering and etc. The proposed method is based on a four conceptual layers. The first layer includes requirements achieved by stakeholders, the second one designs service-oriented diagrams from the data in first layer and extracts XMI codes of them. The third layer includes requirement ontology and protocol ontology to describe behavior of services and relationships between them semantically. Finally the forth layer makes standard the concepts exists in ontologies of previous layer. The generated ontology exceeds absolute domain ontology because it considers the behavior of services moreover the hierarchical relationship of them. Experimental results conducted on a set of UML4Soa diagrams in different scopes demonstrate the improvement of the proposed approach from different points of view such as: completeness of requirements ontology, automatic generation and considering SOA.

  10. Improvement of a plasma uniformity of the 2nd ion source of KSTAR neutral beam injector

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, S. H., E-mail: shjeong2@kaeri.re.kr; Kim, T. S.; Lee, K. W.; Chang, D. H.; In, S. R. [Korea Atomic Energy Research Institute, Daejeon 305-353 (Korea, Republic of); Bae, Y. S. [National Fusion Research Institute, Daejeon 305-806 (Korea, Republic of)

    2014-02-15

    The 2nd ion source of KSTAR (Korea Superconducting Tokamak Advanced Research) NBI (Neutral Beam Injector) had been developed and operated since last year. A calorimetric analysis revealed that the heat load of the back plate of the ion source is relatively higher than that of the 1st ion source of KSTAR NBI. The spatial plasma uniformity of the ion source is not good. Therefore, we intended to identify factors affecting the uniformity of a plasma density and improve it. We estimated the effects of a direction of filament current and a magnetic field configuration of the plasma generator on the plasma uniformity. We also verified that the operation conditions of an ion source could change a uniformity of the plasma density of an ion source.

  11. Results of the 2nd regular inspection of Unit 1 in the Oi Power Station

    International Nuclear Information System (INIS)

    The 2nd regular inspection was carried out in fiscal 1980 on Unit 1 in the Oi Power Station for the period from February 10 to July 30, 1981. The facilities subjected to the inspection were reactor proper, cooling systems, instrumentation and control systems, radiation control systems and others. In the inspection on external appearance, leakage, performance, etc., damages were detected to fuel insert hold-down springs and the support lattices of fuel assemblies, and abnormality was found in steam generator tubes. The radiation exposure of personnel concerning the inspection work was all below the legally permissible level. The improvement works conducted during the period of inspection were the replacement of the damaged insert hold-down springs, the fuel assemblies with damaged support-lattices the support pins of upper core-structure flow guide and the plugging of the abnormal heating tubes. (J.P.N.)

  12. A proposed metamodel for the implementation of object oriented software through the automatic generation of source code

    OpenAIRE

    Carvalho, J. S. C.; Cordeiro, A. G.; Feres, M. M.

    2008-01-01

    During the development of software one of the most visible risks and perhaps the biggest implementation obstacle relates to the time management. All delivery deadlines software versions must be followed, but it is not always possible, sometimes due to delay in coding. This paper presents a metamodel for software implementation, which will rise to a development tool for automatic generation of source code, in order to make any development pattern transparent to the programmer, significantly re...

  13. Reliable Mining of Automatically Generated Test Cases from Software Requirements Specification (SRS

    Directory of Open Access Journals (Sweden)

    Lilly Raamesh

    2010-01-01

    Full Text Available Writing requirements is a two-way process. In this paper we use to classify Functional Requirements (FR and Non Functional Requirements (NFR statements from Software Requirements Specification (SRS documents. This is systematically transformed into state charts considering all relevant information. The current paper outlines how test cases can be automatically generated from these state charts. The application of the states yields the different test cases as solutions to a planning problem. The test cases can be used for automated or manual software testing on system level. And also the paper presents a method for reduction of test suite by using mining methods thereby facilitating the mining and knowledge extraction from test cases.

  14. Studi On Oxidation State Of U In Ba2NdUO6

    International Nuclear Information System (INIS)

    Ba2NdUO6 is not of the important compounds that is formed from a solidification process for high level liquid waste using super high temperature method Ba2NdUO6 has ordered perovskite structure. The objective of this study is to investigate oxidation state of U in Ba2NdUO6. The properties of Ba2NdUO6 were observed by using Faraday-type torsion magnetometer and X-ray Photoelectron Spectrometer (XPS). The magnetic susceptibility measured in the temperature range of 4K to room temperature showed that the Ba2NdUO6 is paramagnetism that obeys the Curie-Weiss law. The effective moment of Ba2NdUO6 is 3.04 ?B. The results of xPs spectrum showed that the peaks of U4f for Ba2NdUO6 appeared exactly between binding energy of UO2 and UO3. It can be concluded that Ba2NdUO6 has binding energy peaks corresponding to pentavalent uranium

  15. CELEBRATED APRIL 2nd – INTERNATIONAL DAY OF PERSONS WITH AUTISM

    Directory of Open Access Journals (Sweden)

    Manuela KRCHANOSKA

    2014-09-01

    Full Text Available On April 2nd, the Macedonian Scientific Society for Autism, for the fourth time organized an event on the occasion of the International Day of Persons with Autism. The event with cultural and artistic character was held at the Museum of the Macedonian Struggle under the motto “They are not alone, we are with them”. The huge number of citizens only confirmed the motto. It seemed that the hall of the Museum of the Macedonian Struggle is too small for the warm hearts of the audience. More than 300 guests were present in the hall, among which there were children with autism and their families, prominent professors, doctors, special educators and rehabilitators, psychologists, students and other citizens with glad heart and will who decided to enrich the event with their presence. The event was opened by the violinist Plamenka Trajkovska, which performed one song. After her, the President of the Macedonian Scientific Society for Autism, PhD. Vladimir Trajkovski delivered his speech. The professor told the parents of autistic children, who were present in large number, not to lose hope, to fight for their children, and that the Macedonian Scientific Society for Autism will provide tremendous support and assistance in this struggle.

  16. APTWG: 2nd Asia-Pacific Transport Working Group Meeting

    Science.gov (United States)

    Dong, J. Q.; Shi, Y. J.; Tamura, N.; Jhang, Hogun; Watanabe, T.-H.; Ding, X. T.

    2013-02-01

    This conference report summarizes the contributions to and discussions at the 2nd Asia-Pacific Transport Working Group Meeting held in Chengdu, China, from 15 to 18 May 2012. The topics of the meeting were organized under five main headings: momentum transport, non-locality in transport, edge turbulence and L-H transition, three-dimensional effects on transport physics, and particle, momentum and heat pinches. It is found that lower hybrid wave and ion cyclotron wave induce co-current rotation while electron cyclotron wave induces counter-current rotation. A four-stage imaging for low (L) to high (H) confinement transition gradually emerges and a more detailed verification is urgently expected. The new edge-localized modes mitigation technique with supersonic molecular beam injection was approved to be effective to some extent on HL-2A and KSTAR. It is also found that low collisionality, trapped electron mode to ion temperature gradient transition (or transition of higher to lower density and temperature gradients), fuelling and lithium coating are in favour of inward pinch of particles in tokamak plasmas.

  17. A proposed metamodel for the implementation of object oriented software through the automatic generation of source code

    Directory of Open Access Journals (Sweden)

    CARVALHO, J. S. C.

    2008-12-01

    Full Text Available During the development of software one of the most visible risks and perhaps the biggest implementation obstacle relates to the time management. All delivery deadlines software versions must be followed, but it is not always possible, sometimes due to delay in coding. This paper presents a metamodel for software implementation, which will rise to a development tool for automatic generation of source code, in order to make any development pattern transparent to the programmer, significantly reducing the time spent in coding artifacts that make up the software.

  18. NeuroPG: open source software for optical pattern generation and data acquisition

    Science.gov (United States)

    Avants, Benjamin W.; Murphy, Daniel B.; Dapello, Joel A.; Robinson, Jacob T.

    2015-01-01

    Patterned illumination using a digital micromirror device (DMD) is a powerful tool for optogenetics. Compared to a scanning laser, DMDs are inexpensive and can easily create complex illumination patterns. Combining these complex spatiotemporal illumination patterns with optogenetics allows DMD-equipped microscopes to probe neural circuits by selectively manipulating the activity of many individual cells or many subcellular regions at the same time. To use DMDs to study neural activity, scientists must develop specialized software to coordinate optical stimulation patterns with the acquisition of electrophysiological and fluorescence data. To meet this growing need we have developed an open source optical pattern generation software for neuroscience—NeuroPG—that combines, DMD control, sample visualization, and data acquisition in one application. Built on a MATLAB platform, NeuroPG can also process, analyze, and visualize data. The software is designed specifically for the Mightex Polygon400; however, as an open source package, NeuroPG can be modified to incorporate any data acquisition, imaging, or illumination equipment that is compatible with MATLAB’s Data Acquisition and Image Acquisition toolboxes. PMID:25784873

  19. 2nd International Open and Distance Learning (IODL Symposium

    Directory of Open Access Journals (Sweden)

    Reviewed by Murat BARKAN

    2006-10-01

    Full Text Available This closing remarks prepared and presented by Prof. Dr. Murat BARKAN Anadolu University, Eskisehir, TURKEY DEAR GUESTS, As the 2nd International Open and Distance Learning (IODL Symposium is now drawing to end, I would like to thank you all for your outstanding speeches, distinguished presentations, constructive roundtable and session discussions, and active participation during the last five days. I hope you all share my view that the whole symposium has been a very stimulating and successful experience. Also, on behalf of all the participants, I would like to take this opportunity to thank and congratulate the symposium organization committee for their excellent job in organizing and hosting our 2nd meeting. Throughout the symposium, five workshops, six keynote speeches and 66 papers, which were prepared by more than 150 academicians and practitioners from 23 different countries, reflected remarkable and various views and approaches about open and flexible learning. Besides, all these academic endeavors, 13 educational films were displayed during the symposium. The technology exhibition, hosted by seven companies, was very effective to showcase the current level of the technology and the success of applications of theory into practice. Now I would like to go over what our scholar workshop and keynote presenters shared with us: Prof. Marina McIsaac form Arizona State University dwelled on how to determine research topics worthwhile to be examined and how to choose appropriate research design and methods. She gave us clues on how to get articles published in professional journals. Prof. Colin Latchem from Australia and Prof. Dr. Ali Ekrem Ozkul from Anadolu University pointed to the importance of strategic planning for distance and flexible learning. They highlighted the advantages of strategic planning for policy-makers, planners, managers and staff. Dr. Wolfram Laaser from Fern University of Hagen, presented different multimedia clips and directed interactive exercises using flashmx in his workshop. Jack Koumi from UK, presented a workshop about what to teach on video and when to choose other media. He exemplified 27 added value techniques and teaching functions for TV and video. He later specified different capabilities and limitations of eight different media used in teaching, emphasizing the importance of optimizing media deployment. Dr. Janet Bohren from University of Cincinnati and Jennifer McVay-Dyche from United Theological Seminary, explained their experience with a course management system used to develop dialogue between K-12 teachers in Turkey and the US, on the topics of religion, culture and schools. Their workshop provided an overview of a pilot study. They showed us a good case-study of utilizing “Blackboard” as a mean for getting rid of biases and improving the understanding of the American and Turkish teachers against each other. We had very remarkable key notes as well. Dr Nikitas Kastis representing European Distance and E-Learning Network (EDEN made his speech on distance and e-Learning evolutions and trends in Europe. He informed the audience about the application and assessment criteria at European scale, concerning e-Learning in the education and training systems. Meanwhile, our key note speakers took our attention to different applications of virtual learning. Dr. Piet Kommers from University of Twente exemplified a virtual training environment for acquiring surgical skills. Dr. Timothy Shih from Tamkang University presented their project called Hard SCORM (Sharable Content Object Reference Model as an asynchronous distance learning specification. In his speech titled “Engaging and Supporting Problem Solving Online” Prof. David Jonassen from University of Missouri, reflected his vision of the future of education and explained why it should embrace problem solving. Then he showed us examples of incorporating this vision with learning environments for making online problem solving possible. Dr. Wolfram Laaser from Fern University talked on applications of ICT at Europe

  20. NEXT GENERATION ANALYSIS SOFTWARE FOR COMPONENT EVALUATION - Results of Rotational Seismometer Evaluation

    Science.gov (United States)

    Hart, D. M.; Merchant, B. J.; Abbott, R. E.

    2012-12-01

    The Component Evaluation project at Sandia National Laboratories supports the Ground-based Nuclear Explosion Monitoring program by performing testing and evaluation of the components that are used in seismic and infrasound monitoring systems. In order to perform this work, Component Evaluation maintains a testing facility called the FACT (Facility for Acceptance, Calibration, and Testing) site, a variety of test bed equipment, and a suite of software tools for analyzing test data. Recently, Component Evaluation has successfully integrated several improvements to its software analysis tools and test bed equipment that have substantially improved our ability to test and evaluate components. The software tool that is used to analyze test data is called TALENT: Test and AnaLysis EvaluatioN Tool. TALENT is designed to be a single, standard interface to all test configuration, metadata, parameters, waveforms, and results that are generated in the course of testing monitoring systems. It provides traceability by capturing everything about a test in a relational database that is required to reproduce the results of that test. TALENT provides a simple, yet powerful, user interface to quickly acquire, process, and analyze waveform test data. The software tool has also been expanded recently to handle sensors whose output is proportional to rotation angle, or rotation rate. As an example of this new processing capability, we show results from testing the new ATA ARS-16 rotational seismometer. The test data was collected at the USGS ASL. Four datasets were processed: 1) 1 Hz with increasing amplitude, 2) 4 Hz with increasing amplitude, 3) 16 Hz with increasing amplitude and 4) twenty-six discrete frequencies between 0.353 Hz to 64 Hz. The results are compared to manufacture-supplied data sheets.

  1. A software tool for simulation of surfaces generated by ball nose end milling

    DEFF Research Database (Denmark)

    Bissacco, Giuliano

    2004-01-01

    The number of models available for prediction of surface topography is very limited. The main reason is that these models cannot be based on engineering principles like those for elastic deformations. Most knowledge about surface roughness and integrity is empirical and up to now very few mathematical relationships relating surface parameters to cutting conditions are available. Basic models of kinematical roughness, determined by the tool profile and the pattern of relative motions of tool and workpiece, have been so far not reliable. The actual roughness may be more than five times higher due to error motions, unstable built up edge and changing tool profile due to wear [1]. Tool chatter is also affecting surface roughness, but its effect is normally not included in prediction of surface roughness, since machining conditions which generate chatter must be avoided in any case. Finally, reproducibility of experimental results concerning surface roughness requires tight control of all influencing factors, difficult to keep in actual machining workshops. This introduces further complications in surface topography modelling. In the light of these considerations, a simple software tool, for prediction of surface topography of ball nose end milled surfaces, was developed. Such software tool is based on a simplified model of the ideal tool motion and neglects the effects due to run-out, static and dynamic deflections and error motions, but has the merit of generating in output a file in a format readable by a surface processor software (SPIP [2]), for calculation of a number of surface roughness parameters. In the next paragraph a description of the basic features of ball nose end milled surfaces is given, while in paragraph 3 the model is described.

  2. Meteosat Second Generation station: processing software and computing arquitecture; Estacion de Recepcion de Imagenes del Satelite Meteosat Segunda generacion: Arquitectura Informatica y Software de Proceso

    Energy Technology Data Exchange (ETDEWEB)

    Martin, L.; Cony, M.; Navarro, A. A.; Zarzalejo, L. F.; Polo, J.

    2010-05-01

    The Renewable Energy Division of CIEMAT houses a specific station for receiving the Meteosat Second Generation images, which is of interest in the works being carried out concerning the solar radiation derived from satellite images. The complexity, the huge amount of information being received and the particular characteristics of the MSG images encouraged the design and development of a specific computer structure and the associated software as well, for a better and more suitable use of the images. This document describes the mentioned structure and software. (Author) 8 refs.

  3. Quetzalcoatl: Una Herramienta para Generar Contratos de Desarrollo de Software en Entornos de Outsourcing / Quetzalcoatl: A Tool for Generating Software Development Contracts in Outsourcing Environments

    Scientific Electronic Library Online (English)

    Jezreel, Mejía; Sergio D., Ixmatlahua; Alma I., Sánchez.

    2014-03-01

    Full Text Available Actualmente el outsourcing es una de las actividades principales de trabajo. Sin embargo, las relaciones que se dan entre un cliente y un proveedor de servicios no son lo suficientemente fuertes para lograr las expectativas de los acuerdos. El contrato de outsourcing para proyectos en desarrollo de [...] software es una alternativa a este tipo de relaciones. En este artículo se presenta la arquitectura de la herramienta Quetzalcoatl, así como las funciones principales que ofrece la herramienta, todo esto con el objetivo de generar y evaluar contratos para proyectos de desarrollo de software en entornos de outsourcing como apoyo a las PYMEs. Abstract in english Nowadays, outsourcing is one of the most important work activities for the software development companies. However, the relationships between a client and a service provider are not b enough to meet the expectations of the agreements. The outsourcing contract for software development projects is an [...] alternative to this type of relationship. This paper presents the architecture of the tool named Quetzalcoatl, also the main functions that this tool offers in order to generate and evaluate contracts for software development projects in outsourcing environments to support SMEs.

  4. Similar regulation of the synthesis of adenovirus fiber and of simian virus 40-specific proteins encoded by the helper-defective Ad2+SV40 hybrid viruses Ad2+ND5 and Ad2+ND4del.

    OpenAIRE

    Klockmann, U.; Klessig, D. F.; Deppert, W.

    1985-01-01

    Human adenoviruses fail to multiply effectively in monkey cells. The block to the replication of these viruses can be overcome by coinfection with simian virus 40 (SV40) or when part of the SV40 genome is integrated into and expressed as part of the adenovirus type 2 (Ad2) genome, as occurs in several Ad2+SV40 hybrid viruses, such as Ad2+ND1, Ad2+ND2, and Ad2+ND4. The SV40 helper-defective Ad2+SV40 hybrid viruses Ad2+ND5 and Ad2+ND4del were analyzed to determine why they are unable to grow ef...

  5. Computer software program for monitoring the availability of systems and components of electric power generating systems

    International Nuclear Information System (INIS)

    As availabilities of electric power generating stations systems and components become more and more important from a financial, personnel safety, and regulatory requirements standpoint, it is evident that a comprehensive, yet simple and user-friendly program for system and component tracking and monitoring is needed to assist in effectively managing the large volume of systems and components with their large numbers of associated maintenance/availability records. A user-friendly computer software program for system and component availability monitoring has been developed that calculates, displays and monitors selected component and system availabilities. This is a Windows trademark based (Graphical User Interface) program that utilizes a system flow diagram for the data input screen which also provides a visual representation of availability values and limits for the individual components and associated systems. This program can be customized to the user's plant-specific system and component selections and configurations. As will be discussed herein, this software program is well suited for availability monitoring and ultimately providing valuable information for improving plant performance and reducing operating costs

  6. Pipeliner: software to evaluate the performance of bioinformatics pipelines for next-generation resequencing.

    Science.gov (United States)

    Nevado, B; Perez-Enciso, M

    2015-01-01

    The choice of technology and bioinformatics approach is critical in obtaining accurate and reliable information from next-generation sequencing (NGS) experiments. An increasing number of software and methodological guidelines are being published, but deciding upon which approach and experimental design to use can depend on the particularities of the species and on the aims of the study. This leaves researchers unable to produce informed decisions on these central questions. To address these issues, we developed pipeliner - a tool to evaluate, by simulation, the performance of NGS pipelines in resequencing studies. Pipeliner provides a graphical interface allowing the users to write and test their own bioinformatics pipelines with publicly available or custom software. It computes a number of statistics summarizing the performance in SNP calling, including the recovery, sensitivity and false discovery rate for heterozygous and homozygous SNP genotypes. Pipeliner can be used to answer many practical questions, for example, for a limited amount of NGS effort, how many more reliable SNPs can be detected by doubling coverage and halving sample size or what is the false discovery rate provided by different SNP calling algorithms and options. Pipeliner thus allows researchers to carefully plan their study's sampling design and compare the suitability of alternative bioinformatics approaches for their specific study systems. Pipeliner is written in C++ and is freely available from http://github.com/brunonevado/Pipeliner. PMID:24890372

  7. Technical officials guide: Nanjing 2014 : the 2nd Summer Youth Olympic Games

    OpenAIRE

    2014-01-01

    The "Technical officials guide you are reading offer an introduction to each sport at the 2nd Summer Youth Olympic Games, Nanjing 2014, as well as providing information on technical officials service.

  8. Pragmatics Annotated Coloured Petri Nets for Protocol Software Generation and Verification

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge; Kristensen, Lars Michael

    2014-01-01

    This paper presents the formal definition of Pragmatics Annotated Coloured Petri Nets (PA-CPNs). PA-CPNs represent a class of Coloured Petri Nets (CPNs) that are designed to support automated code genera-tion of protocol software. PA-CPNs restrict the structure of CPN models and allow Petri net elements to be annotated with so-called pragmatics, which are exploited for code generation. The approach and tool for gen-erating code is called PetriCode and has been discussed and evaluated in earlier work already. The contribution of this paper is to give a formal def-inition for PA-CPNs; in addition, we show how the structural restrictions of PA-CPNs can be exploited for making the verification of the modelled protocols more efficient. This is done by automatically deriving progress measures for the sweep-line method, and by introducing so-called service testers, that can be used to control the part of the state space that is to be explored for verification purposes.

  9. Next generation hyper-scale software and hardware systems for big data analytics

    CERN Document Server

    CERN. Geneva

    2013-01-01

    Building on foundational technologies such as many-core systems, non-volatile memories and photonic interconnects, we describe some current technologies and future research to create real-time, big data analytics, IT infrastructure. We will also briefly describe some of our biologically-inspired software and hardware architecture for creating radically new hyper-scale cognitive computing systems. About the speaker Rich Friedrich is the director of Strategic Innovation and Research Services (SIRS) at HP Labs. In this strategic role, he is responsible for research investments in nano-technology, exascale computing, cyber security, information management, cloud computing, immersive interaction, sustainability, social computing and commercial digital printing. Rich's philosophy is to fuse strategy and inspiration to create compelling capabilities for next generation information devices, systems and services. Using essential insights gained from the metaphysics of innnovation, he effectively leads ...

  10. Next Generation Astronomical Data Processing using Big Data Technologies from the Apache Software Foundation

    Science.gov (United States)

    Mattmann, Chris

    2014-04-01

    In this era of exascale instruments for astronomy we must naturally develop next generation capabilities for the unprecedented data volume and velocity that will arrive due to the veracity of these ground-based sensor and observatories. Integrating scientific algorithms stewarded by scientific groups unobtrusively and rapidly; intelligently selecting data movement technologies; making use of cloud computing for storage and processing; and automatically extracting text and metadata and science from any type of file are all needed capabilities in this exciting time. Our group at NASA JPL has promoted the use of open source data management technologies available from the Apache Software Foundation (ASF) in pursuit of constructing next generation data management and processing systems for astronomical instruments including the Expanded Very Large Array (EVLA) in Socorro, NM and the Atacama Large Milimetre/Sub Milimetre Array (ALMA); as well as for the KAT-7 project led by SKA South Africa as a precursor to the full MeerKAT telescope. In addition we are funded currently by the National Science Foundation in the US to work with MIT Haystack Observatory and the University of Cambridge in the UK to construct a Radio Array of Portable Interferometric Devices (RAPID) that will undoubtedly draw from the rich technology advances underway. NASA JPL is investing in a strategic initiative for Big Data that is pulling in these capabilities and technologies for astronomical instruments and also for Earth science remote sensing. In this talk I will describe the above collaborative efforts underway and point to solutions in open source from the Apache Software Foundation that can be deployed and used today and that are already bringing our teams and projects benefits. I will describe how others can take advantage of our experience and point towards future application and contribution of these tools.

  11. Optimizing Software Testing and Test Case Generation by using the concept of Hamiltonian Paths

    OpenAIRE

    Ankita Bihani; Sargam Badyal

    2014-01-01

    Software testing is a trade-off between budget, time and quality. Broadly, software testing can be classified as Unit testing, Integration testing, Validation testing and System testing. By including the concept of Hamiltonian paths we can improve greatly on the facet of software testing of any project. This paper shows how Hamiltonian paths can be used for requirement specification. It can also be used in acceptance testing phase for checking if all the user requirements are met or not. Furt...

  12. Phase relationship in the TiO2Nd2O3 pseudo-binary system

    International Nuclear Information System (INIS)

    Highlights: ? DSC and XRD measurements for the TiO2Nd2O3 system. ? Nd2Ti2O7, Nd2TiO5, Nd2Ti3O9 and Nd4Ti9O24 exist. ? Nd2Ti4O11 and Nd4Ti9O24 were the same compounds. ? Thermodynamic calculation on the TiO2Nd2O3 system. - Abstract: Phase equilibria in the TiO2Nd2O3 system have been experimentally investigated via X-ray diffraction (XRD) and differential scanning calorimetry (DSC). Four compounds Nd2Ti2O7, Nd2TiO5, Nd2Ti3O9 and Nd4Ti9O24 were confirmed to exist. The literature reported Nd2Ti4O11 was proved to be the same compound as Nd4Ti9O24, and the reported phase transformation of Nd2Ti4O11 from ? structure to ? at 1373 K was not detected. All the phase diagram data from both the literatures and the present work were critically reviewed and taken into account during the thermodynamic optimization of the TiO2Nd2O3 system. A set of consistent thermodynamic parameters, which can explain most of the experimental data ofthe experimental data of the TiO2Nd2O3 system, was achieved. The calculated phase diagram of the TiO2Nd2O3 system was provided.

  13. Oxidized Modified Proteins in the Atherosclerosis Genesis at a Diabetes Mellitus of the 2nd Type

    Directory of Open Access Journals (Sweden)

    O.V. Zanozina

    2009-11-01

    Full Text Available A role of free-radical oxidation in atherogenesis in patients with a diabetes mellitus of the 2nd type at an ischemic heart disease is regarded. It is established, that the oxidized modified proteins are in tight contact with the lipid peroxidation and support a free-radical oxidation in the given category of patients. It is demonstrated, that the oxidized modified protein detection can be both early and integral test of metabolic disturbances and, in perspective, hemostasiologic disturbances at a diabetes mellitus of the 2nd type.

  14. The Crest Wing Wave Energy Device : 2nd phase testing

    OpenAIRE

    Kofoed, Jens Peter; Antonishen, Michael Patrick

    2009-01-01

    This report presents the results of a continuation of an experimental study of the wave energy converting abilities of the Crest Wing wave energy converter (WEC), in the following referred to as ‘Phase 2'. The Crest Wing is a WEC that uses its movement in matching the shape of an oncoming wave to generate power. Model tests have been performed using scale models (length scale 1:30), provided by WaveEnergyFyn, in regular and irregular wave states that can be found in Assessment of Wave Energ...

  15. 2nd international expert meeting straw power; 2. Internationale Fachtagung Strohenergie

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2012-06-15

    Within the 2nd Guelzow expert discussions at 29th to 30th March, 2012 in Berlin (Federal Republic of Germany), the following lectures were held: (1) Promotion of the utilisation of straw in Germany (A. Schuette); (2) The significance of straw in the heat and power generation in EU-27 member states in 2020 and in 2030 under consideration of the costs and sustainability criteria (C. Panoutsou); (3) State of he art of the energetic utilization of hay goods in Europe (D. Thraen); (4) Incineration technological characterisation of straw based on analysis data as well as measured data of large-scale installations (I. Obernberger); (5) Energetic utilization of hay goods in Germany (T. Hering); (6) Actual state of the art towards establishing the first German straw thermal power station (R. Knieper); (7) Straw thermal power plants at agricultural sow farms and poultry farms (H. Heilmann); (8) Country report power from straw in Denmark (A. Evald); (9) Country report power from straw in Poland (J. Antonowicz); (10) Country report power from straw in China (J. Zhang); (11) Energetic utilisation of straw in Czechia (D. Andert); (12) Mobile pelletization of straw (S. Auth); (13) Experiences with the straw thermal power plant from Vattenfall (N. Kirkegaard); (14) Available straw potentials in Germany (potential, straw provision costs) (C. Weiser); (15) Standardization of hay good and test fuels - Classification and development of product standards (M. Englisch); (16) Measures of reduction of emissions at hay good incinerators (V. Lenz); (17) Fermentation of straw - State of the art and perspectives (G. Reinhold); (18) Cellulosis - Ethanol from agricultural residues - Sustainable biofuels (A. Hartmair); (19) Syngas by fermentation of straw (N. Dahmen); (20) Construction using straw (D. Scharmer).

  16. Measurement of leak radiation in labyrinth. Measurement of leak radiation in labyrinth of the 2nd light-ion room

    International Nuclear Information System (INIS)

    Assessment of radiation leaked or diffused in the passages of labyrinth in the facility equipped with an accelerator is essential to make a shield design for such facility. In this study, data available to define a bench mark were collected by measuring the radioactivity in the labyrinth of the 2nd light-ion room of TIARA in Takasaki Institute of Japan Atomic Energy Research Institute. Using neutrons generated by injecting proton of 67 MeV to a thick target of copper, the leaked radiation was determined. The measurement items were as follows; 1) energy spectrum of neutron source, 2) distributions of neutron and ?-ray intensities in the radiation room, 3) dose and intensity distributions and energy spectra of neutron and ?-ray in the labyrinth. Based on these bench mark data, we have a plan to evaluate the simple experience typed and Monte Carlo typed calculation codes, which are used for designing a radiation shield at present. (M.N.)

  17. Power system economics : the Nordic electricity market. 2nd ed.

    International Nuclear Information System (INIS)

    This book written as a textbook for students of engineering is designed for the Norwegian Power Markets course which is part of the Energy and Environment Master's Program and the recently established international MSc program in Electric Power Engineering. As the title indicates, the book deals with both power system economics in general and the practical implementation and experience from the Nordic market. Areas of coverage include: -- Restructuring/deregulation of the power supply system -- Grid access including tariffs and congestion management -- Generation planning -- Market modeling -- Ancillary services -- Regulation of grid monopolies. Although Power Systems Economics is written primarily as a textbook for students, other readers will also find the book interesting. It deals with problems that have been subject of considerable attention in the power sector for some years and it addresses issues that are still relevant and important. (au)

  18. 2nd Topical Workshop on Laser Technology and Optics Design

    CERN Document Server

    2013-01-01

    Lasers have a variety of applications in particle accelerator operation and will play a key role in the development of future particle accelerators by improving the generation of high brightness electron and exotic ion beams and through increasing the acceleration gradient. Lasers will also make an increasingly important contribution to the characterization of many complex particle beams by means of laser-based beam diagnostics methods. The second LANET topical workshop will address the key aspects of laser technology and optics design relevant to laser application to accelerators. The workshop will cover general optics design, provide an overview of different laser sources and discuss methods to characterize beams in details. Participants will be able to choose from a range of topical areas that go deeper in more specific aspects including tuneable lasers, design of transfer lines, noise sources and their elimination and non-linear optics effects. The format of the workshop will be mainly training-based wit...

  19. The Crest Wing Wave Energy Device : 2nd phase testing

    DEFF Research Database (Denmark)

    Kofoed, Jens Peter; Antonishen, Michael Patrick

    2009-01-01

    This report presents the results of a continuation of an experimental study of the wave energy converting abilities of the Crest Wing wave energy converter (WEC), in the following referred to as ‘Phase 2'. The Crest Wing is a WEC that uses its movement in matching the shape of an oncoming wave to generate power. Model tests have been performed using scale models (length scale 1:30), provided by WaveEnergyFyn, in regular and irregular wave states that can be found in Assessment of Wave Energy Devices. Best Practice as used in Denmark (Frigaard et al., 2008). The tests were carried out at Dept. of Civil Engineering, Aalborg University (AAU) in the 3D deep water wave tank. The displacement and force applied to a power take off system, provided by WaveEnergyFyn, were measured and used to calculate mechanical power available to the power take off.

  20. Power system economics : the Nordic electricity market. 2nd ed.

    Energy Technology Data Exchange (ETDEWEB)

    Wangensteen, Ivar

    2012-07-01

    This book written as a textbook for students of engineering is designed for the Norwegian Power Markets course which is part of the Energy and Environment Master's Program and the recently established international MSc program in Electric Power Engineering. As the title indicates, the book deals with both power system economics in general and the practical implementation and experience from the Nordic market. Areas of coverage include: -- Restructuring/deregulation of the power supply system -- Grid access including tariffs and congestion management -- Generation planning -- Market modeling -- Ancillary services -- Regulation of grid monopolies. Although Power Systems Economics is written primarily as a textbook for students, other readers will also find the book interesting. It deals with problems that have been subject of considerable attention in the power sector for some years and it addresses issues that are still relevant and important. (au)

  1. Proceedings of the 2nd KUR symposium on hyperfine interactions

    International Nuclear Information System (INIS)

    Hyperfine interactions between a nuclear spin and an electronic spin discovered from hyperfine splitting in atomic optical spectra have been utilized not only for the determination of nuclear parameters in nuclear physics but also for novel experimental techniques in many fields such as solid state physics, chemistry, biology, mineralogy and for diagnostic methods in medical science. Experimental techniques based on hyperfine interactions yield information about microscopic states of matter so that they are important in material science. Probes for material research using hyperfine interactions have been nuclei in the ground state and radioactive isotopes prepared with nuclear reactors or particle accelerators. But utilization of muons generated from accelerators is recently growing. Such wide spread application of hyperfine interaction techniques gives rise to some difficulty in collaboration among various research fields. In these circumstances, the present workshop was planned after four years since the last KUR symposium on the same subject. This report summarizes the contributions to the workshop in order to be available for the studies of hyperfine interactions. (J.P.N.)

  2. Introductory statement to the 2nd scientific forum on sustainable development: A role for nuclear power?

    International Nuclear Information System (INIS)

    In his Introductory Statement to the 2nd Scientific Forum on 'Sustainable Development - A Role for Nuclear Power?' (Vienna, 28 September 1999), the Director General of the IAEA focussed on the the main aspects concerning the development of nuclear power: safety, competitiveness, and public support

  3. Proceedings of the 2nd Mediterranean Conference on Information Technology Applications (ITA '97)

    International Nuclear Information System (INIS)

    This is the proceedings of the 2nd Mediterranean Conference on Information Technology Applications, held in Nicosia, Cyprus, between 6-7 November, 1997. It contains 16 papers. Two of these fall within the scope of INIS and are dealing with Telemetry, Radiation Monitoring, Environment Monitoring, Radiation Accidents, Air Pollution Monitoring, Diagnosis, Computers, Radiology and Data Processing

  4. Boundary Value Problems for the $2^{nd}$-order Seiberg-Witten Equations

    CERN Document Server

    Doria, C M

    2004-01-01

    It is shown that the non-homogeneous Dirichlet and Neuman problems for the $2^{nd}$-order Seiberg-Witten equation admit a regular solution once the $\\mathcal{H}$-condition (described in the article) is satisfied. The approach consist in applying the elliptic techniques to the variational setting of the Seiberg-Witten equation.

  5. Proceedings of the 2nd symposium on valves for coal conversion and utilization

    Energy Technology Data Exchange (ETDEWEB)

    Maxfield, D.A. (ed.)

    1981-01-01

    The 2nd symposium on valves for coal conversion and utilization was held October 15 to 17, 1980. It was sponsored by the US Department of Energy, Morgantown Energy Technology Center, in cooperation with the Valve Manufacturers Association. Seventeen papers have been entered individually into EDB and ERA. (LTN)

  6. Point classification of 2nd order ODEs: Tresse classification revisited and beyond

    OpenAIRE

    Kruglikov, Boris

    2008-01-01

    In 1896 Tresse gave a complete description of relative differential invariants for the pseudogroup action of point transformations on the 2nd order ODEs. The purpose of this paper is to review, in light of modern geometric approach to PDEs, this classification and also discuss the role of absolute invariants and the equivalence problem.

  7. Technical Adequacy of the Disruptive Behavior Rating Scale-2nd Edition--Self-Report

    Science.gov (United States)

    Erford, Bradley T.; Miller, Emily M.; Isbister, Katherine

    2015-01-01

    This study provides preliminary analysis of the Disruptive Behavior Rating Scale-2nd Edition--Self-Report, which was designed to screen individuals aged 10 years and older for anxiety and behavior symptoms. Score reliability and internal and external facets of validity were good for a screening-level test.

  8. DOE performance indicators for 2nd quarter CY 1993

    Energy Technology Data Exchange (ETDEWEB)

    1993-11-01

    The Department of Energy (DOE) has established a Department-wide Performance Indicator (PI) Program for trending and analysis of operational data as directed by DOE Order 5480.26. The PI Program was established to provide a means for monitoring the environment, safety, and health (ES&H) performance of the DOE at the Secretary and other management levels. This is the tenth in a series of quarterly reports generated for the Department of Energy Idaho Operations Office (DOE-ID) by EG&G Idaho, Inc. to meet the requirements of the PI Program as directed by the DOE Standard (DOE-STD-1048-92). The information in this tenth quarterly report, while contributing to a historical database for supporting future trending analysis, does not at this time provide a sound basis for developing trend-related conclusions. In the future, it is expected that trending and analysis of operational data will enhance the safety culture in both DOE and contractor organizations by providing an early warning of deteriorating environment, safety, and health conditions. DOE-STD-1048-92 identifies four general areas of PIs. They are: Personnel Safety, Operational Incidents, Environment, and Management. These four areas have been subdivided into 26 performance indicators. Approximately 115 performance indicator control and distribution charts comprise the body of this report. A brief summary of PIs contained in each of these general areas is provided. The four EG&G facilities whose performance is charted herein are as follows: (1) The Advanced Test Reactor (ATR), (2) The Radioactive Waste Management Complex (RWMC), (3) The Waste Experimental Reduction Facility (WERF), and (4) The Test Reactor Area (TRA) Hot Cells.

  9. Real-Time Extended Interface Automata for Software Testing Cases Generation

    OpenAIRE

    Yang, Shunkun; Xu, Jiaqi; Man, Tianlong; Liu, Bin

    2014-01-01

    Testing and verification of the interface between software components are particularly important due to the large number of complex interactions, which requires the traditional modeling languages to overcome the existing shortcomings in the aspects of temporal information description and software testing input controlling. This paper presents the real-time extended interface automata (RTEIA) which adds clearer and more detailed temporal information description by the application of time words...

  10. An Evaluation of Software Distributed Shared Memory for Next-Generation Processors and Networks

    OpenAIRE

    Cox, A. L.; Dwarkadas, S.; Keleher, P.; Zwaenepoel, W.

    1993-01-01

    We evaluate the effect of processor speed, network characteristics, and software overhead on the performance of release-consistent software distributed shared memory. We examine five different protocols for implementing release consistency: eager update, eager invalidate, lazy update, lazy invalidate, and a new protocol called lazy hybrid. This lazy hybrid protocol combines the benefits of both lazy update and lazy invalidate. Our simulations indicate that with the processors and networks tha...

  11. Towards a "2nd Generation" of Quality Labels: a Proposal for the Evaluation of Territorial Quality Marks / Vers une «2ème génération» de labels de qualité: une proposition pour l'évaluation des marques de qualité territoriale / Hacia una "2" generación" de sellos de calidad: una propuesta para la evaluación de las marcas de calidad territorial

    Scientific Electronic Library Online (English)

    Eduardo, Ramos; Dolores, Garrido.

    2014-12-01

    Full Text Available SciELO Colombia | Language: English Abstract in spanish La literatura reciente analiza el papel de las especificidades territoriales como el núcleo de las estrategias de desarrollo territorial rural basadas en la diferenciación. Desafortunadamente, la proliferación de los sistemas de garantía de calidad está provocando un "laberinto de sellos", que difun [...] den los esfuerzos locales de capitalizar las especificidades rurales. Una segunda generación de sellos se está desarrollando actualmente para simplificar la diferenciación territorial. Una parte de los territorios al sur de Europa está basando sus estrategias de desarrollo rural mediante el proyecto Marca de Calidad Territorial Europea (MCTE). Este trabajo propone una metodología original, diseñada y desarrollada por los autores para la evaluación de algunos de los sellos de segunda generación. Esta metodología se ha validado en quince territorios rurales como los pioneros de la MCTE en España. Abstract in english Recent literature analyses the role of territorial specificities, as the core of territorial rural development strategies based on differentiation. Unfortunately, the proliferation of quality assurance schemes is provoking a "labyrinth of labels" which diffuses the local efforts for capitalizing rur [...] al specificities. A second-generation of labels is currently being developed to simplify the territorial differentiation message. A number of territories in Southern Europe are basing their rural development strategies joining the so-called European Territorial Quality Mark (ETQM) Project. This paper proposes an original methodology, designed and developed by authors, for the evaluation of some of these second-generation labels. This methodology has been validated in 15 rural territories as the pioneers of the ETQM in Spain.

  12. Generating Variable Strength Covering Array for Combinatorial Software Testing with Greedy Strategy

    Directory of Open Access Journals (Sweden)

    Ziyuan Wang

    2013-12-01

    Full Text Available Combinatorial testingis a practical and efficient software testing techniques, which could detectthe faults that triggered by interactions among factors in software. Comparedto the classic fixed strength combinatorial testing, the variable strengthcombinatorial testing usually uses less test cases to detect more interactionfaults, because it considers the actual interaction relationship in softwaresufficiently. For a model of variable strength combinatorial testing that hasbeen propose previously, two heuristic algorithms, which are based onone-test-at-a-time greedy strategy, are proposed in this paper to generatevariable strength covering arrays as test suites in software testing.Experimental results show that, compared to some existed algorithms and tools,the two proposed algorithms have advantages onboth the execution effectiveness and the optimality of the size of generatedtest suite.

  13. Automatic Generation of Machine Emulators: Efficient Synthesis of Robust Virtual Machines for Legacy Software Migration

    DEFF Research Database (Denmark)

    Franz, Michael; Gal, Andreas

    2006-01-01

    As older mainframe architectures become obsolete, the corresponding le- gacy software is increasingly executed via platform emulators running on top of more modern commodity hardware. These emulators are virtual machines that often include a combination of interpreters and just-in-time compilers. Implementing interpreters and compilers for each combination of emulated and target platform independently of each other is a redundant and error-prone task. We describe an alternative approach that automatically synthesizes specialized virtual-machine interpreters and just-in-time compilers, which then execute on top of an existing software portability platform such as Java. The result is a considerably reduced implementation effort.

  14. A Software Safety Certification Plug-in for Automated Code Generators (Executive Briefing)

    Science.gov (United States)

    Denney, Ewen; Schumann, Johann; Greaves, Doug

    2006-01-01

    A viewgraph presentation describing a certification tool to check the safety of auto-generated codes is shown. The topics include: 1) Auto-generated Code at NASA; 2) Safety of Auto-generated Code; 3) Technical Approach; and 4) Project Plan.

  15. A longitudinal radiographic study of the mineralization of 2nd premolars.

    Science.gov (United States)

    Ravin, J J; Nielsen, H G

    1977-05-01

    Odontogenesis of the 2nd premolar begins in the majority of cases at the age of 3-3 1/2 years, although this period can vary more widely than that for other permanent teeth. For this reason, aplasia of this group of teeth cannot be diagnosed as early and with the same degree of certainty. A group of 104 children aged 3-7 years in whom one or more tooth germs mesial to the 1st permanent molar were not visible in the various age groups was reexamined radiographically in the region where they apparently lacked the development of tooth germs. The second examination took place 16-24 months after the first, and a comparison was made of the two examinations. The study confirms that the 2nd premolar can be very late in developing and that the chance of this being so is greater in the maxilla than in the mandible. PMID:266750

  16. 2nd-Order CESE Results For C1.4: Vortex Transport by Uniform Flow

    Science.gov (United States)

    Friedlander, David J.

    2015-01-01

    The Conservation Element and Solution Element (CESE) method was used as implemented in the NASA research code ez4d. The CESE method is a time accurate formulation with flux-conservation in both space and time. The method treats the discretized derivatives of space and time identically and while the 2nd-order accurate version was used, high-order versions exist, the 2nd-order accurate version was used. In regards to the ez4d code, it is an unstructured Navier-Stokes solver coded in C++ with serial and parallel versions available. As part of its architecture, ez4d has the capability to utilize multi-thread and Messaging Passage Interface (MPI) for parallel runs.

  17. Preface: 2nd Workshop on the State of the Art in Nuclear Cluster Physics

    International Nuclear Information System (INIS)

    The 2nd workshop on the "State of the Art in Nuclear Cluster Physics" (SOTANCP2) took place on May 25-28, 2010, at the Universite Libre de Bruxelles (Brussels, Belgium). The first workshop of this series was held in Strasbourg (France) in 2008. The purpose of SOTANCP2 was to promote the exchange of ideas and to discuss new developments in Clustering Phenomena in Nuclear Physics and Nuclear Astrophysics both from a theoretical and from an experimental point of view

  18. International symposium on peripheral nerve repair and regeneration and 2nd club Brunelli meeting.

    OpenAIRE

    Geuna Stefano; Turgut Mehmet

    2010-01-01

    Abstract The International Symposium "Peripheral Nerve Repair and Regeneration and 2nd Club Brunelli Meeting" was held on December 4-5, 2009 in Turin, Italy (Organizers: Bruno Battiston, Stefano Geuna, Isabelle Perroteau, Pierluigi Tos). Interest in the study of peripheral nerve regeneration is very much alive because complete recovery of nerve function almost never occurs after nerve reconstruction and, often, the clinical outcome is rather poor. Therefore, there is a need for defining innov...

  19. Proceedings of the 2nd IWDG International Whale Conference. Muc Mhara Ireland's Smallest Whale

    OpenAIRE

    Berrow, S. D.; Deegan, B.

    2010-01-01

    Muc Mhara – Ireland’s smallest whale. Proceedings of the 2nd Irish Whale and Dolphin Group International Whale Conference. Papers presented include, “Introduction: The harbour porpoise or Muc Mhara”, “An Irish name for the humble harbour porpoise”, “Life in the Fast Lane: Ecology and Behaviour of harbour porpoises in the Gulf of Maine”, “The ecology of harbour porpoise (Phocoena phocoena) in Irish waters: what strandings programmes tell us.”, “Passive acoustic monitoring...

  20. One-stage thumb lengthening with use of an osteocutaneous 2nd metacarpal flap

    OpenAIRE

    Givissis, Panagiotis; Stavridis, Stavros I.; Ditsios, Konstantinos; Christodoulou, Anastasios

    2009-01-01

    Traumatic thumb amputation represents an extremely disabling entity, thus rendering its reconstruction a procedure of paramount importance. A case of a patient, who sustained a traumatic amputation of his left index finger at the metacarpophalangeal joint and of his left thumb in the middle of the proximal phalanx 4 months ago and was initially treated elsewhere, is described. For the thumb reconstruction, an osteocutaneous flap of the radial side of the 2nd metacarpal, which consisted of a ...

  1. 2nd-Order CESE Results For C1.1: Transonic Ringleb Flow

    Science.gov (United States)

    Friedlander, David J.

    2015-01-01

    The Conservation Element and Solution Element (CESE) method was used as implemented in the NASA research code ez4d (an unstructured Navier-Stokes solver coded in C++ with serial and parallel versions available.) The CESE method is a time-accurate formulation with flux-conservation in both space and time. The method treats the discretized derivatives of space and time identically and while the 2nd-order accurate version was used, high-order versions exist.

  2. Performance of 2nd Generation BaBar Resistive Plate Chambers

    Energy Technology Data Exchange (ETDEWEB)

    Anulli, F.; Baldini, R.; Calcaterra, A.; de Sangro, R.; Finocchiaro, G.; Patteri, P.; Piccolo, M.; Zallo, A.; /Frascati; Cheng, C.H.; Lange, D.J.; Wright, D.M.; /LLNL,; Messner, R.; Wisniewski, William J.; /SLAC; Pappagallo, M.; /Bari U. /INFN, Bari; Andreotti, M.; Bettoni, D.; Calabrese, R.; Cibinetto, G.; Luppi, E.; Negrini, M.; /Ferrara; Capra, R.; /Genoa U. /INFN, Genoa /Naples U. /INFN, Naples /Perugia U. /INFN, Perugia /Pisa U. /INFN, Pisa /Rome U. /INFN, Rome /Oregon U. /UC, Riverside

    2005-07-12

    The BaBar detector has operated nearly 200 Resistive Plate Chambers (RPCs), constructed as part of an upgrade of the forward endcap muon detector, for the past two years. The RPCs experience widely different background and luminosity-driven singles rates (0.01-10 Hz/cm{sup 2}) depending on position within the endcap. Some regions have integrated over 0.3 C/cm{sup 2}. RPC efficiency measured with cosmic rays is high and stable. The average efficiency measured with beam is also high. However, a few of the highest rate RPCs have suffered efficiency losses of 5-15%. Although constructed with improved techniques and minimal use of linseed oil, many of the RPCs, which are operated in streamer mode, have shown increased dark currents and noise rates that are correlated with the direction of the gas flow and the integrated current. Studies of the above aging effects are presented and correlated with detector operating conditions.

  3. PPRODUCTION OF 2ND GENERATION BIOETHANOL FROM LUCERNE – OPTIMIZATION OF HYDROTHERMAL PRETREATMENT

    Directory of Open Access Journals (Sweden)

    Sune Tjalfe Thomsen,

    2012-02-01

    Full Text Available Lucerne (Medicago sativa has many qualities associated with sustainable agriculture such as nitrogen fixation and high biomass yield. Therefore, there is interest in whether lucerne is a suitable biomass substrate for bioethanol production, and if hydrothermal pretreatment (HTT of lucerne improves enzymatic convertibility, providing sufficient enzymatic conversion of carbohydrate to simple sugars for ethanol production. The HTT process was optimised for lucerne hay, and the pretreated biomass was assessed by carbohydrate analysis, inhibitor characterisation of liquid phases, and by simultaneous saccharification and fermentation (SSF of the whole slurry with Cellubrix enzymes and Saccharomyces cerevisiae yeast. The optimal HTT conditions were 205°C for 5 minutes, resulting in pentose recovery of 81%, and an enzymatic convertibility of glucan to monomeric glucose of 74%, facilitating a conversion of 6.2% w/w of untreated material into bioethanol in SSF, which is equivalent to 1,100 litre ethanol per hectare per year

  4. PPRODUCTION OF 2ND GENERATION BIOETHANOL FROM LUCERNE – OPTIMIZATION OF HYDROTHERMAL PRETREATMENT

    OpenAIRE

    Sune Tjalfe Thomsen,; Morten Jensen,; Jens Ejbye Schmidt

    2012-01-01

    Lucerne (Medicago sativa) has many qualities associated with sustainable agriculture such as nitrogen fixation and high biomass yield. Therefore, there is interest in whether lucerne is a suitable biomass substrate for bioethanol production, and if hydrothermal pretreatment (HTT) of lucerne improves enzymatic convertibility, providing sufficient enzymatic conversion of carbohydrate to simple sugars for ethanol production. The HTT process was optimised for lucerne hay, and the pretreated bioma...

  5. Characterization of 2nd generation biomass under thermal conversion and the fate of nitrogen

    Energy Technology Data Exchange (ETDEWEB)

    Giuntoli, J.

    2010-11-17

    This dissertation deals with the characterization of several biomass materials under thermal conversion conditions using small-scale equipment. The fuels are tested under the conditions of slow and fast heating rate pyrolysis and combustion, with the main goal of investigating the chemistry of fuel-bound nitrogen. Among renewable sources, biomass materials hold a special position because they can, in the short term, substitute or integrate fossil fuels in all of their applications applying comparatively few changes to the existing equipment. Biomass wastes, from agriculture or other processes, are convenient in more respects since their use would not only substitute fossil fuels but it would also valorize waste streams. These materials, however, present several issues that are highly delaying their deployment on a large scale. Three of the most important problems are dealt with in this thesis: the heterogeneous nature of the materials, high amount of ash forming matter containing troublesome compounds such as K, Cl and P, and finally, high content of nitrogen. First of all, many biomass residues contain a higher amount of nitrogen compared with woody biomass or even coal. This high content of fuel-N could directly translate into high NOx emissions in combustion conditions or into a high content of nitrogen containing gases such as NH3 and HCN in the syngas from gasification. Primary measures, such as air staging, can be applied directly in the reactor in order to promote the reduction of NOx and NOx--precursors to molecular nitrogen. However, in order to apply such measures and optimize the syngas composition or minimize emissions without relying on expensive catalysts, a detailed knowledge of the mechanisms of fuel-N conversion is required. This thesis has as its main purpose to study the release of volatile nitrogen compounds under pyrolysis conditions and the analysis of the emissions of NO under combustion conditions from high-N fuels. Secondly, as explained in the first two chapters of this dissertation, the definition of biomass is very broad and it includes materials with extremely different composition and characteristics. Additionally, the interest in exploiting some of these materials, such as manures, for energy conversion has never been high enough to trigger substantial research. As a consequence, fundamental data such as reactivity and products distribution are almost completely lacking for many biowastes. One of the purposes of this thesis is, therefore, to gather extensive fundamental data for potential fuels, which have not yet fully characterized. Finally, some elements such as K, Cl, P and S, contained in biomass materials, are known to cause several problems during boiler operation. At high temperatures alkali silicates with melting temperatures lower than the operating one are formed; these partly molten particles can then create issues like slagging, fouling, loss of fluidization and, when Cl is present, corrosion of the boiler surfaces. Together with specific research on boiler materials and optimization of operating conditions, possible pre-treatments used to remove these compounds from the fuel before entering the reactor could greatly enhance the overall process. In this thesis, the effects of a water-leaching pre-treatment on the fuels' reactivity and product yields during pyrolysis are explored. After a general introduction, Chapter 2 has the purpose of providing the reader with an overview of definitions and concepts that are used in the rest of the dissertation. The materials studied in this work and the setups used are introduced in Chapter 3. Chapter 4, then, presents the results of measurements performed on agricultural residues under slow pyrolysis conditions. Chapter 5 reports the results of a similar analysis to the previous one, that was performed on different biomass residues: dry distiller's grains with solubles (DDGS) and chicken manure. Building up on the results of the previous two chapters, Chapter 6 describes the results of fast pyrolysis measurements of DDGS and palm

  6. Utilisation of 2nd generation web technologies in master level vocational teacher training

    OpenAIRE

    Péter Tóth

    2009-01-01

    The Masters level Opportunities and Technological Innovation in Vocational Teacher Education project (project site: http://motivate.tmpk.bmf.hu/) aims to develop the use and management of virtual learning environments in the area of vocational teacher training, drawing on a well established international partnership of institutions providing both technical and educational expertise. This paper gives an overall picture of the first results and products of the collaboration. We touch upon the g...

  7. 2nd generation lignocellulosic bioethanol: is torrefaction a possible approach to biomass pretreatment?

    Energy Technology Data Exchange (ETDEWEB)

    Chiaramonti, David; Rizzo, Andrea Maria; Prussi, Matteo [University of Florence, CREAR - Research Centre for Renewable Energy and RE-CORD, Florence (Italy); Tedeschi, Silvana; Zimbardi, Francesco; Braccio, Giacobbe; Viola, Egidio [ENEA - Laboratory of Technology and Equipment for Bioenergy and Solar Thermal, Rotondella (Italy); Pardelli, Paolo Taddei [Spike Renewables s.r.l., Florence (Italy)

    2011-03-15

    Biomass pretreatement is a key and energy-consuming step for lignocellulosic ethanol production; it is largely responsible for the energy efficiency and economic sustainability of the process. A new approach to biomass pretreatment for the lignocellulosic bioethanol chain could be mild torrefaction. Among other effects, biomass torrefaction improves the grindability of fibrous materials, thus reducing energy demand for grinding the feedstock before hydrolysis, and opens the biomass structure, making this more accessible to enzymes for hydrolysis. The aim of the preliminary experiments carried out was to achieve a first understanding of the possibility to combine torrefaction and hydrolysis for lignocellulosic bioethanol processes, and to evaluate it in terms of sugar and ethanol yields. In addition, the possibility of hydrolyzing the torrefied biomass has not yet been proven. Biomass from olive pruning has been torrefied at different conditions, namely 180-280 C for 60-120 min, grinded and then used as substrate in hydrolysis experiments. The bioconversion has been carried out at flask scale using a mixture of cellulosolytic, hemicellulosolitic, {beta}-glucosidase enzymes, and a commercial strain of Saccharomyces cerevisiae. The experiments demonstrated that torrefied biomass can be enzymatically hydrolyzed and fermented into ethanol, with yields comparable with grinded untreated biomass and saving electrical energy. The comparison between the bioconversion yields achieved using only raw grinded biomass or torrefied and grinded biomass highlighted that: (1) mild torrefaction conditions limit sugar degradation to 5-10%; and (2) torrefied biomass does not lead to enzymatic and fermentation inhibition. Energy consumption for ethanol production has been preliminary estimated, and three different pretreatment steps, i.e., raw biomass grinding, biomass-torrefaction grinding, and steam explosion were compared. Based on preliminary results, steam explosion still has a significant advantage compared to the other two process chains. (orig.)

  8. Reed canary grass as a feedstock for 2nd generation bioethanol production.

    Science.gov (United States)

    Kallioinen, Anne; Uusitalo, Jaana; Pahkala, Katri; Kontturi, Markku; Viikari, Liisa; Weymarn, Niklas von; Siika-Aho, Matti

    2012-11-01

    The enzymatic hydrolysis and fermentation of reed canary grass, harvested in the spring or autumn, and barley straw were studied. Steam pretreated materials were efficiently hydrolysed by commercial enzymes with a dosage of 10-20FPU/g d.m. Reed canary grass harvested in the spring was hydrolysed more efficiently than the autumn-harvested reed canary grass. Additional ?-glucosidase improved the release of glucose and xylose during the hydrolysis reaction. The hydrolysis rate and level of reed canary grass with a commercial Trichoderma reesei cellulase could be improved by supplementation of purified enzymes. The addition of CBH II improved the hydrolysis level by 10% in 48hours' hydrolysis. Efficient mixing was shown to be important for hydrolysis already at 10% dry matter consistency. The highest ethanol concentration (20g/l) and yield (82%) was obtained with reed canary grass at 10% d.m. consistency. PMID:22939601

  9. eLEN2 — 2nd generation eLearning Exchange Networks.

    OpenAIRE

    Panckhurst, Rachel; Marsh, Debra

    2009-01-01

    Since May 2007 the authors of this paper have explored and evaluated the use, including relative merits and challenges of social networking within the context of higher education professional development programmes in France and in Britain (Marsh & Panckhurst, 2007; Panckhurst & Marsh, 2008). A social networking tool was adopted for Masters' level courses in order to try and establish an effective collaborative pedagogical environment and sense of community, by placing students at the centre ...

  10. Utilisation of 2nd generation web technologies in master level vocational teacher training

    Directory of Open Access Journals (Sweden)

    Péter Tóth

    2009-03-01

    Full Text Available The Masters level Opportunities and Technological Innovation in Vocational Teacher Education project (project site: http://motivate.tmpk.bmf.hu/ aims to develop the use and management of virtual learning environments in the area of vocational teacher training, drawing on a well established international partnership of institutions providing both technical and educational expertise. This paper gives an overall picture of the first results and products of the collaboration. We touch upon the goals, the assessments and the learning process of using “Multimedia and e-Learning: e-learning methods and tools” module in details. The main cooperative and collaborative devices are presented in virtual learning environment. The communication during collaborative learning, the structured debate on forum and the benefits of collaborative learning in VLE are interpreted at the end of this paper.

  11. Using 2nd generation tyrosine kinase inhibitors in frontline management of chronic phase chronic myeloid leukemia

    OpenAIRE

    Jayakar, Vishal

    2014-01-01

    Choices in medicine come with responsibility. With several TKI's (Tyrosine kinase inhibitors) available for front-line management of CML (Chronic Myeloid Leukemia), an astute clinician has to personalise, rationalise and take a pragmatic approach towards selection of the best drug for the ‘patient in question’. Though it is hotly debated as to which TKI will triumph, the truth of this debate lies in individualising treatment rather than a general ‘all size fits all’ approach with imat...

  12. Makahiki+WattDepot : An open source software stack for next generation energy research and education

    DEFF Research Database (Denmark)

    Johnson, Philip M.; Xu, Yongwen

    2013-01-01

    The accelerating world-wide growth in demand for energy has led to the conceptualization of a “smart grid”, where a variety of decentralized, intermittent, renewable energy sources (for example, wind, solar, and wave) would provide most or all of the power required by small-scale “micro-grids” servicing hundreds to thousands of consumers. Such a smart grid will require consumers to transition from passive to active participation in order to optimize the efficiency and effectiveness of the grid’s electrical capabilities. This paper presents a software stack comprised of two open source software systems, Makahiki and WattDepot, which together are designed to engage consumers in energy issues through a combination of education, real-time feedback, incentives, and game mechanics. We detail the novel features of Makahiki and WattDepot, along with our initial experiences using them to implement an energy challenge called the Kukui Cup.

  13. Programa computacional para geração de séries sintéticas de precipitação / Software for generation of synthetic series of precipitation

    Scientific Electronic Library Online (English)

    Sidney S., Zanetti; Fernando F., Pruski; Michel C., Moreira; Gilberto C., Sediyama; Demetrius D., Silva.

    2005-04-01

    Full Text Available SciELO Brazil | Language: Portuguese Abstract in portuguese Desenvolveu-se um programa computacional que permite a aplicação da metodologia para geração de séries sintéticas de precipitação desenvolvida por OLIVEIRA (2003). O desenvolvimento do aplicativo foi viabilizado pela elaboração de um algoritmo computacional em ambiente de programação "Borland Delphi [...] 6.0". Os dados de entrada necessários são provenientes de banco de dados no formato padronizado pela Agência Nacional de Águas (ANA) com registros pluviométricos diários provenientes de estações meteorológicas. A partir dessas informações, o programa computacional é capaz de gerar séries sintéticas de precipitação diária contendo o total precipitado em milímetros, a duração do evento em horas, o tempo padronizado de ocorrência da intensidade máxima instantânea e a intensidade máxima instantânea padronizada. A série sintética gerada é armazenada em arquivos no formato "Texto" que podem ser acessados posteriormente por outros aplicativos e/ou planilhas eletrônicas. Além dos arquivos, são apresentadas várias informações na forma de gráficos e quadros, facilitando a avaliação do desempenho da metodologia desenvolvida. Abstract in english A computational model was developed to generate synthetic series of rainfall using the method developed by OLIVEIRA (2003). The software was developed in Borland Delphi 6.0 environment. The input data come from the daily precipitation data in the standardized format of the National Water Agency (ANA [...] ). The software is capable to generate synthetic series of daily rainfall containing the amount and the duration of the rainfall, and the standardized event time of the maximum instantaneous intensity. The generated synthetic series are stored in text-formatted files that may be accessed by others softwares and/or electronic datasheets. There were also presented graphs and tables format, to easily evaluate the performance of the method developed.

  14. Research on Object-oriented Software Testing Cases of Automatic Generation

    Directory of Open Access Journals (Sweden)

    Junli Zhang

    2013-11-01

    Full Text Available In the research on automatic generation of testing cases, there are different execution paths under drivers of different testing cases. The probability of these paths being executed is also different. For paths which are easy to be executed, more redundant testing case tend to be generated; But only fewer testing cases are generated for the control paths which are hard to be executed. Genetic algorithm can be used to instruct the automatic generation of testing cases. For the former paths, it can restrict the generation of these kinds of testing cases. On the contrary, the algorithm will encourage the generation of such testing cases as much as possible. So based on the study on the technology of path-oriented testing case automatic generation, the genetic algorithm is adopted to construct the process of automatic generation. According to the triggering path during the dynamic execution of program, the generated testing cases are separated into different equivalence class. The number of testing case is adjusted dynamicly by the fitness corresponding to the paths. The method can create a certain number of testing cases for each execution path to ensure the sufficiency. It also reduces redundant testing cases so it is an effective method for automatic generation of testing cases.

  15. Technical Background Material for the Wave Generation Software AwaSys 5

    DEFF Research Database (Denmark)

    Frigaard, Peter; Andersen, Thomas Lykke

    2010-01-01

    "Les Appareils Generateurs de Houle en Laboratorie" presented by Bi¶esel and Suquet in 1951 discussed and solved the analytical problems concerning a number of di®erent wave generator types. For each wave maker type the paper presented the transfer function between wave maker displacement and wave amplitude in those cases where the analytical problem could be solved. The article therefore represented a giant step in wave generation techniques and found the basis for today's wave generation in hydraulics laboratories.

  16. Higher (2nd)-order polarization-Wigner function for `even' entangled bi-modal coherent states

    CERN Document Server

    Singh, Ravi S; Yadava, Lallan; Gupta, Gyaneshwar K

    2012-01-01

    Higher (2nd)-order Wigner distribution function in quantum phase space for entangled bi-modal coherent states, a representative of higher (2nd)-order optical-polarization, is introduced by generalizing kernel (transiting) operator in Cahill-Glauber C(s)-correspondence rule. The nature is analyzed which reveals the occurrence of oscillating three peaks: 'two' for individual bi-modes and third for interference between modes. Also, the graphics of 2nd-order polarization-Wigner distribution function, incisively, demonstrates that it is of non-Gaussian nature attaining non-negative values in quantum phase space.

  17. A luminescence spectroscopy study of SrI{sub 2}:Nd{sup 3+} single crystals

    Energy Technology Data Exchange (ETDEWEB)

    Ogorodnikov, I.N., E-mail: i.n.ogorodnikov@gmail.com [Experimental Physics Department, Ural Federal University, 19, Mira Street, 620002 Ekaterinburg (Russian Federation); Pustovarov, V.A. [Experimental Physics Department, Ural Federal University, 19, Mira Street, 620002 Ekaterinburg (Russian Federation); Goloshumova, A.A.; Isaenko, L.I.; Yelisseyev, A.P.; Pashkov, V.M. [Institute of Geology and Mineralogy of Siberian Branch of RAS, 43, Russkaya Street, 630058 Novosibirsk (Russian Federation)

    2013-11-15

    The paper presents the results of a study on the luminescence of SrI{sub 2}:Nd{sup 3+} single crystals grown by the vertical Bridgman method. The photoluminescence (PL) spectra of SrI{sub 2}:Nd crystals show characteristic lines corresponding to transitions in trivalent Nd{sup 3+} ions, the most intense line at ca. 1070 nm is due to the radiative {sup 4}F{sub 3/2}?{sup 4}I{sub 11/2} transitions. Efficient PL excitation in a crystal transparency band occurs due to optical 4f?4f transitions from the ground {sup 4}I{sub 9/2} state, or the charge transfer transitions I{sup ?}?Nd{sup 3+}. The paper discussed two new PL emission bands at 2.8 and 3.8 eV, associated with the lattice defects formed in SrI{sub 2} crystal at introducing Nd{sup 3+} impurity ions; two intense narrow PL excitation bands at 5.52 and 5.31 eV, originating from free and defect-bound excitons, respectively; an efficient channel of exciton energy transfer between the host lattice and defects. We calculated the H(k) functions of distribution of the elementary relaxations over the reaction rate constants and explained on this basis the nonexponential PL decay kinetics in SrI{sub 2}:Nd{sup 3+} crystal. -- Author-Highlights: • Luminescence spectroscopy study with time-resolution of SrI{sub 2}:Nd{sup 3+} single crystals. • PL emission band at 1070 nm is due to {sup 4}F{sub 3/2}?{sup 4}I{sub 11/2} transitions in Nd{sup 3+} ions. • PL bands at 2.8 and 3.8 eV are due to the lattice defects in SrI{sub 2} Nd{sup 3+} crystals. • Efficient channel of exciton energy transfer between the host lattice and defects. • Nonexponential PL decay kinetics explained in terms of Kohlrausch function.

  18. Sustainable development - a role for nuclear power? 2nd scientific forum

    International Nuclear Information System (INIS)

    The 2nd Scientific Forum of the International Atomic Energy Agency (IAEA) was held during the 43rd General Conference. This paper summarizes the deliberations of the two-day Forum. The definition of 'sustainable development' of the 1987 Bruntland Commission - 'development that meets the needs of the present without compromising the ability of future generations to meet their own needs' - provided the background for the Forum's debate whether and how nuclear power could contribute to sustainable energy development. The framework for this debate comprises different perspectives on economic, energy, environmental, and political considerations. Nuclear power, along with all energy generating systems, should be judged on these considerations using a common set of criteria (e.g., emission levels, economics, public safety, wastes, and risks). First and foremost, there is a growing political concern over the possible adverse impact of increasing emissions of greenhouse gases from fossil fuel combustion. However, there is debate as to whether this would have any material impact on the predominantly economic criteria currently used to make investment decisions on energy production. According to the views expressed, the level of safety of existing nuclear power plants is no longer a major concern - a view not yet fully shared by the general public. The need to maintain the highest standards of safety in operation remains, especially under the mounting pressure of competitiveness in deregulated and liberalized energy markets. The industry must continuously reinforce a strong safety culture among reactor designers, builders, and operators. Furthermore, a convincing case for safety will have to be made for any new reactor designs. Of greater concern to the public and politicians are the issues of radioactive waste and proliferation of nuclear weapons. There is a consensus among technical experts that radioactive wastes from nuclear power can be disposed of safely and economically in deep geologic formations. However, the necessary political decisions to select sites for repositories need public support and understanding about what the industry is doing and what can be done. As to nuclear weapons proliferation, the existing safeguards system must be fully maintained and strengthened and inherently proliferation-resistant fuel cycles should be explored. Overviews of the future global energy demand and of the prospects for nuclear power in various economic regions of the world indicate that, in the case of the OECD countries, the dominant issue is economics in an increasingly free market system for electricity. For the so-called transition economies, countries of the Former Soviet Union and Central and Eastern Europe, the issue is one of managing nuclear power plant operations safely. In the case of developing countries, the dominant concern is effective management of technology, in addition to economics and finance. The prospects for nuclear power depend on the resolution of two cardinal issues. The first is economic competitiveness, and in particular, reduced capital cost. The second is public confidence in the ability of the industry to manage plant operations and its high level waste safely. There is a continuing need for dialogue and communication with all sectors of the public: economists, investors, social scientists, politicians, regulators, unions, and environmentalists. Of help in this dialogue would be nuclear power's relevance to and comparative advantages in addressing environmental issues, such as global climate change, local air quality, and regional acidification. Suggestions have been made for a globalized approach to critical nuclear power issues, such as waste management, innovative and proliferation-resistant reactors and fuel cycles, and international standards for new generation nuclear reactor designs.The conclusion seems to be that there is a role for nuclear energy in sustainable development, especially if greenhouse gas emissions are to be limited. Doubts persist in the minds of many energy experts over the pote

  19. Development of Data Analysis Software for Diagnostic Eddy Current Probe (D-probe) for Steam Generator Tube Inspection

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Myung Sik; Hur, Do Haeng; Kim, Kyung Mo; Han, Jung Ho; Lee, Deok Hyun [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2010-10-15

    Occurrences of a stress corrosion cracking in the steam generator tubes of nuclear power plants are closely related to the residual stress existing on the region of a geometric change, that is, expansion transition, u-bend, ding, dent, bulge, etc. Therefore, information on the location, type and quantitative size of a geometric anomaly existing in a tube is a prerequisite to the activity of a non destructive inspection for a root cause analysis, alert detection of an earlier crack, and the prediction of a further crack evolution. KAERI developed an innovative eddy current probe, D-probe, equipped with the simultaneous dual functions of a crack detection and a 3-dimensional quantitative profile measurement. Its excellent performance has been verified through the sampling inspections in several domestic nuclear power plants where the various types of the steam generator tube cracking were observed in operation. The qualified data analysis software should be furnished in order to deploy D-probe to pre- and in-service inspection of commercial power plant. This paper introduces the PC-Windows based eddy current data analysis software which is being developed for D-probe in cooperation with Zetec Inc

  20. Preferred Site for Initiation of RNA Transcription by Escherichia coli RNA Polymerase Within the Simian Virus 40 DNA Segment of the Nondefective Adenovirus-Simian Virus 40 Hybrid Viruses Ad2+ND1 and Ad2+ND3

    Science.gov (United States)

    Zain, B. Sayeeda; Dhar, Ravi; Weissman, Sherman M.; Lebowitz, Paul; Lewis, Andrew M.

    1973-01-01

    The DNA of simian virus 40 (SV40) was transcribed into RNA by Escherichia coli RNA polymerase at 18 to 24 C after synchronization of the initiation of RNA synthesis. After a brief synthetic period the RNA product contained relatively large amounts of sequences derived from a limited segment of SV40 DNA. The source for this pulse-labeled RNA was found to be a portion of the segment of SV40 DNA included within the nondefective adenovirus (Ad)-SV40 hybrid viruses, Ad2+ND1 and Ad2+ND3. After synthesis with [?-32P] ATP, Ad2+ND1 and Ad2+ND3 DNA transcripts contained an initial sequence missing from Ad2 transcripts. This sequence was identified as an initiation sequence for polymerase transcription of the SV40 DNA. Thus, there is a preferred site for initiation of in vitro transcription on the segment of SV40 DNA common to the nondefective Ad2+ND1 and Ad2+ND3 hybrid viruses. Images PMID:4350713

  1. Proceedings of the 2nd Workshop on Robots in Clutter: Preparing robots for the real world (Berlin, 2013)

    OpenAIRE

    Zillich, Michael; Bennewitz, Maren; Fox, Maria; Piater, Justus; Pangercic, Dejan

    2013-01-01

    This volume represents the proceedings of the 2nd Workshop on Robots in Clutter: Preparing robots for the real world, held June 27, 2013, at the Robotics: Science and Systems conference in Berlin, Germany.

  2. Software verification for nuclear industry

    International Nuclear Information System (INIS)

    Why verification of software products throughout the software life cycle is necessary is considered. Concepts of verification, software verification planning, and some verification methodologies for products generated throughout the software life cycle are then discussed

  3. Future Control and Automation : Proceedings of the 2nd International Conference on Future Control and Automation

    CERN Document Server

    2012-01-01

    This volume Future Control and Automation- Volume 1 includes best papers selected from 2012 2nd International Conference on Future Control and Automation (ICFCA 2012) held on July 1-2, 2012, Changsha, China. Future control and automation is the use of control systems and information technologies to reduce the need for human work in the production of goods and services. This volume can be divided into five sessions on the basis of the classification of manuscripts considered, which is listed as follows: Identification and Control, Navigation, Guidance and Sensor, Simulation Technology, Future Telecommunications and Control

  4. All-optical 1st and 2nd order integration on a chip.

    Science.gov (United States)

    Ferrera, Marcello; Park, Yongwoo; Razzari, Luca; Little, Brent E; Chu, Sai T; Morandotti, Roberto; Moss, David J; Azaña, José

    2011-11-01

    We demonstrate all-optical temporal integration of arbitrary optical waveforms with temporal features as short as ~1.9ps. By using a four-port micro-ring resonator based on CMOS compatible doped glass technology we perform the 1st- and 2nd-order cumulative time integral of optical signals over a bandwidth that exceeds 400GHz. This device has applications for a wide range of ultra-fast data processing and pulse shaping functions as well as in the field of optical computing for the real-time analysis of differential equations. PMID:22109195

  5. Results of the 2nd regular inspection in Unit 2 of the Hamaoka Nuclear Power Station

    International Nuclear Information System (INIS)

    The 2nd regular inspection in fiscal 1980 of Unit 2 in the Hamaoka Nuclear Power Station was made from September, 1980, to January, 1981, on its reactor and associated facilities. The inspection by external appearance examination, disassembling, leakage inspection and performance tests did not indicate any abnormality. Radiation exposure of the personnel during the inspection was less than the permissible dose. Radiation exposure data for the personnel are given in tables. The improvements and repair done accordingly were as follows: improvement of condensate recirculation piping, replacement in control-rod drives, and replacement in the power-range instrumentation. (J.P.N.)

  6. PREFACE: 2nd International Meeting for Researchers in Materials and Plasma Technology

    Science.gov (United States)

    Niño, Ely Dannier V.

    2013-11-01

    These proceedings present the written contributions of the participants of the 2nd International Meeting for Researchers in Materials and Plasma Technology, 2nd IMRMPT, which was held from February 27 to March 2, 2013 at the Pontificia Bolivariana Bucaramanga-UPB and Santander and Industrial - UIS Universities, Bucaramanga, Colombia, organized by research groups from GINTEP-UPB, FITEK-UIS. The IMRMPT, was the second version of biennial meetings that began in 2011. The three-day scientific program of the 2nd IMRMPT consisted in 14 Magisterial Conferences, 42 Oral Presentations and 48 Poster Presentations, with the participation of undergraduate and graduate students, professors, researchers and entrepreneurs from Colombia, Russia, France, Venezuela, Brazil, Uruguay, Argentina, Peru, Mexico, United States, among others. Moreover, the objective of IMRMPT was to bring together national and international researchers in order to establish scientific cooperation in the field of materials science and plasma technology; introduce new techniques of surface treatment of materials to improve properties of metals in terms of the deterioration due to corrosion, hydrogen embrittlement, abrasion, hardness, among others; and establish cooperation agreements between universities and industry. The topics covered in the 2nd IMRMPT include New Materials, Surface Physics, Laser and Hybrid Processes, Characterization of Materials, Thin Films and Nanomaterials, Surface Hardening Processes, Wear and Corrosion / Oxidation, Modeling, Simulation and Diagnostics, Plasma Applications and Technologies, Biomedical Coatings and Surface Treatments, Non Destructive Evaluation and Online Process Control, Surface Modification (Ion Implantation, Ion Nitriding, PVD, CVD). The editors hope that those interested in the are of materials science and plasma technology, enjoy the reading that reflect a wide range of topics. It is a pleasure to thank the sponsors and all the participants and contributors for making possible this international meeting of researchers. It should be noted that the event organized by UIS and UPB universities, through their research groups FITEK and GINTEP, was a very significant contribution to the national and international scientific community, achieving the interaction of different research groups from academia and business sector. On behalf of the research groups GINTEP - UPB and FITEK - UIS, we greatly appreciate the support provided by the Sponsors, who allowed to continue with the dream of research. Ely Dannier V-Nitilde no The Editor The PDF file also contains a list of committees and sponsors.

  7. Future Control and Automation : Proceedings of the 2nd International Conference on Future Control and Automation

    CERN Document Server

    2012-01-01

    This volume Future Control and Automation- Volume 2 includes best papers from 2012 2nd International Conference on Future Control and Automation (ICFCA 2012) held on July 1-2, 2012, Changsha, China. Future control and automation is the use of control systems and information technologies to reduce the need for human work in the production of goods and services. This volume can be divided into six sessions on the basis of the classification of manuscripts considered, which is listed as follows: Mathematical Modeling, Analysis and Computation, Control Engineering, Reliable Networks Design, Vehicular Communications and Networking, Automation and Mechatronics.

  8. Vibration Theory, Vol. 3 : linear stochastic vibration theory, 2nd ed.

    DEFF Research Database (Denmark)

    Nielsen, SØren R. K.

    1997-01-01

    The present textbook has been written based on previous lecture notes for a course on stochastic vibration theory that is being given on the 9th semester at Aalborg University for M. Sc. students in structural engineering. The present 2nd edition of this textbook on linear stochastic vibration theory is basically unchanged in comparison to the 1st edition. Only section 4.2 on single input - single output systems and chapter 6 on offshore structures have been modified in order to enhance the clearness.

  9. Collection of documents in the 2nd information exchange meeting on radioactive waste disposal research network

    International Nuclear Information System (INIS)

    The 2nd meeting on 'Radioactive Waste Disposal Research Network' was held at the Nagoya University Museum on March 30, 2007. The 'Radioactive Waste Disposal Research Network' was established in Interorganization Atomic Energy Research Program under academic collaborative agreement between Japan Atomic Energy Agency and the University of Tokyo. The objective is to develop both research infrastructures and human expertise in Japan to an adequate performance level, thereby contributing to the development of the fundamental research in the field of radioactive waste disposal. This material is a collection of presentations and discussions during the information exchange meeting. (author)

  10. Treatment of osteoarthritis of the knee: evidence-based guideline, 2nd edition.

    Science.gov (United States)

    Jevsevar, David S

    2013-09-01

    Treatment of Osteoarthritis of the Knee: Evidence-Based Guideline, 2nd Edition, is based on a systematic review of the current scientific and clinical research. This guideline contains 15 recommendations, replaces the 2008 AAOS clinical practice guideline, and was reevaluated earlier than the 5-year recommendation of the National Guideline Clearinghouse because of methodologic concerns regarding the evidence used in the first guideline. The current guideline does not support the use of viscosupplementation for the treatment of osteoarthritis of the knee. In addition, the work group highlighted the need for better research in the treatment of knee osteoarthritis. PMID:23996988

  11. Roles of doping ions in afterglow properties of blue CaAl2O4:Eu2+,Nd3+ phosphors

    Science.gov (United States)

    Wako, A. H.; Dejene, B. F.; Swart, H. C.

    2014-04-01

    Eu2+ doped and Nd3+ co-doped calcium aluminate (CaAl2O4:Eu2+,Nd3+) phosphor was prepared by a urea-nitrate solution combustion method at furnace temperatures as low as 500 °C. The produced CaAl2O4:Eu2+,Nd3+ powder was investigated in terms of phase composition, morphology and luminescence by X-Ray diffraction (XRD), Scanning Electron Microscope (SEM), Fourier Transform Infra Red spectroscopy (FTIR) and Photoluminescence (PL) techniques respectively. XRD analysis depicts a dominant monoclinic phase that indicates no change in the crystalline structure of the phosphor with varying concentration of Eu2+ and Nd3+. SEM results show agglomerates with non-uniform shapes and sizes with a number of irregular network structures having lots of voids and pores. The Energy Dispersive X-ray Spectroscopy (EDS) and (FTIR) spectra confirm the expected chemical components of the phosphor. PL measurements indicated one broadband excitation spectra from 200 to 300 nm centered around 240 nm corresponding to the crystal field splitting of the Eu2+ d-orbital and an emission spectrum in the blue region with a maximum on 440 nm. This is a strong indication that there was dominantly one luminescence center, Eu2+ which represents emission from transitions between the 4f7 ground state and the 4f6-5d1 excited state configuration. High concentrations of Eu2+ and Nd3+ generally reduce both intensity and lifetime of the phosphor powders. The optimized content of Eu2+ is 1 mol% and for Nd3+ is 1 mol% for the obtained phosphors with excellent optical properties. The phosphor also emits visible light at around 587 and 616 nm. Such emissions can be ascribed to the 5D0-7F1 and 5D0-7F2 intrinsic transition of Eu3+ respectively. The decay characteristics exhibit a significant rise in initial intensity with increasing Eu2+ doping concentration while the decay time increased with Nd3+ co-doping. The observed afterglow can be ascribed to the generation of suitable traps due to the presence of the Nd3+ ions.

  12. Roles of doping ions in afterglow properties of blue CaAl2O4:Eu2+,Nd3+ phosphors

    International Nuclear Information System (INIS)

    Eu2+ doped and Nd3+ co-doped calcium aluminate (CaAl2O4:Eu2+,Nd3+) phosphor was prepared by a urea-nitrate solution combustion method at furnace temperatures as low as 500 °C. The produced CaAl2O4:Eu2+,Nd3+ powder was investigated in terms of phase composition, morphology and luminescence by X-Ray diffraction (XRD), Scanning Electron Microscope (SEM), Fourier Transform Infra Red spectroscopy (FTIR) and Photoluminescence (PL) techniques respectively. XRD analysis depicts a dominant monoclinic phase that indicates no change in the crystalline structure of the phosphor with varying concentration of Eu2+ and Nd3+. SEM results show agglomerates with non-uniform shapes and sizes with a number of irregular network structures having lots of voids and pores. The Energy Dispersive X-ray Spectroscopy (EDS) and (FTIR) spectra confirm the expected chemical components of the phosphor. PL measurements indicated one broadband excitation spectra from 200 to 300 nm centered around 240 nm corresponding to the crystal field splitting of the Eu2+ d-orbital and an emission spectrum in the blue region with a maximum on 440 nm. This is a strong indication that there was dominantly one luminescence center, Eu2+ which represents emission from transitions between the 4f7 ground state and the 4f6–5d1 excited state configuration. High concentrations of Eu2+ and Nd3+ generally reduce both intensity and lifetime of the phosphor powders. The optimized content of Eu2+ is 1 mol% and for Nd3+ is 1 mol% for the obtained phosphors with excellent optical properties. The phosphor also emits visible light at around 587 and 616 nm. Such emissions can be ascribed to the 5D0–7F1 and 5D0–7F2 intrinsic transition of Eu3+ respectively. The decay characteristics exhibit a significant rise in initial intensity with increasing Eu2+ doping concentration while the decay time increased with Nd3+ co-doping. The observed afterglow can be ascribed to the generation of suitable traps due to the presence of the Nd3+ ions.

  13. Massive coordination of dispersed generation using PowerMatcher based software agents

    International Nuclear Information System (INIS)

    One of the outcomes of the EU-Fifth framework CRISP-project (http://crisp.ecn.nl/), has been the development of a real-time control strategy based on the application of distributed intelligence (ICT) to coordinate demand and supply in electricity grids. This PowerMatcher approach has been validated in two real-life and real-time field tests. The experiments aimed at controlled coordination of dispersed electricity suppliers (DG-RES) and demanders in distributed grids enabled by ICT-networks. Optimization objectives for the technology in the tests were minimization of imbalance in a commercial portfolio and mitigation of strong load variations in a distribution network with residential micro-CHPs. With respect to the number of ICT-nodes, the field tests were on a relatively small-scale. However, application of the technology has yielded some very encouraging results in both occasions. In the present paper, lessons learned from the field experiments are discussed. Furthermore, it contains an account of the roadmap for scaling up these field-tests with a larger number of nodes and with more diverse appliance/installation types. Due to its autonomous decision making agent-paradigm, the PowerMatcher software technology is expected to be widely more scaleable than central coordination approaches. Indeed, it is based on microeconomic theory and is expected to work best if it is applied on a massive scale in transparent market settings. A set of various types of supply and des. A set of various types of supply and demand appliances was defined and implemented in a PowerMatcher software simulation environment. A massive amount of these PowerMatcher node-agents each representing such a devicetype was utilized in a number of scenario calculations. As the production of DG-RES-resources and the demand profiles are strongly dependent on the time-of-year, climate scenarios leading to operational snapshots of the cluster were taken for a number of representative periods. The results of these larger scale simulations as well as scalability issues, encountered, are discussed. Further issues covered are the stability of the system as reflected by the internal price development pattern that acts as an 'invisible hand' to reach the common optimisation goal. Finally, the effects of scaling-up the technology are discussed in terms of possible 'emergent behaviour' of subsets in the cluster and primary process quality of appliances operating concertedly using the PowerMatcher

  14. Standardizing the next generation of bioinformatics software development with BioHDF (HDF5).

    Science.gov (United States)

    Mason, Christopher E; Zumbo, Paul; Sanders, Stephan; Folk, Mike; Robinson, Dana; Aydt, Ruth; Gollery, Martin; Welsh, Mark; Olson, N Eric; Smith, Todd M

    2010-01-01

    Next Generation Sequencing technologies are limited by the lack of standard bioinformatics infrastructures that can reduce data storage, increase data processing performance, and integrate diverse information. HDF technologies address these requirements and have a long history of use in data-intensive science communities. They include general data file formats, libraries, and tools for working with the data. Compared to emerging standards, such as the SAM/BAM formats, HDF5-based systems demonstrate significantly better scalability, can support multiple indexes, store multiple data types, and are self-describing. For these reasons, HDF5 and its BioHDF extension are well suited for implementing data models to support the next generation of bioinformatics applications. PMID:20865556

  15. PIPS Is not (just) Polyhedral Software Adding GPU Code Generation in PIPS

    OpenAIRE

    Amini, Mehdi; Ancourt, Corinne; Coelho, Fabien; Creusillet, Be?atrice; Guelton, Serge; Irigoin, Franc?ois; Jouvelot, Pierre; Keryell, Ronan; Villalon, Pierre

    2011-01-01

    Parallel and heterogeneous computing are growing in audience thanks to the increased performance brought by ubiquitous manycores and GPUs. However, available programming models, like OPENCL or CUDA, are far from being straightforward to use. As a consequence, several automated or semi-automated approaches have been proposed to automatically generate hardware-level codes from high-level sequential sources. Polyhedral models are becoming more popular because of their combination of expressivene...

  16. Development of new generation software tools for simulation of electron beam formation in novel high power gyrotrons

    Energy Technology Data Exchange (ETDEWEB)

    Sabchevski, S [Institute of Electronics, Bulgarian Academy of Sciences, BG-1784 Sofia (Bulgaria); Zhelyazkov, I [Faculty of Physics, Sofia University, BG-1164 Sofia (Bulgaria); Benova, E [Faculty of Physics, Sofia University, BG-1164 Sofia (Bulgaria); Atanassov, V [Institute of Electronics, Bulgarian Academy of Sciences, BG-1784 Sofia (Bulgaria); Dankov, P [Faculty of Physics, Sofia University, BG-1164 Sofia (Bulgaria); Thumm, M [Forschungszentrum Karlsruhe, Association EURATOM-FZK, Institute for Pulsed Power and Microwave Technology, D-76021 Karlsruhe (Germany); Dammertz, G [University of Karlsruhe, Institute of High Frequency Techniques and Electronics, D-76128 Karlsruhe (Germany); Piosczyk, B [Forschungszentrum Karlsruhe, Association EURATOM-FZK, Institute for Pulsed Power and Microwave Technology, D-76021 Karlsruhe (Germany); Illy, S [Forschungszentrum Karlsruhe, Association EURATOM-FZK, Institute for Pulsed Power and Microwave Technology, D-76021 Karlsruhe (Germany); Tran, M Q [Centre de Recherches en Physique des Plasmas, Association EURATOM-CRPP, Ecole Polytechnique Federale de Lausanne, CH-1015 Lausanne (Switzerland); Alberti, S [Centre de Recherches en Physique des Plasmas, Association EURATOM-CRPP, Ecole Polytechnique Federale de Lausanne, CH-1015 Lausanne (Switzerland); Hogge, J-Ph [Centre de Recherches en Physique des Plasmas, Association EURATOM-CRPP, Ecole Polytechnique Federale de Lausanne, CH-1015 Lausanne (Switzerland)

    2006-07-15

    Computer aided design (CAD) based on numerical experiments performed by using adequate physical models and efficient simulation codes is an indispensable tool for development, investigation, and optimization of gyrotrons used as radiation sources for electron cyclotron resonance heating (ECRH) of fusion plasmas. In this paper, we review briefly the state-of-the-art in the field of modelling and simulation of intense, relativistic, helical electron beams formed in the electron-optical systems (EOS) of powerful gyrotrons. We discuss both the limitations of the known computer codes and the requirements for increasing their capabilities for solution of various design problems that are being envisaged in the development of the next generation gyrotrons for ECRH. Moreover, we present the concept followed by us in an attempt to unite the advantages of the modern programming techniques with self-consistent, first-principles 3D physical models in the creation of a new highly efficient and versatile software package for simulation of powerful gyrotrons.

  17. All in a Day's Work:Careers Using Science, 2nd Edition

    Science.gov (United States)

    Megan Sullivan

    2008-06-01

    "Almost all careers in the 21st century require a working knowledge of science and mathematics," says Steve Metz, The Science Teacher field editor, in his introduction to All in a Day's Work, 2nd edition . "The pending retirement of 78 millions baby boomers can only add to the need for science and mathematics training, as companies begin recruiting replacement workers in science fields, sometimes--believe it or not--as early as middle school!" This expanded second edition will help you give students an exciting look at the vast array of jobs built on a foundation of science, including: ? the expected--high school science teacher, microbiologist, and radiation therapist, ? the unexpected--bomb investigator, space architect, and musical acoustics scientist, the adventurous--astronaut, deep-cave explorer, and dinosaur paleontologist, and ? the offbeat-- shark advocate, roller coaster designer, and oyster wrangler All in a Day's Work, 2nd edition is a compendium of 49 of the popular "Career of the Month" columns from the NSTA high school journal The Science Teacher . Each column profiles a person in a science-related job and can be reproduced and shared with your high school students as they make career and education plans. Each profile includes suggestions about how to find additional career information, including links to websites and relevant professional organizations and interest groups.

  18. Proceedings of the 2nd JAERI symposium on HTGR technologies October 21 ? 23, 1992, Oarai, Japan

    International Nuclear Information System (INIS)

    The Japan Atomic Energy Research Institute (JAERI) held the 2nd JAERI Symposium on HTGR Technologies on October 21 to 23, 1992, at Oarai Park Hotel at Oarai-machi, Ibaraki-ken, Japan, with support of International Atomic Energy Agency (IAEA), Science and Technology Agency of Japan and the Atomic Energy Society of Japan on the occasion that the construction of the High Temperature Engineering Test Reactor (HTTR), which is the first high temperature gas-cooled reactor (HTGR) in Japan, is now being proceeded smoothly. In this symposium, the worldwide present status of research and development (R and D) of the HTGRs and the future perspectives of the HTGR development were discussed with 47 papers including 3 invited lectures, focusing on the present status of HTGR projects and perspectives of HTGR Development, Safety, Operation Experience, Fuel and Heat Utilization. A panel discussion was also organized on how the HTGRs can contribute to the preservation of global environment. About 280 participants attended the symposium from Japan, Bangladesh, Germany, France, Indonesia, People's Republic of China, Poland, Russia, Switzerland, United Kingdom, United States of America, Venezuela and the IAEA. This paper was edited as the proceedings of the 2nd JAERI Symposium on HTGR Technologies, collecting the 47 papers presented in the oral and poster sessions along with 11 panel exhibitions on the results of research and development associated to the HTTR. (author)

  19. Proceedings of the 2nd technical meeting on high temperature gas-cooled reactors

    International Nuclear Information System (INIS)

    From the point of view for establishing and upgrading the technology basis of HTGRs, the 2nd Technical Meeting on High Temperature Gas-cooled Reactors (HTGRs) was held on March 11 and 12, 1992, in Tokai Research Establishment in order to review the present status and the results of Research and Development (R and D) of HTGRs, to discuss on the items of R and D which should be promoted more positively in the future and then, to help in determining the strategy of development of high temperature engineering and examination in JAERI. At the 2nd Technical Meeting, which followed the 1st Technical Meeting held in February 1990 in Tokai Research Establishment, expectations to the High Temperature Engineering Test Reactor (HTTR), possible contributions of the HTGRs to the preservation of the global environment and the prospect of HTGRs were especially discussed, focusing on the R and D of Safety, high temperature components and process heat utilization by the experts from JAERI as well as universities, national institutes, industries and so on. This proceedings summarizes the papers presented in the oral sessions and materials exhibited in the poster session at the meeting and will be variable as key materials for promoting the R and D on HTGRs from now on. (author)

  20. The MeqTrees software system and its use for third-generation calibration of radio interferometers

    Science.gov (United States)

    Noordam, J. E.; Smirnov, O. M.

    2010-12-01

    Context. The formulation of the radio interferometer measurement equation (RIME) for a generic radio telescope by Hamaker et al. has provided us with an elegant mathematical apparatus for better understanding, simulation and calibration of existing and future instruments. The calibration of the new radio telescopes (LOFAR, SKA) would be unthinkable without the RIME formalism, and new software to exploit it. Aims: The MeqTrees software system is designed to implement numerical models, and to solve for arbitrary subsets of their parameters. It may be applied to many problems, but was originally geared towards implementing Measurement Equations in radio astronomy for the purposes of simulation and calibration. The technical goal of MeqTrees is to provide a tool for rapid implementation of such models, while offering performance comparable to hand-written code. We are also pursuing the wider goal of increasing the rate of evolution of radio astronomical software, by offering a tool that facilitates rapid experimentation, and exchange of ideas (and scripts). Methods: MeqTrees is implemented as a Python-based front-end called the meqbrowser, and an efficient (C++-based) computational back-end called the meqserver. Numerical models are defined on the front-end via a Python-based Tree Definition Language (TDL), then rapidly executed on the back-end. The use of TDL facilitates an extremely short turn-around time (hours rather than weeks or months) for experimentation with new ideas. This is also helped by unprecedented visualization capabilities for all final and intermediate results. A flexible data model and a number of important optimizations in the back-end ensures that the numerical performance is comparable to that of hand-written code. Results: MeqTrees is already widely used as the simulation tool for new instruments (LOFAR, SKA) and technologies (focal plane arrays). It has demonstrated that it can achieve a noise-limited dynamic range in excess of a million, on WSRT data. It is the only package that is specifically designed to handle what we propose to call third-generation calibration (3GC), which is needed for the new generation of giant radio telescopes, but can also improve the calibration of existing instruments.

  1. Cataloging of the Northern Sky from the POSS-II using a Next-Generation Software Technology

    Science.gov (United States)

    Djorgovski, S. G.; Weir, N.; Fayyad, U.

    Digitization of the Second Palomar Observatory Sky Survey (POSS-II) is now in progress at STScI. The resulting data set, the Palomar-STScI Digital Sky Survey (DPOSS), will consist of about 3 TB of pixel data. In order to extract useful information from this data set quickly, uniformly, and efficiently, we have developed a software system to catalog, calibrate, classify, maintain, and analyse the scans, called Sky Image Cataloging and Analysis Tool (SKICAT). It is a suite of programs designed to facilitate the maintenance and analysis of astronomical surveys comprised of multiple, overlapping images and/or catalogs. The system serves three principal functions: catalog construction (including object classification), catalog management, and catalog analysis. It provides a powerful, integrated environment for the manipulation and scientific investigation of catalogs from virtually any source. The system is a testbed for practical astronomical applications of AI technology, including machine learning, expert systems, etc., used for astronomical catalog generation and analysis. The system also provides tools to merge these catalogs into a large, complex database which may be easily queried, modified, and upgraded (e.g., as more or better calibration data are added). For example, we make a considerable use of the GID3* decision tree induction software. The resulting Palomar Northern Sky Catalog (PNSC) is expected to contain galaxies, and stars, in 3 colors ( ), down to the limiting magnitude , with the star-galaxy classification accurate to 90 -- 95 percent down to . The catalog will be continuously upgraded as more calibration data become available. It will be made available to the community via computer networks and/or suitable media, probably in installments, as soon as scientific validation and quality checks are completed. Analysis software (parts of SKICAT) will also be freely available. A vast variety of scientific projects will be possible with this data base, including the studies of large-scale structure, Galactic structure, automatic identifications of sources from other wavelengths (radio through x-ray), generation of objective catalogs of clusters and groups of galaxies, searches for quasars, variable or extreme-color objects, etc.

  2. A Facilitated Interface to Generate a Combined Textual and Graphical Database System Using Widely Available Software

    Directory of Open Access Journals (Sweden)

    Corey Lawson

    2012-10-01

    Full Text Available Data-Base Management System (DBMS is the current standard for storing information. A DBMS organizes and maintains a structure of storage of data. Databases make it possible to store vast amounts of randomly created information and then retrieve items using associative reasoning in search routines. However, design of databases is cumbersome. If one is to use a database primarily to directly input information, each field must be predefined manually, and the fields must be organized to permit coherent data input. This static requirement is problematic and requires that database table(s be predefined and customized at the outset, a difficult proposition since current DBMS lack a user friendly front end to allow flexible design of the input model. Furthermore, databases are primarily text based, making it difficult to process graphical data. We have developed a general and nonproprietary approach to the problem of input modeling designed to make use of the known informational architecture to map data to a database and then retrieve the original document in freely editable form. We create form templates using ordinary word processing software: Microsoft InfoPath 2007. Each field in the form is given a unique name identifier in order to be distinguished in the database. It is possible to export text based documents created initially in Microsoft Word by placing a colon at the beginning of any desired field location. InfoPath then captures the preceding string and uses it as the label for the field. Each form can be structured in a way to include any combination of both textual and graphical fields. We input data into InfoPath templates. We then submit the data through a web service to populate fields in an SQL database. By appropriate indexing, we can then recall the entire document from the SQL database for editing, with corresponding audit trail. Graphical data is handled no differently than textual data and is embedded in the database itself permitting direct query approaches. This technique makes it possible for general users to benefit from a combined text-graphical database environment with a flexible non-proprietary interface. Consequently, any template can be effortlessly transformed to a database system and easily recovered in a narrative form.

  3. Free-radical Oxidation at a Diabetes Mellitus of the 2nd Type: Sources of Formation, Components, Pathogenetic Mechanisms of Toxicity

    Directory of Open Access Journals (Sweden)

    O.V. Zanozina

    2010-07-01

    Full Text Available The new data of oxidative stress, particularly in patients with a diabetes mellitus, is analyzed. The free-radical oxidation peculiarities at the given pathology are distinguished, the sources of the free radical increased generation, including not only six ways of the glucose altered metabolism, but the hyperinsulinemia consequences, as well as the antioxidant maintenance and functional activity disturbance because of glycolysis, are concretized.The intercommunication of the lipid peroxidation and oxidative modification of proteins is assessed; a relationship of the free radical generation with a carbonyl stress and glycolysis is emphasized, which can be assessed as a mutual aggravation syndrome.The different view points of the oxidative stress components at a diabetes mellitus of the 2nd type are presented.

  4. PCID and ASPIRE 2.0 - The Next Generation of AMOS Image Processing Software

    Science.gov (United States)

    Matson, C.; Soo Hoo, T.; Murphy, M.; Calef, B.; Beckner, C.; You, S.

    One of the missions of the Air Force Maui Optical and Supercomputing (AMOS) site is to generate high-resolution images of space objects using the Air Force telescopes located on Haleakala. Because atmospheric turbulence greatly reduces the resolution of space object images collected with ground-based telescopes, methods for overcoming atmospheric blurring are necessary. One such method is the use of adaptive optics systems to measure and compensate for atmospheric blurring in real time. A second method is to use image restoration algorithms on one or more short-exposure images of the space object under consideration. At AMOS, both methods are used routinely. In the case of adaptive optics, rarely can all atmospheric turbulence effects be removed from the imagery, so image restoration algorithms are useful even for adaptive-optics-corrected images. Historically, the bispectrum algorithm has been the primary image restoration algorithm used at AMOS. It has the advantages of being extremely fast (processing times of less than one second) and insensitive to atmospheric phase distortions. In addition, multi-frame blind deconvolution (MFBD) algorithms have also been used for image restoration. It has been observed empirically and with the use of computer simulation studies that MFBD algorithms produce higher-resolution image restorations than does the bispectrum algorithm. MFBD algorithms also do not need separate measurements of a star in order to work. However, in the past, MFBD algorithms have been factors of one hundred or more slower than the bispectrum algorithm, limiting their use to non-time-critical image restorations. Recently, with the financial support of AMOS and the High-Performance Computing Modernization Office, an MFBD algorithm called Physically-Constrained Iterative Deconvolution (PCID) has been efficiently parallelized and is able to produce image restorations in only a few seconds. In addition, with the financial support of AFOSR, it has been shown that PCID achieves or closely approaches the theoretical limits to image restoration quality for a variety of scenarios. For these reasons, PCID is now being transitioned to being the site-wide image restoration algorithm. Because the algorithm can be complicated to use, a GUI is being developed to be the front end to the PCID algorithm. This interface, called the Advanced SPeckle Image Reconstruction Environment (ASPIRE) version 2.0, is the next generation of the current ASPIRE GUI used as a front end to the bispectrum algorithm. ASPIRE 2.0 will be the front-end GUI to PCID, the bispectrum algorithm, and the AMOSphere database. In this presentation we describe ASPIRE 2.0 and PCID and how to use them to obtain high-resolution images.

  5. GONe: Software for estimating effective population size in species with generational overlap

    Science.gov (United States)

    Coombs, J.A.; Letcher, B.H.; Nislow, K.H.

    2012-01-01

    GONe is a user-friendly, Windows-based program for estimating effective size (N e) in populations with overlapping generations. It uses the Jorde-Ryman modification to the temporal method to account for age structure in populations. This method requires estimates of age-specific survival and birth rate and allele frequencies measured in two or more consecutive cohorts. Allele frequencies are acquired by reading in genotypic data from files formatted for either GENEPOP or TEMPOFS. For each interval between consecutive cohorts, N e is estimated at each locus and over all loci. Furthermore, N e estimates are output for three different genetic drift estimators (F s, F c and F k). Confidence intervals are derived from a chi-square distribution with degrees of freedom equal to the number of independent alleles. GONe has been validated over a wide range of N e values, and for scenarios where survival and birth rates differ between sexes, sex ratios are unequal and reproductive variances differ. GONe is freely available for download at. ?? 2011 Blackwell Publishing Ltd.

  6. Move Table: An Intelligent Software Tool for OptimalPath Finding and Halt Schedule Generation

    Directory of Open Access Journals (Sweden)

    Anupam Agrawal

    2007-09-01

    Full Text Available This study aims to help army officials in taking decisions before war to decide the optimalpath for army troops moving between two points in a real world digital terrain, consideringfactors like traveled distance, terrain type, terrain slope, and road network. There can optionallybe one or more enemies (obstacles located on the terrain which should be avoided. A tile-basedA* search strategy with diagonal distance and tie-breaker heuristics is proposed for finding theoptimal path between source and destination nodes across a real-world  3-D  terrain. A performancecomparison (time analysis, search space analysis, and accuracy has been made between themultiresolution A* search and the proposed tile-based A* search for large-scale digital terrainmaps. Different heuristics, which are used by the algorithms to guide these to the goal node,are presented and compared to overcome some of the computational constraints associated withpath finding on large digital terrains. Finally, a halt schedule is generated using the optimal path,weather condition, moving time, priority and type of a column, so that the senior military plannerscan strategically decide in advance the time and locations where the troops have to halt orovertake other troops depending on their priority and also the time of reaching the destination.

  7. Knowledge grows when shared : The Launch of OpenAIRE, 2nd December in Ghent

    DEFF Research Database (Denmark)

    Elbæk, Mikael Karstensen

    2010-01-01

    Knowledge is one of the few commodities that don’t devalue when used. Actually knowledge grows when shared and the free online access to peer-reviewed scientific publications is a potent ingredient the process of sharing. The sharing of knowledge is facilitated by the Open Access Movement. However Open Access is much more than downloading the PDF. Vice President of the European Commission and European Digital Agenda Commissioner Neelie Kroes boldly presented this message in the Opening Session of the OpenAIRE launch. On the 2nd December 2010 the official launch of OpenAIRE the European infrastructure for Open Access was launched in Ghent, Belgium. This project and initiative is facilitating the success of the Open Access Pilot in FP7 as presented earlier in this journal. In this brief article I will present some of the most interesting issues that were discussed during the first session of the day.

  8. Micro-texture Analysis of 2nd Pilgered Zirconium Alloys by Electron Backscatter Diffraction

    International Nuclear Information System (INIS)

    Texture controlling with crystallographic orientation is one of important techniques to produce seamless zirconium tubes for nuclear fuel cladding materials. The texture of the zirconium alloys can be analyzed by several methods such as X-ray, neutron and electron diffraction. Each method has its advantages and disadvantages for texture analysis, respectively. Since the thin seamless zirconium tubing was usually prepared with thick TREX by pilgering process, the grains near outer and inner surfaces have different deformation force during pilgering, which resultantly gives critical crystallographic orientation with position of grains in a tube. In this study, micro-texture analysis of pilgered zirconium alloys was carried out by electron backscatter diffraction (EBSD) to give a information for optimum fabrication condition. Emphasis is on the analysis of crystallographic orientation with position of 2nd pilgered zirconium tubings

  9. 2nd FP7 Conference and International Summer School Nanotechnology : From Fundamental Research to Innovations

    CERN Document Server

    Yatsenko, Leonid

    2015-01-01

    This book presents some of the latest achievements in nanotechnology and nanomaterials from leading researchers in Ukraine, Europe, and beyond. It features contributions from participants in the 2nd International Summer School “Nanotechnology: From Fundamental Research to Innovations” and International Research and Practice Conference “Nanotechnology and Nanomaterials”, NANO-2013, which were held in Bukovel, Ukraine on August 25-September 1, 2013. These events took place within the framework of the European Commission FP7 project Nanotwinning, and were organized jointly by the Institute of Physics of the National Academy of Sciences of Ukraine, University of Tartu (Estonia), University of Turin (Italy), and Pierre and Marie Curie University (France). Internationally recognized experts from a wide range of universities and research institutions share their knowledge and key results on topics ranging from nanooptics, nanoplasmonics, and interface studies to energy storage and biomedical applications. Pr...

  10. analysis and implementation of reactor protection system circuits - case study Egypt's 2 nd research reactor-

    International Nuclear Information System (INIS)

    this work presents a way to design and implement the trip unit of a reactor protection system (RPS) using a field programmable gate arrays (FPGA). instead of the traditional embedded microprocessor based interface design method, a proposed tailor made FPGA based circuit is built to substitute the trip unit (TU), which is used in Egypt's 2 nd research reactor ETRR-2. the existing embedded system is built around the STD32 field computer bus which is used in industrial and process control applications. it is modular, rugged, reliable, and easy-to-use and is able to support a large mix of I/O cards and to easily change its configuration in the future. therefore, the same bus is still used in the proposed design. the state machine of this bus is designed based around its timing diagrams and implemented in VHDL to interface the designed TU circuit

  11. Summary of the 2nd workshop on ion beam-applied biology

    International Nuclear Information System (INIS)

    Induction of novel plant resources by ion beam-irradiation has been investigated in JAERI. To share the knowledge of the present status of the field, and to find out future plants, 1st Workshop on ion beam-applied biology was held last year titled as ''Development of breeding technique for ion beams''. To further improve the research cooperation and to exchange useful information in the field, researchers inside JAERI and also with researchers outside, such as those from agricultural experiment stations, companies, and Universities met each other at the 2nd workshop on ion beam-applied biology titled as ''Future development of breeding technique for ion beams''. People from RIKEN, Institute of Radiation Breeding, Wakasa wan Energy Research Center, National Institute of Radiological Science also participated in this workshop. The 12 of the presented papers are indexed individually. (J.P.N.)

  12. Belief Functions: Theory and Applications - Proceedings of the 2nd International Conference on Belief Functions

    CERN Document Server

    Masson, Marie-Hélène

    2012-01-01

    The theory of belief functions, also known as evidence theory or Dempster-Shafer theory, was first introduced by Arthur P. Dempster in the context of statistical inference, and was later developed by Glenn Shafer as a general framework for modeling epistemic uncertainty. These early contributions have been the starting points of many important developments, including the Transferable Belief Model and the Theory of Hints. The theory of belief functions is now well established as a general framework for reasoning with uncertainty, and has well understood connections to other frameworks such as probability, possibility and imprecise probability theories.   This volume contains the proceedings of the 2nd International Conference on Belief Functions that was held in Compiègne, France on 9-11 May 2012. It gathers 51 contributions describing recent developments both on theoretical issues (including approximation methods, combination rules, continuous belief functions, graphical models and independence concepts) an...

  13. Proceedings of the 2nd joint seminar on atomic collisions and heavy ion induced nuclear reactions

    International Nuclear Information System (INIS)

    The meeting of the 2nd joint seminar on atomic collisions and heavy ion induced nuclear reactions was held at the University of Tokyo, May 13 and 14, 1982. The aim of this seminar has been not only to recognize the common problems lying between above two research fields, but also to obtain an overview of the theoretical and experimental approaches to clear the current problems. In the seminar, more than 50 participants gathered and presented 16 papers. These are two general reviews and fourteen comprehensive surveys on topical subjects which have been developed very intensively in recent years. The editors would like to thank all participants for their assistance and cooperation in making possible a publication of these proceedings. (author)

  14. Textile Tectonics : 2nd Ventulett Symposium, Georgia Tech University, School of Architecture, Atlanta, November 2008

    DEFF Research Database (Denmark)

    Mossé, Aurélie

    The meeting of architecture and textiles is a continuous but too often forgotten story of intimate exchange. However, the 2nd Ventulett Symposium hosted by the College of Architecture, within Georgia Institute of Technology, Atlanta, GA, was one of these precious moments celebrating such a marriage. Organized by Lars Spuybroeck, principal of Nox, Rotterdam, and current Thomas W. Ventulett III distinguished chair of Architectural Design, the event was embracing the textile tectonics as a core topic, praising textiles as the key component of architecture, relying on Gottfried Semper’s understanding of the discipline. Inspiring time gathering some of the most exciting architects of the moment, Lars Spuybroeck, Mark Burry, Evan Douglis, Michael Hensel and Cecil Balmond were invited to discuss their understanding of tectonics. Full text available at http://textilefutures.co.uk/exchange/bin/view/TextileFutures/TextileTectonics

  15. 2nd Response to "Feasibility of 3D reconstruction from a single 2D diffraction measurement"

    CERN Document Server

    Miao, Jianwei

    2009-01-01

    We present our 2nd response to Thibault's commentary article [1] and his reply [2], in which he commented upon our ankylography paper [3] and our 1st response [4]. In this article, we further explain why we think Thibault's theoretical analysis is flawed and his interpretation of our experiment is incorrect. Furthermore, we provide a quantitative analysis and a numerical experiment to illustrate why ankylography can in principle be applicable to general samples. Finally, we present detailed procedures for our numerical experiment on ankylographic reconstructions, which uses the traditional HIO algorithm only with the positivity constraint [5]. We welcome anyone (including Thibault) interested in ankylography to perform numerical experiments and verify our results. We will be very happy to provide any help if needed.

  16. Performance evaluation of Enhanced 2nd Order Gray Edge Color Constancy Algorithm Using Bilateral Filter

    Directory of Open Access Journals (Sweden)

    Richa Dogra*1

    2014-04-01

    Full Text Available The color constancy techniques becomes an important pre-processing technique which reduces the effect of the light source on the given image or scene. It is found that light effects lot on a given scene. So effect of the light source may degrades the performance of certain applications a lot like face recognition, object detection, lane detection etc. Color constancy has ability to detection of color independent of light source. It is a characteristic of the distinct color awareness organization which guarantees that the apparent color of objects remains relatively constant under altering illumination conditions. The overall goal of this paper is to propose a new algorithm 2nd order gray edge based color constancy algorithm using bilateral algorithm. The overall attention is to enhance the color constancy algorithm further. The histogram stretching is also used to improve the results. The comparison has shown the significant improvement over the available techniques.

  17. One-stage thumb lengthening with use of an osteocutaneous 2nd metacarpal flap.

    Science.gov (United States)

    Givissis, Panagiotis; Stavridis, Stavros I; Ditsios, Konstantinos; Christodoulou, Anastasios

    2009-12-01

    Traumatic thumb amputation represents an extremely disabling entity, thus rendering its reconstruction a procedure of paramount importance. A case of a patient, who sustained a traumatic amputation of his left index finger at the metacarpophalangeal joint and of his left thumb in the middle of the proximal phalanx 4 months ago and was initially treated elsewhere, is described. For the thumb reconstruction, an osteocutaneous flap of the radial side of the 2nd metacarpal, which consisted of a 3, 5-cm bony segment with the overlying skin and its blood and nerve supply was used. The flap was transferred and fixed with a plate and screws to the palmar-medial side of the stump of the thumb, while the 1st web space was deepened by removing the rest of the second metacarpal, while a partial skin graft was used to cover a remaining gap. Thumb functionality was restored immediately postoperatively, and the overall result was satisfactory. PMID:19941169

  18. Proceedings of the 2nd seminar of R and D on advanced ORIENT

    International Nuclear Information System (INIS)

    The 2nd Seminar of R and D on advanced ORIENT was held at Ricotte, on November 7th, 2008, Japan Atomic Energy Agency. The first meeting of this seminar was held on Oarai, Ibaraki on May, 2008, and more than fifty participants including related researchers and general public people were attended to this seminar. The second seminar has headed by Nuclear Science and Engineering Directorate, JAEA on Tokai, Ibaraki with 63 participants. Spent nuclear fuel should be recognized not only mass of radioactive elements but also potentially useful materials including platinum metals and rare earth elements. Taking the cooperation with universities related companies and research institutes, into consideration, we aimed at expanding and progressing the basic researches. This report records abstracts and figures submitted from the oral speakers in this seminar. (author)

  19. 2nd INTERNATIONAL CONFERENCE ON ELECTRICAL SYSTEMS (ICES 2006 8-10 May 2006,

    Directory of Open Access Journals (Sweden)

    Tarek Bouktir

    2006-12-01

    Full Text Available The 2nd International Conference on Electrical Systems (ICES 2006 was held 810 May 2006 at Larbi Ben M’Hidi University, Oum El-Bouaghi, Algeria. This conference provides opportunities for professional engineers, particularly young engineers, from both industry and academia to share ideas, explore recent developments, current practices and future trends in all aspects of electrical systems and related fields. ICES 2006 was of similar standing to the previous conference (PCSE’05 by the high quality of the presentations, the technical content of the papers, and the number of delegates attending. As in PCSE’05, it had a broad theme, covering all aspects of electrical power engineering, and was attended by academics, researchers, consultants and members of the manufacturing and electrical Supply industries. During the sessions, 86 papers selected from 300 uploads from 13 countries were debated.

  20. A critical discussion of the 2nd intercomparison on electron paramagnetic resonance dosimetry with tooth enamel

    International Nuclear Information System (INIS)

    Recently, we have participated in 'The 2nd International Intercomparison on EPR Tooth Dosimetry' wherein 18 laboratories had to evaluate low-radiation doses (100-1000 mGy) in intact teeth (Wieser et al., Radiat. Meas., 32 (2000a) 549). The results of this international intercomparison seem to indicate a promising picture of EPR tooth dosimetry. In this paper, the two Belgian EPR participants present a more detailed and critical study of their contribution to this intercomparison. The methods used were maximum likelihood common factor analysis (MLCFA) and spectrum subtraction. Special attention is paid to potential problems with sample preparation, intrinsic dose evaluation, linearity of the dose response, and determination of dose uncertainties

  1. The Second Beginner's Guide to Personal Computers for the Blind and Visually Impaired. 2nd Edition.

    Science.gov (United States)

    Croft, Diane L., Ed.

    The guide updates one section of a previously written manual on speech software for blind and visually impaired persons. The guide presents reviews of 14 specific commercially available software programs. Software is critiqued for IBM (Enhanced PC Talking Program, Soft Vert, Screen-Talk Pro, Artic Vision, Freedom 1, and Prompt-Writer), Apple…

  2. THINKLET: ELEMENTO CLAVE EN LA GENERACIÓN DE MÉTODOS COLABORATIVOS PARA EVALUAR USABILIDAD DE SOFTWARE / THINKLET: KEY ELEMENT IN THE COLLABORATIVE METHODS GENERATION FOR EVALUATE SOFTWARE USABILITY

    Scientific Electronic Library Online (English)

    Andrés, Solano Alegría; Yenny, Méndez Alegría; César, Collazos Ordóñez.

    2010-07-01

    Full Text Available En la actualidad, la usabilidad es un atributo fundamental para el éxito de un producto software. La competitividad entre organizaciones obliga a mejorar el nivel de usabilidad de los productos, debido al riesgo que existe de perder clientes, si el producto no es fácil de usar y/o fácil de aprender. [...] Aunque se han establecido métodos para evaluar la usabilidad de productos software, la mayoría de estos métodos no consideran la posibilidad de involucrar a varias personas trabajando de forma colaborativa en el proceso de evaluación. Por esta razón, convendría utilizar la Metodología para el Diseño de Métodos de Evaluación de Usabilidad Colaborativos, de tal forma que se diseñen métodos que permitan a varias personas de diversas áreas de conocimiento, trabajar de forma colaborativa en el proceso de evaluación. Este artículo presenta de forma general, la metodología mencionada y hace especial énfasis en los thinklets, como elementos clave para el diseño de procesos colaborativos. Abstract in english Currently, usability is a critical attribute to success of software. The competition among organizations forces to improve the level of product usability due to the risk of losing customers if product would not be easy to use and/or easy to learn. Methods have been established to evaluate the usabil [...] ity of software products; however, most of these methods don't take into account the possibility to involve several people working collaboratively in the evaluation process. Therefore, Methodology for Design of Collaborative Usability Evaluation Methods should be used to design methods that allow several people from a range of knowledge areas to work collaboratively in the evaluation process. This paper presents the methodology mentioned and gives special emphasis on Thinklets, as key elements for design of collaborative processes.

  3. Curriculum on the Edge of Survival: How Schools Fail to Prepare Students for Membership in a Democracy. 2nd Edition

    Science.gov (United States)

    Heller, Daniel

    2012-01-01

    Typically, school curriculum has been viewed through the lens of preparation for the workplace or higher education, both worthy objectives. However, this is not the only lens, and perhaps not even the most powerful one to use, if the goal is to optimize the educational system. "Curriculum on the Edge of Survival, 2nd Edition," attempts to define…

  4. Proceedings of the 2nd International Arctic Ungulate Conference, Fairbanks, Alaska, 13-17 Aug 1995: Issue No. 2

    Directory of Open Access Journals (Sweden)

    Rolf Egil Haugerud (editor

    1997-04-01

    Full Text Available This issue of the proceedings of AUC 1995 is the 2nd of four issues. Issues 1, 3 and 4 are found respectively in Rangifer 1996, 16 (2: 49-92; 1997, 17 (3: 10-138 ; 1998, 18 (3-4: 99-154. A list of contents of the four proceedings issues is found in the latter proceedings issue.

  5. Proceedings: 2nd IEA international workshop on beryllium technology for fusion

    International Nuclear Information System (INIS)

    The 2nd IEA International Workshop on Beryllium Technology for Fusion was held September 6--8, 1995 at Jackson Lake Lodge, Wyoming. Forty-four participants took part in the workshop representing Europe, Japan, the Russian Federation, and the United States including representatives from both government laboratories and private industry. The workshop was divided into six technical sessions and a ''town meeting'' panel discussion. Technical sessions addressed the general topics of: Thermomechanical Properties; Manufacturing Technologies; Radiation Effects; Plasma/Tritium Interactions; Safety, Applications, and Design; and Joining and Testing. This volume contains the majority of the papers presented at the workshop. In some instances, the authors of the papers could not be present at the workshop, and the papers were given by others, sometimes in summary form and in some instances combined with others. The full papers are included here in the sequence in which they would have been given. In other instances, presentations were made but no papers were submitted for publication. Those papers do not appear here. In summary, the workshop was very successful. The main objectives of bringing key members of the fusion beryllium community together was certainly met. Forty-four participants registered, and 35 papers were presented. Individual papers are indexed separately on the energy data bases

  6. Academic Training - 2nd Term: 08.01.2007 - 31.03.2007

    CERN Multimedia

    2006-01-01

    2006 - 2007 ACADEMIC TRAINING PROGRAMME 2nd Term : 08.01.2007 - 31.03.2007 LECTURE SERIES Applied Superconductivity by V. Palmieri, INFN, Padova, It. 17, 18, 19 January 11:00 -1200 - Auditorium, Bldg 500 String Theory for Pedestrians by B. Zwiebach, M.I.T. Cambridge, USA 29, 30, 31 January 11:00-12:00 - Auditorium, Bldg 500 on 29, 30 January TH Auditorium on 31 January Introduction to Supersymmetry by D. Kaplan, John Hopkins University, Baltimore, USA 12, 13, 14, 15 February 11:00-12:00 - Auditorium, Bldg 500 The Hunt for the Higgs Particle by F. Zwirner, University of Padova, It 27, 28 February, 1st March 11:00-12:00 - Auditorium, Bldg 500 From Evolution Theory to Parallel and Distributed Genetic by F. Fernandez de Vega 15, 16, March 11:00-12:00 - Auditorium, Bldg 500 The lectures are open to all those interested, without application. The abstract of the lectures, as well as any change to the above information (title, dates, time, place etc.) will be published in the CERN bulletin, the WWW, an...

  7. Academic Training - 2nd Term: 08.01.2007 - 31.03.2007

    CERN Multimedia

    2006-01-01

    2006 - 2007 ACADEMIC TRAINING PROGRAMME 2nd Term : 08.01.2007 - 31.03.2007 LECTURE SERIES Applied Superconductivity by V. Palmieri, INFN, Padova, It. 17, 18, 19 January 11:00 -1200 - Auditorium, bldg. 500 String Theory for Pedestrians by B. Zwiebach, M.I.T. Cambridge, USA 29, 30, 31 January 11:00-12:00 - Auditorium, bldg. 500 on 29, 30 January TH Auditorium on 31 January Introduction to Supersymmetry by D. Kaplan, John Hopkins University, Baltimore, USA 12, 13, 14, 15 February 11:00-12:00 - Auditorium, bldg. 500 The Hunt for the Higgs Particle by F. Zwirner, University of Padova, It 27, 28 February, 1st March 11:00-12:00 - Auditorium, bldg. 500 From Evolution Theory to Parallel and Distributed Genetic Programming by F. Fernandez de Vega 15, 16, March 11:00-12:00 - Auditorium, bldg. 500 The lectures are open to all those interested, without application. The abstract of the lectures, as well as any change to the above information (title, dates, time, place etc.) will be published in the WWW, and ...

  8. Proceedings of the 2nd CSNI Specialist Meeting on Simulators and Plant Analysers

    International Nuclear Information System (INIS)

    The safe utilisation of nuclear power plants requires the availability of different computerised tools for analysing the plant behaviour and training the plant personnel. These can be grouped into three categories: accident analysis codes, plant analysers and training simulators. The safety analysis of nuclear power plants has traditionally been limited to the worst accident cases expected for the specific plant design. Many accident analysis codes have been developed for different plant types. The scope of the analyses has continuously expanded. The plant analysers are now emerging tools intended for extensive analysis of the plant behaviour using a best estimate model for the whole plant including the reactor and full thermodynamic process, both combined with automation and electrical systems. The comprehensive model is also supported by good visualisation tools. Training simulators with real time plant model are tools for training the plant operators to run the plant. Modern training simulators have also features supporting visualisation of the important phenomena occurring in the plant during transients. The 2nd CSNI Specialist Meeting on Simulators and Plant Analysers in Espoo attracted some 90 participants from 17 countries. A total of 49 invited papers were presented in the meeting in addition to 7 simulator system demonstrations. Ample time was reserved for the presentations and informal discussions during the four meeting days. (orig.)

  9. Mesocosm soil ecological risk assessment tool for GMO 2nd tier studies

    DEFF Research Database (Denmark)

    D'Annibale, Alessandra; Maraldo, Kristine

    Ecological Risk Assessment (ERA) of GMO is basically identical to ERA of chemical substances, when it comes to assessing specific effects of the GMO plant material on the soil ecosystem. The tiered approach always includes the option of studying more complex but still realistic ecosystem level effects in 2nd tier caged experimental systems, cf. the new GMO ERA guidance: EFSA Journal 2010; 8(11):1879. We propose to perform a trophic structure analysis, TSA, and include the trophic structure as an ecological endpoint to gain more direct insight into the change in interactions between species, i.e. the food-web structure, instead of relying only on the indirect evidence from population abundances. The approach was applied for effect assessment in the agro-ecosystem where we combined factors of elevated CO2, viz. global climate change, and GMO plant effects. A multi-species (Collembola, Acari and Enchytraeidae) mesocosm factorial experiment was set up in a greenhouse at ambient CO2 and 450 ppm CO2 with a GM barley variety and conventional varieties. The GM barley differed concerning the composition of amino acids in the grain (antisense C-hordein line). The fungicide carbendazim acted as a positive control. After 5 and 11 weeks, data on populations, plants and soil organic matter decomposition were evaluated. Natural abundances of stable isotopes, 13C and 15N, of animals, soil, plants and added organic matter (crushed maize leaves) were used to describe the soil food web structure.

  10. Multi-Fluid Modeling of Mercury's Magnetosphere in Advance of the 2nd MESSENGER Flyby

    Science.gov (United States)

    Kidder, A.; Winglee, R. M.; Harnett, E. M.

    2008-12-01

    In preparation for the second MESSENGER flyby of Mercury on October 6, 2008, 3D multi-fluid simulations are used to predict Mercury's magnetospheric response to many possible solar wind and interplanetary magnetic field (IMF) conditions. Magnetic field components, synthetic spectrograms and the temperature and densities of planetary ions including He+ and Na+ will be plotted along the MESSENGER II flyby trajectory through Mercury's magnetotail. Although solar wind activity may be increased compared to the MESSENGER I flyby in January, we will model both quiet and active solar wind conditions in preparation for the 2nd flyby. Specifically, we look to examine the location and time scales of any flux ropes that form and the location of the bow shock and magnetopause. Additionally, we are interested in determining under what external conditions asymmetric sodium outflow occurs, as well as examining the presence of any field-aligned currents. We will also provide model comparisons with ground-based observations and data from the MESSENGER I flyby.

  11. The 2nd Berlin BedRest Study: protocol and implementation.

    Science.gov (United States)

    Belavý, D L; Bock, O; Börst, H; Armbrecht, G; Gast, U; Degner, C; Beller, G; Soll, H; Salanova, M; Habazettl, H; Heer, M; de Haan, A; Stegeman, D F; Cerretelli, P; Blottner, D; Rittweger, J; Gelfi, C; Kornak, U; Felsenberg, D

    2010-09-01

    Long-term bed-rest is used to simulate the effect of spaceflight on the human body and test different kinds of countermeasures. The 2nd Berlin BedRest Study (BBR2-2) tested the efficacy of whole-body vibration in addition to high-load resisitance exercise in preventing bone loss during bed-rest. Here we present the protocol of the study and discuss its implementation. Twenty-four male subjects underwent 60-days of six-degree head down tilt bed-rest and were randomised to an inactive control group (CTR), a high-load resistive exercise group (RE) or a high-load resistive exercise with whole-body vibration group (RVE). Subsequent to events in the course of the study (e.g. subject withdrawal), 9 subjects participated in the CTR-group, 7 in the RVE-group and 8 (7 beyond bed-rest day-30) in the RE-group. Fluid intake, urine output and axiallary temperature increased during bed-rest (p or = .17). Body weight changes differed between groups (p RVE-group displaying significant decreases in body-weight beyond bed-rest day-51 only. In light of events and experiences of the current study, recommendations on various aspects of bed-rest methodology are also discussed. PMID:20811145

  12. ??????????????????? Reservoir Characteristics of 2nd Member of Jialingjiang Formation in Fuchengzhai Structure of East Sichuan

    Directory of Open Access Journals (Sweden)

    ???

    2013-08-01

    Full Text Available ????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????–??????–????????????????2?????1???????3?????????????????????????????????????????????????????????????????????????????????????????????????????????????????????Based on the core description, thin sections identification, and physical property data analysis, reservoir char- acteristics of 2nd member of Jialingjiang Formation in Fuchengzhai area, eastern Sichuan, are studied in detail. The results show that the rock types of T1j2 are limestone, dolostone and cream rock, and reservoir rocks are mainly crystal- line dolostone and grain dolostone. The combination analysis of casting thin sections and scanning electron microscopy shows that reservoir space mainly exists in secondary pores (inter-grain pore, intercrystal pore, inter-crystal dissolved pore and fractures. The mass-properties analysis indicates that the reservoir property of T1j2 is poor, which belongs to fracture-pore and pore-fracture reservoir. Favourable reservoirs are mainly developed in T1j22, less in T1j21, and least in T1j23. Moreover, the reservoirs of T1j2 are mainly controlled by rock types, distributions of sedimentary facies, diagene- sis forms, tectonic actions, and so on, among which bank and tidal flat of dolomite are favourable reservoir facies, dis- solution and dolomization contribute to diagenesis, and microfractures shaped in tectonic activity are conducive to form the high quality reservoir and develop its permeability.

  13. The Development of Information Literacy Assessment for 2nd Grade Students and Their Performance

    Directory of Open Access Journals (Sweden)

    Lin Ching Chen

    2013-10-01

    Full Text Available The main purpose of this study was to develop an Information Literacy Assessment for 2nd-grade students and evaluate their performance. The assessment included a regular test and a portfolio assessment. There were 30 multiple-choice items and 3 constructed-response items in the test, while the portfolio assessment was based on the Super3 model. This study was conducted in an elementary school located in southern Taiwan. One hundred and forty-two second graders took the test, and only one class was randomly selected as the subjects for the portfolio assessment. The results showed that the test and portfolio assessment had good validity and reliability. In the fields of library literacy and media literacy, second-grade students with different abilities performed differently, while boys and girls performed similarly. Students performed well in the process of the Super3 model, only in the Plan Phase, they still needed teachers’ help to pose inquiry questions. At last, several suggestions were proposed for information literacy assessment and future research.

  14. Open3DGRID : An open-source software aimed at high-throughput generation of molecular interaction fields (MIFs)

    DEFF Research Database (Denmark)

    Tosco, Paolo; Balle, Thomas

    Description Open3DGRID is an open-source software aimed at high-throughput generation of molecular interaction fields (MIFs). Open3DGRID can generate steric potential, electron density and MM/QM electrostatic potential fields; furthermore, it can import GRIDKONT binary files produced by GRID and CoMFA/CoMSIA fields (exported from SYBYL with the aid of a small SPL script). High computational performance is attained through implementation of parallelized algorithms for MIF generation. Most prominent features in Open3DGRID include: •Seamless integration with OpenBabel, PyMOL, GAUSSIAN, FIREFLY, GAMESS-US, TURBOMOLE, MOLDEN, Molecular Discovery GRID •Multi-threaded computation of MIFs (both MM and QM); support for MMFF94 and GAFF force-fields with automated assignment of atom types to the imported molecular structures •Human and machine-readable text output, integrated with 3D maps in several formats to allow visualization of results in PyMOL, MOE, Maestro and SYBYL •User-friendly interface toall major QM packages (e.g. GAUSSIAN, FIREFLY, GAMESS-US, TURBOMOLE, MOLDEN), allows calculation of QM electron density and electrostatic potential 3D maps from within Open3DGRID •User-friendly interface to Molecular Discovery GRID to compute GRID MIFs from within Open3DGRID Open3DGRID is controlled through a command line interface; commands can be either entered interactively from a command prompt or read from a batch script. If PyMOL is installed on the system while Open3DGRID is being operated interactively, the setup of 3D grid computations can be followed in real time on PyMOL's viewport, allowing to tweak grid size and training/test set composition very easily. The main output is arranged as human-readable plain ASCII text, while a number of additional files are generated to store data and to export the results of computations for further analysis and visualization with third party tools. In particular, Open3DGRID can export 3D maps for visualization in PyMOL, MOE, Maestro and SYBYL. Open3DGRID is written in C; while pre-built binaries are available for mainstream operating systems (Windows 32/64-bit, Linux 32/64-bit, Solaris x86 32/64-bit, FreeBSD 32/64-bit, Intel Mac OS X 32/64-bit), source code is portable and can be compiled under any *NIX platform supporting POSIX threads. The modular nature of the code allows for easy implementation of new features, so that the core application can be customized to meet individual needs. A detailed ChangeLog is kept to keep track of the additions and modifications during Open3DGRID's development.

  15. The influence of the 1st AlN and the 2nd GaN layers on properties of AlGaN/2nd AlN/2nd GaN/1st AlN/1st GaN structure

    Energy Technology Data Exchange (ETDEWEB)

    Bi, Yang; Peng, EnChao; Lin, DeFeng [Chinese Academy of Sciences, Materials Science Center, Institute of Semiconductors, P.O. Box 912, Beijing (China); Wang, XiaoLiang; Yang, CuiBai; Xiao, HongLing; Wang, CuiMei; Feng, Chun; Jiang, LiJuan [Chinese Academy of Sciences, Materials Science Center, Institute of Semiconductors, P.O. Box 912, Beijing (China); Chinese Academy of Sciences, Key Laboratory of Semiconductor Materials Science, Institute of Semiconductors, P.O. Box 912, Beijing (China)

    2011-09-15

    This is a theoretical study of the 1st AlN interlayer and the 2nd GaN layer on properties of the Al{sub 0.3}Ga{sub 0.7}N/2nd AlN/2nd GaN/1st AlN/1st GaN HEMT structure by self-consistently solving coupled Schroedinger and Poisson equations. Our calculation shows that by increasing the 1st AlN thickness from 1.0 nm to 3.0 nm, the 2DEG, which is originally confined totally in the 2nd channel, gradually decreases, begins to turn up and eventually concentrates in the 1st one. The total 2DEG (2DEG in both channels) sheet density increases nearly linearly with the increasing 1st AlN thickness. And the slope of the potential profile of the AlGaN changes with the 1st AlN thickness, causing the unusual dependence of the total 2DEG sheet density on the thickness of the AlGaN barrier. The variations of 2DEG distribution, the total 2DEG sheet density and the conduction band profiles as a function of the 2nd GaN thickness also have been discussed. Their physical mechanisms have been investigated on the basis of the surface state theory. And the confinement of 2DEG can be further enhanced by the double-AlN interlayer, compared with the InGaN back-barrier. (orig.)

  16. GENERACIÓN AUTOMÁTICA DE APLICACIONES SOFTWARE A PARTIR DEL ESTANDAR MDA BASÁNDOSE EN LA METODOLOGÍA DE SISTEMAS EXPERTOS E INTELIGENCIA ARTIFICIAL / AUTOMATIC GENERATION OF SOFTWARE APPLICATIONS FROM STANDARD MDA STANDARD BASED ON THE METHOD OF ARTIFICIAL INTELLIGENCE AND EXPERT SYSTEMS

    Directory of Open Access Journals (Sweden)

    IVÁN MAURICIO RUEDA CÁCERES

    2011-04-01

    Full Text Available RESUMEN ANALÍTICO Son muchos los estudios que se han presentado a cerca de la generación automática de líneas de código, este artículo pretende presentar una solución a las limitaciones de una herramienta muy conocida llamada MDA, haciendo uso los avances tecnológicos de la inteligencia artificial y los sistemas expertos. Abarca los principios del marco de trabajo de MDA, transformando los modelos usados y añadiendo características a estos que permitirán hacer más eficiente esta metodología de trabajo. El modelo propuesto abarca las fases del ciclo de vida software siguiendo las reglas del negocio que hacen parte esencial un proyecto real de software. Es con las reglas del negocio que se empieza a dar la transformación del estándar MDA y se pretende dar un aporte que contribuya a automatizar las reglas del negocio de forma tal que sirva para la definición de las aplicaciones en todo el ciclo de vida que la genera. ANALYTICAL SUMMARY Many studies are presented about automatic generation of code lines, this article want to present a solution for limitations of a tool called MDA, using from Artifcial intelligence technological advances and expert sistems. covering the principle of MDA work frame, transforming used models and adding characteristics to this that allow to make more effcient this work metodology. the proposed model covers the phases cycle life software, following the business rules that make essential part in a real software proyect. With the Business rules can start to transform the standard MDA aiming to give a contribution to automate the business rules that works to defne aplications in all the life's cicle that generate it.

  17. Free Open Source Software: FOSS Based GIS for Spatial Retrievals of Appropriate Locations for Ocean Energy Utilizing Electric Power Generation Plants

    OpenAIRE

    Kohei Arai

    2012-01-01

    Free Open Source Software: FOSS based Geographic Information System: GIS for spatial retrievals of appropriate locations for ocean wind and tidal motion utilizing electric power generation plants is proposed. Using scatterometer onboard earth observation satellites, strong wind coastal areas are retrieved with FOSS/GIS of PostgreSQL/GIS. PostGIS has to be modified together with altimeter and scatterometer database. These modification and database creation would be a good reference to the user...

  18. Madeira Extreme Floods: 2009/2010 Winter. Case study - 2nd and 20th of February

    Science.gov (United States)

    Pires, V.; Marques, J.; Silva, A.

    2010-09-01

    Floods are at world scale the natural disaster that affects a larger fraction of the population. It is a phenomenon that extends it's effects to the surrounding areas of the hydrographic network (basins, rivers, dams) and the coast line. Accordingly to USA FEMA (Federal Emergency Management Agency) flood can be defined as:"A general and temporary condition of partial or complete inundation of two or more acres of normally dry land area or of two or more properties from: Overflow of inland or tidal waters; Unusual and rapid accumulation or runoff of surface waters from any source; Mudflow; Collapse or subsidence of land along the shore of a lake or similar body of water as a result of erosion or undermining caused by waves or currents of water exceeding anticipated cyclical levels that result in a flood as defined above." A flash flood is the result of intense and long duration of continuous precipitation and can result in dead casualties (i.e. floods in mainland Portugal in 1967, 1983 and 1997). The speed and strength of the floods either localized or over large areas, results in enormous social impacts either by the loss of human lives and or the devastating damage to the landscape and human infrastructures. The winter of 2009/2010 in Madeira Island was characterized by several episodes of very intense precipitation (specially in December 2009 and February 2010) adding to a new record of accumulated precipitation since there are records in the island. In February two days are especially rainy with absolute records for the month of February (daily records since 1949): 111mm and 97mm on the 2nd and 20th respectively. The accumulated precipitation ended up with the terrible floods on the 20th of February causing the lost of dozens of human lives and hundreds of millions of Euros of losses The large precipitation occurrences either more intense precipitation in a short period or less intense precipitation during a larger period are sometimes the precursor of geological phenomena resulting in land movement, many times in the same or very near areas from previous episodes. Although flood episodes have a strong dependency in the topography and hydrological capacity of the terrains, the human intervention is also an enormously important factor, more specifically the anthropogenic factors such deforestation, dams, change of water fluxes, and impermeabilization of the terrain surface. The risk assessment of floods should be address based not only on the knowledge of the meteorological and hidrometeorological factors, such the accumulated precipitation and soil water balance but also in the river path and water amounts and well the surrounding geomorphology of the water basins. The current work is focused in the meteorological contribution for the floods occurrence episode of 2010 in the Madeira Island, specifically the climatic characterization of the 2009/2010 Winter with particular incidence on the days of the 2nd and 20th of February.

  19. An open-source software tool for the generation of relaxation time maps in magnetic resonance imaging

    Directory of Open Access Journals (Sweden)

    Kühne Titus

    2010-07-01

    Full Text Available Abstract Background In magnetic resonance (MR imaging, T1, T2 and T2* relaxation times represent characteristic tissue properties that can be quantified with the help of specific imaging strategies. While there are basic software tools for specific pulse sequences, until now there is no universal software program available to automate pixel-wise mapping of relaxation times from various types of images or MR systems. Such a software program would allow researchers to test and compare new imaging strategies and thus would significantly facilitate research in the area of quantitative tissue characterization. Results After defining requirements for a universal MR mapping tool, a software program named MRmap was created using a high-level graphics language. Additional features include a manual registration tool for source images with motion artifacts and a tabular DICOM viewer to examine pulse sequence parameters. MRmap was successfully tested on three different computer platforms with image data from three different MR system manufacturers and five different sorts of pulse sequences: multi-image inversion recovery T1; Look-Locker/TOMROP T1; modified Look-Locker (MOLLI T1; single-echo T2/T2*; and multi-echo T2/T2*. Computing times varied between 2 and 113 seconds. Estimates of relaxation times compared favorably to those obtained from non-automated curve fitting. Completed maps were exported in DICOM format and could be read in standard software packages used for analysis of clinical and research MR data. Conclusions MRmap is a flexible cross-platform research tool that enables accurate mapping of relaxation times from various pulse sequences. The software allows researchers to optimize quantitative MR strategies in a manufacturer-independent fashion. The program and its source code were made available as open-source software on the internet.

  20. PREFACE: 2nd International Conference on Competitive Materials and Technological Processes (IC-CMTP2)

    Science.gov (United States)

    László, Gömze A.

    2013-12-01

    Competitiveness is one of the most important factors in our life and it plays a key role in the efficiency both of organizations and societies. The more scientifically supported and prepared organizations develop more competitive materials with better physical, chemical and biological properties and the leading companies apply more competitive equipment and technology processes. The aims of the 2nd International Conference on Competitive Materials and Technology Processes (ic-cmtp2) are the following: Promote new methods and results of scientific research in the fields of material, biological, environmental and technology sciences; Change information between the theoretical and applied sciences as well as technical and technological implantations. Promote the communication between the scientist of different nations, countries and continents. Among the major fields of interest are materials with extreme physical, chemical, biological, medical, thermal, mechanical properties and dynamic strength; including their crystalline and nano-structures, phase transformations as well as methods of their technological processes, tests and measurements. Multidisciplinary applications of materials science and technological problems encountered in sectors like ceramics, glasses, thin films, aerospace, automotive and marine industry, electronics, energy, construction materials, medicine, biosciences and environmental sciences are of particular interest. In accordance to the program of the conference ic-cmtp2, more than 250 inquiries and registrations from different organizations were received. Researchers from 36 countries in Asia, Europe, Africa, North and South America arrived at the venue of conference. Including co-authors, the research work of more than 500 scientists are presented in this volume. Professor Dr Gömze A László Chair, ic-cmtp2 The PDF also contains lists of the boards, session chairs and sponsors.

  1. Conference Report on the 2nd International Symposium on Lithium Applications for Fusion Devices

    Science.gov (United States)

    Ono, M.; Bell, M. G.; Hirooka, Y.; Kaita, R.; Kugel, H. W.; Mazzitelli, G.; Menard, J. E.; Mirnov, S. V.; Shimada, M.; Skinner, C. H.; Tabares, F. L.

    2012-03-01

    The 2nd International Symposium on Lithium Applications for Fusion Devices (ISLA-2011) was held on 27-29 April 2011 at the Princeton Plasma Physics Laboratory (PPPL) with broad participation from the community working on aspects of lithium research for fusion energy development. This community is expanding rapidly in many areas including experiments in magnetic confinement devices and a variety of lithium test stands, theory and modeling and developing innovative approaches. Overall, 53 presentations were given representing 26 institutions from 10 countries. The latest experimental results from nine magnetic fusion devices were given in 24 presentations, from NSTX (PPPL, USA), LTX (PPPL, USA), FT-U (ENEA, Italy), T-11M (TRINITY, RF), T-10 (Kurchatov Institute, RF), TJ-II (CIEMAT, Spain), EAST (ASIPP, China), HT-7 (ASIPP, China), and RFX (Padova, Italy). Sessions were devoted to: I. Lithium in magnetic confinement experiments (facility overviews), II. Lithium in magnetic confinement experiments (topical issues), III. Special session on liquid lithium technology, IV. Lithium laboratory test stands, V. Lithium theory/modeling/comments, VI. Innovative lithium applications and VII. Panel discussion on lithium PFC viability in magnetic fusion reactors. There was notable participation from the fusion technology communities, including the IFE, IFMIF and TBM communities providing productive exchanges with the physics oriented magnetic confinement lithium research groups. It was agreed to continue future exchanges of ideas and data to help develop attractive liquid lithium solutions for very challenging magnetic fusion issues, such as development of a high heat flux steady-state divertor concept and acceptable plasma disruption mitigation techniques while improving plasma performance with lithium. The next workshop will be held at ENEA, Frascati, Italy in 2013.

  2. La sorpresiva congruencia democrática del 2 de diciembre / The Surprising Democratic Congruence of December 2nd

    Scientific Electronic Library Online (English)

    Pedro, Nikken.

    2008-08-01

    Full Text Available SciELO Venezuela | Language: Spanish Abstract in spanish El artículo comienza por subrayar, y valorar positivamente, que en la Venezuela polarizada de hoy se hayan podido procesar democráticamente y sin violencia los resultados del referendo sobre la reforma constitucional. Pasa luego a evaluar las condiciones políticas imperantes en Venezuela luego de la [...] abrumadora victoria electoral del Presidente en diciembre de 2006. Entre esas condiciones destacan los llamados “cinco motores de la revolución”, siendo uno de ellos el de “la reforma constitucional”. A continuación se señalan los que el autor considera los contenidos más resaltantes de la propuesta de reforma original del Presidente. Evalúa las razones de los resultados electorales desfavorables a la propuesta de reforma, considerando contenidos mismos de la propuesta, debilidades del sector oficialista para ese debate y fortalezas del sector opositor. Concluye el artículo presentando las principales consecuencias de los resultados electorales del 2 de diciembre para la realidad sociopolítica venezolana. Abstract in english The article begins by underlining positively the fact that in the results of the referendum for the reform of the Constitution were possible to achieve in democracy and without violence in the currently polarized Venezuela. Later, it evaluates contemporary political conditions in Venezuela after the [...] overwhelming electoral victory of the President in December 2006. Among these conditions outstands “la reforma constitucional” (The Reform of the Constitution) as one of the "Los cinco motores de la revolución” (The Five Engines of the Revolution). Then the author pointed the contents considered most relevant from the original proposal for the reform presented the President. The reasons of the electoral unfavourable results to the reform proposal, taking into consideration the contents of the offer itself, the weaknesses of the official sector to set this debate and the strengths of opposing sectors, are evaluated. The article concludes presenting the principal consequences of the electoral results of December 2nd for Venezuelan social-political reality.

  3. Proceedings of the 2nd international advisory committee on biomolecular dynamics instrument DNA in MLF at J-PARC

    International Nuclear Information System (INIS)

    The 2nd International Advisory Committee on the 'Biomolecular Dynamics Backscattering Spectrometer DNA' was held on November 12th - 13th, 2008 at J-PARC Center, Japan Atomic Energy Agency. This IAC has been organized for aiming to realize an innovative neutron backscattering instrument in the Materials and Life Science Experimental Facility (MLF) at the J-PARC and therefore four leading scientists in the field of neutron backscattering instruments has been selected as the member (Dr. Dan Neumann (Chair); Prof. Ferenc Mezei; Dr. Hannu Mutka; Dr. Philip Tregenna-Piggott), and the 1st IAC had been held on February 27th - 29th, 2008. This report includes the executive summary and materials of the presentations in the 2nd IAC. (author)

  4. Influence of socio-cultural environment on development of childrens musical talents in 2nd trienium of primary school

    OpenAIRE

    Antolin, Petra

    2014-01-01

    The thesis examines the impact of socio-cultural environment on the development of musical talent among pupils in the 2nd three years of primary school. Thesis begins with a closer look at the definition of giftedness, which may be general or specific (partial). Specific giftedness means that children achieve above-average results in one area only, while general gifted children achieve above-average results in multiple different areas. The characteristics of gifted pupils which distinguish th...

  5. Physical properties of double perovskite-type barium neodymium osmate Ba{sub 2}NdOsO{sub 6}

    Energy Technology Data Exchange (ETDEWEB)

    Wakeshima, Makoto, E-mail: wake@sci.hokudai.ac.jp [Hokkaido University, Division of Chemistry, Graduate School of Science, Sapporo-shi, Hokkaido 060-0810 (Japan); Hinatsu, Yukio [Hokkaido University, Division of Chemistry, Graduate School of Science, Sapporo-shi, Hokkaido 060-0810 (Japan); Ohoyama, Kenji [Institute of Materials Research, Tohoku University, Sendai 980-8577 (Japan)

    2013-01-15

    The crystal, magnetic structures and physical properties of the double perovskite-type barium neodymium osmate Ba{sub 2}NdOsO{sub 6} are investigated through powder X-ray and neutron diffraction, electrical conductivity, magnetic susceptibility, and specific heat measurements. The Rietveld analysis reveals that the Nd and Os ions are arranged with regularity over the six-coordinate B sites in a distorted perovskite ABO{sub 3} framework. The monoclinic crystal structure described by space group P2{sub 1}/n (tilt system a{sup -}a{sup -}c{sup +}) becomes more distorted with decreasing temperature from 300 K down to 2.5 K. This compound shows a long-range antiferromagnetic ordering of Os{sup 5+} below 65 K. An antiferromagnetic ordering of Nd{sup 3+} also occurs at lower temperatures ({approx}20 K). The magnetic structure is of Type I and the magnetic moments of Nd{sup 3+} and Os{sup 5+} ions are in the same direction in the ab-plane. - Graphical Abstract: The Magnetic structure of Ba{sub 2}NdOsO{sub 6} is of Type I, and the magnetic moments of the Nd{sup 3+} and Os{sup 5+} ions are in the same direction in the ab-plane. Highlights: Black-Right-Pointing-Pointer Crystal structures of Ba{sub 2}NdOsO{sub 6} are determined to be monoclinic below 300 K. Black-Right-Pointing-Pointer Its electrical resistivity shows a Mott variable-range hopping behavior with localized carriers. Black-Right-Pointing-Pointer An antiferromagnetic ordering of the Os{sup 5+}moment occurs at 65 K. Black-Right-Pointing-Pointer The magnetic structure of Ba{sub 2}NdOsO{sub 6} is determined to be of Type I.

  6. Interaction Between Short-Term Heat Pretreatment and Avermectin On 2nd Instar Larvae of Diamondback Moth, Plutella Xylostella (Linn)

    OpenAIRE

    Gu, Xiaojun; Tian, Sufen; Wang, Dehui; Gao, Fei

    2009-01-01

    Based on the cooperative virulence index (c.f.), the interaction effect between short-term heat pretreatment and avermectin on 2nd instar larvae of diamondback moth (DBM), Plutella xylostella (Linnaeus), was assessed. The results suggested that the interaction results between short-term heat pretreatment and avermectin on the tested insects varied with temperature level as well as its duration and avermectin concentration. Interaction between heat pretreatment at 30°C and avermectin mainly r...

  7. 2nd PEGS Annual Symposium on Antibodies for Cancer Therapy: April 30–May 1, 2012, Boston, USA

    OpenAIRE

    Ho, Mitchell; Royston, Ivor; Beck, Alain

    2012-01-01

    The 2nd Annual Antibodies for Cancer Therapy symposium, organized again by Cambridge Healthtech Institute as part of the Protein Engineering Summit, was held in Boston, USA from April 30th to May 1st, 2012. Since the approval of the first cancer antibody therapeutic, rituximab, fifteen years ago, eleven have been approved for cancer therapy, although one, gemtuzumab ozogamicin, was withdrawn from the market.  The first day of the symposium started with a historical review of early work for l...

  8. Teachers' Spatial Anxiety Relates to 1st-and 2nd-Graders' Spatial Learning

    Science.gov (United States)

    Gunderson, Elizabeth A.; Ramirez, Gerardo; Beilock, Sian L.; Levine, Susan C.

    2013-01-01

    Teachers' anxiety about an academic domain, such as math, can impact students' learning in that domain. We asked whether this relation held in the domain of spatial skill, given the importance of spatial skill for success in math and science and its malleability at a young age. We measured 1st-and 2nd-grade teachers' spatial anxiety…

  9. [Near-infrared luminescence and energy transfer of ACaPO4 : Eu2+, Nd3+ (A = Li, K, Na)].

    Science.gov (United States)

    Xiao, Quan-Lan; Zou, Shao-Yu; Liu, Guan-Xi; Peng, Wen-Fang; Wan, Chui-Ming; Xie, Li-Juan; Meng, Jian-Xin

    2011-09-01

    Near-infrared (NIR) luminescence phosphors ACaPO4 : Eu2+, Nd2+ (A = Li, K, Na) were prepared by conventional solid state method and the sensitization of Nd3+ near-infrared luminescence by Eu2+ was investigated. The characteristic NIR luminescence of Nd3+ in ACaPO4 matrix is greatly enhanced by co-doping of Eu2+. The fluorescence properties of ACaPO4 : Eu2+, the NIR luminescence properties of ACaPO4 : Eu2+, Nd3+ and the fluorescence lifetime were studied. The effect of emission wavelength of Eu2+ on NIR luminescence of Nd3+ was investigated; The energy transfer mechanism between Eu2+ and Nd3+ was also discussed. Emission peak wavelength of Eu2+ In ACaPO4 matrixes was found red shift with the series of A = Li, K, Na and the extent of the overlap with the different excitation peaks of Nd3+ changes obviously. It was concluded that the emission peak position of Eu2+ is a very important factor for energy transfer, and the optimal wavelength range for Eu2+ --> Nd3+ energy transfer is 500 to 550 nm. PMID:22097821

  10. Investigations of near IR photoluminescence properties in TiO2:Nd,Yb materials using hyperspectral imaging methods

    International Nuclear Information System (INIS)

    TiO2 and TiO2:Nd,Yb films were deposited by a doctor blade deposition technique from pastes prepared by a sol–gel process, and characterized by electron microscopy and spectroscopic techniques. Near infrared (NIR) photoluminescence (PL) properties upon 808 nm excitation were also examined. The rutile TiO2:Nd,Yb samples exhibited the strongest NIR PL signal. The relationship between the morphological properties, annealing temperature and the optical behavior of TiO2:Nd,Yb films is discussed. Furthermore, the study showed that hyperspectral imaging spectroscopy can be used as a rapid and nondestructive macroscopic characterization technique for the identification of spectral features and evaluation of luminescent surfaces of oxides. -- Highlights: ? Films and powders of Nd and Yb-doped titania have been synthesized. ? Three modifications (anatase, rutile and Degussa P25) have been studied. ? The NIR photoluminescence properties were studied by hyperspectral imaging. ? Emission at 978, 1008, 1029, 1064 and 1339 nm was obtained. ? The structural properties and their influence on the optical behavior are discussed

  11. Production and Characterization of Ba2NdSbO6 Complex Perovskite as a Substrate for YBa2Cu3O7-? Superconducting Films

    Science.gov (United States)

    Madueño, Q.; Landínez Téllez, D. A.; Roa-Rojas, J.

    We report systematic studies of Ba2NdSbO6 as substrates for the production of YBa2Cu3O7-? superconducting thin films. Chemical stability and crystallographic coupling between Ba2NdSbO6 and YBCO were examined by characterizing Ba2NdSbO6-YBa2Cu3O7-? (0 to 100 vol.%) polycrystalline composites. X-ray diffraction experiments showed that Ba2NdSbO6 belongs to the complex cubic perovskite family. Moreover, we determine that these materials are chemically stables, e.g. there is no chemical reaction at the interface, and the lattice parameters evidenced a matching ~ 2%. Morphological characterization of our samples was performed through scanning electron microscopy, which revealed the existence of separated grains of Ba2NdSbO6 and YBa2Cu3O7-?. Compositional analysis of samples was performed by energy dispersive X-ray experiments, which showed the inexistence of impurities or undesired chemical elements. DC susceptibility measurements permitted us to determine that the presence of Ba2NdSbO6 does not affect the critical temperature of the superconducting transition of YBa2Cu3O7-?. Our results evidenced that Ba2NdSbO6 is an excellent candidate as a substrate for the fabrication of YBa2Cu3O7-? superconducting thin films.

  12. Conceptual design and optimization of a 1-1/2 generation PFBC plant task 14. Topical report

    Energy Technology Data Exchange (ETDEWEB)

    White, J.S.; Witman, P.M.; Harbaugh, L.; Rubow, L.N.; Horazak, D.A.

    1994-12-01

    The economics and performance of advanced pressurized fluidized bed (PFBC) cycles developed for utility applications during the last 10 years (especially the 2nd-Generation PFBC cycle) are projected to be favorable compared to conventional pulverized coal power plants. However, the improved economics of 2nd-Generation PFBC cycles are accompanied by the perception of increased technological risk related to the pressurized carbonizer and its associated gas cleanup systems. A PFBC cycle that removed the uncertainties of the carbonizer while retaining the high efficiency and low cost of a 2nd-Generation PFBC cycle could improve the prospects for early commercialization and pave the way for the introduction of the complete 2nd-Generation PFBC cycle at some later date. One such arrangement is a PFBC cycle with natural gas topping combustion, referred to as the 1.5-Generation PFBC cycle. This cycle combines the advantages of the 2nd-Generation PFBC plant with the reduced risk associated with a gas turbine burning natural gas, and can potentially be part of a phased approach leading to the commercialization of utility 2nd-Generation PFBC cycles. The 1.5-Generation PFBC may also introduce other advantages over the more complicated 2nd-Generation PFBC system. This report describes the technical and economic evaluation of 1.5-Generation PFBC cycles for utility or industrial power generation.

  13. The whiteStar development project: Westinghouse's next generation core design simulator and core monitoring software to power the nuclear renaissance

    International Nuclear Information System (INIS)

    The WhiteStar project has undertaken the development of the next generation core analysis and monitoring system for Westinghouse Electric Company. This on-going project focuses on the development of the ANC core simulator, BEACON core monitoring system and NEXUS nuclear data generation system. This system contains many functional upgrades to the ANC core simulator and BEACON core monitoring products as well as the release of the NEXUS family of codes. The NEXUS family of codes is an automated once-through cross section generation system designed for use in both PWR and BWR applications. ANC is a multi-dimensional nodal code for all nuclear core design calculations at a given condition. ANC predicts core reactivity, assembly power, rod power, detector thimble flux, and other relevant core characteristics. BEACON is an advanced core monitoring and support system which uses existing instrumentation data in conjunction with an analytical methodology for on-line generation and evaluation of 3D core power distributions. This new system is needed to design and monitor the Westinghouse AP1000 PWR. This paper describes provides an overview of the software system, software development methodologies used as well some initial results. (authors)

  14. What difference does a year of schooling make?: Maturation of brain response and connectivity between 2nd and 3rd grades during arithmetic problem solving

    OpenAIRE

    Rosenberg-lee, Miriam; Barth, Maria; Menon, Vinod

    2011-01-01

    Early elementary schooling in 2nd and 3rd grades (ages 7-9) is an important period for the acquisition and mastery of basic mathematical skills. Yet, we know very little about neurodevelopmental changes that might occur over a year of schooling. Here we examine behavioral and neurodevelopmental changes underlying arithmetic problem solving in a well-matched group of 2nd (n = 45) and 3rd (n = 45) grade children. Although 2nd and 3rd graders did not differ on IQ or grade- and age-normed measure...

  15. PREFACE: 1st-2nd Young Researchers Meetings in Rome - Proceedings

    Science.gov (United States)

    YRMR Organizing Committee; Cannuccia, E.; Mazzaferro, L.; Migliaccio, M.; Pietrobon, D.; Stellato, F.; Veneziani, M.

    2011-03-01

    Students in science, particularly in physics, face a fascinating and challenging future. Scientists have proposed very interesting theories, which describe the microscopic and macroscopic world fairly well, trying to match the quantum regime with cosmological scales. Between the extremes of this scenario, biological phenomena in all their complexity take place, challenging the laws we observe in the atomic and sub-atomic world. More and more accurate and complex experiments have been devised and these are now going to test the paradigms of physics. Notable experiments include: the Large Hadronic Collider (LHC), which is going to shed light on the physics of the Standard Model of Particles and its extensions; the Planck-Herschel satellites, which target a very precise measurement of the properties of our Universe; and the Free Electron Lasers facilities, which produce high-brilliance, ultrafast X-ray pulses, allowing the investigation of the fundamental processes of solid state physics, chemistry, and biology. These projects are the result of huge collaborations spread across the world, involving scientists belonging to different and complementary research fields: physicists, chemists, biologists and others, keen to make the best of these extraordinary laboratories. Even though each branch of science is experiencing a process of growing specialization, it is very important to keep an eye on the global picture, remaining aware of the deep interconnections between inherent fields. This is even more crucial for students who are beginning their research careers. These considerations motivated PhD students and young post-docs connected to the Roman scientific research area to organize a conference, to establish the background and the network for interactions and collaborations. This resulted in the 1st and 2nd Young Researchers Meetings in Rome (http://ryrm.roma2.infn.it), one day conferences aimed primarily at graduate students and post-docs, working in physics in Italy and abroad. In its first two editions, the meeting was held at the Universities of Roma "Tor Vergata" (July 2009) and "LaSapienza" (February 2010), and organized in sections dedicated to up-to-date topics spanning broad research fields: Astrophysics-Cosmology, Soft-Condensed Matter Physics, Theoretical-Particle Physics, and Medical Physics. In these proceedings we have collected some of the contributions which were presented during the meetings.

  16. The Quantum Mechanics Solver: How to Apply Quantum Theory to Modern Physics, 2nd edition

    Energy Technology Data Exchange (ETDEWEB)

    Robbin, J M [School of Mathematics, University Walk, Bristol, BS8 1TW (United Kingdom)

    2007-07-20

    he hallmark of a good book of problems is that it allows you to become acquainted with an unfamiliar topic quickly and efficiently. The Quantum Mechanics Solver fits this description admirably. The book contains 27 problems based mainly on recent experimental developments, including neutrino oscillations, tests of Bell's inequality, Bose-Einstein condensates, and laser cooling and trapping of atoms, to name a few. Unlike many collections, in which problems are designed around a particular mathematical method, here each problem is devoted to a small group of phenomena or experiments. Most problems contain experimental data from the literature, and readers are asked to estimate parameters from the data, or compare theory to experiment, or both. Standard techniques (e.g., degenerate perturbation theory, addition of angular momentum, asymptotics of special functions) are introduced only as they are needed. The style is closer to a non-specialist seminar rather than an undergraduate lecture. The physical models are kept simple; the emphasis is on cultivating conceptual and qualitative understanding (although in many of the problems, the simple models fit the data quite well). Some less familiar theoretical techniques are introduced, e.g. a variational method for lower (not upper) bounds on ground-state energies for many-body systems with two-body interactions, which is then used to derive a surprisingly accurate relation between baryon and meson masses. The exposition is succinct but clear; the solutions can be read as worked examples if you don't want to do the problems yourself. Many problems have additional discussion on limitations and extensions of the theory, or further applications outside physics (e.g., the accuracy of GPS positioning in connection with atomic clocks; proton and ion tumor therapies in connection with the Bethe-Bloch formula for charged particles in solids). The problems use mainly non-relativistic quantum mechanics and are organised into three sections: Elementary Particles, Nuclei and Atoms; Quantum Entanglement and Measurement; and Complex Systems. The coverage is not comprehensive; there is little on scattering theory, for example, and some areas of recent interest, such as topological aspects of quantum mechanics and semiclassics, are not included. The problems are based on examination questions given at the Ecole Polytechnique in the last 15 years. The book is accessible to undergraduates, but working physicists should find it a delight. (Book review of 'The Quantum Mechanics Solver: How to Apply Quantum Theory to Modern Physics, 2nd edition', Jean-Loius Basdevant and Jean Dalibard , 2006 Springer, ISBN 978-3-540-27721-7)

  17. Comparison of elution efficiency of 99Mo/99mTc generator using theoretical and a free web based software method

    International Nuclear Information System (INIS)

    Full text: Generator is constructed on the principle of decay growth relationship between a long lived parent radionuclide and short lived daughter radionuclide. Difference in chemical properties of daughter and parent radionuclide helps in efficient separation of the two radionuclides. Aim and Objectives: The present study was designed to calculate the elution efficiency of the generator using the traditional formula based method and free web based software method. Materials and Methods: 99Mo/99mTc MON.TEK (Monrol, Gebze) generator and sterile 0.9% NaCl vial and vacuum vial in the lead shield were used for the elution. A new 99Mo/99mTc generator (calibrated activity 30GBq) calibrated for thursday was received on monday morning in our department. Generator was placed behind lead bricks in fume hood. The rubber plugs of both vacuum and 0.9% NaCl vial were wiped with 70% isopropyl alcohol swabs. Vacuum vial placed inside the lead shield was inserted in the vacuum position simultaneously 10 ml NaCl vial was inserted in the second slot. After 1-2 min vacuum vial was removed without moving the emptied 0.9%NaCl vial. The vacuum slot was covered with another sterile vial to maintain sterility. The RAC was measured in the calibrated dose calibrator (Capintec, 15 CRC). The elution efficiency was calculated theoretically and using free web based software (Apache Web server (www.apache.org) and PHP (www.php.net). Web site of the Italia PHP (www.php.net). Web site of the Italian Association of Nuclear Medicine and Molecular Imaging (www.aimn.it). Results: The mean elution efficiency calculated by theoretical method was 93.95% +0.61. The mean elution efficiency as calculated by the software was 92.85% + 0.89. There was no statistical difference in both the methods. Conclusion: The free web based software provides precise and reproducible results and thus saves time and mathematical calculation steps. This enables a rational use of available activity and also enabling a selection of the type and number of procedures to perform in a busy nuclear medicine department

  18. Free Open Source Software: FOSS Based GIS for Spatial Retrievals of Appropriate Locations for Ocean Energy Utilizing Electric Power Generation Plants

    Directory of Open Access Journals (Sweden)

    Kohei Arai

    2012-09-01

    Full Text Available Free Open Source Software: FOSS based Geographic Information System: GIS for spatial retrievals of appropriate locations for ocean wind and tidal motion utilizing electric power generation plants is proposed. Using scatterometer onboard earth observation satellites, strong wind coastal areas are retrieved with FOSS/GIS of PostgreSQL/GIS. PostGIS has to be modified together with altimeter and scatterometer database. These modification and database creation would be a good reference to the users who would like to create GIS system together with database with FOSS.

  19. Experiments and Demonstrations in Physics: Bar-Ilan Physics Laboratory (2nd Edition)

    Science.gov (United States)

    Kraftmakher, Yaakov

    2014-08-01

    The following sections are included: * Data-acquisition systems from PASCO * ScienceWorkshop 750 Interface and DataStudio software * 850 Universal Interface and Capstone software * Mass on spring * Torsional pendulum * Hooke's law * Characteristics of DC source * Digital storage oscilloscope * Charging and discharging a capacitor * Charge and energy stored in a capacitor * Speed of sound in air * Lissajous patterns * I-V characteristics * Light bulb * Short time intervals * Temperature measurements * Oersted's great discovery * Magnetic field measurements * Magnetic force * Magnetic braking * Curie's point I * Electric power in AC circuits * Faraday's law of induction I * Self-inductance and mutual inductance * Electromagnetic screening * LCR circuit I * Coupled LCR circuits * Probability functions * Photometric laws * Kirchhoff's rule for thermal radiation * Malus' law * Infrared radiation * Irradiance and illuminance

  20. Development of Hydrologic Characterization Technology of Fault Zones: Phase I, 2nd Report

    International Nuclear Information System (INIS)

    This is the year-end report of the 2nd year of the NUMO-LBNL collaborative project: Development of Hydrologic Characterization Technology of Fault Zones under NUMO-DOE/LBNL collaboration agreement, the task description of which can be found in the Appendix 3. Literature survey of published information on the relationship between geologic and hydrologic characteristics of faults was conducted. The survey concluded that it may be possible to classify faults by indicators based on various geometric and geologic attributes that may indirectly relate to the hydrologic property of faults. Analysis of existing information on the Wildcat Fault and its surrounding geology was performed. The Wildcat Fault is thought to be a strike-slip fault with a thrust component that runs along the eastern boundary of the Lawrence Berkeley National Laboratory. It is believed to be part of the Hayward Fault system but is considered inactive. Three trenches were excavated at carefully selected locations mainly based on the information from the past investigative work inside the LBNL property. At least one fault was encountered in all three trenches. Detailed trench mapping was conducted by CRIEPI (Central Research Institute for Electric Power Industries) and LBNL scientists. Some intriguing and puzzling discoveries were made that may contradict with the published work in the past. Predictions are made regarding the hydrologic property of the Wildcat Fault based on the analysis of fault structure. Preliminary conceptual models of the Wildcat Fault were proposed. The Wildcat Fault appears to have multiple splays and some low angled faults may be part of the flower structure. In parallel, surface geophysical investigations were conducted using electrical resistivity survey and seismic reflection profiling along three lines on the north and south of the LBNL site. Because of the steep terrain, it was difficult to find optimum locations for survey lines as it is desirable for them to be as straight as possible. One interpretation suggests that the Wildcat Fault is westerly dipping. This could imply that the Wildcat Fault may merge with the Hayward Fault at depth. However, due to the complex geology of the Berkeley Hills, multiple interpretations of the geophysical surveys are possible. iv An effort to construct a 3D GIS model is under way. The model will be used not so much for visualization of the existing data because only surface data are available thus far, but to conduct investigation of possible abutment relations of the buried formations offset by the fault. A 3D model would be useful to conduct 'what if' scenario testing to aid the selection of borehole drilling locations and configurations. Based on the information available thus far, a preliminary plan for borehole drilling is outlined. The basic strategy is to first drill boreholes on both sides of the fault without penetrating it. Borehole tests will be conducted in these boreholes to estimate the property of the fault. Possibly a slanted borehole will be drilled later to intersect the fault to confirm the findings from the boreholes that do not intersect the fault. Finally, the lessons learned from conducting the trenching and geophysical surveys are listed. It is believed that these lessons will be invaluable information for NUMO when it conducts preliminary investigations at yet-to-be selected candidate sites in Japan.

  1. Development of Hydrologic Characterization Technology of Fault Zones -- Phase I, 2nd Report

    Energy Technology Data Exchange (ETDEWEB)

    Karasaki, Kenzi; Onishi, Tiemi; Black, Bill; Biraud, Sebastien

    2009-03-31

    This is the year-end report of the 2nd year of the NUMO-LBNL collaborative project: Development of Hydrologic Characterization Technology of Fault Zones under NUMO-DOE/LBNL collaboration agreement, the task description of which can be found in the Appendix 3. Literature survey of published information on the relationship between geologic and hydrologic characteristics of faults was conducted. The survey concluded that it may be possible to classify faults by indicators based on various geometric and geologic attributes that may indirectly relate to the hydrologic property of faults. Analysis of existing information on the Wildcat Fault and its surrounding geology was performed. The Wildcat Fault is thought to be a strike-slip fault with a thrust component that runs along the eastern boundary of the Lawrence Berkeley National Laboratory. It is believed to be part of the Hayward Fault system but is considered inactive. Three trenches were excavated at carefully selected locations mainly based on the information from the past investigative work inside the LBNL property. At least one fault was encountered in all three trenches. Detailed trench mapping was conducted by CRIEPI (Central Research Institute for Electric Power Industries) and LBNL scientists. Some intriguing and puzzling discoveries were made that may contradict with the published work in the past. Predictions are made regarding the hydrologic property of the Wildcat Fault based on the analysis of fault structure. Preliminary conceptual models of the Wildcat Fault were proposed. The Wildcat Fault appears to have multiple splays and some low angled faults may be part of the flower structure. In parallel, surface geophysical investigations were conducted using electrical resistivity survey and seismic reflection profiling along three lines on the north and south of the LBNL site. Because of the steep terrain, it was difficult to find optimum locations for survey lines as it is desirable for them to be as straight as possible. One interpretation suggests that the Wildcat Fault is westerly dipping. This could imply that the Wildcat Fault may merge with the Hayward Fault at depth. However, due to the complex geology of the Berkeley Hills, multiple interpretations of the geophysical surveys are possible. iv An effort to construct a 3D GIS model is under way. The model will be used not so much for visualization of the existing data because only surface data are available thus far, but to conduct investigation of possible abutment relations of the buried formations offset by the fault. A 3D model would be useful to conduct 'what if' scenario testing to aid the selection of borehole drilling locations and configurations. Based on the information available thus far, a preliminary plan for borehole drilling is outlined. The basic strategy is to first drill boreholes on both sides of the fault without penetrating it. Borehole tests will be conducted in these boreholes to estimate the property of the fault. Possibly a slanted borehole will be drilled later to intersect the fault to confirm the findings from the boreholes that do not intersect the fault. Finally, the lessons learned from conducting the trenching and geophysical surveys are listed. It is believed that these lessons will be invaluable information for NUMO when it conducts preliminary investigations at yet-to-be selected candidate sites in Japan.

  2. FOREWORD: 2nd International Workshop on New Computational Methods for Inverse Problems (NCMIP 2012)

    Science.gov (United States)

    Blanc-Féraud, Laure; Joubert, Pierre-Yves

    2012-09-01

    Conference logo This volume of Journal of Physics: Conference Series is dedicated to the scientific contributions presented during the 2nd International Workshop on New Computational Methods for Inverse Problems, (NCMIP 2012). This workshop took place at Ecole Normale Supérieure de Cachan, in Cachan, France, on 15 May 2012, at the initiative of Institut Farman. The first edition of NCMIP also took place in Cachan, France, within the scope of the ValueTools Conference, in May 2011 (http://www.ncmip.org/2011/). The NCMIP Workshop focused on recent advances in the resolution of inverse problems. Indeed inverse problems appear in numerous scientific areas such as geophysics, biological and medical imaging, material and structure characterization, electrical, mechanical and civil engineering, and finance. The resolution of inverse problems consists of estimating the parameters of the observed system or structure from data collected by an instrumental sensing or imaging device. Its success firstly requires the collection of relevant observation data. It also requires accurate models describing the physical interactions between the instrumental device and the observed system, as well as the intrinsic properties of the solution itself. Finally, it requires the design of robust, accurate and efficient inversion algorithms. Advanced sensor arrays and imaging devices provide high rate and high volume data; in this context, the efficient resolution of the inverse problem requires the joint development of new models and inversion methods, taking computational and implementation aspects into account. During this one-day workshop, researchers had the opportunity to bring to light and share new techniques and results in the field of inverse problems. The topics of the workshop were: algorithms and computational aspects of inversion, Bayesian estimation, kernel methods, learning methods, convex optimization, free discontinuity problems, metamodels, proper orthogonal decomposition, reduced models for the inversion, non-linear inverse scattering, image reconstruction and restoration, applications (bio-medical imaging, non-destructive evaluation etc). NCMIP 2012 was a one-day workshop. Each of the submitted papers was reviewed by 2 to 4 reviewers. Among the accepted papers, there are 8 oral presentations and 5 posters. Three international speakers were invited for a long talk. This second edition attracted 60 registered attendees in May 2012. NCMIP 2012 was supported by Institut Farman (ENS Cachan) and endorsed by the following French research networks (GDR ISIS, GDR Ondes, GDR MOA, GDR MSPC). The program committee acknowledges the following laboratories CMLA, LMT, LSV, LURPA, SATIE, as well as DIGITEO Network. Laure Blanc-Féraud and Pierre-Yves Joubert Workshop Co-chairs Laure Blanc-Féraud, I3S laboratory, CNRS, France Pierre-Yves Joubert, IEF laboratory, Paris-Sud University, CNRS, France Technical Program Committee Alexandre Baussard, ENSTA Bretagne, Lab-STICC, France Marc Bonnet, ENSTA, ParisTech, France Jerôme Darbon, CMLA, ENS Cachan, CNRS, France Oliver Dorn, School of Mathematics, University of Manchester, UK Mário Figueiredo, Instituto Superior Técnico, Lisbon, Portugal Laurent Fribourg, LSV, ENS Cachan, CNRS, France Marc Lambert, L2S Laboratory, CNRS, SupElec, Paris-Sud University, France Anthony Quinn, Trinity College, Dublin, Ireland Christian Rey, LMT, ENS Cachan, CNRS, France Joachim Weickert, Saarland University, Germany Local Chair Alejandro Mottini, Morpheme group I3S-INRIA Sophie Abriet, SATIE, ENS Cachan, CNRS, France Béatrice Bacquet, SATIE, ENS Cachan, CNRS, France Reviewers Gilles Aubert, J-A Dieudonné Laboratory, CNRS and University of Nice-Sophia Antipolis, France Alexandre Baussard, ENSTA Bretagne, Lab-STICC, France Laure Blanc-Féraud, I3S laboratory, CNRS, France Marc Bonnet, ENSTA, ParisTech, France Jerôme Darbon, CMLA, ENS Cachan, CNRS, France Oliver Dorn, School of Mathematics, University of Manchester, UK Gérard Favier, I3S laboratory, CNRS, France Mário Figueiredo, Instituto Superior Técnico, Lisb

  3. Software Testing

    OpenAIRE

    Sarbjeet Singh; Sukhvinder singh; Gurpreet Singh

    2010-01-01

    Software goes through a cycle of software development stages. A software is envisioned, created, evaluated, fixed and then put to use. To run any software consistently without any failure/bug/error, the most important step is to test the software. This paper points various types of software testing(manual and automation), various software testing techniques like black box, white box, gray box, sanity, functional testing etc. and software test life cycle models (V-model and W-model). This pape...

  4. 2nd Generation RLV Risk Reduction Definition Program: Pratt & Whitney Propulsion Risk Reduction Requirements Program (TA-3 & TA-4)

    Science.gov (United States)

    Matlock, Steve

    2001-01-01

    This is the final report and addresses all of the work performed on this program. Specifically, it covers vehicle architecture background, definition of six baseline engine cycles, reliability baseline (space shuttle main engine QRAS), and component level reliability/performance/cost for the six baseline cycles, and selection of 3 cycles for further study. This report further addresses technology improvement selection and component level reliability/performance/cost for the three cycles selected for further study, as well as risk reduction plans, and recommendation for future studies.

  5. Research at the CEA in the field of safety in 2nd and 3rd generation light water reactors

    Science.gov (United States)

    Billot, Philippe

    2012-05-01

    The research programs at the CEA in the field of safety in nuclear reactors are carried out in a framework of international partnerships. Their purpose is to develop studies on: The methods allowing for the determination of earthquake hazards and their consequences; The behaviour of fuel in an accident situation; The comprehension of deflagration and detonation phenomena of hydrogen and the search for effective prevention methods involving an explosion risk; The cooling of corium in order to stop its progression in and outside the vessel thereby reducing the risk of perforating the basemat; The behaviour of the different fission product families according to their volatility for the UO2 and MOX fuels.

  6. Impacts of European Biofuel Policies on Agricultural Markets and Environment under Consideration of 2nd Generation Technologies and international Trade

    OpenAIRE

    Becker, A.; Adena?uer, Marcel; Blanco Fonseca, Maria

    2010-01-01

    Even though recent discussions on food prices and indirect land use change point at potential conflicts associated with the production of biofuels the appraisal of biofuels as an effective instrument to slow down climate change and reduce energy dependency still prevails. The EU Renewable Energy Directive (EUROPEAN COMMISSION, 2009) underlines this trend by setting a target of 10% share of energy from renewable sources in the transport sector by 2020. As economic competitiveness of biofuel pr...

  7. Licensing safety critical software

    International Nuclear Information System (INIS)

    Licensing difficulties with the shutdown system software at the Darlington Nuclear Generating Station contributed to delays in starting up the station. Even though the station has now been given approval by the Atomic Energy Control Board (AECB) to operate, the software issue has not disappeared - Ontario Hydro has been instructed by the AECB to redesign the software. This article attempts to explain why software based shutdown systems were chosen for Darlington, why there was so much difficulty licensing them, and what the implications are for other safety related software based applications

  8. Improving the performance of E-beam 2nd writing in mask alignment accuracy and pattern faultless for CPL technology

    Science.gov (United States)

    Lee, Booky; Hung, Richard; Lin, Orson; Wu, Yuan-Hsun; Kozuma, Makoto; Shih, Chiang-Lin; Hsu, Michael; Hsu, Stephen D.

    2005-01-01

    The chromeless phase lithography (CPL) is a potential technology for low k1 optical image. For the CPL technology, we can control the local transmission rate to get optimized through pitch imaging performance. The CPL use zebra pattern to manipulate the pattern local transmission as a tri-tone structure in mask manufacturing. It needs the 2nd level writing to create the zebra pattern. The zebra pattern must be small enough not to be printed out and the 2nd writing overlay accuracy must keep within 40nm. The request is a challenge to E-beam 2nd writing function. The focus of this paper is in how to improve the overlay accuracy and get a precise pattern to form accurate pattern transmission. To fulfill this work several items have been done. To check the possibility of contamination in E-Beam chamber by the conductive layer coating we monitor the particle count in the E-Beam chamber before and after the coated blank load-unload. The conductivity of our conductive layer has been checked to eliminate the charging effect by optimizing film thickness. The dimension of alignment mark has also been optimized through experimentation. And finally we checked the PR remain to ensure sufficient process window in our etching process. To verify the performance of our process we check the 3D SEM picture. Also we use AIMs to prove the resolution improvement capability in CPL compared to the traditional methods-Binary mask and Half Tone mask. The achieved overlay accuracy and process can provide promising approach for NGL reticle manufacturing of CPL technology.

  9. The 2nd to 4th digit ratio (2D:4D) and eating disorder diagnosis in women

    OpenAIRE

    Quinton, Stephanie Jane; Smith, April Rose; Joiner, Thomas

    2011-01-01

    Eating disorders are more common in females than in males and are believed to be caused, in part, by biological and hormonal factors. Digit ratio or 2D:4D (the ratio of the 2nd to the 4th digit) is considered to be a proxy for prenatal testosterone (PT) and prenatal oestrogen (PE) exposure. However, how 2D:4D may be related to type of eating pathology is unknown. The relationship between 2D:4D and eating disorder diagnosis was investigated in recovered and currently eating disordered (n=31) a...

  10. Order and disorder in Ca2ND0.90H0.10-A structural and thermal study

    International Nuclear Information System (INIS)

    The structure of calcium nitride hydride and its deuterided form has been re-examined at room temperature and studied at high temperature using neutron powder diffraction and thermal analysis. When synthesised at 600 deg. C, a mixture of both ordered and disordered Ca2ND0.90H0.10 phases results. The disordered phase is the minor component and has a primitive rocksalt structure (spacegroup Fm3m) with no ordering of D/N on the anion sites and the ordered phase is best described using the rhombohedral spacegroup R-3m with D and N arranged in alternate layers in (111) planes. This mixture of ordered and disordered phases exists up to 580 deg. C, at which the loss of deuterium yields Ca2ND0.85 with the disappearance of the disordered phase. In the new ordered phase there exists a similar content of vacancies on both anion sites; to achieve this balance, a little N transfers onto the D site, whereas there is no indication of D transferring onto the N-sites. These observations are thought to indicate that the D/N ordering is difficult to achieve with fully occupied anion sites. It has previously been reported that Ca2ND has an ordered cubic cell with alternating D and N sites in the [100] directions ; however, for the samples studied herein, there were clearly two coexisting phases with apparent broadening/splitting of the primitive peaks but not for the ordered peaks. The rhombohedral phase was in fact metric The rhombohedral phase was in fact metrically cubic; however, all the observed peaks were consistent with the rhombohedral unit cell with no peaks requiring the larger ordered cubic unit cell to be utilised. Furthermore this rhombohedral cell displays the same form of N-D ordering as the Sr and Ba analogues, which are metrically rhombohedral. - Graphical abstract: Ca2ND0.90H0.10 forms a mixture of ordered and disordered phases when synthesised at 600 deg. C. The ordered phase disappears at high temperature upon release of structural deuterium/hydrogen, leaving a single, partially disordered phase. Research highlights: ? Emergence of mixture of ordered and disordered phases at synthesis conditions used. ? Disordered phase disappears at high temperature, leaving partially ordered phase. ? Suggestion of ambiguity in anion ordering scheme.

  11. RECONSTRUCTING THE IDEA OF PRAYER SPACE: A CRITICAL ANALYSIS OF THE TEMPORARY PRAYING PLATFORM PROJECT OF 2ND YEAR ARCHITECTURE STUDENTS IN THE NATIONAL UNIVERSITY OF MALAYSIA (UKM

    Directory of Open Access Journals (Sweden)

    Nangkula Utaberta

    2013-12-01

    Full Text Available Abstract God created human as caliph on this earth. Caliph means leader, care-taker and guardian. Therefore humans have an obligation to maintain, preserve and conserve this natural for future generations. Today we see a lot of damage that occurs in the earth caused by human behavior. Islam saw the whole of nature as a place of prayer that must be maintained its cleanliness and purity. Therefore as Muslims we need to preserve nature as we keep our place of prayer. The main objective of this paper is to re-questioning and re-interpreting the idea of sustainability in Islamic Architecture through a critical analysis of first project of 2nd year architecture student of UKM which is the “Temporary Praying Platform”. The discussion itself will be divided into three (3 main parts. The first part will be discussing contemporary issues in Islamic Architecture especially in the design of Mosques while the second part will expand the framework of sustainability in Islamic Architecture. The last part will be analyzing some sample of design submission by 2nd year students of UKM on the temporary praying platform project. It is expected that this paper can start a further discussion on the inner meaning in Islam and how it was implemented in the design of praying spaces in the future. Keywords:  Sustainability, Islamic Architecture, Temporary Praying PlatformAbstrak Tuhan menciptakan manusia sebagai khalifah di muka bumi ini. Khalifah berarti pemimpin, penjaga dan wali. Oleh karena itu, manusia memiliki kewajiban untuk memelihara, menjaga dan melestarikan alam ini untuk generasi mendatang. Sekaranginikitatelahmelihat banyak kerusakan yang terjadi di bumi yang disebabkan oleh perilaku manusia itu sendiri yang disebutkan sebagai khalifah di bumi. Islam melihat seluruh alam sebagai tempat beribadah yang harus dijaga kebersihan dan kemurniannya, oleh karena itu, sebagai umat Islam adalah perlu melestarikan alam seperti menjaga tempat ibadah mereka. Tujuan utama dari makalah ini adalah untuk mempertanyakan dan menafsirkan kembali gagasan keberlanjutan (sustainable dalam Arsitektur Islam melalui analisis kritis tugas  pertama dari mahasiswa arsitektur angkatan  tahun  kedua dari Universiti Kebangsaan Malaysia (UKM, yaitu tugas perancangan " tempat beribadah sementara "atau “temporary praying platform” . Kajiandibagi menjadi tiga bagian utama. Bagian pertama akan membahas isu-isu kontemporer dalam Arsitektur Islam terutama dalam desain masjid. Kajian kedua adalah kerangka keberlanjutan dalam arsitektur Islam. Bagian ketiga adalah analisis dari beberapa sampel pengajuan desain oleh mahasiswa. Diharapkan tulisan ini dapat memulai diskusi lebih lanjut tentang makna batin dalam Islam dan bagaimana penerapannya dalam desain ruang beribadah yang sustainable. Kata kunci:Keberlanjutan, Arsitektur Islam, tempat beribadah sementara  

  12. Roles of doping ions in afterglow properties of blue CaAl{sub 2}O{sub 4}:Eu{sup 2+},Nd{sup 3+} phosphors

    Energy Technology Data Exchange (ETDEWEB)

    Wako, A.H., E-mail: wakoah@qwa.ufs.ac.za [Department of Physics, University of the Free State, QwaQwa Campus, Private Bag X13, Phuthaditjhaba 9866 (South Africa); Dejene, B.F. [Department of Physics, University of the Free State, QwaQwa Campus, Private Bag X13, Phuthaditjhaba 9866 (South Africa); Swart, H.C. [Department of Physics, University of the Free State, P.O. Box 339, Bloemfontein 9300 (South Africa)

    2014-04-15

    Eu{sup 2+} doped and Nd{sup 3+} co-doped calcium aluminate (CaAl{sub 2}O{sub 4}:Eu{sup 2+},Nd{sup 3+}) phosphor was prepared by a urea-nitrate solution combustion method at furnace temperatures as low as 500 °C. The produced CaAl{sub 2}O{sub 4}:Eu{sup 2+},Nd{sup 3+} powder was investigated in terms of phase composition, morphology and luminescence by X-Ray diffraction (XRD), Scanning Electron Microscope (SEM), Fourier Transform Infra Red spectroscopy (FTIR) and Photoluminescence (PL) techniques respectively. XRD analysis depicts a dominant monoclinic phase that indicates no change in the crystalline structure of the phosphor with varying concentration of Eu{sup 2+} and Nd{sup 3+}. SEM results show agglomerates with non-uniform shapes and sizes with a number of irregular network structures having lots of voids and pores. The Energy Dispersive X-ray Spectroscopy (EDS) and (FTIR) spectra confirm the expected chemical components of the phosphor. PL measurements indicated one broadband excitation spectra from 200 to 300 nm centered around 240 nm corresponding to the crystal field splitting of the Eu{sup 2+} d-orbital and an emission spectrum in the blue region with a maximum on 440 nm. This is a strong indication that there was dominantly one luminescence center, Eu{sup 2+} which represents emission from transitions between the 4f{sup 7} ground state and the 4f{sup 6}–5d{sup 1} excited state configuration. High concentrations of Eu{sup 2+} and Nd{sup 3+} generally reduce both intensity and lifetime of the phosphor powders. The optimized content of Eu{sup 2+} is 1 mol% and for Nd{sup 3+} is 1 mol% for the obtained phosphors with excellent optical properties. The phosphor also emits visible light at around 587 and 616 nm. Such emissions can be ascribed to the {sup 5}D{sub 0}–{sup 7}F{sub 1} and {sup 5}D{sub 0}–{sup 7}F{sub 2} intrinsic transition of Eu{sup 3+} respectively. The decay characteristics exhibit a significant rise in initial intensity with increasing Eu{sup 2+} doping concentration while the decay time increased with Nd{sup 3+} co-doping. The observed afterglow can be ascribed to the generation of suitable traps due to the presence of the Nd{sup 3+} ions.

  13. The MeqTrees software system and its use for third-generation calibration of radio interferometers

    CERN Document Server

    Noordam, Jan E; 10.1051/0004-6361/201015013

    2011-01-01

    The formulation of the radio interferometer measurement equation (RIME) by Hamaker et al. has provided us with an elegant mathematical apparatus for better understanding, simulation and calibration of existing and future instruments. The calibration of the new radio telescopes (LOFAR, SKA) would be unthinkable without the RIME formalism, and new software to exploit it. MeqTrees is designed to implement numerical models such as the RIME, and to solve for arbitrary subsets of their parameters. The technical goal of MeqTrees is to provide a tool for rapid implementation of such models, while offering performance comparable to hand-written code. We are also pursuing the wider goal of increasing the rate of evolution of radio astronomical software, by offering a tool for rapid experimentation and exchange of ideas. MeqTrees is implemented as a Python-based front-end called the meqbrowser, and an efficient (C++-based) computational back-end called the meqserver. Numerical models are defined on the front-end via a P...

  14. Early prediction for necessity of 2nd I-131 ablation therapy with serum thyroglobulin levels in patients with differentiated thyroid cancer

    International Nuclear Information System (INIS)

    The aim of our study was to evaluate the predictive value of serum thyroglobulin levels, measured at preoperative status and just before 1st I-131 ablation therapy with high serum TSH, for necessity of 2nd I-131 ablation therapy in differentiated thyroid cancer patients. 111 patients with DTC who underwent total or near total thyroidectomy followed by immediate I-131 ablation therapy, were enrolled in this study. TSH, Tg and anti-Tg autoantibody were measured before thyroidectomy (TSHpreop, Tgpreop and Anti-Tgpreop) and just before 1st I-131 ablation therapy (TSHabl, Tgabl and Anti-Tgabl). All TSHabl levels were above 30mU/liter, ATg [(Tgpreop-Tgabl)X100/(Tgpreop)] was calculated. 29 patients(26.1%, 29/111) had to receive 2nd I-131 ablation therapy. Of 70 patients whose Tgabl were under 10 ng/ml, only 11 patients had received 2nd I-131 ablation therapy (15.7%). Patients with Tgabl greater than or equal to 10 ng/ml had received 2nd I-131 ablation therapy (18/41, 43.9%) than patients with lower Tgabl level. There was a disparity of necessity of 2nd I-131 ablation therapy between two groups(Tgabl <10 ng/ml and Tgabl =10 ng/ml, two by two /2 test p=0.0016). Of 41 patients with Tgabl greater than or equal to 10 ng/ml, 19 patients showed increased Tg levels (ATg<0). Patients with negative ATg and Tgabl greater than or equal to 10 ng/ml showed a strikingly high necessity of 2nd I-131 ablation therapy (11/19, 57.9%). There was also a significant disparity of necessity of 2nd significant disparity of necessity of 2nd I-131 ablation therapy between two groups(ATg<0 + Tgabl =10 ng/ml and the others, two by two /2 test, p=0.0012). These results suggest that high Tgabl level just before 1st I-131 ablation therapy can forecast the necessity of 2nd I-131 ablation therapy. Moreover, Difference of Tg level between preoperative status and just before 1st I-131 ablation therapy could also suggest necessity of 2nd I-131 ablation therapy at early period of DTC patients surveillance

  15. Determination of Conceptions of Secondary 10th Grade Students About Torque, Angular Momentum and Kepler’s 2nd Law

    Directory of Open Access Journals (Sweden)

    Ayberk Bostan Sar?o?lan

    2013-06-01

    Full Text Available Prior knowledge that students have can be compatible with scientific facts or it can be a learning disability different from the scientific facts. So it is important to reveal students' prior knowledge and to determine their ideas that are not compatible with scientific information. In this study, it has been aimed to reveal the prior knowledge of 133 tenth grade students about torque, conservation of angular momentum and Kepler's 2nd law. Students taking place in the sample have not taken formal education about these concepts yet and their ideas may be naive. The students have been asked three open-ended questions whose study of validity and reliability has been carried out. While it is seen that students gave answers consistent with the scientific facts about the torque concept, none of the answers given by students about the conservation of angular momentum and Kepler's 2nd law is included in the category of scientific answer. Students have been encountered the most common type of alternative answers for both of these concepts. In the teaching of this concept and law, students' prior knowledge conflicting with the scientific ideas should be considered, that is to say, teaching should be organized to include conceptual change.

  16. D Modeling of Headstones of the 2ND and 3RD Century by Low Cost Photogrammetric Techniques

    Science.gov (United States)

    Landes, T.; Waton, M.-D.; Alby, E.; Gourvez, S.; Lopes, B.

    2013-07-01

    As a dozen headstones have been discovered during excavations in south Alsace, archaeologists stored them in the Regional Directorate of Cultural Affairs in Strasbourg. In order to complete the survey they are used to practice by hand on the steles, they asked the INSA Strasbourg to reconstruct at least the 7 figured sandstones in 3D. The high accuracy required by the archaeologists can be reached by an expensive technique using laserscanning system. Aim of the current work is to look for an alternative method and (if appropriate) low cost software allowing to provide a similar quality and a sufficient level of details. The 3D reconstruction of the headstones based exclusively on multiple images processing is presented. The step of point cloud generation is detailed because it determines the final product quality. Therefore, an assessment of the produced point cloud has been performed through comparison to a reference point cloud obtained by laser scanning technique. The steps leading to the photo-realistic textured 3D models of the headstones are presented and the software used for that are evaluated. The final product respects the accuracy requirement of 1 mm desired by the archaeologists.

  17. Software Engineering

    Science.gov (United States)

    Dr Gene Tagliarini

    CSC 450. Software Engineering (3) Prerequisite: CSC 332 and senior standing. Study of the design and production of large and small software systems. Topics include systems engineering, software life-cycle and characterization; use of software tools. Substantial software project required.

  18. FBG application in bridge health monitoring system of Wuhan Yangtze River 2nd Bridge

    Science.gov (United States)

    Liu, Jun

    2009-10-01

    For the traditional resistance strain sensor's shortage,for example,their stability,durability and monitoring scorpe could not satisfied the requirement of bridge monitoring system,put forward to adopt advanced fiber-optic bragg grating sensor and its technology to build up the bridge health monitoring system.Analysed the application scope and aim of difference kinds of fiber bragg grating sensors used, including stress strain sensors,temperature sensors,crack sensors and testing force ring sensors.According to the key construction sections as the project designed,expatiated their specific installation methods and construction craftwork in Wuhan Yangtze River 2th bridge.It formed distributed fiber sensing network of bridge with large-scale located fiber-optic bragg grating sensors, and pointed out the becareful prodeeding when to link no more than twenty sensors to one fiber for the demodulator precision.Discussed how to construct the data acquisition system and its function via the sensors and their demodulator.One fiber of the linked sensors connected to one channel of the demodulator and all together reached sixteen channels.The demodulator were connected to the switch through rj45 interface and communicated with the acquisition server.Designed the software program of data acquisition software system and the database,which used the Sqlite of the embed database to storage the configure information and receive the data through the TCP/IP protocol.It has been build a bridge health monitoring system base on fiber bragg grating technology.

  19. User Manual for Beta Version of TURBO-GRD: A Software System for Interactive Two-Dimensional Boundary/ Field Grid Generation, Modification, and Refinement

    Science.gov (United States)

    Choo, Yung K.; Slater, John W.; Henderson, Todd L.; Bidwell, Colin S.; Braun, Donald C.; Chung, Joongkee

    1998-01-01

    TURBO-GRD is a software system for interactive two-dimensional boundary/field grid generation. modification, and refinement. Its features allow users to explicitly control grid quality locally and globally. The grid control can be achieved interactively by using control points that the user picks and moves on the workstation monitor or by direct stretching and refining. The techniques used in the code are the control point form of algebraic grid generation, a damped cubic spline for edge meshing and parametric mapping between physical and computational domains. It also performs elliptic grid smoothing and free-form boundary control for boundary geometry manipulation. Internal block boundaries are constructed and shaped by using Bezier curve. Because TURBO-GRD is a highly interactive code, users can read in an initial solution, display its solution contour in the background of the grid and control net, and exercise grid modification using the solution contour as a guide. This process can be called an interactive solution-adaptive grid generation.

  20. GENERATION OF A VECTOR OF NODAL FORCES PRODUCED BY LOADS PRE-SET BY THE ARBITRARY SCULPTED SURFACE DESIGNATED FOR UNIVERSAL STRESS ANALYSIS SOFTWARE

    Directory of Open Access Journals (Sweden)

    Shaposhnikov Nikolay Nikolaevich

    2012-10-01

    A user may select the surface accommodating any simulated arbitrary load; further, a point of the pre-set load intensity specified in the Distributed Load Q field of interface window Distributed Loads, and the point of zero intensity load are to be specified. The above source data are used to calculate the scale coefficient of transition from linear distances to the real value of the load intensity generated within the coordinate surface. The point of zero load intensity represents a virtual plane of zero distributed load values. The proposed software designated for the conversion of arbitrary distributed loads into the nodal load is compact; therefore, it may be integrated into modules capable of exporting the nodal load into other systems of strength analysis, though functioning as a problem-oriented geometrical utility of AutoCAD.

  1. Promoting concrete algorithm for implementation in computer system and data movement in terms of software reuse to generate actual values suitable for different access

    Directory of Open Access Journals (Sweden)

    Nderim Zeqiri

    2013-04-01

    Full Text Available The construction of functional algorithms by a good line and programming, open new routes and in the same time increase the capability to use them in the Mechatronics systems with specific and reliability system for any practical implementation and by justification in aspect of the economy context, and in terms of maintenance, making it more stable etc. This flexibility is really a possibility for the new approach and by makes the program code an easy way for updating data and In many cases is needed a quick access method which is which is specified in the context of generating appropriate values for digital systems. This forms, is opening a new space and better management to manage a respective values of a program code, and for software reuse, because this solution reduce costs and has a positive effect in terms of a digital economy.

  2. MYOB software for dummies

    CERN Document Server

    Curtis, Veechi

    2012-01-01

    Your complete guide to MYOB® AccountRight softwareNow in its seventh edition, MYOB® Software For Dummies walks you through everything you need to know, from starting your MYOB® file from scratch and recording payments and receipts, to tracking profit and analysing sales. This new edition includes all the information you need on the new generation of MYOB® AccountRight software, including the new cloud computing features. Set up MYOB® software - understand how to make it work the first time Keep track of purchases and sales - monitor customer accounts and ensure you get pai

  3. Fractally Generated Microstrip Bandpass Filter Designs Basedon Dual-Mode Square Ring Resonator for WirelessCommunication Systems

    Directory of Open Access Journals (Sweden)

    Jawad K. Ali

    2008-01-01

    Full Text Available A novel fractal design scheme has been introduced in this paper to generate microstrip bandpass filter designs with miniaturized sizes for wireless applications. The presented fractal scheme is based on Minkowski-like prefractal geometry. The space-filling property and self-similarity of this fractal geometry has found to produce reduced size symmetrical structures corresponding to the successive iteration levels. The resulting filter designs are with sizes suitable for use in modern wireless communication systems. The performance of each of the generated bandpass filter structures up to the 2nd iteration has been analyzed using a method of moments (MoM based software IE3D, which is widely adopted in microwave research and industry. Results show that these filters possess good transmission and return loss characteristics, besides the miniaturized sizes meeting the design specifications of most of wireless communication systems

  4. [Near-infrared fluorescence and energy transfer in Sr2SiO4 : Eu2+, Nd3+].

    Science.gov (United States)

    Yang, Li-Li; Wan, Wen-Jiao; Li, Jin-Qing; Meng, Jian-Xin

    2011-01-01

    Sr2 SiO4 : Eu2+ , Nd3+ was synthesized by solid state synthesis method, and the sensitization of Nd3+ near-infrared luminescence by Eu2+ was investigated. The characteristic near-infrared luminescence of Nd2+ in Sr2 SiO4 matrix was greatly enhanced by co-doping of Eu2+. The mechanism of energy transfer from Eu2+ to Nd3+ was analyzed through investigation of fluorescence excitation and emission spectra, and fluorescence lifetime. Excited Eu2+ (II) transfers energy effectively to Nd3+ through non-radiative energy transfer process in Sr2SiO4, resulting in greatly enhanced near-infrared luminescence of Nd3+, while sensitization of Nd3+ by Eu2+ (I) was achieved by the way of Eu2+ (II). PMID:21428052

  5. Anatomy of a 2nd-order unconformity: stratigraphy and facies of the Bakken formation during basin realignment

    Energy Technology Data Exchange (ETDEWEB)

    Skinner, Orion; Canter, Lyn; Sonnenfeld, Mark; Williams, Mark [Whiting Oil and Gas Corp., Denver, CO (United States)

    2011-07-01

    Because classic Laramide compressional structures are relatively rare, the Williston Basin is often considered as structurally simple, but because of the presence of numerous sub-basins, simplistic lithofacies generalization is impossible, and detailed facies mapping is necessary to unravel Middle Bakken paleogeography. The unconformity above the Devonian Three Forks is explained by the infilling and destruction of the Devonian Elk Point basin, prepares the Bakken system, and introduces a Mississippian Williston Basin with a very different configuration. Black shales are too often considered as deposits that can only be found in deep water, but to a very different conclusion must be drawn after a review of stratigraphic geometry and facies successions. The whole Bakken is a 2nd-order lowstand to transgressive systems tract lying below the basal Lodgepole, which represents an interval of maximal flooding. This lowstand to transgressive stratigraphic context explains why the sedimentary process and provenance shows high aerial variability.

  6. Study on microstructure and properties of extruded Mg-2Nd-0.2Zn alloy as potential biodegradable implant material.

    Science.gov (United States)

    Li, Junlei; Tan, Lili; Wan, Peng; Yu, Xiaoming; Yang, Ke

    2015-04-01

    Mg-2Nd-0.2Zn (NZ20) alloy was prepared for the application as biodegradable implant material in this study. The effects of the extrusion process on microstructure, mechanical and corrosion properties of the alloy were investigated. The as-cast alloy was composed of ?-Mg matrix and Mg12Nd eutectic compound. The solution treatment could lead to the Mg12Nd phase dissolution and the grain coarsening. The alloy (E1) preheated at 380°C for 1h and extruded at 390°C presents fine grains with amounts of tiny Mg12Nd particles uniformly dispersed throughout the boundaries and the interior of the grains. The alloy (E2) preheated at 480°C for 1h and extruded at 500°C exhibits relatively larger grains with few nano-scale Mg12Nd phase particles dispersed. The alloy of E1, compared with E2, showed relatively lower corrosion rate, higher yield strength and slightly lower elongation. PMID:25686968

  7. Use of 2nd and 3rd Level Correlation Analysis for Studying Degradation in Polycrystalline Thin-Film Solar Cells

    Energy Technology Data Exchange (ETDEWEB)

    Albin, D. S.; del Cueto, J. A.; Demtsu, S. H.; Bansal, S.

    2011-03-01

    The correlation of stress-induced changes in the performance of laboratory-made CdTe solar cells with various 2nd and 3rd level metrics is discussed. The overall behavior of aggregated data showing how cell efficiency changes as a function of open-circuit voltage (Voc), short-circuit current density (Jsc), and fill factor (FF) is explained using a two-diode, PSpice model in which degradation is simulated by systematically changing model parameters. FF shows the highest correlation with performance during stress, and is subsequently shown to be most affected by shunt resistance, recombination and in some cases voltage-dependent collection. Large decreases in Jsc as well as increasing rates of Voc degradation are related to voltage-dependent collection effects and catastrophic shunting respectively. Large decreases in Voc in the absence of catastrophic shunting are attributed to increased recombination. The relevance of capacitance-derived data correlated with both Voc and FF is discussed.

  8. Effect of the nanocrystalline structure type on the optical properties of TiO2:Nd (1 at.%) thin films

    Science.gov (United States)

    Mazur, Michal; Wojcieszak, Damian; Kaczmarek, Danuta; Domaradzki, Jaroslaw; Zatryb, Grzegorz; Misiewicz, Jan; Morgiel, Jerzy

    2015-04-01

    Titanium dioxide thin films, each doped with the same amount of neodymium (1 at.%) were deposited by Low Pressure Hot Target Reactive Sputtering and High Energy Reactive Magnetron Sputtering processes in order to obtain anatase and rutile thin film structures respectively. The microstructure and phase composition were analyzed using the transmission electron microscopy method including high resolution electron microscopy imaging. The measurements of the optical properties showed, that both prepared thin films were transparent in the visible light range and had a low extinction coefficient of ca. 3 ? 10-3. The thin film with the anatase structure had a lower cut-off wavelength and refractive index and a higher value of optical energy band gap as-compared to the TiO2:Nd coating with the rutile structure. Simultaneously, more efficient photoluminescence emission was observed for the rutile thin films.

  9. Testing the 2nd degree local polynomical approximation method using the calculations of fast power reactor two-dimensional models

    International Nuclear Information System (INIS)

    A 2nd order local polynomial approximation method (LPA2) is compared with the method that is at present used by UJV as a standard tool for fast breeder reactor neutron calculations. The comparison of the results of two-dimensional cylindrical fast reactor benchmark calculations by both method leads to the following conclusions: a) from the point of view of computational accuracy LPA2 and standard methods are equivalent; b) from the point of view of machine time consumption the LPA2 method is clearly superior. In actual situations the LPA2 method is 2.5 to 5 times faster for few-group (approximately 4) and 10 to 20 times faster for multi-group (approximately 26) calculations than the standard method. (author)

  10. A summary of the 2nd workshop on Human Resources Development (HRD) in the nuclear field in Asia. FY2000

    International Nuclear Information System (INIS)

    The Human Resources Development (HRD) Project was added in 1999 as a Cooperation Activity of 'the Forum for Nuclear Cooperation in Asia (FNCA)' which is organized by Nuclear Committee. The HRD Project supports to solidify the foundation of nuclear development utilization in Asia by promoting human resources development in Asian countries. The principal activity of the HRD Project is to hold the Workshop on Human Resources Development in the Nuclear Field in Asia once a year. The objective of the Workshop is to clarify problems and needs of the human resources development of each country and to support it mutually by exchanging information etc. The report consists of a summary of the 2nd Workshop on Human Resources Development in the Nuclear Field in Asia held on November 27 and 28, 2000 at Tokai Research Establishment of JAERI. (author)

  11. Methods for designing algorithms of electric equipment protection in nuclear power plants at the 2nd level of protection

    International Nuclear Information System (INIS)

    The questions are discussed of equipment protection systems in nuclear power plants as part of the automated process control systems in these plants. The electrical equipment protection system is discussed with respect to protection algorithms at the 2nd protection level. Three methods can be applied in designing the algorithms or in investigating the operating condition of the equipment in a two-value state space. The characteristics, benefits and constraints of the methods are analyzed in detail. The methods include the decision table method, the failure weight method and the failure tree method. A comparison of the methods shows that for the above purpose, the decision table method and the failure weight method are most suitable. The decision table method is also most suitable for designing the power part of the protection algorithm. (Z.M.). 3 figs., 1 tab., 2 refs

  12. A summary of the 2nd workshop on Human Resources Development (HRD) in the nuclear field in Asia. FY2000

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-06-01

    The Human Resources Development (HRD) Project was added in 1999 as a Cooperation Activity of 'the Forum for Nuclear Cooperation in Asia (FNCA)' which is organized by Nuclear Committee. The HRD Project supports to solidify the foundation of nuclear development utilization in Asia by promoting human resources development in Asian countries. The principal activity of the HRD Project is to hold the Workshop on Human Resources Development in the Nuclear Field in Asia once a year. The objective of the Workshop is to clarify problems and needs of the human resources development of each country and to support it mutually by exchanging information etc. The report consists of a summary of the 2nd Workshop on Human Resources Development in the Nuclear Field in Asia held on November 27 and 28, 2000 at Tokai Research Establishment of JAERI. (author)

  13. Software Engineering Program: Software Process Improvement Guidebook

    Science.gov (United States)

    1996-01-01

    The purpose of this document is to provide experience-based guidance in implementing a software process improvement program in any NASA software development or maintenance community. This guidebook details how to define, operate, and implement a working software process improvement program. It describes the concept of the software process improvement program and its basic organizational components. It then describes the structure, organization, and operation of the software process improvement program, illustrating all these concepts with specific NASA examples. The information presented in the document is derived from the experiences of several NASA software organizations, including the SEL, the SEAL, and the SORCE. Their experiences reflect many of the elements of software process improvement within NASA. This guidebook presents lessons learned in a form usable by anyone considering establishing a software process improvement program within his or her own environment. This guidebook attempts to balance general and detailed information. It provides material general enough to be usable by NASA organizations whose characteristics do not directly match those of the sources of the information and models presented herein. It also keeps the ideas sufficiently close to the sources of the practical experiences that have generated the models and information.

  14. Laparoscopic hepatectomy is theoretically better than open hepatectomy: preparing for the 2nd International Consensus Conference on Laparoscopic Liver Resection.

    Science.gov (United States)

    Wakabayashi, Go; Cherqui, Daniel; Geller, David A; Han, Ho-Seong; Kaneko, Hironori; Buell, Joseph F

    2014-10-01

    Six years have passed since the first International Consensus Conference on Laparoscopic Liver Resection was held. This comparatively new surgical technique has evolved since then and is rapidly being adopted worldwide. We compared the theoretical differences between open and laparoscopic liver resection, using right hepatectomy as an example. We also searched the Cochrane Library using the keyword "laparoscopic liver resection." The papers retrieved through the search were reviewed, categorized, and applied to the clinical questions that will be discussed at the 2nd Consensus Conference. The laparoscopic hepatectomy procedure is more difficult to master than the open hepatectomy procedure because of the movement restrictions imposed upon us when we operate from outside the body cavity. However, good visibility of the operative field around the liver, which is located beneath the costal arch, and the magnifying provide for neat transection of the hepatic parenchyma. Another theoretical advantage is that pneumoperitoneum pressure reduces hemorrhage from the hepatic vein. The literature search turned up 67 papers, 23 of which we excluded, leaving only 44. Two randomized controlled trials (RCTs) are underway, but their results are yet to be published. Most of the studies (n = 15) concerned short-term results, with some addressing long-term results (n = 7), cost (n = 6), energy devices (n = 4), and so on. Laparoscopic hepatectomy is theoretically superior to open hepatectomy in terms of good visibility of the operative field due to the magnifying effect and reduced hemorrhage from the hepatic vein due to pneumoperitoneum pressure. However, there is as yet no evidence from previous studies to back this up in terms of short-term and long-term results. The 2nd International Consensus Conference on Laparoscopic Liver Resection will arrive at a consensus on the basis of the best available evidence, with video presentations focusing on surgical techniques and the publication of guidelines for the standardization of procedures based on the experience of experts. PMID:25130985

  15. [Optimization of the Barron ligature treatment of 2nd and 3rd-degree hemorrhoids using a therapeutic troika].

    Science.gov (United States)

    Förster, C F; Süssmann, H E; Patzelt-Wenczler, R

    1996-11-12

    Due to the fact that the intensity of haemorrhoidal complaints may rapidly change, also numerous therapeutic approaches of minor effectiveness are considered a helpful remedy. However, the advantage of the Barron-ligature is not seriously doubted. By placing it correctly at the insensitive distal rectum, haemorrhoidal operations are only necessary in very advanced stages. Can the Barron ligature be optimized even more? Three patient groups consisting of 120 patients with 2nd degree haemorrhoids who were simultaneously treated by anal dilation using an appropriate lubricant for the anal dilator, were compared with each other in a randomized, open, placebo-controlled study conducted in two centres. In these groups treatment consisted of: rubber-band ligature alone rubber-band ligature and anal dilator and Kamillosan ointment rubber-band ligature and anal dilator and vaseline The observation period comprised six weeks. Every two weeks a check was made. Assessment criteria were: light-red haemorrhage, itching, oozing, sensation of incomplete evacuation, nodal prolapse and slight staining after defecation The pressure ratios of the closing apparatus were investigated at the beginning and end of the study. The group who had been treated with rubber-band ligature, anal dilator and Kamillosan ointment showed the best results. By simultaneously applying the rubber-band ligature, anal dilator and Kamillosan ointment as a lubricant, significantly better results could be obtained. The findings are based on a former retrospective study carried out in 500 patients with 2nd degree haemorrhoids. In this study by applying the anal dilator and Kamillosan ointment, the number of treatments could significantly be reduced from 5.95 to 4.2 and the number of necessary ligatures from 3.8 to 2.76 which, also from the economic point of view, was favourable. PMID:8984570

  16. GENII (Generation II): The Hanford Environmental Radiation Dosimetry Software System: Volume 3, Code maintenance manual: Hanford Environmental Dosimetry Upgrade Project

    Energy Technology Data Exchange (ETDEWEB)

    Napier, B.A.; Peloquin, R.A.; Strenge, D.L.; Ramsdell, J.V.

    1988-09-01

    The Hanford Environmental Dosimetry Upgrade Project was undertaken to incorporate the internal dosimetry models recommended by the International Commission on Radiological Protection (ICRP) in updated versions of the environmental pathway analysis models used at Hanford. The resulting second generation of Hanford environmental dosimetry computer codes is compiled in the Hanford Environmental Dosimetry System (Generation II, or GENII). This coupled system of computer codes is intended for analysis of environmental contamination resulting from acute or chronic releases to, or initial contamination of, air, water, or soil, on through the calculation of radiation doses to individuals or populations. GENII is described in three volumes of documentation. This volume is a Code Maintenance Manual for the serious user, including code logic diagrams, global dictionary, worksheets to assist with hand calculations, and listings of the code and its associated data libraries. The first volume describes the theoretical considerations of the system. The second volume is a Users' Manual, providing code structure, users' instructions, required system configurations, and QA-related topics. 7 figs., 5 tabs.

  17. Operation of Finnish nuclear power plants. Quarterly report, 2nd quarter 1996

    International Nuclear Information System (INIS)

    Quarterly Reports on the operation of Finnish nuclear power plants describe events and observations relating to nuclear and radiation safety which the Finnish Centre for Radiation and Nuclear Safety (STUK) considers safety significant. Safety improvements at the plants are also described. The report also includes a summary of the radiation safety of plant personnel and of the environment and tabulated data on the plants' production and load factors. In the second quarter of 1996, the Finnish nuclear power plant units were in power operation except for the annual maintenance outages of TVO plant units and the Midsummer shutdown at TVO II which was due to low electricity demand, a turbine generator inspection and repairs. The load factor average of all plant units was 88.9 %. Events in the second quarter of 1996 were classified level 0 on the International Nuclear Event Scale (INES)

  18. Development of a radioactive waste treatment equipment utilizing microwave heating, 2nd report

    International Nuclear Information System (INIS)

    The objective of the present study is to establish an incineration technique utilizing microwave heating which enables a high volume reduction of spent ion-exchange resins and filtering media generated at nuclear facilities. The past three years from 1982 to 1985, with the financial aid from the Agency of Science and Technology, brought a great and rapid progress to this project when the heating technique was switched from direct microwave heating to indirect heating by employing a bed of beads of silicon carbide. This material was also used to build a secondary furnace, walls and roster bars, to treat the obnoxious gases and soot arising in the primary incineration process by the radiating heat of this material heated to above 1000 deg C again by microwave energy, but not by the originarily applied direct plasma torch combustion. The incinerator and the secondary furnace were integrated into one unit as the principal treating equipment. This novel approach made possible a well stabilized continuous incineration operation. Further, developmental efforts toward industrial applications were made by setting up a pilot plant with microwave generators, 2 sets of 5 kW of 2450 MHz and 1 set of 25 kW of 915 MHz, and tests were carried out to prove remarkably high volume reduction capability well above roughly 200 on weight basis. For hot test runs, a one - tenth scale pilot test setup was installed at the TOKAI Laboratory of Japan Atmic Energy Research Institute and tested wiic Energy Research Institute and tested with materials spiked with radioisotopes and also with spent ion-exchange resins stored there. Very satisfactory results were obtained in these proving tests to show the efficient capability of high volume reduction treatment of otherwise stable radioactive waste materials such as spent ion-exchange resins. (author)

  19. Multi-atom resonant photoemission and the development of next-generation software and high-speed detectors for electron spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Kay, Alexander William

    2000-09-01

    This dissertation has involved the exploration of a new effect in photoelectron emission, multi-atom resonant photoemission (MARPE), as well as the development of new software, data analysis techniques, and detectors of general use in such research. We present experimental and theoretical results related to MARPE, in which the photoelectron intensity from a core level on one atom is influenced by a core-level absorption resonance on another. We point out that some of our and others prior experimental data has been strongly influenced by detector non-linearity and that the effects seen in new corrected data are smaller and of different form. Corrected data for the MnO(001) system with resonance between the O 1s and Mn 2p energy levels are found to be well described by an extension of well-known intraatomic resonant photoemission theory to the interatomic case, provided that interactions beyond the usual second-order Kramers-Heisenberg treatment are included. This theory is also found to simplify under certain conditions so as to yield results equivalent to a classical x-ray optical approach, with the latter providing an accurate and alternative, although less detailed and general, physical picture of these effects. Possible future applications of MARPE as a new probe of near-neighbor identities and bonding and its relationship to other known effects are also discussed. We also consider in detail specially written data acquisition software that has been used for most of the measurements reported here. This software has been used with an existing experimental system to develop the method of detector characterization and then data correction required for the work described above. The development of a next generation one-dimensional, high-speed, electron detector is also discussed. Our goal has been to design, build and test a prototype high-performance, one-dimensional pulse-counting detector that represents a significant advancement in detector technology and is well matched to modern high-brightness synchrotrons radiation sources and high-transmission electron-energy analyzers as typically used in photoelectron spectroscopy experiments. The general design of the detector and the results of initial tests are discussed and the acquisition of photoelectron spectra with the first test detector is described.

  20. Multi-atom resonant photoemission and the development of next-generation software and high-speed detectors for electron spectroscopy

    International Nuclear Information System (INIS)

    This dissertation has involved the exploration of a new effect in photoelectron emission, multi-atom resonant photoemission (MARPE), as well as the development of new software, data analysis techniques, and detectors of general use in such research. We present experimental and theoretical results related to MARPE, in which the photoelectron intensity from a core level on one atom is influenced by a core-level absorption resonance on another. We point out that some of our and others prior experimental data has been strongly influenced by detector non-linearity and that the effects seen in new corrected data are smaller and of different form. Corrected data for the MnO(001) system with resonance between the O 1s and Mn 2p energy levels are found to be well described by an extension of well-known intraatomic resonant photoemission theory to the interatomic case, provided that interactions beyond the usual second-order Kramers-Heisenberg treatment are included. This theory is also found to simplify under certain conditions so as to yield results equivalent to a classical x-ray optical approach, with the latter providing an accurate and alternative, although less detailed and general, physical picture of these effects. Possible future applications of MARPE as a new probe of near-neighbor identities and bonding and its relationship to other known effects are also discussed. We also consider in detail specially written data acquisition software that has been used for most of the measurements reported here. This software has been used with an existing experimental system to develop the method of detector characterization and then data correction required for the work described above. The development of a next generation one-dimensional, high-speed, electron detector is also discussed. Our goal has been to design, build and test a prototype high-performance, one-dimensional pulse-counting detector that represents a significant advancement in detector technology and is well matched to modern high-brightness synchrotrons radiation sources and high-transmission electron-energy analyzers as typically used in photoelectron spectroscopy experiments. The general design of the detector and the results of initial tests are discussed and the acquisition of photoelectron spectra with the first test detector is described

  1. 2nd visit to the Korea Automotive Technology Institute; Kankoku jidosha kenkyuin (KATECH) homon

    Energy Technology Data Exchange (ETDEWEB)

    Inoue, S. [Japan Automobile Research Institute Inc., Tsukuba (Japan)

    2000-02-01

    The paper reported on a seminar to which the author was invited by KATECH, and a visit to Nissei Chemical Industry Co. KATECH is an institute established in 1990 by joint investment of the commerce/industry division of the Korean government and the automobile industry circle, for the purpose of 'heightening the automobile industrial structure and giving assistance to the national technology competition.' The number of staff-member is now 99, and the scale of the total business is about 3 billion yen in the last one year. At present, the following are proceeded with as the main projects: next-generation automobile technology business, technology development of low fuel consumption automobiles, developmental project for module structuring technology. A total of 25 members participated in the seminar from automobile makers including Hyundai Motor Co. and Dae Woo Motor Co. and from material makers. The author gave a lecture on automobile use damping materials in terms of the trend in Japan, testing method of damping performance, evaluation method, indication method, etc. The author gave a practical training of the measuring method of damping performance to Nissei Chemical Industry Co. which manufactures damping materials. (NEDO)

  2. Study on efficiency of dry decontamination technique by numerical method. The 2nd part (Joint research)

    International Nuclear Information System (INIS)

    System decontamination has been generally carried out with the aim of reducing the amount of radioactive waste generated and minimizing human exposure to radiation released from nuclear fuel facilities. At the Ningyo-Toge Environmental Engineering Center, metal surfaces that are contaminated by uranium are dry decontaminated by using iodine heptafluoride (IF7) as a system decontaminator. In this dry decontamination technique, a chemical reaction occurs between the uranium compound attached to the metal surface and IF7. Only a few studies have been carried out on the decontamination efficiency, mechanism, level, etc. of dry decontamination techniques that use a decontamination gas. Therefore, the generalization of dry decontamination techniques is required. And, clarifying a depositing mechanism of Uranium Hexafluoride becomes assumption to clarify the decontamination mechanism. In the present study, the efficiency of a dry decontamination technique was assessed by a numerical method using decontamination data obtained at the Ningyo-Toge Environmental Engineering Center. A concrete analytical content is a depositing of Uranium Hexafluoride. (author)

  3. 2nd Workshop on Jet Modification in the RHIC and LHC Era

    CERN Document Server

    2013-01-01

    A workshop organized jointly by the Wayne State Heavy Ion Group and the JET Collaboration. The goal of this 2 1/2 day meeting is to review the most important new experimental measurements and theoretical breakthroughs that have occurred in the past year andto throughly explore the limits of perturbative QCD based approaches to the description of hard processes in heavy-ion collisions. Over the period of three days, topics covered will include new experimental observables that may discern between different perturbative approaches, the inevitable transformation of analytic schemes to Monte-Carlo event generators, and the progress made towards Next to Leading Order calculations of energy loss. The workshop is intended to be slow paced:We envision a mixture of longer invited talks and shorter contributed talks,allowing sufficient time for discussion, as well as time to follow up on more technical aspects of the data analysis and theoretical calculations. One of the outcomes of this workshop will be a ...

  4. Report on the 2nd European conference on computer-aided design (CAD) in small- and medium-size industries (MICAD 82)

    Energy Technology Data Exchange (ETDEWEB)

    Magnuson, W.G. Jr.

    1982-10-01

    A summary is presented of the 2nd European conference on computer aided design (CAD) in small- and medium-size industries (MICAD82) held in Paris, France, September 21-23, 1982. The conference emphasized applications of CAD in industries with limited investment resources and which are forced to innovate in order to sustain competition.

  5. The 2008—2013 Crisis as Metastasis. A Preview of the 2nd edition of The Cancer Stage of Capitalism by Pluto Press

    OpenAIRE

    John McMurtry

    2013-01-01

    By means of selection of relevant excerpts, a preview is offered hereby of the 2nd edition of John McMurtry's prophetic 1999 book "The Cancer Stage of Capitalism", published by Pluto Press, and entitled "The Cancer Stage of Capitalism and Its Cure"

  6. Design of control system for the 2nd and 3rd charge exchange system in J-PARC 3GeV RCS

    International Nuclear Information System (INIS)

    J-PARC 3GeV Synchrotron Accelerator is using method of charge exchange injection using three carbon foils. In order to achieve this injection, three charge exchange devices installed in this facility. These devices are controlled by one control system. The 2nd and 3rd charge exchange devices are upgrading to increase maintainability and exhaust ability of the vacuum unit, and the control system has reconsidered. Basic policy of redesigning the control system is separated from centralized control system of the three devices, and we reconstruct the control system that independent from the centralized control system. On this condition, we are upgrading of the 2nd and 3rd charge exchange device. It is necessary to redesign the interlock unit about safety, because of being stand-alone control. Now, the error signal of the charge exchange unit consolidates the error signal of three devices, and it operates the Machine Protection System (MPS). Therefore, we needed long time to search occasion why the error happened. However, the MPS will be operated by the error signal on each unit, we hope it makes a difference to search occasion easily. The 2nd and 3rd charge exchange units adopt a simple control system using Yokogawa electric PLC FA-M3. We are designing of the control system with safety that fuses the drive unit and the vacuum unit. This report is about design of the 2nd and 3rd charge exchange unit control system that reconstructed the hardware of their unit. (author)

  7. The 2nd International Conference on Nuclear Physics in Astrophysics Refereed and selected contributions Debrecen, Hungary May 16–20, 2005

    CERN Document Server

    Fülöp, Zsolt; Somorjai, Endre

    2006-01-01

    Launched in 2004, "Nuclear Physics in Astrophysics" has established itself in a successful topical conference series addressing the forefront of research in the field. This volume contains the selected and refereed papers of the 2nd conference, held in Debrecen in 2005 and reprinted from "The European Physical Journal A - Hadrons and Nuclei".

  8. The 2008—2013 Crisis as Metastasis. A Preview of the 2nd edition of The Cancer Stage of Capitalism by Pluto Press

    Directory of Open Access Journals (Sweden)

    John McMurtry

    2013-03-01

    Full Text Available By means of selection of relevant excerpts, a preview is offered hereby of the 2nd edition of John McMurtry's prophetic 1999 book "The Cancer Stage of Capitalism", published by Pluto Press, and entitled "The Cancer Stage of Capitalism and Its Cure"

  9. Incretinas, incretinomiméticos, inhibidores de DPP IV: (2ª parte Incretins, Incretinmimetics, Inhibitors (2nd part

    Directory of Open Access Journals (Sweden)

    Claudia Bayón

    2010-09-01

    Full Text Available En los últimos años se reconoce un nuevo mecanismo involucrado en la fisiopatología de la Diabetes Mellitus tipo 2: el déficit de producción y/o acción de las incretinas. Las incretinas son enterohormonas que estimulan la secreción de insulina en respuesta a la ingesta de nutrientes. Glucagon like peptide-1 (GLP1 y Polipéptido insulinotrópico glucosa dependiente (GIP son las principales incretinas descubiertas hasta hoy. Ambas presentan también efecto trófico sobre las células beta de los islotes pancreáticos. GLP-1 presenta otras acciones como son la inhibición de la secreción de glucagón, enlentecimiento del vaciamiento gástrico e inhibición del apetito. Ambas incretinas son rápidamente clivadas por la enzima dipeptidil peptidasa 4 (DPP-4. Nuevas drogas como los incretinomiméticos, análogos y los inhibidores de DPP-4 se presentan como una terapéutica prometedora para los pacientes con diabetes tipo 2. Conflicto de intereses: Dr. León Litwak - Miembro del Board Latinoamericano de Eli Lilly y Sanofi Aventis - Miembro del Board Nacional de los laboratorios Novo Nordisk, Novartis, GlaxoSmithKline, Sanofi Aventis, Boheringer Ingelheim, Bristol Myers, Astra Zeneca - Investigador principal de protocolos pertenecientes a Eli Lilly, Novo Nordisk, Novartis, GlaxoSmithKline, Takeda, PPDF, Pfizer, Merck Sharp and Dôhme, Amger, Roche, Minimed, Quintiles - Conferencista de los laboratorios mencionados.Two main pathophysiological mechanisms are currently involved in Type 2 Diabetes (T2DM, insulin resistance and impairment of beta cell function. However, in recent years a new mechanism was reported: a significant decrease in incretins production and/or action. Incretins are gastrointestinal hormones whose main action is stimulating insulin secretion in response to nutrients. The best known incretins are glucagon like peptide-1 (GLP-1 and Gastric insulinotropic peptide (GIP. GLP-1 and GIP not only increase insulin secretion, but also decrease glucagon secretion, slow gastric emptying and reduce apetite, generating weight loss. Both incretins are rapidly clived by the enzyme dipeptidil peptidase 4 (DPP4. In order to emulate incretins action, several drugs were developed: GLP-1 receptor agonists, GLP-1 mimetics, and DPP4 inhibitors. All of them seem to became a very promising tool for the treatment of T2DM. Financial Interests: Dr. León Litwak - Member of the Latin American Board of Eli Lilly and Sanofi Aventis - Member of the National Board of the following laboratories: Novo Nordisk, Novartis, GlaxoSmithKlein Sanofi, Aventis, Boheringer Ingelheim, Bristol Myers, Astra Zeneca - Principal Investigator of Protocols from: Eli Lilly, Novo Nordisk, Novartis, GlaxoSmithKlein, Takeda, PPDF, Pfizer, Merck Sharp and Dôhme, Amgen, Roche, Minimed, Quintiles - Lecturer for the former laboratories.

  10. Clinical impact of dose reductions and interruptions of second-generation tyrosine kinase inhibitors in patients with chronic myeloid leukaemia.

    OpenAIRE

    Santos, Fabio P. S.; Kantarjian, Hagop; Fava, Carmen; O’brien, Susan; Garcia-manero, Guillermo; Ravandi, Farhad; Wierda, William; Thomas, Deborah; Shan, Jianquin; Cortes, Jorge

    2010-01-01

    Second (2nd)-generation tyrosine kinase inhibitors (TKI) (dasatinib, nilotinib) are effective in patients with all phases of chronic myeloid leukaemia (CML). Dose reductions and treatment interruptions are frequently required due to toxicity, but their significance is unknown. We analysed the impact of dose reductions/interruptions and dose intensity of 2nd-generation TKI on response and survival. A total of 280 patients with CML (all phases) were analysed. Dose reductions were considered whe...

  11. Has the time for first-line treatment with second generation tyrosine kinase inhibitors in patients with chronic myelogenous leukemia already come? Systematic review and meta-analysis

    OpenAIRE

    Gurion, Ronit; Gafter-gvili, Anat; Vidal, Liat; Leader, Avi; Ram, Ron; Shacham-abulafia, Adi; Paul, Mical; Ben-bassat, Isaac; Shpilberg, Ofer; Raanani, Pia

    2013-01-01

    Second generation tyrosine kinase inhibitors have recently been introduced as first-line treatment for chronic phase chronic myelogenous leukemia. We aimed to evaluate the efficacy and safety of 2nd generation tyrosine kinase inhibitors versus imatinib as first-line treatment for these patients. We carried out a systematic review and meta-analysis of randomized controlled trials comparing 2nd generation tyrosine kinase inhibitors to imatinib as first-line treatment in chronic phase chronic my...

  12. Next generation multilocus sequence typing (NGMLST) and the analytical software program MLSTEZ enable efficient, cost-effective, high-throughput, multilocus sequencing typing.

    Science.gov (United States)

    Chen, Yuan; Frazzitta, Aubrey E; Litvintseva, Anastasia P; Fang, Charles; Mitchell, Thomas G; Springer, Deborah J; Ding, Yun; Yuan, George; Perfect, John R

    2015-02-01

    Multilocus sequence typing (MLST) has become the preferred method for genotyping many biological species, and it is especially useful for analyzing haploid eukaryotes. MLST is rigorous, reproducible, and informative, and MLST genotyping has been shown to identify major phylogenetic clades, molecular groups, or subpopulations of a species, as well as individual strains or clones. MLST molecular types often correlate with important phenotypes. Conventional MLST involves the extraction of genomic DNA and the amplification by PCR of several conserved, unlinked gene sequences from a sample of isolates of the taxon under investigation. In some cases, as few as three loci are sufficient to yield definitive results. The amplicons are sequenced, aligned, and compared by phylogenetic methods to distinguish statistically significant differences among individuals and clades. Although MLST is simpler, faster, and less expensive than whole genome sequencing, it is more costly and time-consuming than less reliable genotyping methods (e.g. amplified fragment length polymorphisms). Here, we describe a new MLST method that uses next-generation sequencing, a multiplexing protocol, and appropriate analytical software to provide accurate, rapid, and economical MLST genotyping of 96 or more isolates in single assay. We demonstrate this methodology by genotyping isolates of the well-characterized, human pathogenic yeast Cryptococcus neoformans. PMID:25624069

  13. Software Metrics and Software Metrology

    CERN Document Server

    Abran, Alain

    2010-01-01

    Most of the software measures currently proposed to the industry bring few real benefits to either software managers or developers. This book looks at the classical metrology concepts from science and engineering, using them as criteria to propose an approach to analyze the design of current software measures and then design new software measures (illustrated with the design of a software measure that has been adopted as an ISO measurement standard). The book includes several case studies analyzing strengths and weaknesses of some of the software measures most often quoted. It is meant for sof

  14. 2nd order spline interpolation of the Abel transformation for use in cylindrically-symmetric radiative source

    International Nuclear Information System (INIS)

    Inversion of the observed transverse radiance and transmittance, M(z) and N(z) , into the radial emission coefficients J(r) , in cylindrically-symmetric radiation source induce to solve the generalized Abel equations S(z) = 2 ?zR [J(r) K(z,r) r dr/? (R2-z2)] (0 ? z zR [K(r) r dr/? (R2-z2)]. This equation can be solved analytically. In 1981, Young proposed an iteration(Y' I). After 1990, we proposed successively a piecewise linear interpolation (PLI) and a block-bi-quadric interpolation(BBQI). In this paper, we notice that emission coefficient J(r) is sufficiently smooth and symmetric at r = 0, from which we know that J'(r) = 0. Considering the condition, we propose the 2nd-order spline interpolation (2OSI). The former Abel equation is separated into a system of linear algebraic equations, whose coefficient matrix is an upper Heisenberg matrix. So this equation can be solved easily and rapidly and the results obtained from the method are smoother than others. Thus the experimenters can apply this method easily. We have solved the two examples by using present 2OSI. The results shoent 2OSI. The results show that the computed values converge to the exact solutions with an increase in the number of nodes n. Therefore the method (2OSI) is effective and reasonable

  15. Combined intracervical PGE2 and intra-amniotic PGF2 alpha for induction of 2nd trimester abortion

    DEFF Research Database (Denmark)

    Allen, J; Maigaard, S

    1987-01-01

    Fourteen consecutive patients (mean gestational age 18.1 weeks, range 15-23 weeks) referred for therapeutic termination of pregnancy were induced into abortion by intra-amniotic PGF2 alpha 40 mg followed by oxytocin stimulation. 14 other patients (mean gestational age 17.9 weeks, range 15-23 weeks) were pretreated with intracervical PGE2 1.0 mg in gel for 4 h prior to induction of abortion with intra-amniotic PGF2 alpha 40 mg without further stimulation. The induction-abortion interval for patients treated with intra-amniotic PGF2 alpha and oxytocin, was 19.1 +/- 2.94 h (+/- SE, n = 14) with a success rate of 80% after 24 h. After pretreatment with intracervical PGE2 1.0 mg in viscous gel, intra-amniotic PGF2 alpha 40 mg induced abortion after 11.2 +/- 1.12 h (+/- SE, n = 14) with a 100% success rate after 24 h. No systemic side effects of the PGE2 pretreatment were noted. No cervical laceration was observed. The results need further confirmation, but still suggest cervical priming with intracervical PGE2 1.0mg in gel and subsequent induction of abortion by intra-amniotic PGF2 alpha 40 mg as an attractive principle for 2nd trimester abortion.

  16. Radiation-induced swelling and amorphization in Ca2Nd8(SiO4)6O2

    International Nuclear Information System (INIS)

    Specimens of Ca2Nd8(SiO4)6O2 were doped with 1.2 wt% 244Cm and the effects of self-radiation damage from alpha decay were determined as a function of cumulative dose. The macroscopic volume of the specimens increased exponentially with dose to a limiting (saturation) value of approx. 8.0%. The initially crystalline material became completely X-ray amorphous at a dose of 11.7 x 1024 alpha decays/m3. The dissolution rate of the amorphous state was about an order of magnitude higher than the crystalline state. The stored energy of the amorphous state was approx. 130 J/g. Differential thermal analysis along with isochronal and isothermal-step annealing were used to study the kinetics associated with the thermal recovery of the radiation-induced swelling and amorphization. A single recovery stage associated with recrystallization of the amorphous material was observed and the activation energy was determined to be 3.1 +- 0.2 eV. (author)

  17. Explicit formulas for 2nd-order driving terms due to sextupoles and chromatic effects of quadrupoles.

    Energy Technology Data Exchange (ETDEWEB)

    Wang, C-X. (Accelerator Systems Division (APS))

    2012-04-25

    Optimization of nonlinear driving terms have become a useful tool for designing storage rings, especially modern light sources where the strong nonlinearity is dominated by the large chromatic effects of quadrupoles and strong sextupoles for chromaticity control. The Lie algebraic method is well known for computing such driving terms. However, it appears that there was a lack of explicit formulas in the public domain for such computation, resulting in uncertainty and/or inconsistency in widely used codes. This note presents explicit formulas for driving terms due to sextupoles and chromatic effects of quadrupoles, which can be considered as thin elements. The computation is accurate to the 4th-order Hamiltonian and 2nd-order in terms of magnet parameters. The results given here are the same as the APS internal note AOP-TN-2009-020. This internal nte has been revised and published here as a Light Source Note in order to get this information into the public domain, since both ELEGANT and OPA are using these formulas.

  18. Evaluation of neutron irradiation embrittlement in the Korean reactor pressure vessel steels (II) (2nd progress report)

    Energy Technology Data Exchange (ETDEWEB)

    Hong, J. H.; Lee, B. S.; Chi, S. H.; Kim, J. H. [Korea Atomic Energy Research Institute, Taejon (Korea)

    1999-11-01

    Reactor pressure vessel materials, which were produced by the domestic company HANJUNG, are being evaluated using the experimental irradiation facility HANARO. For this evaluation, instrumented capsules are used for neutron irradiation of various kinds of specimens made of different kinds of steels, which are VCD (Y4), VCD+Al(U4), Si+Al(Y5), U4 weld metal, and U4 HAZ, respectively. In the irradiation test the temperature should be controlled in the range of 290 {+-} 10 deg C. The current status of performing this project is as follows : (1) The key data on mechanical properties, mainly related to the fracture toughness, of the unirradiated materials have been obtained. (2) The irradiation of the 1st (preliminary) instrumented capsule has been completed. (3) The 2nd standard instrumented capsule was irradiated successfully. (4) Although the irradiation data are not sufficient at present time, post-irradiation test results showed that the Y5 steel is highly resistant to irradiation embrittlement while VCD material has lower resistance than the other steels used in this study. 12 refs., 19 figs., 8 tabs. (Author)

  19. Does the Application of Instructional Mathematics Software Have Enough Efficiency?

    Directory of Open Access Journals (Sweden)

    Zahra Kalantarnia

    2013-12-01

    Full Text Available Modern tools as new educational systems can improve teaching-learning procedures in schools. Teaching mathematics is one of the main and difficult components in educational systems. Informing methods is essential for teachers and instructors. It seems that did not forget usual teaching method and using software or media considered as remedial teaching. Teachers always follow dynamic methods for teaching and learning. The aim of this study is that views of students studied regard to math software and its efficiency. Twenty two girl students of 2nd grade are chosen at high schools. Through standard questionnaire and survey method, views of students are collected. Data are studied via Kolmogorov-Smirnov and one sample sign tests. The results of tests indicated that students have positive views toward co-instructional software of math learning. Therefore it seems that mathematical software can advantages for teaching and learning at high schools.

  20. Software reliability

    CERN Document Server

    Bendell, A

    1986-01-01

    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  1. UWB Tracking Software Development

    Science.gov (United States)

    Gross, Julia; Arndt, Dickey; Ngo, Phong; Phan, Chau; Dusl, John; Ni, Jianjun; Rafford, Melinda

    2006-01-01

    An Ultra-Wideband (UWB) two-cluster Angle of Arrival (AOA) tracking prototype system is currently being developed and tested at NASA Johnson Space Center for space exploration applications. This talk discusses the software development efforts for this UWB two-cluster AOA tracking system. The role the software plays in this system is to take waveform data from two UWB radio receivers as an input, feed this input into an AOA tracking algorithm, and generate the target position as an output. The architecture of the software (Input/Output Interface and Algorithm Core) will be introduced in this talk. The development of this software has three phases. In Phase I, the software is mostly Matlab driven and calls C++ socket functions to provide the communication links to the radios. This is beneficial in the early stage when it is necessary to frequently test changes in the algorithm. Phase II of the development is to have the software mostly C++ driven and call a Matlab function for the AOA tracking algorithm. This is beneficial in order to send the tracking results to other systems and also to improve the tracking update rate of the system. The third phase is part of future work and is to have the software completely C++ driven with a graphics user interface. This software design enables the fine resolution tracking of the UWB two-cluster AOA tracking system.

  2. PREFACE: Proceedings of the 2nd International Conference on Quantum Simulators and Design (Tokyo, Japan, 31 May-3 June 2008) Proceedings of the 2nd International Conference on Quantum Simulators and Design (Tokyo, Japan, 31 May-3 June 2008)

    Science.gov (United States)

    Akai, Hisazumi; Tsuneyuki, Shinji

    2009-02-01

    This special issue of Journal of Physics: Condensed Matter comprises selected papers from the proceedings of the 2nd International Conference on Quantum Simulators and Design (QSD2008) held in Tokyo, Japan, between 31 May and 3 June 2008. This conference was organized under the auspices of the Development of New Quantum Simulators and Quantum Design Grant-in-Aid for Scientific Research on Priority Areas, Ministry of Education, Culture, Sports, Science and Technology of Japan (MEXT). The conference focused on the development of first principles electronic structure calculations and their applications. The aim was to provide an opportunity for discussion on the progress in computational materials design and, in particular, the development of quantum simulators and quantum design. Computational materials design is a computational approach to the development of new materials. The essential ingredient is the use of quantum simulators to design a material that meets a given specification of properties and functionalities. For this to be successful, the quantum simulator should be very reliable and be applicable to systems of realistic size. During the conference, new methods of quantum simulation and quantum design were discussed including methods beyond the local density approximation of density functional theory, order-N methods, methods dealing with excitations and reactions, and the application of these methods to the design of novel materials, devices and systems. The conference provided an international forum for experimental and theoretical researchers to exchange ideas. A total of 220 delegates from eight countries participated in the conference. There were 13 invited talks, ten oral presentations and 120 posters. The 3rd International Conference on Quantum Simulators and Design will be held in Germany in the autumn of 2011.

  3. What difference does a year of schooling make? Maturation of brain response and connectivity between 2nd and 3rd grades during arithmetic problem solving.

    Science.gov (United States)

    Rosenberg-Lee, Miriam; Barth, Maria; Menon, Vinod

    2011-08-01

    Early elementary schooling in 2nd and 3rd grades (ages 7-9) is an important period for the acquisition and mastery of basic mathematical skills. Yet, we know very little about neurodevelopmental changes that might occur over a year of schooling. Here we examine behavioral and neurodevelopmental changes underlying arithmetic problem solving in a well-matched group of 2nd (n = 45) and 3rd (n = 45) grade children. Although 2nd and 3rd graders did not differ on IQ or grade- and age-normed measures of math, reading and working memory, 3rd graders had higher raw math scores (effect sizes = 1.46-1.49) and were more accurate than 2nd graders in an fMRI task involving verification of simple and complex two-operand addition problems (effect size = 0.43). In both 2nd and 3rd graders, arithmetic complexity was associated with increased responses in right inferior frontal sulcus and anterior insula, regions implicated in domain-general cognitive control, and in left intraparietal sulcus (IPS) and superior parietal lobule (SPL) regions important for numerical and arithmetic processing. Compared to 2nd graders, 3rd graders showed greater activity in dorsal stream parietal areas right SPL, IPS and angular gyrus (AG) as well as ventral visual stream areas bilateral lingual gyrus (LG), right lateral occipital cortex (LOC) and right parahippocampal gyrus (PHG). Significant differences were also observed in the prefrontal cortex (PFC), with 3rd graders showing greater activation in left dorsal lateral PFC (dlPFC) and greater deactivation in the ventral medial PFC (vmPFC). Third graders also showed greater functional connectivity between the left dlPFC and multiple posterior brain areas, with larger differences in dorsal stream parietal areas SPL and AG, compared to ventral stream visual areas LG, LOC and PHG. No such between-grade differences were observed in functional connectivity between the vmPFC and posterior brain regions. These results suggest that even the narrow one-year interval spanning grades 2 and 3 is characterized by significant arithmetic task-related changes in brain response and connectivity, and argue that pooling data across wide age ranges and grades can miss important neurodevelopmental changes. Our findings have important implications for understanding brain mechanisms mediating early maturation of mathematical skills and, more generally, for educational neuroscience. PMID:21620984

  4. 2nd SUMO Conference

    CERN Document Server

    Weber, Melanie

    2015-01-01

    This contributed volume contains the conference proceedings of the Simulation of Urban Mobility (SUMO) conference 2014, Berlin. The included research papers cover a wide range of topics in traffic planning and simulation, including open data, vehicular communication, e-mobility, urban mobility, multimodal traffic as well as usage approaches. The target audience primarily comprises researchers and experts in the field, but the book may also be beneficial for graduate students.  

  5. TESTING FOR OBJECT ORIENTED SOFTWARE

    Directory of Open Access Journals (Sweden)

    Jitendra S. Kushwah

    2011-02-01

    Full Text Available This paper deals with design and development of an automated testing tool for Object Oriented Software. By an automated testing tool, we mean a tool that automates a part of the testing process. It can include one or more of the following processes: test strategy eneration, test case generation, test case execution, test data generation, reporting and logging results. By object-oriented software we mean a software designed using OO approach and implemented using a OO language. Testing of OO software is different from testing software created using procedural languages. Severalnew challenges are posed. In the past most of the methods for testing OO software were just a simple extension of existing methods for conventional software. However, they have been shown to be not very appropriate. Hence, new techniques have been developed. This thesis work has mainly focused on testing design specifications for OO software. As described later, there is a lack of specification-based testing tools for OO software. An advantage of testing software specifications as compared to program code is that specifications aregenerally correct whereas code is flawed. Moreover, with software engineering principles firmly established in the industry, most of the software developed nowadays follow all the steps of Software Development Life Cycle (SDLC. For this work, UML specifications created in Rational Rose are taken. UML has become the de-factostandard for analysis and design of OO software. Testing is conducted at 3 levels: Unit, Integration and System. At the system level there is no difference between the testing techniques used for OO software and other software created using a procedural language, and hence, conventional techniques can be used. This tool provides features for testing at Unit (Class level as well as Integration level. Further a maintenance-level component has also been incorporated. Results of applying this tool to sample Rational Rose files have been ncorporated, and have been found to be satisfactory.

  6. Development of an algorithm that generates thermal zones for facilitating data collection from within a building schedule simulation software; Elaboration d'un algorithme de generation de zones thermiques visant a faciliter la saisie de donnees a l'interieur d'un logiciel de simulation horaire de batiments

    Energy Technology Data Exchange (ETDEWEB)

    Bellemare, R. [Hydro-Quebec Distribution, Montreal, PQ (Canada). Services Conseil Utilisation de l' energie; Sansregret, S. [Hydro-Quebec, Shawinigan, PQ (Canada). Research Inst., Energy Technology Labs

    2008-07-01

    Hydro-Quebec has developed a software package known as PEP as part of its initiative to optimize the energy performance of its buildings. The objective was to assess the energy savings associated with measures to optimize energy efficiency. DOE2.1E was the basis for the PEP software package. This paper presented the algorithm used within PEP to automatically generates thermal zones needed to perform building simulations. Engineering rules and hypotheses are used to generate thermal zones according to building area and type of activity within the building. A comparison of thermal losses revealed a 3 per cent loss in heating and cooling load. 1 refs., 5 tabs., 4 figs.

  7. A verification of the high density after contrast enhancement in the 2nd week in cerebroischemic lesion

    International Nuclear Information System (INIS)

    To determine the indication, it is necessary to make clear the relation among the Stage (time and course), the Strength, the Pathogenesis, and the Effects of the operation in these diseases (SSPE relation). In this report, we focused on the High Density of CT after the contrast enhancement in the cases of ischemic lesions (the High Density was named ''Ribbon H. D.''). Seventeen cases of Ribbon H. D. in fresh infarctions were verified concerning the time of the appearance of the H. D., the features of its location and nature, and the histological findings. The results were as follows: The Ribbon H. D. appeared in the early stage of infarctions, and had its peak density at the end of the 2nd week after the onset. The Ribbon H. D. was mostly located along the cortical line, showing a ribbon-like band. The Ribbon H. D. did not appear in the sharply demarcated coagulation necrosis in the early stage or in the defined Low Density (L. D.) in the late stage of infarctions. Although the Ribbon H. D. shows the extravasation of contrast media, it does not necessarily show the existence of the hemorrhagic infarction. Some part of the Ribbon H. D. changes to a well-defined L. D. and the rest of the part becomes relative isodensity in the late stage. This change corresponds to the change in the incomplete necrosis which is afterwards divided into a resolution with a cystic cavity and the glial replacement in the late stage. In conclusion, it is possible to understand that the Ribbot is possible to understand that the Ribbon H. D. corresponds to the lesion of an incomplete necrosis, with neovascularization, in the early stage of infarctions. Therefore, in addition to the present indication of a by-pass operation (TIA, RIND), this incomplete necrosis (Ribbon H. D.), its surrounding area and just before the appearance of the Ribbon H. D. might be another indication of the operation. (author)

  8. The ratios of 2nd to 4th digit may be a predictor of schizophrenia in male patients.

    Science.gov (United States)

    Bolu, Abdullah; Oznur, Taner; Develi, Sedat; Gulsun, Murat; Aydemir, Emre; Alper, Mustafa; Toygar, Mehmet

    2015-05-01

    The production of androgens (mostly testosterone) during the early fetal stage is essential for the differentiation of the male brain. Some authors have suggested a relationship between androgen exposure during the prenatal period and schizophrenia. These two separate relationships suggest that digit length ratios are associated with schizophrenia in males. The study was performed in a university hospital between October 2012 and May 2013. One hundred and three male patients diagnosed with schizophrenia according to DSM-IV using SCID-I, and 100 matched healthy males, were admitted to the study. Scale for the Assessment of Positive Symptoms (SAPS), Scale for the Assessment of Negative Symptoms (SANS) and Brief Psychiatric Rating Scale (BPRS) were used to assess schizophrenia symptoms. The second digit (2D) and fourth digit (4D) asymmetry index (AI), and the right- and left-hand 2D:4D ratios were calculated. All parametric data in the groups were compared using an independent t-test. The predictive power of the AI was estimated by receiver operating characteristics analysis. The 2D:4D AI was statistically significantly lower in the patient group than the healthy control comparison group. There were significant differences between the schizophrenia and the control groups in respect of left 2D:4D and right 2D:4D. There was no correlation between AI, left, or right 2D:4D, BPRS, or SAPS in the schizophrenia group. However, there was a negative correlation between left 2nd digit (L2D):4D and the SANS score. Our findings support the view that the 2D:4D AI can be used as a moderate indicator of schizophrenia. Even more simply, the right or left 2D:4D can be used as an indicator. L2D:4D could indicate the severity of negative symptoms. Clin. Anat. 28:551-556, 2015. © 2015 Wiley Periodicals, Inc. PMID:25779956

  9. The Effects of Star Strategy of Computer-Assisted Mathematics Lessons on the Achievement and Problem Solving Skills in 2nd Grade Courses

    Directory of Open Access Journals (Sweden)

    Jale ?PEK

    2013-12-01

    Full Text Available The aim of research is to determine the effect of STAR strategy on 2nd grade students’ academic achievement and problem solving skills in the computer assisted mathematics instruction. 30 students who attend to 2nd grade course in a primary school in Ayd?n in 2010-2011 education year join the study group. The research has been taken place in 7 week. 3 types of tests have been used as means of data collecting tool, “Academic Achievement Test”, “Problem Solving Achievement Test” and “The Evaluation Form of Problem Solving Skills”. At the end of research students’ views about computer assisted mathematics instruction were evaluated. It has been examined that whether the differences between the scores of pre-test and post-test are statistically meaningful or not. According to the results, a positive increase on the academic achievement and problem solving skills has been determined at the end of the education carried out with STAR strategy.

  10. Software Transparency

    OpenAIRE

    Leite, Julio Cesar Sampaio Do Prado

    2009-01-01

    Software transparency is a new concern that software developers must deal with. As society moves towards the digitalization of day to day processes, the transparency of these digital processes becomes of fundamental importance if citizens would like to exercise their right to know. Informed discourse is only possible if processes that affect the public are open to evaluation. Achieving software transparency to this level of openness brings up several roadblocks. Thi...

  11. Effect of Software Manager, Programmer and Customer over Software Quality

    Directory of Open Access Journals (Sweden)

    Ghrehbaghi Farhad

    2013-01-01

    Full Text Available Several factors might be affecting the quality of software products. In this study we focus on three significant parameters: software manager, programmer and the customer. Our study demonstrates that the quality of product will improve by increasing the information generated by these three parameters. These parameters can be considered as triangular which the quality is its centroid. In this perspective, if the triangular be equilateral, then the optimum quality of the product will beachieved. In other words, to generate high quality software, the ability of software manager, programmer and customer must be same. Subsequently, only a manager or only an expert programmer, and with a software aware customer cannot be a guarantee to the high quality of the software products.

  12. Fe69B20.2Nd4.2Nb3.3Y2.5Zr0.8 magnets produced by injection casting

    International Nuclear Information System (INIS)

    Fe69B20.2Nd4.2Nb3.3Y2.5Zr0.8 hard magnets in rods were produced by injection casting method. Magnetic properties, phase evolution and microstructure for Fe69B20.2Nd4.2Nb3.3Y2.5Zr0.8 magnets have been investigated and presented in as-cast and annealed states. The magnets possess soft magnetic characteristics in as-cast state, and hard magnetic characteristics after annealing. The Nb refines the grain sizes of ?-Fe, Fe3B and Nd2Fe14B magnetic phases, while Y and Zr improve the glass forming ability. Good magnetic properties in Fe69B20.2Nd4.2Nb3.3Y2.5Zr0.8 magnets arise from the microstructure composed of magnetically exchange coupled nanocrystalline ?-Fe, Fe3B and Nd2Fe14B phases. Optimal annealed magnets of 2 mm diameter and 30 mm length demonstrate the hard magnetic properties, i.e. intrinsic coercivity jHc of 496 kA/m, remanence Br of 0.76 T and maximum energy product (BH)max of 72.0 kJ/m3. - Highlights: ? Fe69B20.2Nd4.2Nb3.3Y2.5Zr0.8 magnets are produced by injection casting. ? Magnetic properties depend on the intrinsic properties of the phases. ? Microstructure is composed of -Fe, Fe3B Nd2Fe14B nano-grains. ? Magnets demonstrate jHc of 496 kA/m, Br of 0.76 T and (BH)max of 72 kJ/m3

  13. Contributions of the Geomathematics Group, TU Kaiserslautern, to the 2nd International GOCE User Workshop 2004 at ESA-ESRIN Frascati, Italy

    OpenAIRE

    Gutting, Martin; Michel, Dominik

    2004-01-01

    The following two papers present recent developments in multiscale ocean circulation modeling and multiscale gravitational field modeling that have been presented at the 2nd International GOCE User Workshop 2004 in Frascati. Part A - Multiscale Modeling of Ocean Circulation In this paper the applicability of multiscale methods to oceanography is demonstrated. More precisely, we use convolutions with certain locally supported kernels to approximate the dynamic topography and the geostrophic fl...

  14. Process of change in organisations through eHealth: 2nd International eHealth Symposium 2010, Stuttgart, Germany, June 7 - 8, 2010 ; Proceedings edited by Stefan Kirn

    OpenAIRE

    Kirn, Stefan

    2010-01-01

    Foreword: On behalf of the Organizing Committee, it is my pleasure to welcome you to Hohenheim, Stuttgart for the 2nd International eHealth Symposium which is themed 'Process of change in organisations through eHealth'. Starting with the inaugural event in 2009, which took place in Turku, Finland, we want to implement a tradition of international eHealth symposia. The presentations and associated papers in this proceedings give a current and representative outline of technical options, applic...

  15. Software Reviews.

    Science.gov (United States)

    Beezer, Robert A.; And Others

    1988-01-01

    Reviews for three software packages are given. Those packages are: Linear Algebra Computer Companion; Probability and Statistics Demonstrations and Tutorials; and Math Utilities: CURVES, SURFS, AND DIFFS. (PK)

  16. Hybrid distributed Raman amplification combining random fiber laser based 2nd-order and low-noise LD based 1st-order pumping.

    Science.gov (United States)

    Jia, Xin-Hong; Rao, Yun-Jiang; Yuan, Cheng-Xu; Li, Jin; Yan, Xiao-Dong; Wang, Zi-Nan; Zhang, Wei-Li; Wu, Han; Zhu, Ye-Yu; Peng, Fei

    2013-10-21

    A configuration of hybrid distributed Raman amplification (H-DRA), that is formed by incorporating a random fiber laser (RFL) based 2nd-order pump and a low-noise laser-diode (LD) based 1st-order pump, is proposed in this paper. In comparison to conventional bi-directional 1st-order DRA, the effective noise figure (ENF) is found to be lower by amount of 0 to 4 dB due to the RFL-based 2nd-order pump, depending on the on-off gain, while the low-noise 1st-order Raman pump is used for compensating the worsened signal-to-noise ratio (SNR) in the vicinity towards the far end of the fiber and avoiding the potential nonlinear impact induced by excess injection of pump power and suppressing the pump-signal relative intensity noise (RIN) transfer. As a result, the gain distribution can be optimized along ultra-long fiber link, due to combination of the 2nd-order RFL and low-noise 1st-order pumping, making the transmission distance be extended significantly. We utilized such a configuration to achieve ultra-long-distance distributed sensing based on Brillouin optical time-domain analysis (BOTDA). A repeater-less sensing distance record of up to 154.4 km with 5 m spatial resolution and ~ ± 1.4 °C temperature uncertainty is successfully demonstrated. PMID:24150305

  17. Spectroscopic studies of 5d3/2nd 1D0,2 autoionization lines of barium under collision with rare gases

    Science.gov (United States)

    Afrousheh, K.; Marafi, M.; Kokaj, J.; Makdisi, Y.; Mathew, J.

    2012-05-01

    The spectroscopic behavior of 5d3/2nd (1D0 and 1D2) autoionizing Rydberg series of barium was studied under collision with rare gases. The series members from n = 8 to n = 64 were observed using two-photon excitation of the two valence electrons in the 6s21S0 ground state of barium. The barium vapor was produced in a heat-pipe-like oven, and a tunable dye laser pumped by an excimer laser was used as the excitation source. The obtained spectral data have Beutler-Fano profiles. These spectral lines were investigated when inert gases Ar, Kr, and Xe at different pressures were introduced into the oven as perturbing gases. The collision-induced line shifts were measured and the shift parameters for the even-parity 5d3/2nd 1D0 and 5d3/2nd 1D2 (n = 8-35) autoionizing states were extracted from the data. The collision-induced change in the spectral line shape at different Xe pressure was also explored.

  18. GREEN SOFTWARE ENGINEERING PROCESS : MOVING TOWARDS SUSTAINABLE SOFTWARE PRODUCT DESIGN

    OpenAIRE

    Shantanu Ray

    2013-01-01

    The Software development lifecycle (SDLC) currently focuses on systematic execution and maintenance of software by dividing the software development process into various phases that include requirements-gathering, design, implementation, testing, deployment and maintenance. The problem here is that certain important decisions taken in these phases like use of paper, generation of e-Waste, power consumption and increased carbon foot print by means of travel, Air-conditioning etc may harm the e...

  19. 2nd State of the Onion: Larry Wall's Keynote Address at the Second Annual O'Reilly Perl Conference

    Science.gov (United States)

    This page, part of publisher O'Reilly & Associates' Website devoted to the Perl language, contains a transcript of Larry Wall's keynote address at the second annual O'Reilly Perl Conference, which was held August 17-20, 1998, in San Jose, California. In his keynote address, Larry Wall, the original author of the Perl programming language, provides a thought-provoking (and entertaining) mix of philosophy and technology. Wall's talk touches on the future of the Perl language, the relationship of the free software community to commercial software developers, chaos, complexity, and human symbology. The page also includes copies of graphics used during the keynote.

  20. Application of Software Safety Analysis Methods

    Energy Technology Data Exchange (ETDEWEB)

    Park, G. Y.; Hur, S.; Cheon, S. W.; Kim, D. H.; Lee, D. Y.; Kwon, K. C. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Lee, S. J.; Koo, Y. H. [Doosan Heavy Industries and Construction Co., Daejeon (Korea, Republic of)

    2009-05-15

    A fully digitalized reactor protection system, which is called the IDiPS-RPS, was developed through the KNICS project. The IDiPS-RPS has four redundant and separated channels. Each channel is mainly composed of a group of bistable processors which redundantly compare process variables with their corresponding setpoints and a group of coincidence processors that generate a final trip signal when a trip condition is satisfied. Each channel also contains a test processor called the ATIP and a display and command processor called the COM. All the functions were implemented in software. During the development of the safety software, various software safety analysis methods were applied, in parallel to the verification and validation (V and V) activities, along the software development life cycle. The software safety analysis methods employed were the software hazard and operability (Software HAZOP) study, the software fault tree analysis (Software FTA), and the software failure modes and effects analysis (Software FMEA)

  1. EDITORIAL: Selected Papers from OMS'07, the 2nd Topical Meeting of the European Optical Society on Optical Microsystems (OMS)

    Science.gov (United States)

    Rendina, Ivo; Fazio, Eugenio; Ferraro, Pietro

    2008-06-01

    OMS'07 was the 2nd Topical Meeting of the European Optical Society (EOS) on Optical Microsystems (OMS). It was organized by the EOS in the frame of its international topical meeting activity, and after the success of the inaugural meeting was once again held in Italy, 30 September to 3 October 2007, amidst the wonderful scenery of the Island of Capri. The local organizing committee was composed of researchers from `La Sapienza' University in Rome and the National Council of Research (CNR) in Naples, Italy. A selected group of leading scientists in the field formed the international scientific committee. The conference was fully dedicated to the most recent advancements carried out in the field of optical microsystems. More then 150 scientists coming from five continents attended the conference and more than 100 papers were presented, organized into the following sessions: Photonic cystals and metamaterials Optofluidic microsystems and devices Optical microsystems and devices New characterization methods for materials and devices Application of optical systems Optical sources and photodetectors Optical resonators Nonlinear optic devices Micro-optical devices. Four keynote lecturers were invited for the Plenary sessions: Federico Capasso, Harvard University, USA; Bahram Javidi, University of Connecticut, USA (Distinguished Lecturer, Emeritus of LEOS--IEEE Society); Demetri Psaltis, EPFL, Lausanne, Switzerland; Ammon Yariv, California Institute of Technology, USA. Furthermore, 21 invited speakers opened each session of the conference with their talks. In addition a special session was organized to celebrate eighty years of the Isituto Nazionale di Ottica Applicata (INOA) of CNR. The special invited speaker for this session was Professor Theodor W Hänsch (Nobel Prize in Physics, 2005), who gave a lecture entitled `What can we do with optical frequency combs?' In this special issue of Journal of Optics A: Pure and Applied Optics, a selection of the most interesting papers presented at OMS'07 has been collected, reporting progress in the different aspects of microsystems design, production, characterization and application. The papers embrace most of the various topics that were debated during the conference. Abstracts for the presentations given at the conference can be found on the OMS'07 website at http://www.inoa.it/oms07/. We would like to thank all the members of the scientific and industrial committees of OMS'07 for the high scientific content of the meeting, the European Optical Society for the irreplaceable support given to the conference organization and the editorial staff at Journal of Optics A for the invaluable work done in preparing the special issue.

  2. PREFACE: The 2nd International Conference on Geological, Geographical, Aerospace and Earth Sciences 2014 (AeroEarth 2014)

    Science.gov (United States)

    Lumban Gaol, Ford; Soewito, Benfano

    2015-01-01

    The 2nd International Conference on Geological, Geographical, Aerospace and Earth Sciences 2014 (AeroEarth 2014), was held at Discovery Kartika Plaza Hotel, Kuta, Bali, Indonesia during 11 - 12 October 2014. The AeroEarth 2014 conference aims to bring together researchers and engineers from around the world. Through research and development, earth scientists have the power to preserve the planet's different resource domains by providing expert opinion and information about the forces which make life possible on Earth. Earth provides resources and the exact conditions to make life possible. However, with the advent of technology and industrialization, the Earth's resources are being pushed to the brink of depletion. Non-sustainable industrial practices are not only endangering the supply of the Earth's natural resources, but are also putting burden on life itself by bringing about pollution and climate change. A major role of earth science scholars is to examine the delicate balance between the Earth's resources and the growing demands of industrialization. Through research and development, earth scientists have the power to preserve the planet's different resource domains by providing expert opinion and information about the forces which make life possible on Earth. We would like to express our sincere gratitude to all in the Technical Program Committee who have reviewed the papers and developed a very interesting Conference Program as well as the invited and plenary speakers. This year, we received 98 papers and after rigorous review, 17 papers were accepted. The participants come from eight countries. There are four Parallel Sessions and two invited Speakers. It is an honour to present this volume of IOP Conference Series: Earth and Environmental Science (EES) and we deeply thank the authors for their enthusiastic and high-grade contributions. Finally, we would like to thank the conference chairmen, the members of the steering committee, the organizing committee, the organizing secretariat and the financial support from the conference sponsors that allowed the success of AeroEarth 2014. The Editors of the AeroEarth 2014 Proceedings Dr. Ford Lumban Gaol Dr. Benfano Soewito

  3. Thematic network on the analysis of thorium and its isotopes in workplace materials. Report on the 2nd intercomparison exercise

    International Nuclear Information System (INIS)

    Work Package 2 (WP 2) of the EC Thematic Network on ''The analysis of thorium and its isotopes in workplace materials'' is concerned with ''Examination and comparison of analytical techniques for the determination of thorium and its progeny in bulk materials and the development of standards, both for the calibration of associated metrological facilities, and for routine quality control purposes''. The primary objective of WP 2 is ''To evaluate appropriate techniques and determine best practice for analysis of 232Th at workplace levels and environments'', and is to be achieved through a series of intercomparison exercises. Following the results of the 1st intercomparison exercise, a 2nd intercomparison exercise was designed to evaluate the capability of the analytical methods currently being used by European laboratories to determine thorium at very low levels in the presence of a complex inorganic matrix. The intercomparison exercise involved the analysis of three samples of thorium in solution, prepared by the United Kingdom National Physical Laboratory. One sample was the equilibrium solution, analysed in the 1st intercomparison exercise. The other two samples were prepared by dilution of a non-equilibrium thorium solution, one of which was spiked with impurity elements. (i) The results showed a further improvement in the accuracy of measurements in the equilibrium solution, when compared with the results from the 1st intercomparison study. There was an o 1st intercomparison study. There was an overall drop in u-test values and the number of u-test values above the upper limit of significance fell from 6 to 3. (ii) Overall, participating laboratories also performed well in the analysis of the low level non-equilibrium solutions. However, several laboratories using y-spectrometry had insufficient sensitivity for measurement of low level non-equilibrium sample solutions and did not report results. (iii) There was little evidence that the presence of impurities had a detrimental effect on measurements made on the low level non-equilibrium sample solutions. (iv) Some laboratories used different analytical techniques in this work than in the 1st intercomparison study. Consequently, there is a slight change in the spread of analytical techniques used

  4. Software Smarts

    Science.gov (United States)

    1998-01-01

    Under an SBIR (Small Business Innovative Research) contract with Johnson Space Center, Knowledge Based Systems Inc. (KBSI) developed an intelligent software environment for modeling and analyzing mission planning activities, simulating behavior, and, using a unique constraint propagation mechanism, updating plans with each change in mission planning activities. KBSI developed this technology into a commercial product, PROJECTLINK, a two-way bridge between PROSIm, KBSI's process modeling and simulation software and leading project management software like Microsoft Project and Primavera's SureTrak Project Manager.

  5. Software requirements

    CERN Document Server

    Wiegers, Karl E

    2003-01-01

    Without formal, verifiable software requirements-and an effective system for managing them-the programs that developers think they've agreed to build often will not be the same products their customers are expecting. In SOFTWARE REQUIREMENTS, Second Edition, requirements engineering authority Karl Wiegers amplifies the best practices presented in his original award-winning text?now a mainstay for anyone participating in the software development process. In this book, you'll discover effective techniques for managing the requirements engineering process all the way through the development cy

  6. Silverlight 4 Business Intelligence Software

    CERN Document Server

    Czernicki, Bart

    2010-01-01

    Business Intelligence (BI) software allows you to view different components of a business using a single visual platform, which makes comprehending mountains of data easier. BI is everywhere. Applications that include reports, analytics, statistics, and historical and predictive modeling are all examples of BI. Currently, we are in the second generation of BI software - called BI 2.0 - which is focused on writing BI software that is predictive, adaptive, simple, and interactive. As computers and software have evolved, more data can be presented to end users with increasingly visually rich tech

  7. Inventory of safeguards software

    International Nuclear Information System (INIS)

    The purpose of this survey activity will serve as a basis for determining what needs may exist in this arena for development of next-generation safeguards systems and approaches. 23 software tools are surveyed by JAEA and NMCC. Exchanging information regarding existing software tools for safeguards and discussing about a next R and D program of developing a general-purpose safeguards tool should be beneficial to a safeguards system design and indispensable to evaluate a safeguards system for future nuclear fuel facilities. (author)

  8. Analysis of Polish writing on the history of physical education and sports in the North-Eastern borderlands of the 2nd republic

    Directory of Open Access Journals (Sweden)

    Eligiusz Ma?olepszy

    2013-05-01

    Full Text Available The aim of this paper is presentation of the up-to-date state of research on physical education and sports in the North-Eastern Borderlands of the 2nd Republic based on analysis of Polish literatureon the subject. In the sense of territorial scope, the paper covers the areas of the Polesie, Novogrodek and Vilnius voivodeships.As for the scope of studies on the history of physical education and sports in the North-Eastern Borderlands of the 2nd Republic, the most cognitively significant is the work by Laskiewicz on „Kultura fizyczna na Wile?szczy?nie w latach 1900–1939. Zarys monograficznydziejów” (Physical Culture in the Region of Vilnius in the Years 1900–1939. An Outline of Monographic History. The history of physical culture in rural areas were fairly well drawn up. Interms of historiography, there are publications presenting physical education and sports in urban areas. The publications mainly refer to physical activity in larger towns and cities, e.g. in Baranowicze, Brest-on-Bug, Lida, Novogrodek and in Vilnius. In terms of voivodeships, papers on physical education and sports in the Region of Vilnius significantly predominate. The presented analysis of the state of research – in reference to Polish writings – shows the necessity to supplement the preliminary archival researchof the sources – in order to prepare a monograph on „Dziejów wychowania fizycznego i sportu na Kresach Pó?nocno-Wschodnich II Rzeczypospolitej” (the History of Physical Education and Sports inthe North – Eastern Borderlands of the 2nd Republic. A preliminary archival research should also be conducted in the archives kept by Byelorussia and Lithuania.

  9. 2nd (final) IAEA research co-ordination meeting on 'plasma-material interaction data for mixed plasma facing materials in fusion reactors'. Summary report

    International Nuclear Information System (INIS)

    The proceedings and conclusions of the 2nd Research Co-ordination Meeting on 'Plasma-Material Interaction Data for Mixed Plasma Facing Materials in Fusion Reactors', held on October 16 and 17, 2000 at the IAEA Headquarters in Vienna, are briefly described. This report includes a summary of the presentations made by the meeting participants and a review of the accomplishments of the Co-ordinated Research Project (CRP). In addition, short summaries from the participants are included indicating the specific research completed in support of this CRP. (author)

  10. Phase equilibria and crystal chemistry of the CaO–1/2 Nd2O3–CoOz system at 885 °C in air

    International Nuclear Information System (INIS)

    The phase diagram of the CaO–1/2 Nd2O3–CoOz system at 885 °C in air has been determined. The system consists of two calcium cobaltate compounds that have promising thermoelectric properties, namely, the 2D thermoelectric oxide solid solution, (Ca3?xNdx)Co4O9?z (0?x?0.5), which has a misfit layered structure, and Ca3Co2O6 which consists of 1D chains of alternating CoO6 trigonal prisms and CoO6 octahedra. Ca3Co2O6 was found to be a point compound without the substitution of Nd on the Ca site. The reported Nd2CoO4 phase was not observed at 885 °C. A ternary (Ca1?xNd1+x)CoO4?z (x=0) phase, or (CaNdCo)O4?z, was found to be stable at this temperature. A solid solution region of distorted perovskite (Nd1?xCax)CoO3?z (0?x?0.25, space group Pnma) was established. In the peripheral binary systems, while a solid solution region was identified for (Nd1?xCax)2O3?z (0?x?0.2), Nd was not found to substitute in the Ca site of CaO. Six solid solution tie-line regions and six three-phase regions were determined in the CaO–Nd2O3–CoOz system in air. - Graphical abstract: Phase diagram of the 1/2 Nd2O3–CaO–CoOx system at 885 °C, showing the limits of various solid solutions, and the tie-line relationships of various phases. - Highlights: • Phase diagram of the CaO–1/2 Nd2O3–CoOz system constructed. • System consists of thermoelectric oxide (Ca3?xNdx)Co4O9?z (0?x?0.5). • Structures of (Nd1?xCax)CoO3?z and (CaNdCo)O4?z determined

  11. Proceedings of the 2nd NUCEF international symposium NUCEF`98. Safety research and development of base technology on nuclear fuel cycle

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-03-01

    This volume contains 68 papers presented at the 2nd NUCEF International Symposium NUCEF`98 held on 16-17 November 1998, in Hitachinaka, Japan, following the 1st symposium NUCEF`95 (Proceeding: JAERI-Conf 96-003). The theme of this symposium was `Safety Research and Development of Base Technology on Nuclear Fuel Cycle`. The papers were presented in oral and poster sessions on following research fields: (1) Criticality Safety, (2) Reprocessing and Partitioning, (3) Radioactive Waste Management. The 68 papers are indexed individually. (J.P.N.)

  12. ISE-SPL: uma abordagem baseada em linha de produtos de software aplicada à geração automática de sistemas para educação médica na plataforma E-learning / ISE-SPL: a software product line approach applied to automatic generation of systems for medical education in E-learning platform

    Scientific Electronic Library Online (English)

    Túlio de Paiva Marques, Carvalho; Bruno Gomes de, Araújo; Ricardo Alexsandro de Medeiros, Valentim; Jose, Diniz Junior; Francis Solange Vieira, Tourinho; Rosiane Viana Zuza, Diniz.

    2013-12-01

    Full Text Available INTRODUÇÃO: O e-learning surgiu como uma forma complementar de ensino, trazendo consigo vantagens como o aumento da acessibilidade da informação, aprendizado personalizado, democratização do ensino e facilidade de atualização, distribuição e padronização do conteúdo. Neste sentido, o presente trabal [...] ho tem como objeto apresentar uma ferramenta, intitulada de ISE-SPL, cujo propósito é a geração automática de sistemas de e-learning para a educação médica, utilizando para isso sistemas ISE (Interactive Spaced-Education) e conceitos de Linhas de Produto de Software. MÉTODOS: A ferramenta consiste em uma metodologia inovadora para a educação médica que visa auxiliar o docente da área de saúde na sua prática pedagógica por meio do uso de tecnologias educacionais, todas baseadas na computação aplicada à saúde (Informática em Saúde). RESULTADOS: Os testes realizados para validar a ISE-SPL foram divididos em duas etapas: a primeira foi feita através da utilização de um software de análise de ferramentas semelhantes ao ISE-SPL, chamado S.P.L.O.T; e a segunda foi realizada através da aplicação de questionários de usabilidade aos docentes da área da saúde que utilizaram o ISE-SPL. CONCLUSÃO: Ambos os testes demonstraram resultados positivos, permitindo comprovar a eficiência e a utilidade da ferramenta de geração de softwares de e-learning para o docente da área da saúde. Abstract in english INTRODUCTION: E-learning, which refers to the use of Internet-related technologies to improve knowledge and learning, has emerged as a complementary form of education, bringing advantages such as increased accessibility to information, personalized learning, democratization of education and ease of [...] update, distribution and standardization of the content. In this sense, this paper aims to present a tool, named ISE-SPL, whose purpose is the automatic generation of E-learning systems for medical education, making use of ISE systems (Interactive Spaced-Education) and concepts of Software Product Lines. METHODS: The tool consists of an innovative methodology for medical education that aims to assist professors of healthcare in their teaching through the use of educational technologies, all based on computing applied to healthcare (Informatics in Health). RESULTS: The tests performed to validate the ISE-SPL were divided into two stages: the first was made by using a software analysis tool similar to ISE-SPL, called S.P.L.O.T and the second was performed through usability questionnaires to healthcare professors who used ISE-SPL. CONCLUSION: Both tests showed positive results, allowing to conclude that ISE-SPL is an efficient tool for generation of E-learning software and useful for teachers in healthcare.

  13. Preseismic oscillating electric field "strange attractor like" precursor, of T = 6 months, triggered by Ssa tidal wave. Application on large (Ms > 6.0R) EQs in Greece (October 1st, 2006 - December 2nd, 2008)

    CERN Document Server

    Thanassoulas, C; Verveniotis, G; Zymaris, N

    2009-01-01

    In this work the preseismic "strange attractor like" precursor is studied, in the domain of the Earth's oscillating electric field for T = 6 months. It is assumed that the specific oscillating electric field is generated by the corresponding lithospheric oscillation, triggered by the Ssa tidal wave of the same wave length (6 months) under excess strain load conditions met in the focal area of a future large earthquake. The analysis of the recorded Earth's oscillating electric field by the two distant monitoring sites of PYR and HIO and for a period of time of 26 months (October 1st, 2006 - December 2nd, 2008) suggests that the specific precursor can successfully resolve the predictive time window in terms of months and for a "swarm" of large EQs (Ms > 6.0R), in contrast to the resolution obtained by the use of electric fields of shorter (T = 1, 14 days, single EQ identification) wave length. More over, the fractal character of the "strange attractor like" precursor in the frequency domain is pointed out. Fina...

  14. Proceedings of the Fifth Triennial Software Quality Forum 2000, Software for the Next Millennium, Software Quality Forum

    Energy Technology Data Exchange (ETDEWEB)

    Scientific Software Engineering Group, CIC-12

    2000-04-01

    The Software Quality Forum is a triennial conference held by the Software Quality Assurance Subcommittee for the Department of Energy's Quality Managers. The forum centers on key issues, information, and technology important in software development for the Nuclear Weapons Complex. This year it will be opened up to include local information technology companies and software vendors presenting their solutions, ideas, and lessons learned. The Software Quality Forum 2000 will take on a more hands-on, instructional tone than those previously held. There will be an emphasis on providing information, tools, and resources to assist developers in their goal of producing next generation software.

  15. Computing and software

    Directory of Open Access Journals (Sweden)

    White, G. C.

    2004-06-01

    Full Text Available The reality is that the statistical methods used for analysis of data depend upon the availability of software. Analysis of marked animal data is no different than the rest of the statistical field. The methods used for analysis are those that are available in reliable software packages. Thus, the critical importance of having reliable, up–to–date software available to biologists is obvious. Statisticians have continued to develop more robust models, ever expanding the suite of potential analysis methods available. But without software to implement these newer methods, they will languish in the abstract, and not be applied to the problems deserving them. In the Computers and Software Session, two new software packages are described, a comparison of implementation of methods for the estimation of nest survival is provided, and a more speculative paper about how the next generation of software might be structured is presented. Rotella et al. (2004 compare nest survival estimation with different software packages: SAS logistic regression, SAS non–linear mixed models, and Program MARK. Nests are assumed to be visited at various, possibly infrequent, intervals. All of the approaches described compute nest survival with the same likelihood, and require that the age of the nest is known to account for nests that eventually hatch. However, each approach offers advantages and disadvantages, explored by Rotella et al. (2004. Efford et al. (2004 present a new software package called DENSITY. The package computes population abundance and density from trapping arrays and other detection methods with a new and unique approach. DENSITY represents the first major addition to the analysis of trapping arrays in 20 years. Barker & White (2004 discuss how existing software such as Program MARK require that each new model’s likelihood must be programmed specifically for that model. They wishfully think that future software might allow the user to combine pieces of likelihood functions together to generate estimates. The idea is interesting, and maybe some bright young statistician can work out the specifics to implement the procedure. Choquet et al. (2004 describe MSURGE, a software package that implements the multistate capture–recapture models. The unique feature of MSURGE is that the design matrix is constructed with an interpreted language called GEMACO. Because MSURGE is limited to just multistate models, the special requirements of these likelihoods can be provided. The software and methods presented in these papers gives biologists and wildlife managers an expanding range of possibilities for data analysis. Although ease–of–use is generally getting better, it does not replace the need for understanding of the requirements and structure of the models being computed. The internet provides access to many free software packages as well as user–discussion groups to share knowledge and ideas. (A starting point for wildlife–related applications is (http://www.phidot.org.

  16. Phase evolution of Li{sub 2}ND, LiD and LiND{sub 2} in hydriding/dehydriding of Li{sub 3}N

    Energy Technology Data Exchange (ETDEWEB)

    Chien, W.-M.; Lamb, Joshua [Metallurgical Materials Engineering, MS 388, University of Nevada, Reno, NV 89557 (United States); Chandra, Dhanesh [Metallurgical Materials Engineering, MS 388, University of Nevada, Reno, NV 89557 (United States)], E-mail: dchandra@unr.edu; Huq, Ashfia [Spallation Neutron Source, Oak Ridge National Laboratory, Oak Ridge, TN 37831 (United States); Richardson, James; Maxey, Evan [Intense Pulsed Neutron Source, Argonne National Laboratory, Argonne, IL 60439 (United States)

    2007-10-31

    Neutron and synchrotron studies have been performed on in situ hydriding of Li{sub 3}N. Commercial Li{sub 3}N is composed of {alpha} phase ({approx}70 wt.%) and {beta} phase ({approx}30 wt.%). We have performed experiments to convert {beta} {yields} {alpha}, and studied in situ deuteration using neutron diffraction experiments. We found concurrent phase evolution of Li{sub 2}ND, LiD, and LiND{sub 2}. Mass percentages of the phases evolved as a function of time and temperature have been quantified using General Structure Analysis System (GSAS) refinement of the neutron diffraction data. The problem of formation of the stable LiD is discussed in light of decreasing of the amount of LiD phase when the temperature is increased from 200 to 320 deg. C during dehydriding, and in addition the concentration of Li{sub 2}ND phase increased at this temperature. Lattice parameters, volume changes, phase evolutions in wt.% as a function of temperature and time are presented.

  17. Measurements of oscillator strengths of the 2p5(2P1/2)nd J = 2, 3 autoionizing resonances in neon

    International Nuclear Information System (INIS)

    Oscillator strengths of the 2p5(2P1/2)nd J = 2, 3 autoionizing resonances in neon have been determined using a dc discharge plasma in conjunction with an Nd:YAG pumped dye laser system. The excited states are approached using two-step laser excitation via 2p53p'[1/2]1, 2p53p'[3/2]1 and 2p53p'[3/2]2 intermediate states which are accessed from the 2p53s [1/2]2 metastable state, populated by the discharge in the hollow cathode lamp. The f-values have been determined for the nd'[3/2]2, nd'[5/2]2 and nd'[5/2]3 series following the ?K = ?J = +?l selection rule. Employing the saturation technique the photoionization cross section at the 2p5 2P1/2 ionization threshold is determined as 5.5(6) Mb and consequently the f-values of the nd' J = 2, 3 autoionizing resonances have been extracted

  18. A feasible study of docetaxel/nedaplatin combined chemotherapy for relapsed or refractory esophageal cancer patients as a 2nd-line chemotherapy

    International Nuclear Information System (INIS)

    As a 2nd-line treatment for relapsed or refractory esophageal cancer patients after chemoradiotherapy, we performed a combination chemotherapy of docetaxel (DOC)/nedaplatin (CDGP) for 11 patients. Intravenous drip infusion of DOC 30 mg/m2 and CDGP 30 mg/m2 on days 1, 8 and 15, and 4 weeks treatment was assumed as 1 cycle. We treated 8 of 11 patients with more than 2 cycles, and 4 of 8 patients were treated with radiation therapy (RT). The effects by Response Evaluation Criteria In Solid Tumor (RECIST) revealed partial response (PR) in 2 patients (50%), stable disease (SD) in 1 patient and progress disease (PD) in 1 patient without RT, and PR in 3 patients and not effective in 1 patient with RT, respectively. There was no treatment-related death nor adverse event of grade 4. The Hematological toxicities of leukopenia of grade 3 were observed in 3 patients. Non-hematological toxicities more than grade 3 were not observed. The combination chemotherapy of DOC/CDGP was concluded to be safe and effective for relapsed or refractory esophageal cancer patients as a 2nd-line treatment. (author)

  19. On factors contributing to quality of nuclear control computer software

    International Nuclear Information System (INIS)

    Safety related computer software has increasingly come into focus in the software engineering field over the past decade. This paper describes how Ontario Hydro has addressed the software industry concerns in the methodology used for designing and implementing the unit control computer software for the new Darlington Generating Station. The corner stone of the methodology is a software quality assurance (SQA) program, which was initially set up to cover only the software development portion of the software life cycle, but which is now being extended to cover the entire software life cycle, including commissioning, operation and maintenance of the software. (author). 3 refs., 2 figs

  20. Software Reviews.

    Science.gov (United States)

    Teles, Elizabeth, Ed.; And Others

    1990-01-01

    Reviewed are two computer software packages for Macintosh microcomputers including "Phase Portraits," an exploratory graphics tool for studying first-order planar systems; and "MacMath," a set of programs for exploring differential equations, linear algebra, and other mathematical topics. Features, ease of use, cost, availability, and hardware…

  1. Software Patterns

    Science.gov (United States)

    It is an articulation of the principles, values,and practices behind the pattern discipline. It is organized around a body of dozens of example patterns.It covers a wide range of topics ranging from pattern forms, to pattern languages, to the history of software patterns and pattern ethics. (UNC E-Learning Grant)

  2. Integrated Software Pipelining

    OpenAIRE

    Eriksson, Mattias

    2009-01-01

    In this thesis we address the problem of integrated software pipelining for clustered VLIW architectures. The phases that are integrated and solved as one combined problem are: cluster assignment, instruction selection, scheduling, register allocation and spilling. As a first step we describe two methods for integrated code generation of basic blocks. The first method is optimal and based on integer linear programming. The second method is a heuristic based on genetic algorithms. We then exte...

  3. GREEN SOFTWARE ENGINEERING PROCESS : MOVING TOWARDS SUSTAINABLE SOFTWARE PRODUCT DESIGN

    Directory of Open Access Journals (Sweden)

    Shantanu Ray

    2013-02-01

    Full Text Available The Software development lifecycle (SDLC currently focuses on systematic execution and maintenance of software by dividing the software development process into various phases that include requirements-gathering, design, implementation, testing, deployment and maintenance. The problem here is that certain important decisions taken in these phases like use of paper, generation of e-Waste, power consumption and increased carbon foot print by means of travel, Air-conditioning etc may harm the environment directly or indirectly. There is a dearth of models that define how a software can be developed and maintained in an environment friendly way. This paper discusses the changes in the existing SDLC and suggests appropriate steps which can lead to lower carbon emissions, power and paper use, thus helping organizations to move towards greener and sustainable software development.

  4. A Survey on Software Testing Techniques using Genetic Algorithm

    OpenAIRE

    Chayanika Sharma; Proffesoor. Sangeeta Sabharwal; Asst. Professor Ritu Sibal

    2013-01-01

    The overall aim of the software industry is to ensure delivery of high quality software to the end user. To ensure high quality software, it is required to test software. Testing ensures that software meets user specifications and requirements. However, the field of software testing has a number of underlying issues like effective generation of test cases, prioritisation of test cases etc which need to be tackled. These issues demand on effort, time and cost of the testing. ...

  5. Software requirements: Guidance and control software development specification

    Science.gov (United States)

    Withers, B. Edward; Rich, Don C.; Lowman, Douglas S.; Buckland, R. C.

    1990-01-01

    The software requirements for an implementation of Guidance and Control Software (GCS) are specified. The purpose of the GCS is to provide guidance and engine control to a planetary landing vehicle during its terminal descent onto a planetary surface and to communicate sensory information about that vehicle and its descent to some receiving device. The specification was developed using the structured analysis for real time system specification methodology by Hatley and Pirbhai and was based on a simulation program used to study the probability of success of the 1976 Viking Lander missions to Mars. Three versions of GCS are being generated for use in software error studies.

  6. A Discrete Mechanical Model of 2D Carbon Allotropes Based on a 2nd-Generation REBO Potential: Geometry and Prestress of Single-Walled CNTs

    OpenAIRE

    Favata, Antonino; Micheletti, Andrea; Podio-guidugli, Paolo; Pugno, Nicola

    2014-01-01

    Predicting the natural equilibrium radius of a single - walled Carbon NanoTube (CNT) of given chirality, and evaluating - if any - the accompanying prestress state, are two important issues, the first of which has been repeatedly taken up in the last decade or so. In this paper, we work out both such a prediction and such an evaluation for achiral (that is, armchair and zigzag) CNTs, modeled as discrete elastic structures whose shape and volume changes are governed by a Reac...

  7. SHARK (System for coronagraphy with High order Adaptive optics from R to K band): a proposal for the LBT 2nd generation instrumentation

    Science.gov (United States)

    Farinato, Jacopo; Pedichini, Fernando; Pinna, Enrico; Baciotti, Francesca; Baffa, Carlo; Baruffolo, Andrea; Bergomi, Maria; Bruno, Pietro; Cappellaro, Enrico; Carbonaro, Luca; Carlotti, Alexis; Centrone, Mauro; Close, Laird; Codona, Johanan; Desidera, Silvano; Dima, Marco; Esposito, Simone; Fantinel, Daniela; Farisato, Giancarlo; Fontana, Adriano; Gaessler, Wolfgang; Giallongo, Emanuele; Gratton, Raffaele; Greggio, Davide; Guerra, Juan Carlos; Guyon, Olivier; Hinz, Philip; Leone, Francesco; Lisi, Franco; Magrin, Demetrio; Marafatto, Luca; Munari, Matteo; Pagano, Isabella; Puglisi, Alfio; Ragazzoni, Roberto; Salasnich, Bernardo; Sani, Eleonora; Scuderi, Salvo; Stangalini, Marco; Testa, Vincenzo; Verinaud, Christophe; Viotto, Valentina

    2014-08-01

    This article presents a proposal aimed at investigating the technical feasibility and the scientific capabilities of high contrast cameras to be implemented at LBT. Such an instrument will fully exploit the unique LBT capabilities in Adaptive Optics (AO) as demonstrated by the First Light Adaptive Optics (FLAO) system, which is obtaining excellent results in terms of performance and reliability. The aim of this proposal is to show the scientific interest of such a project, together with a conceptual opto-mechanical study which shows its technical feasibility, taking advantage of the already existing AO systems, which are delivering the highest Strehl experienced in nowadays existing telescopes. Two channels are foreseen for SHARK, a near infrared channel (2.5-0.9 um) and a visible one (0.9 - 0.6 um), both providing imaging and coronagraphic modes. The visible channel is equipped with a very fast and low noise detector running at 1.0 kfps and an IFU spectroscopic port to provide low and medium resolution spectra of 1.5 x 1.5 arcsec fields. The search of extra solar giant planets is the main science case and the driver for the technical choices of SHARK, but leaving room for several other interesting scientific topics, which will be briefly depicted here.

  8. Design and manufacture of a D-shape coil-based toroid-type HTS DC reactor using 2nd generation HTS wire

    Science.gov (United States)

    Kim, Kwangmin; Go, Byeong-Soo; Sung, Hae-Jin; Park, Hea-chul; Kim, Seokho; Lee, Sangjin; Jin, Yoon-Su; Oh, Yunsang; Park, Minwon; Yu, In-Keun

    2014-09-01

    This paper describes the design specifications and performance of a real toroid-type high temperature superconducting (HTS) DC reactor. The HTS DC reactor was designed using 2G HTS wires. The HTS coils of the toroid-type DC reactor magnet were made in the form of a D-shape. The target inductance of the HTS DC reactor was 400 mH. The expected operating temperature was under 20 K. The electromagnetic performance of the toroid-type HTS DC reactor magnet was analyzed using the finite element method program. A conduction cooling method was adopted for reactor magnet cooling. Performances of the toroid-type HTS DC reactor were analyzed through experiments conducted under the steady-state and charge conditions. The fundamental design specifications and the data obtained from this research will be applied to the design of a commercial-type HTS DC reactor.

  9. Software for mass spectrometer control

    International Nuclear Information System (INIS)

    The paper describes a software application for the MAT 250 mass spectrometer control, which was refurbished. The spectrometer was bring-up-to-date using a hardware structure on its support where the software application for mass spectrometer control was developed . The software application is composed of dedicated modules that perform given operations. The instructions that these modules have to perform are generated by a principal module. This module makes possible the change of information between the modules that compose the software application. The use of a modal structure is useful for adding new functions in the future. The developed application in our institute made possible the transformation of the mass spectrometer MAT 250 into a device endowed with other new generation tools. (authors)

  10. Software testing in roughness calculation

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Y L [Center for Measurement Standards/ITRI, Bldg. 16, 321 Kuang Fu Rd, Sec. 2, Hsinchu, Taiwan (China); Hsieh, P F [United Microelectronics Corp., No. 3, Li-Hsin 2nd Rd., Hsinchu Science Park, Hsinchu, Taiwan (China); Fu, W E [Center for Measurement Standards/ITRI, Bldg. 16, 321 Kuang Fu Rd, Sec. 2, Hsinchu, Taiwan (China)

    2005-01-01

    A test method to determine the function quality provided by the software for roughness measurement is presented in this study. The function quality of the software requirements should be part of and assessed through the entire life cycle of the software package. The specific function, or output accuracy, is crucial for the analysis of the experimental data. For scientific applications, however, commercial software is usually embedded with specific instrument, which is used for measurement or analysis during the manufacture process. In general, the error ratio caused by the software would be more apparent especially when dealing with relatively small quantities, like the measurements in the nanometer-scale range. The model of 'using a data generator' proposed by NPL of UK was applied in this study. An example of the roughness software is tested and analyzed by the above mentioned process. After selecting the 'reference results', the 'reference data' was generated by a programmable 'data generator'. The filter function of 0.8 mm long cutoff value, defined in ISO 11562 was tested with 66 sinusoid data at different wavelengths. Test results from commercial software and CMS written program were compared to the theoretical data calculated from ISO standards. As for the filter function in this software, the result showed a significant disagreement between the reference and test results. The short cutoff feature for filtering at the high frequencies does not function properly, while the long cutoff feature has the maximum difference in the filtering ratio, which is more than 70% between the wavelength of 300 {mu}m and 500 {mu}m. Conclusively, the commercial software needs to be tested more extensively for specific application by appropriate design of reference dataset to ensure its function quality.

  11. EPIQR software

    Energy Technology Data Exchange (ETDEWEB)

    Flourentzos, F. [Federal Institute of Technology, Lausanne (Switzerland); Droutsa, K. [National Observatory of Athens, Athens (Greece); Wittchen, K.B. [Danish Building Research Institute, Hoersholm (Denmark)

    1999-11-01

    The support of the EPIQR method is a multimedia computer program. Several modules help the users of the method to treat the data collected during a diagnosis survey, to set up refurbishment scenario and calculate their cost or energy performance, and finally to visualize the results in a comprehensive way and to prepare quality reports. This article presents the structure and the main features of the software. (au)

  12. Instrumentation-software system of controllable well logging neutron generator AINK-36-3Ts and its application in petroleum geology

    International Nuclear Information System (INIS)

    Science and Technical Department of JSC Tatneftegeofizika has developed in 1999 an instrumentation-software system of pulsed neutron logging AINK- 36-3Ts which offers advanced capabilities in logging oil and gas wells. This system provides as follows: (i) multispaced (3 spacings) pulsed neutron gamma log; (ii) multispaced neutron activation oxygen log; (iii) natural gamma-ray log. Specific features of this system are as follows: use of symmetrical spacings (60 cm); use of PIC-processors in the downhole unit; and use of realistic adaptive models of a PNGL response. Basic instrumentation specifications of the system are as follows: yield of 14-MeV neutrons: 6.107 n/s; modulation frequency of neutron bursts: 20 Hz; guaranteed serviceability period of the neutron tube: 100 h; outer diameter of the logging tool: 36 mm; total length of the logging tool: 3.1 m; number of spacings: 3 (30, 60, -60 cm); telemetry type: Manchester II. Preprocessing of the tool response is made in the downhole unit, next the digital information is transmitted to the surface and processed on a surface-based workstation. High stability of the neutron yield and two-exponent model applied in data processing allow one to produce parameter ?af with an accuracy of up to 5% and fluid production rate in well (from neutron-activation oxygen log) with an accuracy of up to 5%. This instrumentation-software system can be applied on oil-and-gas fields to solve the following problems: litelds to solve the following problems: lithological analysis of productive formations and estimating their flow properties; controlling the position of WOC and GFC in pay zones under production; determining the residual oil-saturation in completely watered, originally oil- prone formations; identifying the hydrodynamic communication between perforated productive formations and above- and below-lying water-bearing beds. The use of symmetrical spacings provides a separate estimation of upward and downward water (or watered product) flow rates. The AINK-36-3Ts instrumentation-software system is presently subject to testing on the oil fields of Tatarstan. (author)

  13. Software preservation

    Directory of Open Access Journals (Sweden)

    Tadej Vodopivec

    2011-01-01

    Full Text Available Comtrade Ltd. covers a wide range of activities related to information and communication technologies; its deliverables include web applications, locally installed programs,system software, drivers, embedded software (used e.g. in medical devices, auto parts,communication switchboards. Also the extensive knowledge and practical experience about digital long-term preservation technologies have been acquired. This wide spectrum of activities puts us in the position to discuss the often overlooked aspect of the digital preservation - preservation of software programs. There are many resources dedicated to digital preservation of digital data, documents and multimedia records,but not so many about how to preserve the functionalities and features of computer programs. Exactly these functionalities - dynamic response to inputs - render the computer programs rich compared to documents or linear multimedia. The article opens the questions on the beginning of the way to the permanent digital preservation. The purpose is to find a way in the right direction, where all relevant aspects will be covered in proper balance. The following questions are asked: why at all to preserve computer programs permanently, who should do this and for whom, when we should think about permanent program preservation, what should be persevered (such as source code, screenshots, documentation, and social context of the program - e.g. media response to it ..., where and how? To illustrate the theoretic concepts given the idea of virtual national museum of electronic banking is also presented.

  14. SOFTWARE TOOL FOR LEARNING THE GENERATION OF THE CARDIOID CURVE IN AN AUTOCAD ENVIRONMENT / HERRAMIENTA SOFTWARE PARA EL APRENDIZAJE DE LA GENERACIÓN DE LA CARDIODE EN UN ENTORNO AUTOCAD

    Scientific Electronic Library Online (English)

    MIGUEL ÁNGEL, GÓMEZ-ELVIRA-GONZÁLEZ; JOSÉ IGNACIO, ROJAS-SOLA; MARÍA DEL PILAR, CARRANZA-CAÑADAS.

    2012-02-01

    Full Text Available Este artículo presenta una novedosa aplicación desarrollada en Visual LISP para el entorno AutoCAD, que presenta de forma rápida e intuitiva la generación de la cardiode de cinco formas diferentes, siendo dicha curva cíclica, la que presenta una amplia gama de aplicaciones artísticas y técnicas, ent [...] re ellas, el perfil de algunas levas. Abstract in english This article presents a novel application which has been developed in Visual LISP for an AutoCAD environment, and which shows the generation of the cardioid curve intuitively and quickly in five different ways (using the conchoid of a circumference, pedal curve of a circumference, inverse of a parab [...] ola, orthoptic curve of a circumference, and epicycloid of a circumference). This cyclic curve has a large number of artistic and technical applications, among them the profile of some cams.

  15. Focal plane detector for QDD spectrography in Institute of Nuclear Study and detector for SMART 2nd focal plane in RIKEN

    Energy Technology Data Exchange (ETDEWEB)

    Fuchi, Yoshihide [Tokyo Univ., Tanashi (Japan). Inst. for Nuclear Study

    1996-09-01

    The focal plane detector for QDD spectrography in Institute of Nuclear Study was composed of drift space and a proportional counter tube, and the latter is composed of position detector and two delta E detector for recognizing the particles. In this detector, a uniform parallel electric field can be obtained by placing a guard plate at the same height as that of a drift plate outer place of the detector. On the other hand, the detector for SMART 2nd focal plate in RIKEN is composed of drift space and a single wire proportional counter, and has two cathode read out single wire drift counters set so as to hold the focal plane. (G.K.)

  16. Proceedings of the 2nd international workshop on electromagnetic forces and related effects on blankets and other structures surrounding the fusion plasma torus

    Science.gov (United States)

    Takagi, T.; Nishiguchi, I.; Yoshida, Y.

    This publication is the collection of the papers presented at the title meeting. The subjects of the papers presented were categorized in six parts and are contained in this volume. In the first part, the valuable experiences are presented concerning electromagnetic phenomena in existing large devices or those under construction. In the 2nd part, the papers are mainly concerning on the evaluation of the electromagnetic fields and forces for the next experimental reactors. In the 3rd part, electromagnetomechanical coupling problems were treated by numerical and experimental approaches. In the part 4, numerical and experimental approaches for ferromagnetic structures are performed. In the 5th part, papers related to the structural integrity evaluation are presented. The part 6 is devoted to the proposal of the intelligent material system. A summary of the panel discussion held at the final session of the workshop is also included at the end of this volume. The 22 of the presented papers are indexed individually.

  17. Double NASICON-type cell: ordered Nd3+ distribution in Li0.2Nd0.8/3Zr2(PO4)3.

    Science.gov (United States)

    Barré, Maud; Crosnier-Lopez, Marie-Pierre; Le Berre, Françoise; Bohnké, Odile; Suard, Emmanuelle; Fourquet, Jean-Louis

    2008-06-21

    The NASICON compound Li(0.2)Nd(0.8/3)Zr(2)(PO(4))(3), synthesized by a sol-gel process, has been structurally characterized by TEM and powder diffraction (neutron and X-ray). It crystallizes in the space group R3[combining macron] (No. 148): at room temperature, the Nd(3+) ions present an ordered distribution in the [Zr(2)(PO(4))(3)](-) network which leads to a doubling of the classical c parameter (a = 8.7160(3) A, c = 46.105(1) A). Above 600 degrees C, Nd(3+) diffusion occurs leading at 1000 degrees C to the loss of the supercell. This reversible cationic diffusion in a preserved 3D [Zr(2)(PO(4))(3)](-) network is followed through thermal X-ray diffraction. Ionic conductivity measurements have been undertaken by impedance spectroscopy, while some results concerning the sintering of the NASICON compound are given. PMID:18521448

  18. Final Report of the 2nd Ex-Vessel Neutron Dosimetry Installation And Evaluations for Kori Unit 1 Reactor Pressure Vessel

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Byoung Chul; Yoo, Choon Sung; Lee, Sam Lai; Chang, Kee Ok; Gong, Un Sik; Choi, Kwon Jae; Chang, Jong Hwa; Kim, Kwan Hyun; Hong, Joon Wha

    2007-02-15

    This report describes a neutron fluence assessment performed for the Kori Unit 1 pressure vessel beltline region based on the guidance specified in Regulatory Guide 1.190. In this assessment, maximum fast neutron exposures expressed in terms of fast neutron fluence (E>1 MeV) and iron atom displacements (dpa) were established for the beltline region of the pressure vessel. After Cycle 22 of reactor operation, 2nd Ex-Vessel Neutron Dosimetry Program was instituted at Kori Unit 1 to provide continuous monitoring of the beltline region of the reactor vessel. The use of the Ex-Vessel Neutron Dosimetry Program coupled with available surveillance capsule measurements provides a plant specific data base that enables the evaluation of the vessel exposure and the uncertainty associated with that exposure over the service life of the unit. Ex-Vessel Neutron Dosimetry has been evaluated at the conclusion of Cycle 23.

  19. Analysis of the 2nd Ispra test on the 1/6TH scale model of the SNR-300 fast breeder reactor with the PISCES-2 DELK code

    International Nuclear Information System (INIS)

    This paper presents a numerical simulation of the 2nd test carried out at the JRC in Ispra (Italy) in the 1/6th scale vessel model of the SNR-300 LMFBR. The computer code used for the analysis was the hybrid code PISCES-2D-ELK, being an explicit finite difference code capable of simulating fluids and structures in an Eulerian reference frame, a Lagrange reference frame and having a thin shell option available. The analysis was performed for a total transient time of 10 ms and the results are presented in the form of contour-velocity vector plots at several intervals. The evolution of some important parameters such as the impulse and strain at the shieldtank, the pressure and impulse at the vessel roof (waterhammer) and the strain profiles of the shieldtank and the main vessel are compared with the measured experimental results. (orig./GL)

  20. Use of thermodynamic calculation for investigating phase diagram of the ternary system NaCl-PbCl2-NdCl3

    International Nuclear Information System (INIS)

    Thermodynamic calculation of meltability diagram of ternary system NaCl-PbCl2-NdCl3 with the use of literature and experimental data on meltability diagrams of binary systems forming it, as well as data on crystallization heats of the components, has been carried out. Equations are derived under condition of pseudoperfection of the ternary system. 64 mol.% PbCl2, 26 mol.% NaCl, 10 mol.% NdCl3 and average temperature of crystallization 391 deg C correspond to the calculated composition of the ternary eutectics, 49 mol.% PbCl2, 35 mol.% NaCl, 16 mol.% NdCl3 and average temperature of peritectic transformation 416 deg C - to the composition of the ternary peritectic. The results obtained agree well with the experimental data

  1. Synthesis and characterization of new cuprates (Pb0.75Mo0.25)Sr2(Nd1-xCax)Cu2O7-?

    International Nuclear Information System (INIS)

    New cuprates with nominal composition (Pb0.75Mo0.25)Sr2(Nd1-xCax)Cu2O7-? (0 ? x ? 1) were synthesized by solid state reactions in N2. The crystal structure was characterized by x-ray powder diffraction as tetragonal. Dc electrical resistance and ac magnetic susceptibility measurements were applied to check the existence of superconductivity in these cuprates. Superconductivity with Tc up to 73 K is observed in the range 0.4 ? x ? 0.6. Synthesis in N2 is found to be necessary for obtaining superconductivity in these cuprates. Preparation in air or post-treatment in flowing oxygen destroys superconductivity. Comparison with previous Pb-based 1212 superconducting oxides is made. The valence of Pb and the possible position of Mo in the lattice are discussed. (author)

  2. Final Report of the 2nd Ex-Vessel Neutron Dosimetry Installation And Evaluations for Yonggwang Unit 1 Reactor Pressure Vessel

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Byoung Chul; Yoo, Choon Sung; Lee, Sam Lai; Chang, Kee Ok; Gong, Un Sik; Choi, Kwon Jae; Chang, Jong Hwa; Li, Nam Jin; Hong, Joon Wha

    2007-01-15

    This report describes a neutron fluence assessment performed for the Yonggwang Unit 1 pressure vessel belt line region based on the guidance specified in Regulatory Guide 1.190. In this assessment, maximum fast neutron exposures expressed in terms of fast neutron fluence (E>1 MeV) and iron atom displacements (dpa) were established for the belt line region of the pressure vessel. During Cycle 16 of reactor operation, 2nd Ex-Vessel Neutron Dosimetry Program was instituted at Yonggwang Unit 1 to provide continuous monitoring of the belt line region of the reactor vessel. The use of the Ex-Vessel Neutron Dosimetry Program coupled with available surveillance capsule measurements provides a plant specific data base that enables the evaluation of the vessel exposure and the uncertainty associated with that exposure over the service life of the unit. Ex-Vessel Neutron Dosimetry has been evaluated at the conclusion of Cycle 16.

  3. Office Computer Software: A Comprehensive Review of Software Programs.

    Science.gov (United States)

    Secretary, 1992

    1992-01-01

    Describes types of software including system software, application software, spreadsheets, accounting software, graphics packages, desktop publishing software, database, desktop and personal information management software, project and records management software, groupware, and shareware. (JOW)

  4. Phase equilibria and crystal chemistry of the CaO-1/2 >Nd2O3-CoOz system at 885 °C in air

    Science.gov (United States)

    Wong-Ng, W.; Laws, W.; Talley, K. R.; Huang, Q.; Yan, Y.; Martin, J.; Kaduk, J. A.

    2014-07-01

    The phase diagram of the CaO-1/2 >Nd2O3-CoOz system at 885 °C in air has been determined. The system consists of two calcium cobaltate compounds that have promising thermoelectric properties, namely, the 2D thermoelectric oxide solid solution, (Ca3-xNdx)Co4O9-z (0?x?0.5), which has a misfit layered structure, and Ca3Co2O6 which consists of 1D chains of alternating CoO6 trigonal prisms and CoO6 octahedra. Ca3Co2O6 was found to be a point compound without the substitution of Nd on the Ca site. The reported Nd2CoO4 phase was not observed at 885 °C. A ternary (Ca1-xNd1+x)CoO4-z (x=0) phase, or (CaNdCo)O4-z, was found to be stable at this temperature. A solid solution region of distorted perovskite (Nd1-xCax)CoO3-z (0?x?0.25, space group Pnma) was established. In the peripheral binary systems, while a solid solution region was identified for (Nd1-xCax)2O3-z (0?x?0.2), Nd was not found to substitute in the Ca site of CaO. Six solid solution tie-line regions and six three-phase regions were determined in the CaO-Nd2O3-CoOz system in air.

  5. Development of a Risk-Based Probabilistic Performance-Assessment Method for Long-Term Cover Systems - 2nd Edition

    International Nuclear Information System (INIS)

    A probabilistic, risk-based performance-assessment methodology has been developed to assist designers, regulators, and stakeholders in the selection, design, and monitoring of long-term covers for contaminated subsurface sites. This report describes the method, the software tools that were developed, and an example that illustrates the probabilistic performance-assessment method using a repository site in Monticello, Utah. At the Monticello site, a long-term cover system is being used to isolate long-lived uranium mill tailings from the biosphere. Computer models were developed to simulate relevant features, events, and processes that include water flux through the cover, source-term release, vadose-zone transport, saturated-zone transport, gas transport, and exposure pathways. The component models were then integrated into a total-system performance-assessment model, and uncertainty distributions of important input parameters were constructed and sampled in a stochastic Monte Carlo analysis. Multiple realizations were simulated using the integrated model to produce cumulative distribution functions of the performance metrics, which were used to assess cover performance for both present- and long-term future conditions. Performance metrics for this study included the water percolation reaching the uranium mill tailings, radon gas flux at the surface, groundwater concentrations, and dose. Results from uncertainty analyses, sensitivity analyses, and alternative design citivity analyses, and alternative design comparisons are presented for each of the performance metrics. The benefits from this methodology include a quantification of uncertainty, the identification of parameters most important to performance (to prioritize site characterization and monitoring activities), and the ability to compare alternative designs using probabilistic evaluations of performance (for cost savings)

  6. SOFTWARE METRICS VALIDATION METHODOLOGIES IN SOFTWARE ENGINEERING

    Directory of Open Access Journals (Sweden)

    K.P. Srinivasan

    2014-12-01

    Full Text Available In the software measurement validations, assessing the validation of software metrics in software engineering is a very difficult task due to lack of theoretical methodology and empirical methodology [41, 44, 45]. During recent years, there have been a number of researchers addressing the issue of validating software metrics. At present, software metrics are validated theoretically using properties of measures. Further, software measurement plays an important role in understanding and controlling software development practices and products. The major requirement in software measurement is that the measures must represent accurately those attributes they purport to quantify and validation is critical to the success of software measurement. Normally, validation is a collection of analysis and testing activities across the full life cycle and complements the efforts of other quality engineering functions and validation is a critical task in any engineering project. Further, validation objective is to discover defects in a system and assess whether or not the system is useful and usable in operational situation. In the case of software engineering, validation is one of the software engineering disciplines that help build quality into software. The major objective of software validation process is to determine that the software performs its intended functions correctly and provides information about its quality and reliability. This paper discusses the validation methodology, techniques and different properties of measures that are used for software metrics validation. In most cases, theoretical and empirical validations are conducted for software metrics validations in software engineering [1-50].

  7. Trans-articular chondrosarcoma grade 2 of proximal phalanx resulting in its fracture along with destruction of middle phalanx of 2nd toe right foot: a case report and review of the literature

    OpenAIRE

    Bashir, Sheikh Irfan; Gupta, Rajesh; Khan, Haris Nazir; Ahmed, Rayees; Mohd, Ashraf; Salaria, Abdul Q.

    2009-01-01

    Foot is an unusual site for chondrosarcoma and involvement of phalanges is extremely rare. We report a case of grade 2 chondrosarcoma of proximal phalanx resulting in its fracture along with transarticular extension to the middle phalanx of the 2nd toe of right foot in a 62 year old female. The patient presented with 1 and ½ year history of pain and swelling in right 2nd toe. X-ray showed presence of expanding lytic lesion with amorphous calcification along with fracture proximal phalanx. Fi...

  8. Honeywell Modular Automation System Computer Software Documentation

    International Nuclear Information System (INIS)

    The purpose of this Computer Software Document (CSWD) is to provide configuration control of the Honeywell Modular Automation System (MAS) in use at the Plutonium Finishing Plant (PFP). This CSWD describes hardware and PFP developed software for control of stabilization furnaces. The Honeywell software can generate configuration reports for the developed control software. These reports are described in the following section and are attached as addendum's. This plan applies to PFP Engineering Manager, Thermal Stabilization Cognizant Engineers, and the Shift Technical Advisors responsible for the Honeywell MAS software/hardware and administration of the Honeywell System

  9. Software for parallel processing applications

    International Nuclear Information System (INIS)

    Parallel computing has been used to solve large computing problems in high-energy physics. Typical problems include offline event reconstruction, monte carlo event-generation and reconstruction, and lattice QCD calculations. Fermilab has extensive experience in parallel computing using CPS (cooperative processes software) and networked UNIX workstations for the loosely-coupled problems of event reconstruction and monte carlo generation and CANOPY and ACPMAPS for Lattice QCD. Both systems will be discussed. Parallel software has been developed by many other groups, both commercial and research-oriented. Examples include PVM, Express and network-Linda for workstation clusters and PCN and STRAND88 for more tightly-coupled machines

  10. CHECK-software for calculating the leak of the primary circuit coolant into the boiler water of steam generators of NPP with WWER

    International Nuclear Information System (INIS)

    The CHECK program is described, which is employed for the calculation of primary circuit coolant leak to boiler water of the steam generator at NPP with WWER-type reactor. Analysis of the system of differential linear equations describing mass transfer of radionuclides over the secondary circuit system is put is the basis of the algorithm. The results of the calculation are displayed and printed out. The CHECK program is realized for IBM-compatible computers and it occupies approximately 130 kbyte

  11. Software engineering architecture-driven software development

    CERN Document Server

    Schmidt, Richard F

    2013-01-01

    Software Engineering: Architecture-driven Software Development is the first comprehensive guide to the underlying skills embodied in the IEEE's Software Engineering Body of Knowledge (SWEBOK) standard. Standards expert Richard Schmidt explains the traditional software engineering practices recognized for developing projects for government or corporate systems. Software engineering education often lacks standardization, with many institutions focusing on implementation rather than design as it impacts product architecture. Many graduates join the workforce with incomplete skil

  12. International Conference on Small and Special Electrical Machines, 2nd, London, England, September 22-24, 1981, Proceedings

    Science.gov (United States)

    Papers are presented on recent research concerning small and special electrical machines, including machine selection and environmental aspects; induction motors; stepping motors and drives; actuators, torque motors, and couplers; hysteresis and reluctance motors; synchronous motors and generators (including permanent magnet); control schemes and servo machines; and dc motors (including permanent magnet and brushless). Topics examined include the reliability of small ironless rotor dc motors, a new form of induction motor for fan drives, a study of the components of interbar voltage and magnetic field at the surface of small skewed diecast aluminum rotors, the microprocessor control of a step motor with various inertia loads, the synchronization of reluctance motor without pole-slipping, and the normal force in linear stepping motors. Also discussed are a direct simulation method using magnetic equivalent circuits for converter-fed reluctance machines, the synchronous performance of a single-phase machine with induced excitation, the application of design and analysis in small machines for aircraft, the microprocessor control of an inverter-driven reluctance motor, an electric main propulsion drive for a remotely piloted vehicle, and small dc motors with controllable electronic commutators. No individual items are abstracted in this volume

  13. Microgravimetric and magnetometric three-dimensional analysis in the 2nd section of the Bosque de Chapultepec

    Science.gov (United States)

    Escobedo-Zenil, D.; Sanchez-Gonzalez, J.; García-Serrano, A.

    2013-05-01

    The Bosque de Chapultepec is the most important recreational area in Mexico City. In the early 20th century construction material in this region was exploited illegally generating a clandestine system of mines without any registration or census. Later in the early 50's it was planned the creation of a park in the area, nonetheless many mines were blocked by debris or vegetation and only a few mines were filled to build the infrastructure of the park. In June 2006, the collapse of the foundation slab of the Lago Mayor emptied 5000 cubic meters of water, which made clear the need of near surface geophysics studies to locate instabilities due to underground cavities. This work describes the progress of microgravimetry and magnetometry studies located in a forest region where the collapse of a mine entrance occurred. This mine has 4 known branches, but the full extent or if these branches are connected to the entrance of another filled mine located approximately to 100 meters is unknown. The results of this work, in correlation with the geological model and preliminary results of seismic and electrical methods, show lateral variations that may be associated with cavities and possible structural faults, which represent hazards to the Bosque de Chapultepec.

  14. Software for surface analysis

    Science.gov (United States)

    Watson, D. G.; Doern, F. E.

    1985-04-01

    Two software packages designed to aid in the analysis of digitally stored Secondary Ion Mass Spectrometric (SIMS) and electron spectroscopic data are described. The first, MASS, is a program that normalizes, and allows the application of sensitivity coefficients to SIMS depth profiles. The second, DIP, is a digital image processor designed to enhance secondary, backscattered, and Auger electron spectroscopic (AES) maps. DIP can also provide quantitative area analysis of AES maps. The algorithms are currently optimized to handle data generated by Physical Electronics Industries data acquisition systems, but are generally applicable.

  15. New hotspotting software available

    Science.gov (United States)

    Wessel, Paul

    A new software package for use in “hotspotting” has been released and is available without charge to help clarify the method and stimulate plate motion research. Hotspotting was introduced by Wessel and Kroenke [1997a] as a new geometric technique and immediately generated much interest [e.g., Schneider, 1997; Stein, 1997].The method involves drawing crustal flowlines back in time from seamount locations. Because seamount ages are not used, one can include all available seamounts. If hotspots are stationary, if the stage rotations are accurate, and if the seamounts were indeed formed at hotspots, then these flowlines will intersect at the hotspot locations.

  16. CROSS-SECTION GENERATION OF VARIOUS GEO-SCIENTIFIC FEATURES WITHOUT CONTOUR DIGITIZATION USING A VISUAL C++ BASED SOFTWARE APPLICATION ‘VIGAT 2005’

    Directory of Open Access Journals (Sweden)

    Dasgupta A. R.

    2007-06-01

    Full Text Available Cross-section can be described as a two dimensional dataset where the horizontal distances are represented on the x-axis and the depth on the y-axis. A cross-section is a window into the subsurface.
    This work presents the construction of cross sections with the help of 'Vigat 2005' - a Visual C ++ based software application. Its main purpose is to provide cross section views of geoscientific features
    and to interpret their variation within the area of study. In geological context, profile or cross section is an exposure of the ground showing depositional strata. Geological cross sections are very powerful
    means of conveying structural geometries. They are planar, usually vertical, graphical representations of earth sections showing stratigraphical successions, age, structure, and rock types present in the subsurface. Geological cross sections allow a better conceptualization of the 3-D geometry of the structures. By using 'Vigat 2005', a cross section graphic can be displayed by the user with a simple click of the mouse. It offers much easy to use functionality to facilitate the completion of desired tasks. Specific boundary conditions to represent the movement of rock block over the fault can be
    displayed using the graphical user interface. Relief or slope variation of the study area can also be viewed. A topographical map provides an aerial (overhead view of a landscape. It is possible to create a more pictorial representation of the landscape by making a topographic profile of the region.
    A topographic profile is a cross section showing elevations and slopes along a given line. A precise method to determine slope variations is to construct a profile or cross section through the topography. The most important advantage of 'Vigat 2005' is that users do not need to digitize contours. This work focuses on the design and implementation of an optimized interpretive environment that have been built using Visual C++ tools. Strong programming capabilities of Visual C++ have been utilized, at its full extent, for the development of 'Vigat 2005'.

  17. Software Model Of Software-Development Process

    Science.gov (United States)

    Lin, Chi Y.; Synott, Debra J.; Levary, Reuven R.

    1990-01-01

    Collection of computer programs constitutes software tool for simulation of medium- to large-scale software-development projects. Necessary to include easily identifiable and more-readily quantifiable characteristics like costs, times, and numbers of errors. Mathematical model incorporating these and other factors of dynamics of software-development process implemented in the Software Life Cycle Simulator (SLICS) computer program. Simulates dynamics of software-development process. In combination with input and output expert software systems and knowledge-based management software system, develops information for use in managing large software-development project. Intended to aid managers in planning, managing, and controlling software-development processes by reducing uncertainties in budgets, required personnel, and schedules.

  18. Progress report of the Research Group. 1st part: Tore Supra. 2nd part: Fontenay-aux-Roses

    International Nuclear Information System (INIS)

    Three major events dominated the activities of the EURATOM/CEA association during 1980: the decision to launch the realization of the TORE SUPRA project, the progressive recognition of high frequency heating as a solution for the future, and the increasing support given to the development of heating methods and diagnostics in the JET project. It is estimated that project studies are sufficiently advanced and that industrial fabrication problems have been sufficiently covered for the realization of Tore Supra to begin in 1981. One of the successes of the work carried out is the complete validation for the superfluid helium cooling system. The satisfactory development of high frequency heating and the increasing credibility of this form of heating for future work are very important factors. In this context, the decision of the JET to envisage a large amount of ionic cyclotron heating is particularly important. The results obtained in 1980 are in fact very encouraging. The maximum power of the 500 kW T.F.R. generator was coupled with the plasma and it was possible to establish an energy Q-value. Even though the injection of neutral particles can now be considered as a proved heating method, studies of the accompanying physical phenomena are still important. The T.F.R. experiments carried out in this field in 1980 were very useful. The importance of the realization and development activities conducted during 1980, should not mask the enormous effort that made, both experimentally and theoretically, in order to understand key physical phenomena in plasma. The main peoccupation concerned small and large disruptions and all aspects of the associated instabilities. A detailed analysis of the experimental results using numerical models has led to improved empirical knowledge on the elementary transport phenomena taking place. Increasingly detailed studies on microinstabilities were also fruitful and have even led to a complete reversal in some of the ideas held about the case of universal instabilities

  19. Avaliação do Acesso em Saúde na 2ª Microrregião de Saúde, CE / Evaluation of the Access to Health in the 2nd Microregion of Health-CE

    Scientific Electronic Library Online (English)

    Maria Verônica Sales da, Silva; Maria Josefina da, Silva; Lucilane Maria Sales da, Silva; Adail Afrânio Marcelino do, Nascimento; Ana Kelve Castro, Damasceno.

    2012-05-01

    Full Text Available O objetivo desta pesquisa é avaliar os indicadores da Central de Regulação da 2ª Microrregional de Saúde. METODOLOGIA: estudo documental, descritivo e de avaliação, realizado na 2ª Microrregional de Saúde-CE, desenvolvido nas Centrais de Marcação de Consultas (CMC) dos municípios, que envolvem 16 pr [...] ofissionais ligados à regulação microrregional e regional. A coleta dos dados realizou-se no período de fevereiro a agosto de 2007, em fonte documental. Este projeto foi submetido ao Comitê de Ética da Universidade Federal do Ceará-UFC. RESULTADOS: a aplicação do parâmetro de cobertura permitiu identificar que os municípios apresentaram baixa cobertura de consultas especializadas para a população microrregional. Apenas dois municípios possuem mais de 90% de cobertura; nos demais, esse parâmetro está aquém do estimado, tendo dois municípios apresentado 28% de cobertura e apenas um, 13%, representando o menor percentual. A supervisão da Programação pactuada e Integrada-PPI nos municípios ainda é uma atividade incipiente no âmbito das ações de controle. Percebeu-se que as CMC dos municípios da 2ª microrregião estão com indicadores de oferta e demanda estrangulados, os gestores municipais não programam o suficiente para atender a demanda da população. A regulação estruturada nesta microrregião não cumpre o papel de otimizar a utilização dos serviços de referência nos espaços supramunicipais segundo os critérios das necessidades de saúde da população. Abstract in english The target of this research is to evaluate the indicators of the Regulation Center of the 2nd Microregional of Heath. METHODOLOGY: documental, descriptive and evaluative study, held in the 2nd Microregional of Health - CE, developed in the appointment setting centers of the districts. The object of [...] study envolved 16 professionals linked to the microregional regulation. The gathering of data was made between February and August 2007, in a documental source. In relation to ethic matters, we have submitted this project to the Ethics Committee of UFC. RESULTS: The application of the parameter of covering allowed us to identify that the districts showed a low tax of appointments specialized to the microregional population. Only 2 districts have more than 90% covering, in the other ones, this parameter is below the estimated, showing two districts with 28% of covering and only one with 13% of covering which represented the lowest percentage. The supervision of PPI in the districts is still an incipient activity in terms of controlling actions. We have noticed that the CMC of the districts of the 2 nd microregional have their outnumbered indicators of offering and demands and the municipal managers do not programm enough to meet the population needs. The structured regulation in this microregion does not improve the utilization of reference services in the supra-municipal spaces, according to the necessity criteria of population health.

  20. Desempenho ortográfico de escolares do 2º ao 5º ano do ensino particular / Spelling performance of students of 2nd to 5th grade from private teaching

    Scientific Electronic Library Online (English)

    Simone Aparecida, Capellini; Ana Carla Leite, Romero; Andrea Batista, Oliveira; Maria Nobre, Sampaio; Natália, Fusco; José Francisco, Cervera-Mérida; Amparo, Ygual Fernández.

    2012-04-01

    Full Text Available SciELO Brazil | Languages: English, Portuguese Abstract in portuguese OBJETIVOS: caracterizar, comparar e classificar o desempenho dos escolares do 2º ao 5º ano do ensino particular segundo a semiologia dos erros. MÉTODO: foram avaliados 115 escolares do 2º ao 5º ano, sendo 27 do 2°ano, 30 do 3° e 4° ano e 28 do 5° ano escolar, divididos em quatro grupos, respectivame [...] nte GI, GII, GIII e GIV. As provas do protocolo de avaliação da ortografia - Pró-Ortografia foram divididas em: versão coletiva (escrita de letras do alfabeto, ditado randomizado das letras do alfabeto, ditado de palavras, ditado de pseudopalavras, ditado com figuras, escrita temática induzida por figura) e versão individual (ditado de frases, erro proposital, ditado soletrado, memória lexical ortográfica). RESULTADOS: houve diferença estatisticamente significante na comparação intergrupos, indicando que com o aumento da média de acertos em todas as provas da versão coletiva e individual e com o aumento da seriação escolar, os grupos diminuíram a média de erros na escrita com base na semiologia do erro. A maior freqüência de erros encontrada foi de ortografia natural. CONCLUSÃO: os dados deste estudo evidenciaram que o aumento da média de acertos de acordo com a seriação escolar pode ser indicativo do funcionamento normal de desenvolvimento da escrita infantil nesta população. A maior frequência de erros de ortografia natural encontrada indica que pode não estar ocorrendo instrução formal sobre a correspondência fonema-grafema, uma vez que os mesmos estão na dependência direta da aprendizagem da regra de correspondência direta fonema-grafema. Abstract in english PURPOSE: to characterize, compare and classify the performance of students from 2nd to 5th grades of private teaching according to the semiology of errors. METHOD: 115 students from the 2nd to 5th grades, 27 from the 2nd grade, 30 students from the 3rd and 4th grades, and 28 from the 5th grade divid [...] ed into four groups, respectively, GI, GII, GIII and GIV, were evaluated. The tests of Spelling Evaluation Protocol - Pro-Orthography were divided into: collective version (writing letters of the alphabet, randomized dictation of letters, word dictation, non-word dictation, dictation with pictures, thematic writing induced by picture) and individual version (dictation of sentences, purposeful error, spelled dictation, spelling lexical memory). RESULTS: there was a statistically significant difference in inter-group comparison indicating that there was an increase in average accuracy for all tests as for the individual and collective version. With the increase in grade level, the groups decreased the average of writing errors based on the semiology of errors. We found a higher frequency of natural spelling errors. CONCLUSION: data from this study showed that the increase in average accuracy according to grade level may be an indicative for normal development of student's writing in this population. The higher frequency of natural spelling errors found indicates that formal instruction on phoneme-grapheme correspondence may not be occurring, since that they are directly dependent on the learning of the rule of direct phoneme-grapheme correspondence.

  1. Avaliação do Acesso em Saúde na 2ª Microrregião de Saúde, CE Evaluation of the Access to Health in the 2nd Microregion of Health-CE

    Directory of Open Access Journals (Sweden)

    Maria Verônica Sales da Silva

    2012-05-01

    Full Text Available O objetivo desta pesquisa é avaliar os indicadores da Central de Regulação da 2ª Microrregional de Saúde. METODOLOGIA: estudo documental, descritivo e de avaliação, realizado na 2ª Microrregional de Saúde-CE, desenvolvido nas Centrais de Marcação de Consultas (CMC dos municípios, que envolvem 16 profissionais ligados à regulação microrregional e regional. A coleta dos dados realizou-se no período de fevereiro a agosto de 2007, em fonte documental. Este projeto foi submetido ao Comitê de Ética da Universidade Federal do Ceará-UFC. RESULTADOS: a aplicação do parâmetro de cobertura permitiu identificar que os municípios apresentaram baixa cobertura de consultas especializadas para a população microrregional. Apenas dois municípios possuem mais de 90% de cobertura; nos demais, esse parâmetro está aquém do estimado, tendo dois municípios apresentado 28% de cobertura e apenas um, 13%, representando o menor percentual. A supervisão da Programação pactuada e Integrada-PPI nos municípios ainda é uma atividade incipiente no âmbito das ações de controle. Percebeu-se que as CMC dos municípios da 2ª microrregião estão com indicadores de oferta e demanda estrangulados, os gestores municipais não programam o suficiente para atender a demanda da população. A regulação estruturada nesta microrregião não cumpre o papel de otimizar a utilização dos serviços de referência nos espaços supramunicipais segundo os critérios das necessidades de saúde da população.The target of this research is to evaluate the indicators of the Regulation Center of the 2nd Microregional of Heath. METHODOLOGY: documental, descriptive and evaluative study, held in the 2nd Microregional of Health - CE, developed in the appointment setting centers of the districts. The object of study envolved 16 professionals linked to the microregional regulation. The gathering of data was made between February and August 2007, in a documental source. In relation to ethic matters, we have submitted this project to the Ethics Committee of UFC. RESULTS: The application of the parameter of covering allowed us to identify that the districts showed a low tax of appointments specialized to the microregional population. Only 2 districts have more than 90% covering, in the other ones, this parameter is below the estimated, showing two districts with 28% of covering and only one with 13% of covering which represented the lowest percentage. The supervision of PPI in the districts is still an incipient activity in terms of controlling actions. We have noticed that the CMC of the districts of the 2 nd microregional have their outnumbered indicators of offering and demands and the municipal managers do not programm enough to meet the population needs. The structured regulation in this microregion does not improve the utilization of reference services in the supra-municipal spaces, according to the necessity criteria of population health.

  2. Linear algebra applications using Matlab software

    OpenAIRE

    Cornelia Victoria Anghel

    2005-01-01

    The paper presents two ways of special matrix generating using some functions included in the MatLab software package. The MatLab software package contains a set of functions that generate special matrixes used in the linear algebra applications and the signal processing from different activity fields. The paper presents two tipes of special matrixes that can be generated using written sintaxes in the dialog window of the MatLab software and for the command validity we need to press the Enter...

  3. Space Flight Software Development Software for Intelligent System Health Management

    Science.gov (United States)

    Trevino, Luis C.; Crumbley, Tim

    2004-01-01

    The slide presentation examines the Marshall Space Flight Center Flight Software Branch, including software development projects, mission critical space flight software development, software technical insight, advanced software development technologies, and continuous improvement in the software development processes and methods.

  4. Classification of automatic software build methods

    OpenAIRE

    Kawalerowicz, Marcin

    2013-01-01

    The process of creating working software from source code and other components (like libraries, database files, etc.) is called "software build". Apart from linking and compiling, it can include other steps like automated testing, static code analysis, documentation generation, deployment and other. All that steps can be automated using a build description of some sort (e.g. script). This article classifies the automatic software build processes beginning at build script and...

  5. CROSS-SECTION GENERATION OF VARIOUS GEO-SCIENTIFIC FEATURES WITHOUT CONTOUR DIGITIZATION USING A VISUAL C++ BASED SOFTWARE APPLICATION 'VIGAT 2005'

    Scientific Electronic Library Online (English)

    Srivastava, Naveenchandra N; Rathod, Brijesh G; Solanki, Ajay M; Machhar, Suresh P; Patel, Vivek R; Dasgupta, A. R..

    2007-06-01

    Full Text Available Un perfil puede ser definido como un conjunto de datos bidimensional donde las distancias horizontales son representadas en el eje x, y la profundidad en el eje y. Un perfil es una ventana al subsuelo. Este trabajo presenta la construcción de perfiles con la ayuda de 'Vigat 2005' - aplicaci&oa [...] cuten basada en Visual C++. Su propósito principal es proveer perfiles mostrando características geo-científicas que permitan interpretar la variación de las mismas dentro del área de estudio. En un contexto geológico, los perfiles son una exposición del suelo mostrando rocas sedimentarias. Estos perfiles son muy útiles para determinar las geometrías de los cuerpos. Ellos son representaciones gráficas planares, usualmente verticales, de la Tierra mostrando sucesiones estratigráficas, la edad, estructura y tipos de rocas presentes en el subsuelo. Los perfiles geológicos permiten una mejor conceptualización de la geometría tridimensional de las estructuras. Al usar 'Vigat 2005', un perfil puede ser mostrado por medio de un simple clic del ratón por parte del usuario. Este provee una mejor funcionalidad para completar las tareas deseadas. Las condiciones límites específicas para representar el movimiento de un bloque de rocas sobre una falla pueden ser mostradas usando la interfaz gráfica del programa. También puede ser mostrado el relieve o las variaciones de la pendiente el área de estudio. Un mapa topográfico permite ver vistas aéreas del terreno. Es posible crear una representación más gráfica del terreno al hacer un perfil topográfico de la región. Un perfil topográfico muestra elevaciones y pendientes a lo largo de una línea determinada. Un método preciso para determinar las variaciones de la pendiente es construir un perfil a través de la topografía. La mayor ventaja de 'Vigat 2005' es que los usuarios no necesitan digitalizar contornos. Este trabajo se enfoca en el diseño e implementación de un ambiente interpretativo optimizado que se construyo con las herramientas de Visual C++. Las fuertes capacidades de programación de C++ han sido utilizadas completamente para la construcción de 'Vigat 2005'. Abstract in english Cross-section can be described as a two dimensional dataset where the horizontal distances are represented on the x-axis and the depth on the y-axis. A cross-section is a window into the subsurface. This work presents the construction of cross sections with the help of 'Vigat 2005' - a Visual C ++ b [...] ased software application. Its main purpose is to provide cross section views of geoscientific features and to interpret their variation within the area of study. In geological context, profile or cross section is an exposure of the ground showing depositional strata. Geological cross sections are very powerful means of conveying structural geometries. They are planar, usually vertical, graphical representations of earth sections showing stratigraphical successions, age, structure, and rock types present in the subsurface. Geological cross sections allow a better conceptualization of the 3-D geometry of the structures. By using 'Vigat 2005', a cross section graphic can be displayed by the user with a simple click of the mouse. It offers much easy to use functionality to facilitate the completion of desired tasks. Specific boundary conditions to represent the movement of rock block over the fault can be displayed using the graphical user interface. Relief or slope variation of the study area can also be viewed. A topographical map provides an aerial (overhead) view of a landscape. It is possible to create a more pictorial representation of the landscape by making a topographic profile of the region. A topographic profile is a cross section showing elevations and slopes along a given line. A precise method to determine slope variations is to construct a profile or cross section through the topography. The most important advantage of 'Vigat 2005' is that users do not need to digitize contours. This work focuses on the design and imple

  6. User`s guide for PLTWIND Version 1.0: PC-based software for generating plots of monitored wind data and gridded wind fields for Hanford emergency response applications

    Energy Technology Data Exchange (ETDEWEB)

    Glantz, C.S.; Burk, K.W.

    1995-09-01

    This document is a user`s guide for the PLoT Near-Surface WIND (PLTWIND) modeling system. PLTWIND is a personal-computer-based software product designed to produce graphical displays of Hanford wind observations and model-generated wind fields. The real-time wind data processed by PLTWIND are acquired from the mainframe computer system at the Hanford Meteorology Station and copied to PLTWIND systems by the Hanford Local Area Network (HLAN). PLTWIND is designed fbr operation on an IBM-compatible PC with a connection to the HLAN. An HP-compatible pen plotter or laser printer (with a minimum of 1.5 megabytes of memory and a Plotter-in-a-Cartridge hardware) is required to generate hardcopies of PLTWTND`s graphical products. PLTWM`s products are intended for use by emergency response personnel in evaluating atmospheric dispersion characteristics in the near-surface environment. Model products provide important atmospheric information to hazard evaluators; however, these products are only tools for assessing near-surface atmospheric transport and should not be interrupted as providing definitive representation of atmospheric conditions.

  7. Model-based software design

    Science.gov (United States)

    Iscoe, Neil; Liu, Zheng-Yang; Feng, Guohui; Yenne, Britt; Vansickle, Larry; Ballantyne, Michael

    1992-01-01

    Domain-specific knowledge is required to create specifications, generate code, and understand existing systems. Our approach to automating software design is based on instantiating an application domain model with industry-specific knowledge and then using that model to achieve the operational goals of specification elicitation and verification, reverse engineering, and code generation. Although many different specification models can be created from any particular domain model, each specification model is consistent and correct with respect to the domain model.

  8. Squezed-light generation in a nonlinear planar waveguide with a periodic corrugation.

    Czech Academy of Sciences Publication Activity Database

    Pe?ina ml., Jan; Haderka, Ond?ej; Sibilia, C.; Bertolotti, M.; Scalora, M.

    2007-01-01

    Ro?. 76, ?. 3 (2007), 033813/1-033813/14. ISSN 1050-2947 Grant ostatní: GA ?R(CZ) GA202/05/0498 Institutional research plan: CEZ:AV0Z10100522 Keywords : matched 2nd-harmonic generation * photonic-bandgap structures * lithium-niobate * oscillations * enhancement * states Subject RIV: BH - Optics, Masers, Lasers Impact factor: 2.893, year: 2007

  9. An Automatic Posture Planning Software of Arc Robot Based on SolidWorks API

    Directory of Open Access Journals (Sweden)

    Junfeng Li

    2009-06-01

    Full Text Available An automatic posture planning software is presented in this paper. The main function of this software is posture planning of the welding torch. This software is developed based on the programming platform SolidWorks API. The four components of the software are path generation module, posture generation module, posture adjustment module and date out put module. This software could supplement a posture planning function to ROBOGUIDE. Some planning experiments are executed to prove the validity of the software.

  10. Assessing multizone airflow software

    Energy Technology Data Exchange (ETDEWEB)

    Lorenzetti, D.M.

    2001-12-01

    Multizone models form the basis of most computer simulations of airflow and pollutant transport in buildings. In order to promote computational efficiency, some multizone simulation programs, such as COMIS and CONTAM, restrict the form that their flow models may take. While these tools allow scientists and engineers to explore a wide range of building airflow problems, increasingly their use has led to new questions not answerable by the current generation of programs. This paper, directed at software developers working on the next generation of building airflow models, identifies structural aspects of COMIS and related programs that prevent them from easily incorporating desirable new airflow models. The paper also suggests criteria for evaluating alternate simulation environments for future modeling efforts.

  11. Simultaneous determination of ceftriaxone and streptomycin in mixture by 'ratio-spectra' 2nd derivative and 'zero-crossing' 3rd derivative spectrophotometry.

    Science.gov (United States)

    Morelli, B

    1994-05-01

    Binary mixtures of antibiotics, ceftriaxone sulphate and streptomycin sodium, are assayed by 'ratio-spectra' 2nd derivative and 'zero-crossing' 3rd derivative spectrophotometry. Both procedures did not require any separation step and/or solving of equations. In the first method, calibration plots are linear up to 40mug/ml of ceftriaxone at 225, 241.5, 255.5, 255.5-241.5 and 225-241.5 nm (peak-to-peak), with r ranging from 0.9999 to 1.0000, and up to 30mug/ml of streptomycin at 206 nm, r 0.9998. Detection limits, at P = 0.05 level of significance: ceftriaxone, from 0.24 to 0.47 mug/ml (at the various wavelengths), streptomycin, 0.42 mug/ml. By the second method, lines of regression are linear up to 40 mug/ml of ceftriaxone, at 227.8 and 241.7 nm (r, 0.9999 and 1.0000) and up to 35 mug/ml of streptomycin (r, 0.9999). Detection limits were calculated to be 0.35 and 0.15 mug/ml for ceftriaxone and 0.27 mug/ml for streptomycin. Both methods were successfully applied to laboratory mixtures and to mixtures of commercial injections for these drugs. PMID:18965982

  12. Report on the 2nd Florence International Symposium on Advances in Cardiomyopathies: 9th meeting of the European Myocardial and Pericardial Diseases WG of the ESC

    Directory of Open Access Journals (Sweden)

    Franco Cecchi

    2012-12-01

    Full Text Available A bridge between clinical and basic science aiming at cross fertilization, with leading experts presenting alongside junior investigators, is the key feature of the “2nd Florence International Symposium on Advances in Cardiomyopathies” , 9th Meeting of the Myocardial and Pericardial Diseases Working Group of the European Society of Cardiology, which was held in Florence, Italy on 26-­-28th September 2012. Patients with cardiomyopathies, with an estimated 3 per thousand prevalence in the general population, constitute an increasingly large proportion of patients seen by most cardiologists. This class of diseases, which are mostly genetically determined with different transmission modalities, can cause important and often unsolved management problems, despite rapid advances in the field. On the other hand, few other areas of cardiology have seen such an impressive contribution from basic science and translational research to the understanding of their pathophysiology and clinical management. The course was designed to constantly promote close interaction between basic science and clinical practice and highlight the top scientific and translational discoveries in this field in 10 scientific sessions. It was preceded by two mini-­-courses, which included the basic concepts of cardiomyocyte mechanical and electrophysiological properties and mechanisms, how-­-to sessions for clinical diagnosis and management and illustrative case study presentations of different cardiomyopathies.

  13. The lymph drainage of the mammary glands in the bitch: a lymphographic study. Part 1: the 1st, 2nd, 4th and 5th mammary glands

    International Nuclear Information System (INIS)

    The objective of this paper was to study the lymph drainage of the 1st, 2nd, 4th and 5th mammary glands in the bitch using indirect lymphography. The main conclusions drawn after the study of 67 normal lactating mongrel bitches were as follows: lymph drains from the first gland,usually to the axillary nodes, and, in few cases, to the axillary and superficial cervical nodes simultaneously. The second gland drains tothe axillary nodes. The fourth gland usually drains to the superficial inguinal nodes, but it may, rarely, drain to the superficial inguinal and medial iliac nodes simultaneously. The fifth gland drains to the superficial inguinal nodes. Lymphatic connection between the mammary glands could not be demonstrated. Furthermore, it was confirmed that lymph can pass from one gland to another, through their common regional lymph nodes, by retrograde flow. It was demonstrated that there is a connection between the superficial inguinal lymph nodes from either side. It is suggested that lymphatic connection between the axillary and sternal nodes and between the axillary and bronchial nodes should be possible in some cases. Lymphatics of the mammary glands that cross the midline were not demonstrated

  14. Ba2NdZrO5.5 as a potential substrate material for YBa2Cu3O7-? superconducting films

    International Nuclear Information System (INIS)

    The new oxide Ba2NdZrO5.5 (BNZO) has been produced by the standard solid state reaction method. X-ray diffraction analysis (XRD) revealed that this synthesized material has an ordered complex cubic perovskite structure characteristic of A2BB'O6 crystalline structure with a lattice parameter of a = 8.40 Aa. It was established through EDX analysis that there is no trace of impurities. Chemical stability of BNZO with YBa2Cu3O7-? (YBCO) has been studied by means of Rietveld analysis of experimental XRD data on several samples of BNZO-YBCO composites. Quantitative analysis of phases on XRD patterns show that all peaks have been indexed for both BNZO and YBCO, and no extra peak is detectable. YBCO and BNZO remain as two different separate phases in the composites with no chemical reaction. Electrical measurements also revealed that superconducting transition temperature of pure YBCO and BNZO-YBCO composites is 90 K. These favorable characteristics of BNZO show that it can be used as a potential substrate material for deposition of YBCO superconducting films. (copyright 2007 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim) (orig.)

  15. U-Pb and K-Ar geochronology from the Cerro Empexa Formation, 1st and 2nd Regions, Precordillera, northern Chile

    International Nuclear Information System (INIS)

    The Cerro Empexa Formation (Galli, 1957) is a regionally distributed andesitic volcanic and continental sedimentary unit exposed in the Precordillera of the 1st and 2nd Regions of northern Chile. The formation has generally been considered to lie within the Lower or 'mid' Cretaceous, however, this assignment is based on scant, unreliable geochronologic data. Furthermore, there are conflicting interpretations as to whether the unit predates or postdates the first major Mesozoic shortening event affecting northern Chile. Because of the formation's presumed mid-Cretaceous age and its stratigraphic position over older back-arc sedimentary successions, the unit has been interpreted to represent products of the first eastward jump in the Andean magmatic arc from the arc's initial position in the Cordillera de la Costa (Scheuber and Reutter, 1992). In this paper we present the results of mapping and field observations that indicate exposures previously assigned to the Cerro Empexa Formation include two andesitic volcanic units separated by a major unconformity. The Cerro Empexa Formation proper lies above this unconformity. We also present U-Pb zircon and K-Ar geochronology that indicate the Cerro Empexa Formation is latest Cretaceous in its lower levels, and integrate our data with previously reported 40 Ar/39 Ar and fission-track data in the Cerros de Montecristo area (Maksaev, 1990; Maksaev and Zentilli, 1999) to show that 1800±600 m of rocks were d) to show that 1800±600 m of rocks were deposited within ca. 2.5 m.y (au)

  16. The theory of contractions of 2D 2nd order quantum superintegrable systems and its relation to the Askey scheme for hypergeometric orthogonal polynomials

    Science.gov (United States)

    Miller, Willard, Jr.

    2014-05-01

    We describe a contraction theory for 2nd order superintegrable systems, showing that all such systems in 2 dimensions are limiting cases of a single system: the generic 3-parameter potential on the 2-sphere, S9 in our listing. Analogously, all of the quadratic symmetry algebras of these systems can be obtained by a sequence of contractions starting from S9. By contracting function space realizations of irreducible representations of the S9 algebra (which give the structure equations for Racah/Wilson polynomials) to the other superintegrable systems one obtains the full Askey scheme of orthogonal hypergeometric polynomials.This relates the scheme directly to explicitly solvable quantum mechanical systems. Amazingly, all of these contractions of superintegrable systems with potential are uniquely induced by Wigner Lie algebra contractions of so(3, C) and e(2, C). The present paper concentrates on describing this intimate link between Lie algebra and superintegrable system contractions, with the detailed calculations presented elsewhere. Joint work with E. Kalnins, S. Post, E. Subag and R. Heinonen.

  17. Sexual orientation and the 2nd to 4th finger length ratio: evidence for organising effects of sex hormones or developmental instability?

    Science.gov (United States)

    Rahman, Q; Wilson, G D

    2003-04-01

    It has been proposed that human sexual orientation is influenced by prenatal sex hormones. Some evidence examining putative somatic markers of prenatal sex hormones supports this assumption. An alternative suggestion has been that homosexuality may be due to general developmental disruptions independent of hormonal effects. This study investigated the ratio of the 2nd to 4th finger digits (the 2D:4D ratio), a measure often ascribed to the organisational actions of prenatal androgens, and the fluctuating asymmetry (FA-a measure of general developmental disruption) of these features, in a sample of 240 healthy, right handed and exclusively heterosexual and homosexual males and females (N=60 per group). Homosexual males and females showed significantly lower 2D:4D ratios in comparison to heterosexuals, but sexual orientation did not relate to any measures of FA. The evidence may suggest that homosexual males and females have been exposed to non-disruptive, but elevated levels of androgens in utero. However, these data also draw attention to difficulties in the interpretation of results when somatic features are employed as biological markers of prenatal hormonal influences. PMID:12573297

  18. Stable isotope and trace element studies on gladiators and contemporary Romans from Ephesus (Turkey, 2nd and 3rd Ct. AD)--mplications for differences in diet.

    Science.gov (United States)

    Lösch, Sandra; Moghaddam, Negahnaz; Grossschmidt, Karl; Risser, Daniele U; Kanz, Fabian

    2014-01-01

    The gladiator cemetery discovered in Ephesus (Turkey) in 1993 dates to the 2nd and 3rd century AD. The aim of this study is to reconstruct diverse diet, social stratification, and migration of the inhabitants of Roman Ephesus and the distinct group of gladiators. Stable carbon, nitrogen, and sulphur isotope analysis were applied, and inorganic bone elements (strontium, calcium) were determined. In total, 53 individuals, including 22 gladiators, were analysed. All individuals consumed C3 plants like wheat and barley as staple food. A few individuals show indication of consumption of C4 plants. The ?13C values of one female from the gladiator cemetery and one gladiator differ from all other individuals. Their ?34S values indicate that they probably migrated from another geographical region or consumed different foods. The ?15N values are relatively low in comparison to other sites from Roman times. A probable cause for the depletion of 15N in Ephesus could be the frequent consumption of legumes. The Sr/Ca-ratios of the gladiators were significantly higher than the values of the contemporary Roman inhabitants. Since the Sr/Ca-ratio reflects the main Ca-supplier in the diet, the elevated values of the gladiators might suggest a frequent use of a plant ash beverage, as mentioned in ancient texts. PMID:25333366

  19. Influence of long-term altered gravity on the swimming performance of developing cichlid fish: including results from the 2nd German Spacelab Mission D-2

    Science.gov (United States)

    Rahmann, H.; Hilbig, R.; Flemming, J.; Slenzka, K.

    This study presents qualitative and quantitative data concerning gravity-dependent changes in the swimming behaviour of developing cichlid fish larvae (Oreochromis mossambicus) after a 9 resp. 10 days exposure to increased acceleration (centrifuge experiments), to reduced gravity (fast-rotating clinostat), changed accelerations (parabolic air craft flights) and to near weightlessness (2nd German Spacelab Mission D-2). Changes of gravity initially cause disturbances of the swimming performance of the fish larvae. With prolonged stay in orbit a step by step normalisation of the swimming behaviour took place in the fish. After return to 1g earth conditions no somersaulting or looping could be detected concerning the fish, but still slow and disorientated movements as compared to controls occurred. The fish larvae adapted to earth gravity within 3-5 days. Fish seem to be in a distinct early developmental stages extreme sensitive and adaptable to altered gravity. However, elder fish either do not react or show compensatory behaviour e.g. escape reactions.

  20. Reliability of software

    International Nuclear Information System (INIS)

    Common factors and differences in the reliability of hardware and software; reliability increase by means of methods of software redundancy. Maintenance of software for long term operating behavior. (HP)

  1. Free Software Foundation Software Directory - Science

    Science.gov (United States)

    The Free Software Directory is a project of the Free Software Foundation (FSF) and United Nations Education, Scientific and Cultural Organization (UNESCO). We catalog useful free software that runs under free operating systems-- particularly the GNU operating system and its GNU/Linux variants.

  2. Software Testing Techniques and Strategies

    OpenAIRE

    Isha,; Sunita Sangwan

    2014-01-01

    Software testing provides a means to reduce errors, cut maintenance and overall software costs. Numerous software development and testing methodologies, tools, and techniques have emerged over the last few decades promising to enhance software quality. This paper describes Software testing, need for software testing, Software testing goals and principles. Further it describe about different Software testing techniques and different software testing strategies.

  3. A Matrix Approach to Software Process Definition

    Science.gov (United States)

    Schultz, David; Bachman, Judith; Landis, Linda; Stark, Mike; Godfrey, Sally; Morisio, Maurizio; Powers, Edward I. (Technical Monitor)

    2000-01-01

    The Software Engineering Laboratory (SEL) is currently engaged in a Methodology and Metrics program for the Information Systems Center (ISC) at Goddard Space Flight Center (GSFC). This paper addresses the Methodology portion of the program. The purpose of the Methodology effort is to assist a software team lead in selecting and tailoring a software development or maintenance process for a specific GSFC project. It is intended that this process will also be compliant with both ISO 9001 and the Software Engineering Institute's Capability Maturity Model (CMM). Under the Methodology program, we have defined four standard ISO-compliant software processes for the ISC, and three tailoring criteria that team leads can use to categorize their projects. The team lead would select a process and appropriate tailoring factors, from which a software process tailored to the specific project could be generated. Our objective in the Methodology program is to present software process information in a structured fashion, to make it easy for a team lead to characterize the type of software engineering to be performed, and to apply tailoring parameters to search for an appropriate software process description. This will enable the team lead to follow a proven, effective software process and also satisfy NASA's requirement for compliance with ISO 9001 and the anticipated requirement for CMM assessment. This work is also intended to support the deployment of sound software processes across the ISC.

  4. Mobile Software Licensing

    OpenAIRE

    Dusparic, Ivana

    2003-01-01

    Licensing models and license management systems are going through major changes because the traditional shrink-wrapped software way of licensing does not meet the needs of the modern software market. In particular the emergence of new platforms, such as mobile computing and the assumption by software vendors of clients possessing network connected software, will revolutionize how software is licensed and managed. Software license systems are moving towards more exibility and s...

  5. SOFTWARE DEVELOPMENT TECHNIQUES

    OpenAIRE

    Asst. Prof. Rajani Kota

    2012-01-01

    Software development is the set of activities and processes for programmers that will eventually result in a software product. This may include requirement analysis, software design, implementation, testing, documentation, maintenance and then describing computer programs that meet user requirements within the constraints of the environment. It is a structure imposed on the development of software product. Software development is the most important process in developing a Software/t...

  6. Developing CMS software documentation system

    CERN Document Server

    Stankevicius, Mantas

    2012-01-01

    CMSSW (CMS SoftWare) is the overall collection of software and services needed by the simulation, calibration and alignment, and reconstruction modules that process data so that physicists can perform their analyses. It is a long term project, with a large amount of source code. In large scale and complex projects is important to have as up-to-date and automated software documentation as possible. The core of the documentation should be version-based and available online with the source code. CMS uses Doxygen and Twiki as the main tools to provide automated and non-automated documentation. Both of them are heavily cross-linked to prevent duplication of information. Doxygen is used to generate functional documentation and dependency graphs from the source code. Twiki is divided into two parts: WorkBook and Software Guide. WorkBook contains tutorial-type instructions on accessing computing resources and using the software to perform analysis within the CMS collaboration and Software Guide gives further details....

  7. Developing CMS software documentation system

    Science.gov (United States)

    Stankevicius, M.; Lassila-Perini, K.; Malik, S.

    2012-12-01

    CMSSW (CMS SoftWare) is the overall collection of software and services needed by the simulation, calibration and alignment, and reconstruction modules that process data so that physicists can perform their analyses. It is a long term project, with a large amount of source code. In large scale and complex projects is important to have as up-to-date and automated software documentation as possible. The core of the documentation should be version-based and available online with the source code. CMS uses Doxygen and Twiki as the main tools to provide automated and non-automated documentation. Both of them are heavily cross-linked to prevent duplication of information. Doxygen is used to generate functional documentation and dependency graphs from the source code. Twiki is divided into two parts: WorkBook and Software Guide. WorkBook contains tutorial-type instructions on accessing computing resources and using the software to perform analysis within the CMS collaboration and Software Guide gives further details. This note describes the design principles, the basic functionality and the technical implementations of the CMSSW documentation.

  8. TPV-Application As Small Back-up Generator For Standalone Photovoltaic Systems

    Science.gov (United States)

    Mattarolo, G.; Bard, J.; Schmid, J.

    2004-11-01

    Stand-alone PV applications that supply a constant load can benefit from a small reliable back-up generator. It allows to reduce the size of the PV array and the battery significantly with only a very small contribution from the back-up generator in the range of 5 to 10% of the total energy demand. In addition, a significant reduction of the investment cost and improvements of operational safety of remote PV applications can be achieved. In the power range from some W to some kW, a TPV generator can be competitive to other established electric generator technologies. TPV offers a compact, reliable, quiet and safe technology with the potential for low cost and versatile fuel usage, including bio fuels. Starting in 1994, a TPV-system has been developed for grid independent operation of gas heating systems. With improving efficiency, the focus was shifted towards a CHP development based on natural gas for households. The realised system concept can theoretically achieve 7% efficiency based on a Kanthal emitter operating at 1300°C and GaSb cells. In the framework of the research and training network TPVCell the system will be used to realise a TPV generator with a minimum efficiency of 2%. In the next step it is planned to improve the existing recuperative burner concept by software based design methods and to realise a new prototype. For the long term, the overall system efficiency target is 10%. In 1st part, the paper will briefly explain the system concept and show the achieved results. In the 2nd part, the authors will present simulation results for the application of such a TPV system in stand-alone photovoltaic systems.

  9. Software and systems traceability

    CERN Document Server

    Cleland-Huang, Jane; Zisman, Andrea

    2012-01-01

    ""Software and Systems Traceability"" provides a comprehensive description of the practices and theories of software traceability across all phases of the software development lifecycle. The term software traceability is derived from the concept of requirements traceability. Requirements traceability is the ability to track a requirement all the way from its origins to the downstream work products that implement that requirement in a software system. Software traceability is defined as the ability to relate the various types of software artefacts created during the development of software syst

  10. A new approach to PLC software design.

    Science.gov (United States)

    Kandare, Gregor; Godena, Giovanni; Strmcnik, Stanko

    2003-04-01

    This paper presents a model-based approach to PLC software development. The essence of this approach is the introduction of a new procedural modeling language called ProcGraph. In contrast to commonly used methods, ProcGraph deals with the procedural aspect of the control system and allows software specification at a higher level of abstraction. The modeling language has been supported with the development of a software tool which facilitates graphical model design and automatic code generation. The specification notation has been tested in the development of software for industrial applications. The supporting tool has been tested in a laboratory environment. PMID:12708546

  11. Durational and generational differences in Mexican immigrant obesity: Is acculturation the explanation?

    OpenAIRE

    Creighton, Mathew J.; Goldman, Noreen; Pebley, Anne R.; Chung, Chang Y.

    2012-01-01

    Using the Los Angeles Family and Neighborhood Survey (L.A.FANS-2; n = 1610), we explore the link between Mexican immigrant acculturation, diet, exercise and obesity. We distinguish Mexican immigrants and 2nd generation Mexicans from 3rd+ generation whites, blacks and Mexicans. First, we examine variation in social and linguistic measures by race/ethnicity, duration of residence and immigrant generation. Second, we consider the association between acculturation, diet and exercise. Third, we ev...

  12. Next generation sequencing technologies and the changing landscape of phage genomics

    OpenAIRE

    Klumpp, Jochen; Fouts, Derrick E.; Sozhamannan, Shanmuga

    2012-01-01

    The dawn of next generation sequencing technologies has opened up exciting possibilities for whole genome sequencing of a plethora of organisms. The 2nd and 3rd generation sequencing technologies, based on cloning-free, massively parallel sequencing, have enabled the generation of a deluge of genomic sequences of both prokaryotic and eukaryotic origin in the last seven years. However, whole genome sequencing of bacterial viruses has not kept pace with this revolution, despite the fact that th...

  13. SOFTWARE TESTING AND SOFTWARE DEVELOPMENT LIFECYCLES

    Directory of Open Access Journals (Sweden)

    Chitra Wasnik

    2013-03-01

    Full Text Available Software Testing is the process used to help identify the correctness, completeness, security, and quality of developed computer software. What is Software Testing? Process of validating and verifying that a program does what it is expected to do. Software Testing is an empirical investigation conducted to provide stakeholders with information about the quality of the product or service under test, with respect to the context in which it is intended to operate. Software Testing also provides an objective, independent view of the software to allow the business to appreciate and understand the risks at implementation of the software. Test techniques include, but are not limited to, the process of executing a program or application with the intent of finding software bugs. It can also be stated as the process of validating and verifying that a software program/application/product meets the business and technical requirements that guided its design and development, so that it works as expected and can be implemented with the same characteristics. Software Development Life Cycle (SDLC is a methodology that is typically used to develop, maintain and replace information systems for improving the quality of the software design and development process.

  14. Maneuver Automation Software

    Science.gov (United States)

    Uffelman, Hal; Goodson, Troy; Pellegrin, Michael; Stavert, Lynn; Burk, Thomas; Beach, David; Signorelli, Joel; Jones, Jeremy; Hahn, Yungsun; Attiyah, Ahlam; Illsley, Jeannette

    2009-01-01

    The Maneuver Automation Software (MAS) automates the process of generating commands for maneuvers to keep the spacecraft of the Cassini-Huygens mission on a predetermined prime mission trajectory. Before MAS became available, a team of approximately 10 members had to work about two weeks to design, test, and implement each maneuver in a process that involved running many maneuver-related application programs and then serially handing off data products to other parts of the team. MAS enables a three-member team to design, test, and implement a maneuver in about one-half hour after Navigation has process-tracking data. MAS accepts more than 60 parameters and 22 files as input directly from users. MAS consists of Practical Extraction and Reporting Language (PERL) scripts that link, sequence, and execute the maneuver- related application programs: "Pushing a single button" on a graphical user interface causes MAS to run navigation programs that design a maneuver; programs that create sequences of commands to execute the maneuver on the spacecraft; and a program that generates predictions about maneuver performance and generates reports and other files that enable users to quickly review and verify the maneuver design. MAS can also generate presentation materials, initiate electronic command request forms, and archive all data products for future reference.

  15. Robotic Software Integration Using MARIE

    Directory of Open Access Journals (Sweden)

    Dominic Letourneau Clement Raievsky

    2008-11-01

    Full Text Available This paper presents MARIE, a middleware framework oriented towards developing and integrating new and existing software for robotic systems. By using a generic communication framework, MARIE aims to create a flexible distributed component system that allows robotics developers to share software programs and algorithms, and design prototypes rapidly based on their own integration needs. The use of MARIE is illustrated with the design of a socially interactive autonomous mobile robot platform capable of map building, localization, navigation, tasks scheduling, sound source localization, tracking and separation, speech recognition and generation, visual tracking, message reading and graphical interaction using a touch screen interface.

  16. 2nd seminar on radioscopy

    International Nuclear Information System (INIS)

    This seminar record contains 10 articles on the following subjects: The function and properties of modern radioscopy systems (H. Heidt); systems producing pictures compared to weld seams (R. Limpert); automatic fault assessment on cast parts (D. Filbert); practical experience with computer-aided irradiation tests on car steering housings and crude cast wheels - SABA test system (M. Purschke); concepts and experience in the automatic interpretation of weld seam pictures (P. Rose); the significance of the size and shape of burnt spots in the radioscopy process (E. Kirchner); detector systems in radioscopy (R. Link); radioscopy systems with automatic testing and integrated picture processing (R. Grimm); automatic testing of light alloy wheels for cars in practice (Part 1: K. Doerrich, Part 2: G. Korthals). (HM)

  17. Green Chemistry (2nd edition)

    Science.gov (United States)

    Manahan, Stanley

    Measurement science used to characterize environmentally important species is a central aspect of this textbook and accompanying lecture support/PowerPoint presentations. In addition to discussing the hows and whys of measurement, the discussion of measurements up to 2007 as related to the future of the environment provides a context for learning the material. The content of each chapter is available for download in Power Point file format.

  18. Quantum Gravity (2nd edn)

    International Nuclear Information System (INIS)

    There has been a flurry of books on quantum gravity in the past few years. The first edition of Kiefer's book appeared in 2004, about the same time as Carlo Rovelli's book with the same title. This was soon followed by Thomas Thiemann's 'Modern Canonical Quantum General Relativity'. Although the main focus of each of these books is non-perturbative and non-string approaches to the quantization of general relativity, they are quite orthogonal in temperament, style, subject matter and mathematical detail. Rovelli and Thiemann focus primarily on loop quantum gravity (LQG), whereas Kiefer attempts a broader introduction and review of the subject that includes chapters on string theory and decoherence. Kiefer's second edition attempts an even wider and somewhat ambitious sweep with 'new sections on asymptotic safety, dynamical triangulation, primordial black holes, the information-loss problem, loop quantum cosmology, and other topics'. The presentation of these current topics is necessarily brief given the size of the book, but effective in encapsulating the main ideas in some cases. For instance the few pages devoted to loop quantum cosmology describe how the mini-superspace reduction of the quantum Hamiltonian constraint of LQG becomes a difference equation, whereas the discussion of 'dynamical triangulations', an approach to defining a discretized Lorentzian path integral for quantum gravity, is less detailed. The first few chapters of the book provide, in a roughly hipters of the book provide, in a roughly historical sequence, the covariant and canonical metric variable approach to the subject developed in the 1960s and 70s. The problem(s) of time in quantum gravity are nicely summarized in the chapter on quantum geometrodynamics, followed by a detailed and effective introduction of the WKB approach and the semi-classical approximation. These topics form the traditional core of the subject. The next three chapters cover LQG, quantization of black holes, and quantum cosmology. Of these the chapter on LQG is the shortest at fourteen pages-a reflection perhaps of the fact that there are two books and a few long reviews of the subject available written by the main protagonists in the field. The chapters on black holes and cosmology provide a more or less standard introduction to black hole thermodynamics, Hawking and Unruh radiation, quantization of the Schwarzschild metric and mini-superspace collapse models, and the DeWitt, Hartle-Hawking and Vilenkin wavefunctions. The chapter on string theory is an essay-like overview of its quantum gravitational aspects. It provides a nice introduction to selected ideas and a guide to the literature. Here a prescient student may be left wondering why there is no quantum cosmology in string theory, perhaps a deliberate omission to avoid the 'landscape' and its fauna. In summary, I think this book succeeds in its purpose of providing a broad introduction to quantum gravity, and nicely complements some of the other books on the subject. (book review)

  19. Reactor accidents. 2nd ed.

    International Nuclear Information System (INIS)

    This report offers an overview of the discussion on the source term of reactor accidents. The magnitude of the source term in accident scenarios is the determining factor for the consequences. Several Dutch publications are summarized as well as American reports relevant to the Netherlands. An accident scenario is defined representative for a severe reactor accident. Consequences to public health as well as to long-term soil contamination are calculated. It is pointed out how safety standards with respect to site selection has been changed in comparison to the FRG and the US. The question of the liability for reactor accidents is also discussed. (G.J.P.)

  20. 2nd Ralf Yorque Workshop

    CERN Document Server

    1985-01-01

    These are the proceedings of the Second R. Yorque Workshop on Resource Management which took place in Ashland, Oregon on July 23-25, 1984. The purpose of the workshop is to provide an informal atmosphere for the discussion of resource assessment and management problems. Each participant presented a one hour morning talk; afternoons were reserved for informal chatting. The workshop was successful in stimulating ideas and interaction. The papers by R. Deriso, R. Hilborn and C. Walters all address the same basic issue, so they are lumped together. Other than that, the order to the papers in this volume was determined in the same fashion as the order of speakers during the workshop -- by random draw. Marc Mangel Department of Mathematics University of California Davis, California June 1985 TABLE OF CONTENTS A General Theory for Fishery Modeling Jon Schnute Data Transformations in Regression Analysis with Applications to Stock-Recruitment Relationships David Ruppert and Raymond J. Carroll ••••••. •?...

  1. Patient Safety: 2nd edition

    OpenAIRE

    Vincent, C.

    2010-01-01

    When you are ready to implement measures to improve patient safety, this is the book to consult. Charles Vincent, one of the world's pioneers in patient safety, discusses each and every aspect clearly and compellingly. He reviews the evidence of risks and harms to patients, and he provides practical guidance on implementing safer practices in health care. The second edition puts greater emphasis on this practical side. Examples of team based initiatives show how patient safety can be improved...

  2. The truncated Newton using 1st and 2nd order adjoint-state method: a new approach for traveltime tomography without rays

    Science.gov (United States)

    Bretaudeau, F.; Metivier, L.; Brossier, R.; Virieux, J.

    2013-12-01

    Traveltime tomography algorithms generally use ray tracing. The use of rays in tomography may not be suitable for handling very large datasets and perform tomography in very complex media. Traveltime maps can be computed through finite-difference approach (FD) and avoid complex ray-tracing algorithm for the forward modeling (Vidale 1998, Zhao 2004). However, rays back-traced from receiver to source following the gradient of traveltime are still used to compute the Fréchet derivatives. As a consequence, the sensitivity information computed using back-traced rays is not numerically consistent with the FD modeling used (the derivatives are only a rough approximation of the true derivatives of the forward modeling). Leung & Quian (2006) proposed a new approach that avoid ray tracing where the gradient of the misfit function is computed using the adjoint-state method. An adjoint-state variable is thus computed simultaneously for all receivers using a numerical method consistent with the forward modeling, and for the computational cost of one forward modeling. However, in their formulation, the receivers have to be located at the boundary of the investigated model, and the optimization approach is limited to simple gradient-based method (i.e. steepest descent, conjugate gradient) as only the gradient is computed. However, the Hessian operator has an important role in gradient-based reconstruction methods, providing the necessary information to rescale the gradient, correct for illumination deficit and remove artifacts. Leung & Quian (2006) uses LBFGS, a quasi-Newton method that provides an improved estimation of the influence of the inverse Hessian. Lelievre et al. (2011) also proposed a tomography approach in which the Fréchet derivatives are computed directly during the forward modeling using explicit symbolic differentiation of the modeling equations, resulting in a consistent Gauss-Newton inversion. We are interested here in the use of a new optimization approach named as the truncated Newton (TCN) (Métivier et al. 2012) with a more accurate estimation of the impact of the Hessian. We propose an efficient implementation for first-arrival traveltime tomography. In TCN, the model update ?m is obtained through the iterative resolution of the Newton linear system H ?m = - g. Based on a matrix-free conjugate gradient resolution, the iterative solver requires only the computation of the gradient and of Hessian-vector products. We propose a generalization of the computation of the gradient using the adjoint-state method that allows to consider receivers located anywhere. Then the Hessian-vector products are computed using an original formulation based on a 2nd-order adjoint-state method, at the cost of an additional forward modeling. The TCN algorithm is composed of two nested loops: an internal loop to compute ?m, and an external loop where a line search is performed to update the subsurface parameters. TCN thus considers locally the inversion of the traveltime data using an estimation of the full Hessian (both 1st and 2nd order terms) at an acceptable cost. Tomography with TCN is an improvement over the simple gradient-based adjoint-state tomography due to its good convergence property, to the better consideration of illumination, and is a promising tool for multi-parameter inversion as rescaling is given by the Hessian.

  3. Agile Software Development

    Science.gov (United States)

    Biju, Soly Mathew

    2008-01-01

    Many software development firms are now adopting the agile software development method. This method involves the customer at every level of software development, thus reducing the impact of change in the requirement at a later stage. In this article, the principles of the agile method for software development are explored and there is a focus on…

  4. Software distribution using xnetlib

    Energy Technology Data Exchange (ETDEWEB)

    Dongarra, J.J. [Univ. of Tennessee, Knoxville, TN (US). Dept. of Computer Science]|[Oak Ridge National Lab., TN (US); Rowan, T.H. [Oak Ridge National Lab., TN (US); Wade, R.C. [Univ. of Tennessee, Knoxville, TN (US). Dept. of Computer Science

    1993-06-01

    Xnetlib is a new tool for software distribution. Whereas its predecessor netlib uses e-mail as the user interface to its large collection of public-domain mathematical software, xnetlib uses an X Window interface and socket-based communication. Xnetlib makes it easy to search through a large distributed collection of software and to retrieve requested software in seconds.

  5. Ensuring Software IP Cleanliness

    Directory of Open Access Journals (Sweden)

    Mahshad Koohgoli

    2007-12-01

    Full Text Available At many points in the life of a software enterprise, determination of intellectual property (IP cleanliness becomes critical. The value of an enterprise that develops and sells software may depend on how clean the software is from the IP perspective. This article examines various methods of ensuring software IP cleanliness and discusses some of the benefits and shortcomings of current solutions.

  6. Implementación de estrategias curriculares en asignaturas de segundo año de la Licenciatura en Enfermería Implementation of curricular strategies in Nursing 2nd year subjects

    Directory of Open Access Journals (Sweden)

    María Cristina Pérez Guerrero

    2013-04-01

    Full Text Available Se presenta una revisión bibliográfica en la que se valora la implementación de las estrategias curriculares en la Licenciatura en Enfermería, a partir de las asignaturas impartidas en el segundo año de la carrera. Se destaca la importancia de estos recursos pedagógicos en la formación de los estudiantes, su desarrollo y contribución a la solución de situaciones relacionadas con el cuidado, la calidad de la atención de salud, la disminución de eventos adversos y la seguridad del paciente. Las estrategias curriculares vinculadas a la carrera de Enfermería constituyen una forma particular de desarrollar el proceso docente, caracterizadas por una direccionalidad coordinada que responde al perfil de salida del egresado, en la que se imbrican los contenidos y métodos teóricos y prácticos de las unidades curriculares correspondientes al plan de estudio, a partir de una estructura metodológica que garantiza su funcionamiento. Ello contribuye a la formación integral de un profesional competente.A bibliographic revision is presented in order to assess the implementation of curricular strategies in Nursing 2nd year subjects. The importance of teaching aids in students' training, their development and contribution to solve issues related to care and quality of the health system, the lowering of adverse events and the patient´s safety are pointed out. Curricular strategies related to Nursing represent a particular way to develop the teaching process, characterized by a coordinated direction which responds to the graduate's experience, in which theoretical and practical methods and contents of the curricular units that belong to the syllabus are interwoven, starting from a methodological structure that ensures these strategy functions. It contributes to the comprehensive formation of a competent professional.

  7. Implementación de estrategias curriculares en asignaturas de segundo año de la Licenciatura en Enfermería / Implementation of curricular strategies in Nursing 2nd year subjects

    Scientific Electronic Library Online (English)

    María Cristina, Pérez Guerrero; Maité, Suárez Fernández; Alina, Carrasco Milanés.

    2013-04-01

    Full Text Available SciELO Cuba | Language: Spanish Abstract in spanish Se presenta una revisión bibliográfica en la que se valora la implementación de las estrategias curriculares en la Licenciatura en Enfermería, a partir de las asignaturas impartidas en el segundo año de la carrera. Se destaca la importancia de estos recursos pedagógicos en la formación de los estudi [...] antes, su desarrollo y contribución a la solución de situaciones relacionadas con el cuidado, la calidad de la atención de salud, la disminución de eventos adversos y la seguridad del paciente. Las estrategias curriculares vinculadas a la carrera de Enfermería constituyen una forma particular de desarrollar el proceso docente, caracterizadas por una direccionalidad coordinada que responde al perfil de salida del egresado, en la que se imbrican los contenidos y métodos teóricos y prácticos de las unidades curriculares correspondientes al plan de estudio, a partir de una estructura metodológica que garantiza su funcionamiento. Ello contribuye a la formación integral de un profesional competente. Abstract in english A bibliographic revision is presented in order to assess the implementation of curricular strategies in Nursing 2nd year subjects. The importance of teaching aids in students' training, their development and contribution to solve issues related to care and quality of the health system, the lowering [...] of adverse events and the patient´s safety are pointed out. Curricular strategies related to Nursing represent a particular way to develop the teaching process, characterized by a coordinated direction which responds to the graduate's experience, in which theoretical and practical methods and contents of the curricular units that belong to the syllabus are interwoven, starting from a methodological structure that ensures these strategy functions. It contributes to the comprehensive formation of a competent professional.

  8. The 2000 activities and the 2nd Workshop on Human Resources Development in the Nuclear Field as part of Asian regional cooperation

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-06-01

    In 1999, the Project for Human Resources Development (HRD) was initiated as defined in the framework of the Forum for Nuclear Cooperation in Asia (FNCA), organized by the Atomic Energy Commission of Japan. The objective of the HRD Project is to solidify the foundation of technologies for nuclear development and utilization in Asia by promoting human resources development in Asian countries. In the Project there are two kind of activities; in-workshop activities and outside-of-workshop activities. As in-workshop activities, the 2nd Workshop on Human Resources Development in the Nuclear Field was held on November 27 and 28, 2000, at the Tokai Research Institute of JAERI. As outside-of-workshop activities. 'The presentation of the present state of international training and education in the nuclear field in Japan' was held on November 29, 2000 after the workshop. Participating countries were China, Indonesia, South Korea, Japan, Malaysia, the Philippines, Thailand, and Vietnam. The secretariat for the Human Resources Development Projects is provided by the Nuclear Technology and Education Center of the Japan Atomic Energy Research Institute. This report consists of presentation papers and materials at the Workshop, presentation documents of 'The present state of international training and education in the nuclear field in Japan', a letter of proposal from the Project Leader of Japan to the project leaders of the participating countries after the Workshop and a presentation paper on Human Resources Development at the 3rd Coordinators Meeting of FNCA at Tokyo on March 14-16, 2001. (author)

  9. Educational Software Directory

    Science.gov (United States)

    Educational Software Directory.net is an online directory that provides information on various software programs used in education. The website is hosted by KnowPlay, an online software store, which charges software makers to have their information posted here. The collection of programs and related websites can be searched using keywords or browsed by topic areas, which are: Children's, Games, Language Arts, Math, Multimedia, Music, Reference, Science, Social Studies, Special Needs, Teacher's, and Training. Other sections of the website provide links to organizations that offer online reviews of educational software, publishers of educational software, educational organizations involved in educational software, and online magazines, news sources, and related resources.

  10. Software Language Evolution:

    OpenAIRE

    Vermolen, S. D.

    2012-01-01

    Software plays a critical role in our daily life. Vast amounts of money are spent on more and more complex systems. All software, regardless if it controls a plane or the game on your phone is never finished. Software changes when it contains bugs or when new functionality is added. This process of change is called software eovlution. Despite what the name suggests, this is in practice a rapid process. Software is described in a software language. Not only software can evolve, also the la...

  11. Patterns of Software Development Process

    Directory of Open Access Journals (Sweden)

    Sandro Javier Bolaños Castro

    2011-12-01

    Full Text Available "Times New Roman","serif";mso-fareast-font-family:"Times New Roman";mso-ansi-language:EN-US;mso-fareast-language:EN-US;mso-bidi-language:AR-SA">This article presents a set of patterns that can be found to perform best practices in software processes that are directly related to the problem of implementing the activities of the process, the roles involved, the knowledge generated and the inputs and outputs belonging to the process. In this work, a definition of the architecture is encouraged by using different recurrent configurations that strengthen the process and yield efficient results for the development of a software project. The patterns presented constitute a catalog, which serves as a vocabulary for communication among project participants [1], [2], and also can be implemented through software tools, thus facilitating patterns implementation [3]. Additionally, a tool that can be obtained under GPL (General Public license is provided for this purpose

  12. Linear algebra applications using Matlab software

    Directory of Open Access Journals (Sweden)

    Cornelia Victoria Anghel

    2005-10-01

    Full Text Available The paper presents two ways of special matrix generating using some functions included in the MatLab software package. The MatLab software package contains a set of functions that generate special matrixes used in the linear algebra applications and the signal processing from different activity fields. The paper presents two tipes of special matrixes that can be generated using written sintaxes in the dialog window of the MatLab software and for the command validity we need to press the Enter task. The applications presented in the paper represent eamples of numerical calculus using the MatLab software and belong to the scientific field „Computer Assisted Mathematics” thus creating the symbiosis between mathematics and informatics.

  13. IMAGE Software Suite

    Science.gov (United States)

    Gallagher, Dennis L.; Rose, M. Franklin (Technical Monitor)

    2000-01-01

    The IMAGE Mission is generating a truely unique set of magnetospheric measurement through a first-of-its-kind complement of remote, global observations. These data are being distributed in the Universal Data Format (UDF), which consists of data, calibration, and documentation. This is an open dataset, available to all by request to the National Space Science Data Center (NSSDC) at NASA Goddard Space Flight Center. Browse data, which consists of summary observations, is also available through the NSSDC in the Common Data Format (CDF) and graphic representations of the browse data. Access to the browse data can be achieved through the NSSDC CDAWeb services or by use of NSSDC provided software tools. This presentation documents the software tools, being provided by the IMAGE team, for use in viewing and analyzing the UDF telemetry data. Like the IMAGE data, these tools are openly available. What these tools can do, how they can be obtained, and how they are expected to evolve will be discussed.

  14. Software and Computing News

    CERN Multimedia

    Barberis, D

    The last several months have been very busy ones for the ATLAS software developers. They've been trying to cope with the competing demands of multiple software stress tests and testbeds. These include Data Challenge Two (DC2), the Combined Testbeam (CTB), preparations for the Physics Workshop to be held in Rome in June 2005, and other testbeds, primarily one for the High-Level Trigger. Data Challenge 2 (DC2) The primary goal of this was to validate the computing model and to provide a test of simulating a day's worth of ATLAS data (10 million events) and of fully processing it and making it available to the physicists within 10 days (i.e. a 10% scale test). DC2 consists of three parts - the generation, simulation, and mixing of a representative sample of physics events with background events; the reconstruction of the mixed samples with initial classification into the different physics signatures; and the distribution of the data to multiple remote sites (Tier-1 centers) for analysis by physicists. Figu...

  15. Computer software review procedures

    International Nuclear Information System (INIS)

    This article reviews the procedures which are used to review software written for computer based instrumentation and control functions in nuclear facilities. The utilization of computer based control systems is becoming much more prevalent in such installations, in addition to being retrofit into existing systems. Currently, the Nuclear Regulatory System uses Regulatory Guide 1.152, open-quotes Criteria for Programmable Digital Computer System Software in Safety-Related Systems of Nuclear Power Plantsclose quotes and ANSI/IEEE-ANS-7-4.3.2-1982, open-quotes Application Criteria for Programmable Digital Computer Systems in Safety Systems of Nuclear Power Generating Stationsclose quotes for guidance when performing reviews of digital systems. There is great concern about the process of verification and validation of these codes, so when inspections are done of such systems, inspectors examine very closely the processes which were followed in developing the codes, the errors which were detected, how they were found, and the analysis which went into tracing down the causes behind the errors to insure such errors were not propagated again in the future

  16. H. Beale et al., Cases, Materials and Texts on Contract Law, 2nd ed. (Oxford: Hart Publishing, 2010); and T. K. Graziano, Comparative Contract Law: Cases, Materials and Exercises (Basingstoke: Palgrave MacMillan, 2009)

    OpenAIRE

    Johnstone, Rachael L.

    2011-01-01

    view essay of the following books on comparative law: Hugh Beale, Bénédicte Fauvarque-Cosson, Jacobien Rutgers, Denis Tallon and Stefan Vogenauer, Cases, Materials and Text on Contract Law, 2nd ed. (Ius Commune Casebooks for the Common Law of Europe No. 6) (Oxford, United Kingdom: Hart Publishing, 2010) lxxxiv + 1358 pp. paper. 38.95 GBP; and Thomas Kadner Graziano, Comparative Contract Law: Cases, Materials and Exercises (Basingstoke, United Kingdom: Palgrage MacMillan, 2009) xi + 510 pp. ...

  17. Electrical properties of ZrO2-Y2O3, HfO2-Nd2O3, HfO2-Y2O3 system films

    International Nuclear Information System (INIS)

    Electrophysical properties of ZrO2-Y2O3, HfO2-Nd2O3 system films in dependence on the conditions of their obtaining are studied. Introduction of REE oxides in ZrO2 and HfO2 films permits to stabilize the electrophysical parameters of the films in wide intervals of the substrate temperatures and layer growth rates

  18. Double nitrates of neodymium and potassium of the composition 2KNO3xNd(NO3)3x2H2O, 3KNO3x2Nd(NO3)3xH2O

    International Nuclear Information System (INIS)

    Double salts of the 2KNO3xNd(NO3)3x2H2O composition at 50 deg C, 3KNO3x2Nd(NO3)3xH2O composition at 50, 65, 100 deg C are prepared from aqueous solutions containing neodymium and potassium nitrates. Concentration limits for compounds crystallization are shown, some crystallooptical characteristics are investigated, X-ray phase analysis is performed, their thermal properties are studied

  19. ADAPTABLE SOFTWARE SHELLS VERSUS MICROSOFT SOFTWARE SHELLS

    OpenAIRE

    Todoroi, Dumitru

    2008-01-01

    Development and evolution of Microsoft Office and Microsoft Windows shells is based in general on the special methodology of Software creation and implementation such as macros, subroutine, custom commands and specialized features; this methodology of Microsoft Software shells is analyzed. The universal methodology of Adaptable Software creation is proposed. Present result evaluates from [Tod-08.1] which is a evolution of the Fulbright research project no. 22131 “Societal Information Syste...

  20. Containment and surveillance for software

    International Nuclear Information System (INIS)

    Some operators and state authorities are offering their computer systems, both hardware and software, to be used for safeguards purposes by the International Atomic Energy Agency. Therefore a need exists to develop a method of authenticating the data produced by a computer program before it can be used by the Agency. As part of a complete Computer Systems Authentication (COMSAT) package, a method of software containment and surveillance has been developed to compliment existing software authentication techniques. The package is applicable to both operator and Agency provided systems. A program to demonstrate the principles has been written. With this facility, the Agency will be able to leave unattended software in the field, either to be used by the operator to generate data for inspection on their own computer, or to save an inspector having to re-install inspection-specific software on an Agency computer, in the knowledge that the operation of the protected computer is being continuously monitored. If adopted, either of these uses will enable the Agency to reduce their costs. (Author)