WorldWideScience
 
 
1

STARS 2.0: 2nd-generation open-source archiving and query software  

Science.gov (United States)

The Subaru Telescope is in process of developing an open-source alternative to the 1st-generation software and databases (STARS 1) used for archiving and query. For STARS 2, we have chosen PHP and Python for scripting and MySQL as the database software. We have collected feedback from staff and observers, and used this feedback to significantly improve the design and functionality of our future archiving and query software. Archiving - We identified two weaknesses in 1st-generation STARS archiving software: a complex and inflexible table structure and uncoordinated system administration for our business model: taking pictures from the summit and archiving them in both Hawaii and Japan. We adopted a simplified and normalized table structure with passive keyword collection, and we are designing an archive-to-archive file transfer system that automatically reports real-time status and error conditions and permits error recovery. Query - We identified several weaknesses in 1st-generation STARS query software: inflexible query tools, poor sharing of calibration data, and no automatic file transfer mechanisms to observers. We are developing improved query tools and sharing of calibration data, and multi-protocol unassisted file transfer mechanisms for observers. In the process, we have redefined a 'query': from an invisible search result that can only transfer once in-house right now, with little status and error reporting and no error recovery - to a stored search result that can be monitored, transferred to different locations with multiple protocols, reporting status and error conditions and permitting recovery from errors.

Winegar, Tom

2008-07-01

2

2nd Generation Alkaline Electrolysis : Final report  

DEFF Research Database (Denmark)

This report provides the results of the 2nd Generation Alkaline Electrolysis project which was initiated in 2008. The project has been conducted from 2009-2012 by a consortium comprising Århus University Business and Social Science – Centre for Energy Technologies (CET (former HIRC)), Technical University of Denmark – Mechanical Engineering (DTU-ME), Technical University of Denmark – Energy Conversion (DTU-EC), FORCE Technology and GreenHydrogen.dk. The project has been supported by EUDP.

Yde, Lars; Kjartansdóttir, Cecilia Kristin

2013-01-01

3

The 2nd Generation Real Time Mission Monitor (RTMM) Development  

Science.gov (United States)

The NASA Real Time Mission Monitor (RTMM) is a visualization and information system that fuses multiple Earth science data sources, to enable real time decisionmaking for airborne and ground validation experiments. Developed at the National Aeronautics and Space Administration (NASA) Marshall Space Flight Center, RTMM is a situational awareness, decision-support system that integrates satellite imagery and orbit data, radar and other surface observations (e.g., lightning location network data), airborne navigation and instrument data sets, model output parameters, and other applicable Earth science data sets. The integration and delivery of this information is made possible using data acquisition systems, network communication links, network server resources, and visualizations through the Google Earth virtual globe application. In order to improve the usefulness and efficiency of the RTMM system, capabilities are being developed to allow the end-user to easily configure RTMM applications based on their mission-specific requirements and objectives. This second generation RTMM is being redesigned to take advantage of the Google plug-in capabilities to run multiple applications in a web browser rather than the original single application Google Earth approach. Currently RTMM employs a limited Service Oriented Architecture approach to enable discovery of mission specific resources. We are expanding the RTMM architecture such that it will more effectively utilize the Open Geospatial Consortium Sensor Web Enablement services and other new technology software tools and components. These modifications and extensions will result in a robust, versatile RTMM system that will greatly increase flexibility of the user to choose which science data sets and support applications to view and/or use. The improvements brought about by RTMM 2nd generation system will provide mission planners and airborne scientists with enhanced decision-making tools and capabilities to more efficiently plan, prepare and execute missions, as well as to playback and review past mission data. To paraphrase the old television commercial RTMM doesn t make the airborne science, it makes the airborne science easier.

Blakeslee, Richard; Goodman, Michael; Meyer, Paul; Hardin, Danny; Hall, John; He, Yubin; Regner, Kathryn; Conover, Helen; Smith, Tammy; Lu, Jessica; Garrett, Michelle

2009-01-01

4

Super Boiler 2nd Generation Technology for Watertube Boilers  

Energy Technology Data Exchange (ETDEWEB)

This report describes Phase I of a proposed two phase project to develop and demonstrate an advanced industrial watertube boiler system with the capability of reaching 94% (HHV) fuel-to-steam efficiency and emissions below 2 ppmv NOx, 2 ppmv CO, and 1 ppmv VOC on natural gas fuel. The boiler design would have the capability to produce >1500 F, >1500 psig superheated steam, burn multiple fuels, and will be 50% smaller/lighter than currently available watertube boilers of similar capacity. This project is built upon the successful Super Boiler project at GTI. In that project that employed a unique two-staged intercooled combustion system and an innovative heat recovery system to reduce NOx to below 5 ppmv and demonstrated fuel-to-steam efficiency of 94% (HHV). This project was carried out under the leadership of GTI with project partners Cleaver-Brooks, Inc., Nebraska Boiler, a Division of Cleaver-Brooks, and Media and Process Technology Inc., and project advisors Georgia Institute of Technology, Alstom Power Inc., Pacific Northwest National Laboratory and Oak Ridge National Laboratory. Phase I of efforts focused on developing 2nd generation boiler concepts and performance modeling; incorporating multi-fuel (natural gas and oil) capabilities; assessing heat recovery, heat transfer and steam superheating approaches; and developing the overall conceptual engineering boiler design. Based on our analysis, the 2nd generation Industrial Watertube Boiler when developed and commercialized, could potentially save 265 trillion Btu and $1.6 billion in fuel costs across U.S. industry through increased efficiency. Its ultra-clean combustion could eliminate 57,000 tons of NOx, 460,000 tons of CO, and 8.8 million tons of CO2 annually from the atmosphere. Reduction in boiler size will bring cost-effective package boilers into a size range previously dominated by more expensive field-erected boilers, benefiting manufacturers and end users through lower capital costs.

Mr. David Cygan; Dr. Joseph Rabovitser

2012-03-31

5

Aging Studies of 2nd Generation BaBar RPCs  

Energy Technology Data Exchange (ETDEWEB)

The BaBar detector, operating at the PEPII B factory of the Stanford Linear Accelerator Center (SLAC), installed over 200 2nd generation Resistive Plate Chambers (RPCs) in 2002. The streamer rates produced by backgrounds and signals from normal BaBar running vary considerably (0.1- >20 Hz/cm2) depending on the layer and position of the chambers, thus providing a broad spectrum test of RPC performance and aging. The lowest rate chambers have performed very well with stable efficiencies averaging 95%. Other chambers had rate-dependant inefficiencies due to Bakelite drying which were reversed by the introduction of humidified gases. RPC inefficiencies in the highest rate regions of the higher rate chambers have been observed and also found to be rate dependant. The inefficient regions grow with time and have not yet been reduced by operation with humidified input gas. Three of these chambers were converted to avalanche mode operation and display significantly improved efficiencies. The rate of production of HF in the RPC exhaust gases was measured in avalanche and streamer mode RPCs and found to be comparable despite the lower current of the avalanche mode RPCs.

Band, H.R.; /SLAC

2007-09-25

6

From 1st- to 2nd-Generation Biofuel Technologies: Extended Executive Summary  

Energy Technology Data Exchange (ETDEWEB)

This report looks at the technical challenges facing 2nd-generation biofuels, evaluates their costs and examines related current policies to support their development and deployment. The potential for production of more advanced biofuels is also discussed. Although significant progress continues to be made to overcome the technical and economic challenges, 2nd-generation biofuels still face major constraints to their commercial deployment.

NONE

2008-07-01

7

Colchicine treatment of jute seedlings in the 1st and 2nd generation after irradiation  

International Nuclear Information System (INIS)

Colchicine treatment (0.05% for 12 h) to 15 day old seedlings in the 1st generation after X-ray or gamma-ray exposure was lethal. In contrast the same colchicine treatment to 15 day old seedlings in the 2nd generation was effective in inducing polyploids. (author)

8

Characterization and calibration of 2nd generation slope measuring profiler  

Science.gov (United States)

High spectral resolution and nanometer sized foci of 3rd generation SR beamlines can only be achieved by means of ultra precise optical elements. The improved brilliance and the coherence of free electron lasers (FEL) even push the accuracy limits and make the development of a new generation of ultra precise reflective optical elements mandatory. Typical elements are wave front preserving plane mirrors (lengths of up to 1 m, residual slope errors ˜0.05 ?rad (rms) and values of 0.1 nm (rms) for the micro-roughness) and curved optical elements like spheres, toroids or elliptical cylinder (residual slope error ˜0.25 ?rad (rms) and better). These challenging specifications and the ongoing progress in finishing technology need to be matched by improved accuracy metrology instruments. We will discuss the results of recent developments in the field of metrology made in the BESSY-II-optics laboratory (BOL) at the Helmholtz Zentrum Berlin (HZB), by the use of vertical angle comparator (VAC) in use to calibrate the nanometer optical component measuring machine (NOM). The BESSY-NOM represents an ultra accurate type of slope measuring instruments characterized by an accuracy of 0.05 ?rad (rms) for plane substrates and 0.2 ?rad (rms) for significant curved surfaces.

Siewert, Frank; Buchheim, Jana; Zeschke, Thomas

2010-05-01

9

The 1997 Protocol and the European Union (European Union and '2nd generation' responsibility conventions)  

International Nuclear Information System (INIS)

The issue of accession of the Eastern European Member States to the 1997 Protocol is discussed with focus on the European Union's authority and enforcement powers. Following up the article published in the preceding issue of this journal, the present contribution analyses the relations of the '2nd generation' responsibility conventions to the law of the European Union. (orig.)

10

Sustainable Production of Fuel : A Study for Customer Adoption of 2nd Generation of Biofuel  

Digital Repository Infrastructure Vision for European Research (DRIVER)

Abstract Finding a new fuel to substitute gasoline which reducing rapidly every year, is an urgent problem in the world. In this situation, biofuel is considered to be one kind of new fuel which make no pollution. Nowadays, 1st generation biofuel is familiar with people and adopted by customers, which make it have a stable market share. Since it also brings new problems, 2nd generation biofuel appear and solve all the problems.In the thesis, I compared the pros and cons between the 1st and 2n...

Jin, Ying

2010-01-01

11

Effects of Thermal Cycling on Control and Irradiated EPC 2nd Generation GaN FETs  

Science.gov (United States)

The power systems for use in NASA space missions must work reliably under harsh conditions including radiation, thermal cycling, and exposure to extreme temperatures. Gallium nitride semiconductors show great promise, but information pertaining to their performance is scarce. Gallium nitride N-channel enhancement-mode field effect transistors made by EPC Corporation in a 2nd generation of manufacturing were exposed to radiation followed by long-term thermal cycling in order to address their reliability for use in space missions. Results of the experimental work are presented and discussed.

Patterson, Richard L.; Scheick, Leif; Lauenstein, Jean-Marie; Casey, Megan; Hammoud, Ahmad

2013-01-01

12

Boosting biogas yield of anaerobic digesters by utilizing concentrated molasses from 2nd generation bioethanol plant  

Energy Technology Data Exchange (ETDEWEB)

Concentrated molasses (C5 molasses) from 2nd generation bioethanol plant has been investigated for enhancing productivity of manure based digesters. A batch study at mesophilic condition (35+- 1 deg C) showed the maximum methane yield from molasses as 286 LCH4/kgVS which was approximately 63% of the calculated theoretical yield. In addition to the batch study, co-digestion of molasses with cattle manure in a semi-continuously stirred reactor at thermophilic temperature (50+- 1 deg C) was also performed with a stepwise increase in molasses concentration. The results from this experiment revealed the maximum average biogas yield of 1.89 L/L/day when 23% VSmolasses was co-digested with cattle manure. However, digesters fed with more than 32% VSmolasses and with short adaptation period resulted in VFA accumulation and reduced methane productivity indicating that when using molasses as biogas booster this level should not be exceeded.

Sarker, Shiplu [Department of Renewable Energy, Faculty of Engineering and Science, University of Agder, Grimstad-4879 (Norway); Moeller, Henrik Bjarne [Department of Biosystems Engineering, Faculty of Science and Technology, Aarhus University, Research center Foulum, Blichers Alle, Post Box 50, Tjele-8830 (Denmark)

2013-07-01

13

The New 2nd-Generation SRF R&D Facility at Jefferson Lab: TEDF  

Energy Technology Data Exchange (ETDEWEB)

The US Department of Energy has funded a near-complete renovation of the SRF-based accelerator research and development facilities at Jefferson Lab. The project to accomplish this, the Technical and Engineering Development Facility (TEDF) Project has completed the first of two phases. An entirely new 3,100 m{sup 2} purpose-built SRF technical work facility has been constructed and was occupied in summer of 2012. All SRF work processes with the exception of cryogenic testing have been relocated into the new building. All cavity fabrication, processing, thermal treatment, chemistry, cleaning, and assembly work is collected conveniently into a new LEED-certified building. An innovatively designed 800 m2 cleanroom/chemroom suite provides long-term flexibility for support of multiple R&D and construction projects as well as continued process evolution. The characteristics of this first 2nd-generation SRF facility are described.

Reece, Charles E.; Reilly, Anthony V.

2012-09-01

14

Improved beam spot measurements in the 2nd generation proton beam writing system  

Energy Technology Data Exchange (ETDEWEB)

Nanosized ion beams (especially proton and helium) play a pivotal role in the field of ion beam lithography and ion beam analysis. Proton beam writing has shown lithographic details down to the 20 nm level, limited by the proton beam spot size. Introducing a smaller spot size will allow smaller lithographic features. Smaller probe sizes, will also drastically improve the spatial resolution for ion beam analysis techniques. Among many other requirements, having an ideal resolution standard, used for beam focusing and a reliable focusing method, is an important pre-requisite for sub-10 nm beam spot focusing. In this paper we present the fabrication processes of a free-standing resolution standard with reduced side-wall projection and high side-wall verticality. The resulting grid is orthogonal (90.0° ± 0.1), has smooth edges with better than 6 nm side-wall projection. The new resolution standard has been used in focusing a 2 MeV H{sub 2}{sup +} beam in the 2nd generation PBW system at Center for Ion Beam Applications, NUS. The beam size has been characterized using on- and off-axis scanning transmission ion microscopy (STIM) and ion induced secondary electron detection, carried out with a newly installed micro channel plate electron detector. The latter has been shown to be a realistic alternative to STIM measurements, as the drawback of PIN diode detector damage is alleviated. With these improvements we show reproducible beam focusing down to 14 nm.

Yao, Yong [Centre for Ion Beam Applications, Department of Physics, National University of Singapore, Singapore 117542 (Singapore); Mourik, Martin W. van [Coherence and Quantum Technology, Department of Applied Physics, Eindhoven University of Technology, 5600 MB Eindhoven (Netherlands); Santhana Raman, P. [Centre for Ion Beam Applications, Department of Physics, National University of Singapore, Singapore 117542 (Singapore); Kan, Jeroen A. van, E-mail: phyjavk@nus.edu.sg [Centre for Ion Beam Applications, Department of Physics, National University of Singapore, Singapore 117542 (Singapore)

2013-07-01

15

Improved beam spot measurements in the 2nd generation proton beam writing system  

International Nuclear Information System (INIS)

Nanosized ion beams (especially proton and helium) play a pivotal role in the field of ion beam lithography and ion beam analysis. Proton beam writing has shown lithographic details down to the 20 nm level, limited by the proton beam spot size. Introducing a smaller spot size will allow smaller lithographic features. Smaller probe sizes, will also drastically improve the spatial resolution for ion beam analysis techniques. Among many other requirements, having an ideal resolution standard, used for beam focusing and a reliable focusing method, is an important pre-requisite for sub-10 nm beam spot focusing. In this paper we present the fabrication processes of a free-standing resolution standard with reduced side-wall projection and high side-wall verticality. The resulting grid is orthogonal (90.0° ± 0.1), has smooth edges with better than 6 nm side-wall projection. The new resolution standard has been used in focusing a 2 MeV H2+ beam in the 2nd generation PBW system at Center for Ion Beam Applications, NUS. The beam size has been characterized using on- and off-axis scanning transmission ion microscopy (STIM) and ion induced secondary electron detection, carried out with a newly installed micro channel plate electron detector. The latter has been shown to be a realistic alternative to STIM measurements, as the drawback of PIN diode detector damage is alleviated. With these improvements we show reproducible beam focusing down to 14 nm

16

Improved beam spot measurements in the 2nd generation proton beam writing system  

Science.gov (United States)

Nanosized ion beams (especially proton and helium) play a pivotal role in the field of ion beam lithography and ion beam analysis. Proton beam writing has shown lithographic details down to the 20 nm level, limited by the proton beam spot size. Introducing a smaller spot size will allow smaller lithographic features. Smaller probe sizes, will also drastically improve the spatial resolution for ion beam analysis techniques. Among many other requirements, having an ideal resolution standard, used for beam focusing and a reliable focusing method, is an important pre-requisite for sub-10 nm beam spot focusing. In this paper we present the fabrication processes of a free-standing resolution standard with reduced side-wall projection and high side-wall verticality. The resulting grid is orthogonal (90.0° ± 0.1), has smooth edges with better than 6 nm side-wall projection. The new resolution standard has been used in focusing a 2 MeV H2+ beam in the 2nd generation PBW system at Center for Ion Beam Applications, NUS. The beam size has been characterized using on- and off-axis scanning transmission ion microscopy (STIM) and ion induced secondary electron detection, carried out with a newly installed micro channel plate electron detector. The latter has been shown to be a realistic alternative to STIM measurements, as the drawback of PIN diode detector damage is alleviated. With these improvements we show reproducible beam focusing down to 14 nm.

Yao, Yong; van Mourik, Martin W.; Santhana Raman, P.; van Kan, Jeroen A.

2013-07-01

17

A preliminary study investigating class characteristics in the Gurmukhi handwriting of 1st and 2nd generation Punjabis.  

Science.gov (United States)

Gurmukhi is a written script of the Punjabi language spoken by 104 million people worldwide. It has been previously shown in a study of Punjabi residents to contain several unique class characteristics. In this paper these class characteristics and others were analysed in both 1st generation and 2nd generation Punjabi decedents residing in the United Kingdom. Using the Pearson Chi-squared test, eight characteristic features were found to be statistically different in the Gurmukhi handwriting of the two populations (p > 0.01). Additionally there are several changes in previously identified class characteristics, such as script type and angularity of characters, between the 1st generation and 2nd generation Punjabi populations. These class characteristics may be of value to forensic document examiners and allow them to identify the population and the generation of the writer of a suspect document. PMID:18953800

Turner, Ian J; Sidhu, Rajvinder K; Love, Julian M

2008-09-01

18

Generation of higher order Gauss-Laguerre modes in single-pass 2nd harmonic generation  

DEFF Research Database (Denmark)

We present a realistic method for dynamic simulation of the development of higher order modes in second harmonic generation. The deformation of the wave fronts due to the nonlinear interaction is expressed by expansion in higher order Gauss-Laguerre modes.

Buchhave, Preben; Tidemand-Lichtenberg, Peter

2008-01-01

19

Generative Software Development  

Digital Repository Infrastructure Vision for European Research (DRIVER)

Generation of software from modeling languages such as UML and domain specific languages (DSLs) has become an important paradigm in software engineering. In this contribution, we present some positions on software development in a model based, generative manner based on home grown DSLs as well as the UML. This includes development of DSLs as well as development of models in these languages in order to generate executable code, test cases or models in different languages. Dev...

Rumpe, Bernhard; Schindler, Martin; Vo?lkel, Steven; Weisemo?ller, Ingo

2014-01-01

20

1st or 2nd generation bioethanol-impacts of technology integration & on feed production and land use  

DEFF Research Database (Denmark)

"1st or 2nd generation bioethanol-impacts of technology integration & on feed production and land use" Liquid bio fuels are perceived as a means of mitigating CO2 emissions from transport and thus climate change, but much concern has been raised to the energy consumption from refining biomass to liquid fuels. Integrating technologies such that waste stream can be used will reduce energy consumption in the production of bioethanol from wheat. We show that the integration of bio refining and combined heat an power generation reduces process energy requirements with 30-40 % and makes bioethanol production comparable to gasoline production in terms of energy loss. Utilisation of biomass in the energy sector is inevitably linked to the utilisation of land. This is a key difference between fossil and bio based energy systems. Thus evaluations of bioethanol production based on energy balances alone are inadequate. 1st and 2nd generation bioethanol production exhibits major differences when evaluated on characteristics as feed energy and feed protein production and subsequently on land use changes. 1st generation bioethanol production based on wheat grain in Denmark may in fact reduce the pressure on agricultural land on a global scale, but increase the pressure on local/national scale. In contrast to that 2nd generation bioethanol based on wheat straw exhibits a poorer energy balance than 1st generation, but the induced imbalances on feed energy are smaller. Proteins are some of the plant components with the poorest bio synthesis efficiency and as such the area demand for their production is relatively high. Preservation of the proteins in the biomass such as in feed by-products from bioethanol production is of paramount importance in developing sustainable utilisation of biomass in the energy sector.

Bentsen, Niclas Scott; Felby, Claus

2009-01-01

 
 
 
 
21

White paper on perspectives of biofuels in Denmark - with focus on 2nd generation bioethanol; Hvidbog om perspektiver for biobraendstoffer i Danmark - med fokus paa 2. generations bioethanol  

Energy Technology Data Exchange (ETDEWEB)

The white paper presents the perspectives - both options and barriers - for a Danish focus on production and use of biomass, including sustainable 2nd generation bioethanol, for transport. The white paper presents the current knowledge of biofuels and bioethanol and recommendations for a Danish strategy. (ln)

Larsen, Gy.; Foghmar, J.

2009-11-15

22

Self-assembling software generator  

Science.gov (United States)

A technique to generate an executable task includes inspecting a task specification data structure to determine what software entities are to be generated to create the executable task, inspecting the task specification data structure to determine how the software entities will be linked after generating the software entities, inspecting the task specification data structure to determine logic to be executed by the software entities, and generating the software entities to create the executable task.

Bouchard, Ann M. (Albuquerque, NM); Osbourn, Gordon C. (Albuquerque, NM)

2011-11-25

23

REFUEL. Potential and realizable cost reduction of 2nd generation biofuels  

International Nuclear Information System (INIS)

In the REFUEL project steering possibilities for and impacts of a greater market penetration of biofuels are assessed. Several benefits are attributed to second generation biofuels, fuels made from lignocellulosic feedstock, such as higher productivity, less impacts on land use and food markets and improved greenhouse gas emission reductions. The chances of second generation biofuels entering the market autonomously are assessed and several policy measures enhancing those changes are evaluated. It shows that most second generation biofuels might become competitive in the biofuel market, if the production of biodiesel from oil crops becomes limited by land availability. Setting high biofuel targets, setting greenhouse gas emissions caps on biofuel and setting subtargets for second generation biofuels, all have a similar impact of stimulating second generation's entrance into the biofuel market. Contrary, low biofuel targets and high imports can have a discouraging impact on second generation biofuel development, and thereby on overall greenhouse gas performance. Since this paper shows preliminary results from the REFUEL study, one is advised to contact the authors before quantitatively referring to this paper

24

Strategies for 2nd generation biofuels in EU - Co-firing to stimulate feedstock supply development and process integration to improve energy efficiency and economic competitiveness  

International Nuclear Information System (INIS)

The present biofuel policies in the European Union primarily stimulate 1st generation biofuels that are produced based on conventional food crops. They may be a distraction from lignocellulose based 2nd generation biofuels - and also from biomass use for heat and electricity - by keeping farmers' attention and significant investments focusing on first generation biofuels and the cultivation of conventional food crops as feedstocks. This article presents two strategies that can contribute to the development of 2nd generation biofuels based on lignocellulosic feedstocks. The integration of gasification-based biofuel plants in district heating systems is one option for increasing the energy efficiency and improving the economic competitiveness of such biofuels. Another option, biomass co-firing with coal, generates high-efficiency biomass electricity and reduces CO2 emissions by replacing coal. It also offers a near-term market for lignocellulosic biomass, which can stimulate development of supply systems for biomass also suitable as feedstock for 2nd generation biofuels. Regardless of the long-term priorities of biomass use for energy, the stimulation of lignocellulosic biomass production by development of near term and cost-effective markets is judged to be a no-regrets strategy for Europe. Strategies that induce a relevant development and exploit existing energy infrastructures in order to reduce risk and reach lower costs, are proposed an attractive complement the present and prospective biofuel policies.

25

Development of WWER-440 fuel. Use of fuel assemblies of 2-nd and 3-rd generations with increased enrichment  

International Nuclear Information System (INIS)

The problem of increasing the power of units at NPPs with WWER-440 is of current importance. There are all the necessary prerequisites for the above-stated problem as a result of updating the design of fuel assemblies and codes. The decrease of power peaking factor in the core is achieved by using profiled fuel assemblies, fuel-integrated burning absorber, FAs with modernized docking unit, modern codes, which allows decreasing conservatism of RP safety substantiation. A wide range of experimental studies of fuel behaviour has been performed which has reached burn-up of (50-60) MW·day/kgU in transition and emergency conditions, post-reactor studies of fuel assemblies, fuel rods and fuel pellets with a 5-year operating period have been performed, which prove high reliability of fuel, presence of a large margin in the fuel pillar, which helps reactor operation at increased power. The results of the work performed on introduction of 5-6 fuel cycles show that the ultimate fuel state on operability in WWER-440 reactors is far from being achieved. Neutron-physical and thermal-hydraulic characteristics of the cores of working power units with RP V-213 are such that actual (design and measured) power peaking factors on fuel assemblies and fuel rods, as a rule, are smaller than the maximum design values. This factor is a real reserve for power forcing. There is experience of operating Units 1, 2, 4 of the Kola NPP and Unit 2 of the Rovno NPP at increased power. Units of the Loviisa NPP are operated at 109 % power. During transfer to work at increased power it is reasonable to use fuel assemblies with increased height of the fuel pillar, which allows decreasing medium linear power distribution. Further development of the 2-nd generation fuel assembly design and consequent transition to working fuel assemblies of the 3-rd generation provides significant improvement of fuel consumption under the conditions of WWER-440 reactors operation with more continuous fuel cycles and increased power

26

Experimental Investigation of 2nd Generation Bioethanol Derived from Empty-Fruit-Bunch (EFB) of Oil-Palmon Performance and Exhaust Emission of SI Engine  

Digital Repository Infrastructure Vision for European Research (DRIVER)

The experimental investigation of 2nd generation bioethanol derived from EFB of oil-palm blended with gasoline for 10, 20, 25% by volume and pure gasoline were conducted on performance and exhaust emission tests of SI engine. A four stroke, four cylinders, programmed fuel injection (PGMFI), 16 valves variable valve timing and electronic lift control (VTEC), single overhead camshaft (SOHC), and 1497 cm3 SI engine (Honda/L15A) was used in this investigation. Engine performance test was carried ...

Yanuandri Putrasari; Haznan Abimanyu; Achmad Praptijanto; Arifin Nur; Yan Irawan; Sabar Pangihutan

2014-01-01

27

Performance Evaluation of Electrochem's PEM Fuel Cell Power Plant for NASA's 2nd Generation Reusable Launch Vehicle  

Science.gov (United States)

NASA's Next Generation Launch Technology (NGLT) program is being developed to meet national needs for civil and commercial space access with goals of reducing the launch costs, increasing the reliability, and reducing the maintenance and operating costs. To this end, NASA is considering an all- electric capability for NGLT vehicles requiring advanced electrical power generation technology at a nominal 20 kW level with peak power capabilities six times the nominal power. The proton exchange membrane (PEM) fuel cell has been identified as a viable candidate to supply this electrical power; however, several technology aspects need to be assessed. Electrochem, Inc., under contract to NASA, has developed a breadboard power generator to address these technical issues with the goal of maximizing the system reliability while minimizing the cost and system complexity. This breadboard generator operates with dry hydrogen and oxygen gas using eductors to recirculate the gases eliminating gas humidification and blowers from the system. Except for a coolant pump, the system design incorporates passive components allowing the fuel cell to readily follow a duty cycle profile and that may operate at high 6:1 peak power levels for 30 second durations. Performance data of the fuel cell stack along with system performance is presented to highlight the benefits of the fuel cell stack design and system design for NGLT vehicles.

Kimble, Michael C.; Hoberecht, Mark

2003-01-01

28

Stroke Symbol Generation Software for Fighter Aircraft  

Digital Repository Infrastructure Vision for European Research (DRIVER)

This paper gives an overview of the stroke symbol generation software developed by Hindustan Aeronautics Limited for fighter aircraft. This paper covers the working principle of head-up-display, overview of target hardware on which the developed software has been integrated and tested, software architecture, hardware software interfaces and design details of stroke symbol generation software. The paper also covers the issues related to stroke symbol quality which were encountered by the desig...

Tripathi, G. K.; Prashant Kumar

2013-01-01

29

Contribution of ion beam analysis methods to the development of 2nd generation high temperature superconducting (HTS) wires  

Energy Technology Data Exchange (ETDEWEB)

One of the crucial steps in the second generation high temperature superconducting wire program was development of the buffer layer architecture. The architecture designed at the Superconductivity Technology Center at Los Alamos National Laboratory consists of several oxide layers wherein each layer plays a specific role, namely: nucleation layer, diffusion barrier, biaxially textured template, and an intermediate layer with a good match to the lattice parameter of superconducting Y{sub 1}Ba{sub 2}Cu{sub 3}O{sub 7} (YBCO) compound. This report demonstrates how a wide range of ion beam analysis techniques (SIMS, RBS, channeling, PIXE, PIGE, NRA, ERD) was employed for analysis of each buffer layer and the YBCO films. These results assisted in understanding of a variety of physical processes occurring during the buffet layer fabrication and helped to optimize the buffer layer architecture as a whole.

Usov, Igor O [Los Alamos National Laboratory; Arendt, Paul N [Los Alamos National Laboratory; Stan, Liliana [Los Alamos National Laboratory; Holesinger, Terry G [Los Alamos National Laboratory; Foltyn, Steven R [Los Alamos National Laboratory; Depaula, Raymond F [Los Alamos National Laboratory

2009-01-01

30

FT-IR Investigation of Hoveyda-Grubbs'2nd Generation Catalyst in Self-Healing Epoxy Mixtures  

International Nuclear Information System (INIS)

The development of smart composites capable of self-repair on aeronautical structures is still at the planning stage owing to complex issues to overcome. A very important issue to solve concerns the components' stability of the proposed composites which are compromised at the cure temperatures necessary for good performance of the composite. In this work we analyzed the possibility to apply Hoveyda Grubbs' second generation catalyst (HG2) to develop self-healing systems. Our experimental results have shown critical issues in the use of epoxy precursors in conjunction with Hoveyda-Grubbs II metathesis catalyst. However, an appropriate curing cycle of the self-healing mixture permits to overcome the critical issues making possible high temperatures for the curing process without deactivating self-repair activity.

31

Control system for the 2nd generation Berkeley automounters (BAM2) at GM/CA-CAT macromolecular crystallography beamlines  

Energy Technology Data Exchange (ETDEWEB)

GM/CA-CAT at Sector 23 of the Advanced Photon Source (APS) is an NIH funded facility for crystallographic structure determination of biological macromolecules by X-ray diffraction. A second-generation Berkeley automounter is being integrated into the beamline control system at the 23BM experimental station. This new device replaces the previous all-pneumatic gripper motions with a combination of pneumatics and XYZ motorized linear stages. The latter adds a higher degree of flexibility to the robot including auto-alignment capability, accommodation of a larger capacity sample Dewar of arbitrary shape, and support for advanced operations such as crystal washing, while preserving the overall simplicity and efficiency of the Berkeley automounter design.

Makarov, O., E-mail: makarov@anl.gov [GM/CA-CAT, Biosciences Division, Argonne National Laboratory, Argonne, IL 60439 (United States); Hilgart, M.; Ogata, C.; Pothineni, S. [GM/CA-CAT, Biosciences Division, Argonne National Laboratory, Argonne, IL 60439 (United States); Cork, C. [Physical Biosciences Division, Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States)

2011-09-01

32

Anaerobic digestion in combination with 2nd generation ethanol production for maximizing biofuels yield from lignocellulosic biomass – testing in an integrated pilot-scale biorefinery plant.  

DEFF Research Database (Denmark)

An integrated biorefinery concept for 2nd generation bioethanol production together with biogas production from the fermentation effluent was tested in pilot-scale. The pilot plant comprised pretreatment, enzymatic hydrolysis, hexose and pentose fermentation into ethanol and anaerobic digestion of the fermentation effluent in a UASB (upflow anaerobic sludge blanket) reactor. Operation of the 770 liter UASB reactor was tested under both mesophilic (38ºC) and thermophilic (53ºC) conditions with increasing loading rates of the liquid fraction of the effluent from ethanol fermentation. At an OLR of 3.5 kg-VS/(m3•d) a methane yield of 340 L/kg-VS was achieved for thermophilic operation while 270 L/kg-VS was obtained under mesophilic conditions. Thermophilic operation was, however, less robust towards further increase of the loading rate and for loading rates higher than 5 kg-VS/(m3•d) the yield was higher for mesophilic than for thermophilic operation. The effluent from the ethanol fermentation showed no signs of toxicity to the anaerobic microorganisms. Implementation of the biogas production from the fermentation effluent accounted for about 30% higher biofuels yield in the biorefinery compared to a system with only bioethanol production.

Uellendahl, Hinrich; Ahring, Birgitte Kiær

33

Power plant intake quantification of wheat straw composition for 2nd generation bioethanol optimization : A Near Infrared Spectroscopy (NIRS) feasibility study  

DEFF Research Database (Denmark)

Optimization of 2nd generation bioethanol production from wheat straw requires comprehensive knowledge of plant intake feedstock composition. Near Infrared Spectroscopy is evaluated as a potential method for instantaneous quantification of the salient fermentation wheat straw components: cellulose (glucan), hemicelluloses (xylan, arabinan), and lignin. Aiming at chemometric multivariate calibration, 44 pre-selected samples were subjected to spectroscopy and reference analysis. For glucan and xylan prediction accuracies (slope: 0.89, 0.94) and precisions (r2: 0.87) were obtained, corresponding to error of prediction levels at 8–9%. Models for arabinan and lignin were marginally less good, and especially for lignin a further expansion of the feasibility dataset was deemed necessary. The results are related to significant influences from sub-sampling/mass reduction errors in the laboratory regimen. A relative high proportion of outliers excluded from the present models (10–20%) may indicate that comminution sample preparation is most likely always needed. Different solutions to these issues are suggested.

Lomborg, Carina J.; Thomsen, Mette Hedegaard

2010-01-01

34

Experimental Investigation of 2nd Generation Bioethanol Derived from Empty-fruit-bunch (EFB of Oil-palm on Performance and Exhaust Emission of SI Engine  

Directory of Open Access Journals (Sweden)

Full Text Available The experimental investigation of 2nd generation bioethanol derived from EFB of oil-palm blended with gasoline for 10, 20, 25% by volume and pure gasoline were conducted on performance and exhaust emission tests of SI engine. A four stroke, four cylinders, programmed fuel injection (PGMFI, 16 valves variable valve timing and electronic lift control (VTEC, single overhead camshaft (SOHC, and 1,497 cm3 SI engine (Honda/L15A was used in this investigation. Engine performance test was carried out for brake torque, power, and fuel consumption. The exhaust emission was analyzed for carbon monoxide (CO and hydrocarbon (HC. The engine was operated on speed range from1,500 until 4,500 rev/min with 85% throttle opening position. The results showed that the highest brake torque of bioethanol blends achieved by 10% bioethanol content at 3,000 to 4,500 rpm, the brake power was greater than pure gasoline at 3,500 to 4,500 rpm for 10% bioethanol, and bioethanol-gasoline blends of 10 and 20% resulted greater bsfc than pure gasoline at low speed from 1,500 to 3,500 rpm. The trend of CO and HC emissions tended to decrease when the engine speed increased.

Yanuandri Putrasari

2014-07-01

35

Direct and non-destructive proof of authenticity for the 2nd generation of Brazilian real banknotes via easy ambient sonic spray ionization mass spectrometry.  

Science.gov (United States)

Using a desorption/ionization technique, easy ambient sonic-spray ionization coupled to mass spectrometry (EASI-MS), documents related to the 2nd generation of Brazilian Real currency (R$) were screened in the positive ion mode for authenticity based on chemical profiles obtained directly from the banknote surface. Characteristic profiles were observed for authentic, seized suspect counterfeit and counterfeited homemade banknotes from inkjet and laserjet printers. The chemicals in the authentic banknotes' surface were detected via a few minor sets of ions, namely from the plasticizers bis(2-ethylhexyl)phthalate (DEHP) and dibutyl phthalate (DBP), most likely related to the official offset printing process, and other common quaternary ammonium cations, presenting a similar chemical profile to 1st-generation R$. The seized suspect counterfeit banknotes, however, displayed abundant diagnostic ions in the m/z 400-800 range due to the presence of oligomers. High-accuracy FT-ICR MS analysis enabled molecular formula assignment for each ion. The ions were separated by 44m/z, which enabled their characterization as Surfynol® 4XX (S4XX, XX=40, 65, and 85), wherein increasing XX values indicate increasing amounts of ethoxylation on a backbone of 2,4,7,9-tetramethyl-5-decyne-4,7-diol (Surfynol® 104). Sodiated triethylene glycol monobutyl ether (TBG) of m/z 229 (C10H22O4Na) was also identified in the seized counterfeit banknotes via EASI(+) FT-ICR MS. Surfynol® and TBG are constituents of inks used for inkjet printing. PMID:25498934

Schmidt, Eduardo Morgado; Franco, Marcos Fernando; Regino, Karen Gomes; Lehmann, Eraldo Luiz; Arruda, Marco Aurélio Zezzi; de Carvalho Rocha, Werickson Fortunato; Borges, Rodrigo; de Souza, Wanderley; Eberlin, Marcos Nogueira; Correa, Deleon Nascimento

2014-12-01

36

QMG: mesh generation and related software  

Science.gov (United States)

The QMG package does finite element mesh generation in two and three dimensions. The package includes geometric modeling software, the mesh generator itself, and a finite element solver. It is free software whose source code is downloadable from the Web. QMG2.0 runs under Unix and Windows NT.

37

Experimental and numerical validation of the effective medium theory for the B-term band broadening in 1st and 2nd generation monolithic silica columns.  

Science.gov (United States)

Effective medium theory (EMT) expressions for the B-term band broadening in monolithic silica columns are presented at the whole-column as well as at the mesoporous skeleton level. Given the bi-continuous nature of the monolithic medium, regular as well as inverse formulations of the EMT-expressions have been established. The established expressions were validated by applying them to a set of experimental effective diffusion (Deff)-data obtained via peak parking on a number of 1st and 2nd generation monolithic silica columns, as well as to a set of numerical diffusion simulations in a simplified monolithic column representation (tetrahedral skeleton model) with different external porosities and internal diffusion coefficients. The numerically simulated diffusion data can be very closely represented over a very broad range of zone retention factors (up to k?=80) using the established EMT-expressions, especially when using the inverse variant. The expressions also allow representing the experimentally measured effective diffusion data very closely. The measured Deff/Dmol-values were found to decrease significantly with increasing retention factor, in general going from about Deff/Dmol=0.55 to 0.65 at low k? (k??1.5-3.8) to Deff/Dmol=0.25 at very high k? (k??40-80). These values are significantly larger than observed in fully-porous and core-shell particles. The intra-skeleton diffusion coefficient (Dpz) was typically found to be of the order of Dpz/Dmol=0.4, compared to Dpz/Dmol=0.2-0.35 observed in most particle-based columns. These higher Dpz/Dmol values are the cause of the higher Deff/Dmol values observed. In addition, it also appears that the higher internal diffusion is linked to the higher porosity of the mesoporous skeleton that has a relatively open structure with relatively wide pores. The observed (weak) relation between Dpz/Dmol and the zone retention factor appears to be in good agreement with that predicted when applying the regular variant of the EMT-expression directly to the mesoporous skeleton level. PMID:24909439

Deridder, Sander; Vanmessen, Alison; Nakanishi, Kazuki; Desmet, Gert; Cabooter, Deirdre

2014-07-18

38

Quantification of left and right ventricular function and myocardial mass: Comparison of low-radiation dose 2nd generation dual-source CT and cardiac MRI  

International Nuclear Information System (INIS)

Objective: To prospectively evaluate the accuracy of left and right ventricular function and myocardial mass measurements based on a dual-step, low radiation dose protocol with prospectively ECG-triggered 2nd generation dual-source CT (DSCT), using cardiac MRI (cMRI) as the reference standard. Materials and methods: Twenty patients underwent 1.5 T cMRI and prospectively ECG-triggered dual-step pulsing cardiac DSCT. This image acquisition mode performs low-radiation (20% tube current) imaging over the majority of the cardiac cycle and applies full radiation only during a single adjustable phase. Full-radiation-phase images were used to assess cardiac morphology, while low-radiation-phase images were used to measure left and right ventricular function and mass. Quantitative CT measurements based on contiguous multiphase short-axis reconstructions from the axial CT data were compared with short-axis SSFP cardiac cine MRI. Contours were manually traced around the ventricular borders for calculation of left and right ventricular end-diastolic volume, end-systolic volume, stroke volume, ejection fraction and myocardial mass for both modalities. Statistical methods included independent t-tests, the Mann–Whitney U test, Pearson correlation statistics, and Bland–Altman analysis. Results: All CT measurements of left and right ventricular function and mass correlated well with those from cMRI: for left/right end-diastolic volume r = 0.885/0.801, left/right end-systolic volum5/0.801, left/right end-systolic volume r = 0.947/0.879, left/right stroke volume r = 0.620/0.697, left/right ejection fraction r = 0.869/0.751, and left/right myocardial mass r = 0.959/0.702. Mean radiation dose was 6.2 ± 1.8 mSv. Conclusions: Prospectively ECG-triggered, dual-step pulsing cardiac DSCT accurately quantifies left and right ventricular function and myocardial mass in comparison with cMRI with substantially lower radiation exposure than reported for traditional retrospective ECG-gating.

39

Development of automated crack propagation analysis system. 2nd report, the crack propagation analysis system and finite element model generation for the crack propagation  

International Nuclear Information System (INIS)

The authors have been developing a fully automated three-dimensional crack propagation analysis system. Although three-dimensional finite element analyses have become a common tool in the industries to perform design analyses, there still exist many difficulties in performing three-dimensional crack propagation analyses. That is because, although fully automatic mesh generation techniques are available for tetrahedral finite elements, hexahedral elements are commonly used in three-dimensional crack analyses. Furthermore, the analysis models tend to be large in their scales. The key components of present analysis system are the mesh generation software and virtual crack closure-integral method (VCCM) for the second-order tetrahedral finite element. VCCM is an energetic method to compute the stress intensity factors. In this paper, methodologies in automatic mesh generation for crack propagation analysis are described in detail and some numerical examples are presented. (author)

40

Generating Mechanisms for Evolving Software Mirror Graph  

Directory of Open Access Journals (Sweden)

Full Text Available Following the growing research interests in complex networks, in recent years many researchers treated static structures of software as complex networks and revealed that most of these networks demonstrate small-world effect and follow scale-free degree distribution. Different from the perspectives adopted in these works, our previous work proposed software mirror graph to model the dynamic execution processes of software and revealed software mirror graph may also be small world and scale-free. To explain how the software mirror graph evolves into a small world and scale free structure, in this paper we further proposed a mathematical model based on the mechanisms of growth, preferential attachment, and walking. This model captures some of the features of the software mirror graph, and the simulation results show that it can generate a network having similar properties to the software mirror graph. The implications are also discussed in this paper.

Ling-Zan Zhu

2012-09-01

 
 
 
 
41

Stroke Symbol Generation Software for Fighter Aircraft  

Directory of Open Access Journals (Sweden)

Full Text Available This paper gives an overview of the stroke symbol generation software developed by Hindustan Aeronautics Limited for fighter aircraft. This paper covers the working principle of head-up-display, overview of target hardware on which the developed software has been integrated and tested, software architecture, hardware software interfaces and design details of stroke symbol generation software. The paper also covers the issues related to stroke symbol quality which were encountered by the design team and the details about how the issues were resolved during integration and test phase.Defence Science Journal, 2013, 63(2, pp.153-156, DOI:http://dx.doi.org/10.14429/dsj.63.4257

G.K. Tripathi

2013-03-01

42

Developing software for Symbian OS 2nd edition a beginner''s guide to creating Symbian OS V9 smartphone applications in C++  

CERN Document Server

Many problems encountered by engineers developing code for specialized Symbian subsystems boil down to a lack of understanding of the core Symbian programming concepts. Developing Software for Symbian OS remedies this problem as it provides a comprehensive coverage of all the key concepts. Numerous examples and descriptions are also included, which focus on the concepts the author has seen developers struggle with the most. The book covers development ranging from low-level system programming to end user GUI applications. It also covers the development and packaging tools, as well as providing some detailed reference and examples for key APIs. The new edition includes a completely new chapter on platform security.The overall goal of the book is to provide introductory coverage of Symbian OS v9 and help developers with little or no knowledge of Symbian OS to develop as quickly as possible. There are few people with long Symbian development experience compared to demand, due to the rapid growth of Symbian in re...

Babin, Steve

2008-01-01

43

Experimental Stochatics (2nd edition)  

International Nuclear Information System (INIS)

Otto Moeschlin and his co-authors have written a book about simulation of stochastic systems. The book comes with a CD-ROM that contains the experiments discussed in the book, and the text from the book is repeated on the CD-ROM. According to the authors, the aim of the book is to give a quick introduction to stochastic simulation for 'all persons interested in experimental stochastics'. To please this diverse audience, the authors offer a book that has four parts. Part 1, called 'Artificial Randomness', is the longest of the four parts. It gives an overview of the generation, testing and basic usage of pseudo random numbers in simulation. Although algorithms for generating sequences of random numbers are fundamental to simulation, it is a slightly unusual choice to give it such weight in comparison to other algorithmic topics. The remaining three parts consist of simulation case studies. Part 2, 'Stochastic Models', treats four problems - Buffon's needle, a queuing system, and two problems related to the kinetic theory of gases. Part 3 is called 'Stochastic Processes' and discusses the simulation of discrete time Markov chains, birth-death processes, Brownian motion and diffusions. The last section of Part 3 is about simulation as a tool to understand the traffic flow in a system controlled by stoplights, an area of research for the authors. Part 4 is called 'Evaluation of Statistical Procedures'. This section contains examples where simulation is used to test the pees where simulation is used to test the performance of statistical methods. It covers four examples: the Neymann-Pearson lemma, the Wald sequential test, Bayesian point estimation and Hartigan procedures. The CD-ROM contains an easy-to-install software package that runs under Microsoft Windows. The software contains the text and simulations from the book. What I found most enjoyable about this book is the number of topics covered in the case studies. The highly individual selection of applications, which may serve as a source of inspiration for teachers of computational stochastic methods, is the main contribution of this electronic monograph. However, both the book and software suffer from several severe problems. Firstly, I feel that the structure of the text is weak. Probably this is partly the result of the text from the CD-ROM being put into a book format, but the short paragraphs and poorly structured sentences destroy the reading experience. Secondly, although the software is functional, I believe that, like me, many users will be disappointed by the quality of the user interface and the visualizations. The opportunities to interact with the simulations are limited. Thirdly, the presentation is slightly old fashioned and lacking in pedagogical structure. For example, flow charts and Pascal programs are used to present algorithms. To conclude, I am surprised that this electronic monograph warranted a second edition in this form. Teachers may find the examples useful as a starting point, but students and researchers are advised to look elsewhere. (book review)

44

Automated Environment Generation for Software Model Checking  

Science.gov (United States)

A key problem in model checking open systems is environment modeling (i.e., representing the behavior of the execution context of the system under analysis). Software systems are fundamentally open since their behavior is dependent on patterns of invocation of system components and values defined outside the system but referenced within the system. Whether reasoning about the behavior of whole programs or about program components, an abstract model of the environment can be essential in enabling sufficiently precise yet tractable verification. In this paper, we describe an approach to generating environments of Java program fragments. This approach integrates formally specified assumptions about environment behavior with sound abstractions of environment implementations to form a model of the environment. The approach is implemented in the Bandera Environment Generator (BEG) which we describe along with our experience using BEG to reason about properties of several non-trivial concurrent Java programs.

Tkachuk, Oksana; Dwyer, Matthew B.; Pasareanu, Corina S.

2003-01-01

45

Next generation lightweight mirror modeling software  

Science.gov (United States)

The advances in manufacturing techniques for lightweight mirrors, such as EXELSIS deep core low temperature fusion, Corning's continued improvements in the Frit bonding process and the ability to cast large complex designs, combined with water-jet and conventional diamond machining of glasses and ceramics has created the need for more efficient means of generating finite element models of these structures. Traditional methods of assembling 400,000 + element models can take weeks of effort, severely limiting the range of possible optimization variables. This paper will introduce model generation software developed under NASA sponsorship for the design of both terrestrial and space based mirrors. The software deals with any current mirror manufacturing technique, single substrates, multiple arrays of substrates, as well as the ability to merge submodels into a single large model. The modeler generates both mirror and suspension system elements, suspensions can be created either for each individual petal or the whole mirror. A typical model generation of 250,000 nodes and 450,000 elements only takes 3-5 minutes, much of that time being variable input time. The program can create input decks for ANSYS, ABAQUS and NASTRAN. An archive/retrieval system permits creation of complete trade studies, varying cell size, depth, and petal size, suspension geometry with the ability to recall a particular set of parameters and make small or large changes with ease. The input decks created by the modeler are text files which can be modified by any text editor, all the shell thickness parameters and suspension spring rates are accessible and comments in deck identify which groups of elements are associated with these parameters. This again makes optimization easier. With ANSYS decks, the nodes representing support attachments are grouped into components; in ABAQUS these are SETS and in NASTRAN as GRIDPOINT SETS, this make integration of these models into large telescope or satellite models easier.

Arnold, William R.; Fitzgerald, Matthew; Rosa, Rubin Jaca; Stahl, H. Philip

2013-09-01

46

Next Generation Lightweight Mirror Modeling Software  

Science.gov (United States)

The advances in manufacturing techniques for lightweight mirrors, such as EXELSIS deep core low temperature fusion, Corning's continued improvements in the Frit bonding process and the ability to cast large complex designs, combined with water-jet and conventional diamond machining of glasses and ceramics has created the need for more efficient means of generating finite element models of these structures. Traditional methods of assembling 400,000 + element models can take weeks of effort, severely limiting the range of possible optimization variables. This paper will introduce model generation software developed under NASA sponsorship for the design of both terrestrial and space based mirrors. The software deals with any current mirror manufacturing technique, single substrates, multiple arrays of substrates, as well as the ability to merge submodels into a single large model. The modeler generates both mirror and suspension system elements, suspensions can be created either for each individual petal or the whole mirror. A typical model generation of 250,000 nodes and 450,000 elements only takes 5-10 minutes, much of that time being variable input time. The program can create input decks for ANSYS, ABAQUS and NASTRAN. An archive/retrieval system permits creation of complete trade studies, varying cell size, depth, and petal size, suspension geometry with the ability to recall a particular set of parameters and make small or large changes with ease. The input decks created by the modeler are text files which can be modified by any editor, all the key shell thickness parameters are accessible and comments in deck identify which groups of elements are associated with these parameters. This again makes optimization easier. With ANSYS decks, the nodes representing support attachments are grouped into components; in ABAQUS these are SETS and in NASTRAN as GRIDPOINT SETS, this make integration of these models into large telescope or satellite models easier.

Arnold, William R., Sr.; Fitzgerald, Mathew; Rosa, Rubin Jaca; Stahl, H. Philip

2013-01-01

47

Next-Generation Lightweight Mirror Modeling Software  

Science.gov (United States)

The advances in manufacturing techniques for lightweight mirrors, such as EXELSIS deep core low temperature fusion, Corning's continued improvements in the Frit bonding process and the ability to cast large complex designs, combined with water-jet and conventional diamond machining of glasses and ceramics has created the need for more efficient means of generating finite element models of these structures. Traditional methods of assembling 400,000 + element models can take weeks of effort, severely limiting the range of possible optimization variables. This paper will introduce model generation software developed under NASA sponsorship for the design of both terrestrial and space based mirrors. The software deals with any current mirror manufacturing technique, single substrates, multiple arrays of substrates, as well as the ability to merge submodels into a single large model. The modeler generates both mirror and suspension system elements, suspensions can be created either for each individual petal or the whole mirror. A typical model generation of 250,000 nodes and 450,000 elements only takes 5-10 minutes, much of that time being variable input time. The program can create input decks for ANSYS, ABAQUS and NASTRAN. An archive/retrieval system permits creation of complete trade studies, varying cell size, depth, and petal size, suspension geometry with the ability to recall a particular set of parameters and make small or large changes with ease. The input decks created by the modeler are text files which can be modified by any editor, all the key shell thickness parameters are accessible and comments in deck identify which groups of elements are associated with these parameters. This again makes optimization easier. With ANSYS decks, the nodes representing support attachments are grouped into components; in ABAQUS these are SETS and in NASTRAN as GRIDPOINT SETS, this make integration of these models into large telescope or satellite models possible

Arnold, William R., Sr.; Fitzgerald, Mathew; Rosa, Rubin Jaca; Stahl, Phil

2013-01-01

48

A Practical GLR Parser Generator for Software Reverse Engineering  

Directory of Open Access Journals (Sweden)

Full Text Available Traditional parser generators use deterministic parsing methods. These methods can not meet the parsing requirements of software reverse engineering effectively. A new parser generator is presented which can generate GLR parser with automatic error recovery. The generated GLR parser has comparable parsing speed with the traditional LALR(1 parser and can be used in the parsing of software reverse engineering.

Teng Geng

2014-03-01

49

A Practical GLR Parser Generator for Software Reverse Engineering  

Digital Repository Infrastructure Vision for European Research (DRIVER)

Traditional parser generators use deterministic parsing methods. These methods can not meet the parsing requirements of software reverse engineering effectively. A new parser generator is presented which can generate GLR parser with automatic error recovery. The generated GLR parser has comparable parsing speed with the traditional LALR(1) parser and can be used in the parsing of software reverse engineering.

Teng Geng; Fu Xu; Han Mei; Wei Meng; Zhibo Chen; Changqing Lai

2014-01-01

50

Techno-economic evaluation of 2nd generation bioethanol production from sugar cane bagasse and leaves integrated with the sugar-based ethanol process  

Directory of Open Access Journals (Sweden)

Full Text Available Abstract Background Bioethanol produced from the lignocellulosic fractions of sugar cane (bagasse and leaves, i.e. second generation (2G bioethanol, has a promising market potential as an automotive fuel; however, the process is still under investigation on pilot/demonstration scale. From a process perspective, improvements in plant design can lower the production cost, providing better profitability and competitiveness if the conversion of the whole sugar cane is considered. Simulations have been performed with AspenPlus to investigate how process integration can affect the minimum ethanol selling price of this 2G process (MESP-2G, as well as improve the plant energy efficiency. This is achieved by integrating the well-established sucrose-to-bioethanol process with the enzymatic process for lignocellulosic materials. Bagasse and leaves were steam pretreated using H3PO4 as catalyst and separately hydrolysed and fermented. Results The addition of a steam dryer, doubling of the enzyme dosage in enzymatic hydrolysis, including leaves as raw material in the 2G process, heat integration and the use of more energy-efficient equipment led to a 37 % reduction in MESP-2G compared to the Base case. Modelling showed that the MESP for 2G ethanol was 0.97 US$/L, while in the future it could be reduced to 0.78 US$/L. In this case the overall production cost of 1G + 2G ethanol would be about 0.40 US$/L with an output of 102 L/ton dry sugar cane including 50 % leaves. Sensitivity analysis of the future scenario showed that a 50 % decrease in the cost of enzymes, electricity or leaves would lower the MESP-2G by about 20%, 10% and 4.5%, respectively. Conclusions According to the simulations, the production of 2G bioethanol from sugar cane bagasse and leaves in Brazil is already competitive (without subsidies with 1G starch-based bioethanol production in Europe. Moreover 2G bioethanol could be produced at a lower cost if subsidies were used to compensate for the opportunity cost from the sale of excess electricity and if the cost of enzymes continues to fall.

Macrelli Stefano

2012-04-01

51

Minimal unroll factor for code generation of software pipelining  

Digital Repository Infrastructure Vision for European Research (DRIVER)

We address the problem of generating compact code from software pipelined loops. Although software pipelining is a powerful technique to extract fine- grain parallelism, it generates lifetime intervals spanning multiple loop iterations. These intervals require periodic register allocation (also called variable expansion), which in turn yields a code generation challenge. We are looking for the minimal unrolling factor enabling the periodic register allocation of software pipelined kernels. Th...

Bachir, Mounira; Touati, Sid-ahmed-ali; Brault, Frederic; Gregg, David; Cohen, Albert

2013-01-01

52

The 2nd Generation z(Redshift) and Early Universe Spectrometer Part I: First-light observation of a highly lensed local-ULIRG analog at high-z  

CERN Document Server

We report first science results from our new spectrometer, the 2nd generation z(Redshift) and Early Universe Spectrometer (ZEUS-2), recently commissioned on the Atacama Pathfinder Experiment telescope (APEX). ZEUS-2 is a submillimeter grating spectrometer optimized for detecting the faint and broad lines from distant galaxies that are redshifted into the telluric windows from 200 to 850 microns. It utilizes a focal plane array of transition-edge sensed bolometers, the first use of these arrays for astrophysical spectroscopy. ZEUS-2 promises to be an important tool for studying galaxies in the years to come due to its synergy with ALMA and its capabilities in the short submillimeter windows that are unique in the post Herschel era. Here we report on our first detection of the [CII] 158 $\\mu m$ line with ZEUS-2. We detect the line at z ~ 1.8 from H-ATLAS J091043.1-000322 with a line flux of $(6.44 \\pm 0.42) \\times 10^{-18} W m^{-2}$. Combined with its far-infrared luminosity and a new Herschel-PACS detection of...

Ferkinhoff, Carl; Parshley, Stephen; Nikola, Thomas; Stacey, Gordon J; Schoenwald, Justin; Higdon, James L; Higdon, Sarah J U; Verma, Aprajita; Riechers, Dominik; Hailey-Dunsheath, Steven; Menten, Karl; Güsten, Rolf; Wieß, Axel; Irwin, Kent; Cho, Hsiao M; Niemack, Michael; Halpern, Mark; Amiri, Mandana; Hasselfield, Matthew; Wiebe, D V; Ade, Peter A R; Tucker, Carol E

2013-01-01

53

A Practical Evaluation of Next Generation Sequencing & Molecular Cloning Software  

Digital Repository Infrastructure Vision for European Research (DRIVER)

Laboratories using Next Generation Sequencing (NGS) technologies and/ or high-throughput molecular cloning experiments can spend a significant amount of their research budget on data analysis and data management. The decision to develop in-house software, to rely on combinations of free software packages, or to purchase commercial software can significantly affect productivity and ROI. In this talk, we will describe a practical software evaluation process that was developed to assist core fac...

Meintjes, Peter; Qaadri, Kashef; Olsen, Christian

2013-01-01

54

Automatic program generation: future of software engineering  

Energy Technology Data Exchange (ETDEWEB)

At this moment software development is still more of an art than an engineering discipline. Each piece of software is lovingly engineered, nurtured, and presented to the world as a tribute to the writer's skill. When will this change. When will the craftsmanship be removed and the programs be turned out like so many automobiles from an assembly line. Sooner or later it will happen: economic necessities will demand it. With the advent of cheap microcomputers and ever more powerful supercomputers doubling capacity, much more software must be produced. The choices are to double the number of programers, double the efficiency of each programer, or find a way to produce the needed software automatically. Producing software automatically is the only logical choice. How will automatic programing come about. Some of the preliminary actions which need to be done and are being done are to encourage programer plagiarism of existing software through public library mechanisms, produce well understood packages such as compiler automatically, develop languages capable of producing software as output, and learn enough about the whole process of programing to be able to automate it. Clearly, the emphasis must not be on efficiency or size, since ever larger and faster hardware is coming.

Robinson, J.H.

1979-01-01

55

Next-generation business intelligence software with Silverlight 3  

CERN Document Server

Business Intelligence (BI) software is the code and tools that allow you to view different components of a business using a single visual platform, making comprehending mountains of data easier. Applications that include reports, analytics, statistics, and historical and predictive modeling are all examples of BI applications. Currently, we are in the second generation of BI software, called BI 2.0. This generation is focused on writing BI software that is predictive, adaptive, simple, and interactive. As computers and software have evolved, more data can be presented to end users with increas

Czernicki, Bart

2010-01-01

56

Automating Traceability for Generated Software Artifacts  

Science.gov (United States)

Program synthesis automatically derives programs from specifications of their behavior. One advantage of program synthesis, as opposed to manual coding, is that there is a direct link between the specification and the derived program. This link is, however, not very fine-grained: it can be best characterized as Program is-derived- from Specification. When the generated program needs to be understood or modified, more $ne-grained linking is useful. In this paper, we present a novel technique for automatically deriving traceability relations between parts of a specification and parts of the synthesized program. The technique is very lightweight and works -- with varying degrees of success - for any process in which one artifact is automatically derived from another. We illustrate the generality of the technique by applying it to two kinds of automatic generation: synthesis of Kalman Filter programs from speci3cations using the Aut- oFilter program synthesis system, and generation of assembly language programs from C source code using the GCC C compilel: We evaluate the effectiveness of the technique in the latter application.

Richardson, Julian; Green, Jeffrey

2004-01-01

57

Search-based software test data generation using evolutionary computation  

CERN Document Server

Search-based Software Engineering has been utilized for a number of software engineering activities. One area where Search-Based Software Engineering has seen much application is test data generation. Evolutionary testing designates the use of metaheuristic search methods for test case generation. The search space is the input domain of the test object, with each individual or potential solution, being an encoded set of inputs to that test object. The fitness function is tailored to find test data for the type of test that is being undertaken. Evolutionary Testing (ET) uses optimizing search techniques such as evolutionary algorithms to generate test data. The effectiveness of GA-based testing system is compared with a Random testing system. For simple programs both testing systems work fine, but as the complexity of the program or the complexity of input domain grows, GA-based testing system significantly outperforms Random testing.

Maragathavalli, P

2011-01-01

58

Beyond the 2nd Fermi Pulsar Catalog  

CERN Document Server

Over thirteen times more gamma-ray pulsars have now been studied with the Large Area Telescope on NASA's Fermi satellite than the ten seen with the Compton Gamma-Ray Observatory in the nineteen-nineties. The large sample is diverse, allowing better understanding both of the pulsars themselves and of their roles in various cosmic processes. Here we explore the prospects for even more gamma-ray pulsars as Fermi enters the 2nd half of its nominal ten-year mission. New pulsars will naturally tend to be fainter than the first ones discovered. Some of them will have unusual characteristics compared to the current population, which may help discriminate between models. We illustrate a vision of the future with a sample of six pulsars discovered after the 2nd Fermi Pulsar Catalog was written.

Hou, Xian; Reposeur, Thierry; Rousseau, Romain

2013-01-01

59

Computer Software Configuration Item-Specific Flight Software Image Transfer Script Generator  

Science.gov (United States)

A K-shell UNIX script enables the International Space Station (ISS) Flight Control Team (FCT) operators in NASA s Mission Control Center (MCC) in Houston to transfer an entire or partial computer software configuration item (CSCI) from a flight software compact disk (CD) to the onboard Portable Computer System (PCS). The tool is designed to read the content stored on a flight software CD and generate individual CSCI transfer scripts that are capable of transferring the flight software content in a given subdirectory on the CD to the scratch directory on the PCS. The flight control team can then transfer the flight software from the PCS scratch directory to the Electronically Erasable Programmable Read Only Memory (EEPROM) of an ISS Multiplexer/ Demultiplexer (MDM) via the Indirect File Transfer capability. The individual CSCI scripts and the CSCI Specific Flight Software Image Transfer Script Generator (CFITSG), when executed a second time, will remove all components from their original execution. The tool will identify errors in the transfer process and create logs of the transferred software for the purposes of configuration management.

Bolen, Kenny; Greenlaw, Ronald

2010-01-01

60

Newton's 2nd Law: Inquiry Approach  

Science.gov (United States)

In this lab activity, learners act as fellow scientists and colleagues of Isaac Newton. He has asked them to independently test his ideas on the nature of motion, in particular his 2nd Law. The emphasis here is on the process of science rather than the actual results. Learners can use the Science Flowchart to trace and discuss their process. Time estimate and materials are given for learners to run their designed experiments.

Cecilia Tung

2010-01-01

 
 
 
 
61

2nd International Arctic Ungulate Conference  

Digital Repository Infrastructure Vision for European Research (DRIVER)

The 2nd International Arctic Ungulate Conference was held 13-17 August 1995 on the University of Alaska Fairbanks campus. The Institute of Arctic Biology and the Alaska Cooperative Fish and Wildlife Research Unit were responsible for organizing the conference with assistance from biologists with state and federal agencies and commercial organizations. David R. Klein was chair of the conference organizing committee. Over 200 people attended the conference, coming from 10 different countries. T...

Anonymous, A.

1996-01-01

62

A code generation framework for the ALMA common software  

Science.gov (United States)

Code generation helps in smoothing the learning curve of a complex application framework and in reducing the number of Lines Of Code (LOC) that a developer needs to craft. The ALMA Common Software (ACS) has adopted code generation in specific areas, but we are now exploiting the more comprehensive approach of Model Driven code generation to transform directly an UML Model into a full implementation in the ACS framework. This approach makes it easier for newcomers to grasp the principles of the framework. Moreover, a lower handcrafted LOC reduces the error rate. Additional benefits achieved by model driven code generation are: software reuse, implicit application of design patterns and automatic tests generation. A model driven approach to design makes it also possible using the same model with different frameworks, by generating for different targets. The generation framework presented in this paper uses openArchitectureWare1 as the model to text translator. OpenArchitectureWare provides a powerful functional language that makes this easier to implement the correct mapping of data types, the main difficulty encountered in the translation process. The output is an ACS application readily usable by the developer, including the necessary deployment configuration, thus minimizing any configuration burden during testing. The specific application code is implemented by extending generated classes. Therefore, generated and manually crafted code are kept apart, simplifying the code generation process and aiding the developers by keeping a clean logical separation between the two. Our first results show that code generation improves dramatically the code productivity.

Troncoso, Nicolás; von Brand, Horst H.; Ibsen, Jorge; Mora, Matias; Gonzalez, Victor; Chiozzi, Gianluca; Jeram, Bogdan; Sommer, Heiko; Zamora, Gabriel; Tejeda, Alexis

2010-07-01

63

Generation of Embedded Hardware/Software from SystemC  

Directory of Open Access Journals (Sweden)

Full Text Available Designers increasingly rely on reusing intellectual property (IP and on raising the level of abstraction to respect system-on-chip (SoC market characteristics. However, most hardware and embedded software codes are recoded manually from system level. This recoding step often results in new coding errors that must be identified and debugged. Thus, shorter time-to-market requires automation of the system synthesis from high-level specifications. In this paper, we propose a design flow intended to reduce the SoC design cost. This design flow unifies hardware and software using a single high-level language. It integrates hardware/software (HW/SW generation tools and an automatic interface synthesis through a custom library of adapters. We have validated our interface synthesis approach on a hardware producer/consumer case study and on the design of a given software radiocommunication application.

Ouadjaout Salim

2006-01-01

64

Overview of the next generation of Fermilab collider software  

International Nuclear Information System (INIS)

Fermilab is entering an era of operating a more complex collider facility. In addition, new operator workstations are available that have increased capabilities. The task of providing updated software in this new environment precipitated a project called Colliding Beam Software (CBS). It was soon evident that a new approach was needed for developing console software. Hence CBS, although a common acronym, is too narrow a description. A new generation of the application program subroutine library has been created to enhance the existing programming environment with a set of value added tools. Several key Collider applications were written that exploit CBS tools. This paper will discuss the new tools and the underlying change in methodology in application program development for accelerator control at Fermilab. (author)

65

Community Hurricane Preparedness, 2nd Edition  

Science.gov (United States)

The purpose of this course is to provide emergency managers who face threats from tropical cyclones and hurricanes with basic information about: How tropical cyclones form The hazards they pose How the NWS forecasts future hurricane behavior What tools and guiding principles can help emergency managers prepare their communities The course is not intended to take the place of courses sponsored by FEMA, the National Hurricane Center, and/or state agencies. However, it will provide a good background for those who either plan to attend those courses or cannot attend them. The original module was published in 2000. This 2nd edition provides updated information on hurricane science and National Weather Service forecast products. In addition, a new section on Emergency Management discusses decision-making tools that can help emergency managers in response and evacuation decision-making during hurricane threats. This module is course number IS-324.a in FEMA's Emergency Management Institute's Independent Study catalog.

2014-09-14

66

2nd International Conference on Pattern Recognition  

CERN Document Server

This book contains the extended and revised versions of a set of selected papers from the 2nd International Conference on Pattern Recognition (ICPRAM 2013), held in Barcelona, Spain, from 15 to 18 February, 2013. ICPRAM was organized by the Institute for Systems and Technologies of Information, Control and Communication (INSTICC) and was held in cooperation with the Association for the Advancement of Artificial Intelligence (AAAI). The hallmark of this conference was to encourage theory and practice to meet in a single venue. The focus of the book is on contributions describing applications of Pattern Recognition techniques to real-world problems, interdisciplinary research, experimental and/or theoretical studies yielding new insights that advance Pattern Recognition methods.

Marsico, Maria

2015-01-01

67

BOOK REVIEW: Experimental Stochatics (2nd edition)  

Science.gov (United States)

Otto Moeschlin and his co-authors have written a book about simulation of stochastic systems. The book comes with a CD-ROM that contains the experiments discussed in the book, and the text from the book is repeated on the CD-ROM. According to the authors, the aim of the book is to give a quick introduction to stochastic simulation for `all persons interested in experimental stochastics'. To please this diverse audience, the authors offer a book that has four parts. Part 1, called `Artificial Randomness', is the longest of the four parts. It gives an overview of the generation, testing and basic usage of pseudo random numbers in simulation. Although algorithms for generating sequences of random numbers are fundamental to simulation, it is a slightly unusual choice to give it such weight in comparison to other algorithmic topics. The remaining three parts consist of simulation case studies. Part 2, `Stochastic Models', treats four problems---Buffon's needle, a queuing system, and two problems related to the kinetic theory of gases. Part 3 is called `Stochastic Processes' and discusses the simulation of discrete time Markov chains, birth--death processes, Brownian motion and diffusions. The last section of Part 3 is about simulation as a tool to understand the traffic flow in a system controlled by stoplights, an area of research for the authors. Part4 is called `Evaluation of Statistical Procedures'. This section contains examples where simulation is used to test the performance of statistical methods. It covers four examples: the Neymann--Pearson lemma, the Wald sequential test, Bayesian point estimation and Hartigan procedures. The CD-ROM contains an easy-to-install software package that runs under Microsoft Windows. The software contains the text and simulations from the book. What I found most enjoyable about this book is the number of topics covered in the case studies. The highly individual selection of applications, which may serve as a source of inspiration for teachers of computational stochastic methods, is the main contribution of this electronic monograph. However, both the book and software suffer from several severe problems. Firstly, I feel that the structure of the text is weak. Probably this is partly the result of the text from the CD-ROM being put into a book format, but the short paragraphs and poorly structured sentences destroy the reading experience. Secondly, although the software is functional, I believe that, like me, many users will be disappointed by the quality of the user interface and the visualizations. The opportunities to interact with the simulations are limited. Thirdly, the presentation is slightly old fashioned and lacking in pedagogical structure. For example, flow charts and Pascal programs are used to present algorithms. To conclude, I am surprised that this electronic monograph warranted a second edition in this form. Teachers may find the examples useful as a starting point, but students and researchers are advised to look elsewhere. JG was supported by KBN grant no 2 P03A 020 24.

Wiberg, P.

2004-05-01

68

Optimized generation of high resolution breast anthropomorphic software phantoms  

Energy Technology Data Exchange (ETDEWEB)

Purpose: The authors present an efficient method for generating anthropomorphic software breast phantoms with high spatial resolution. Employing the same region growing principles as in their previous algorithm for breast anatomy simulation, the present method has been optimized for computational complexity to allow for fast generation of the large number of phantoms required in virtual clinical trials of breast imaging. Methods: The new breast anatomy simulation method performs a direct calculation of the Cooper's ligaments (i.e., the borders between simulated adipose compartments). The calculation corresponds to quadratic decision boundaries of a maximum a posteriori classifier. The method is multiscale due to the use of octree-based recursive partitioning of the phantom volume. The method also provides user-control of the thickness of the simulated Cooper's ligaments and skin. Results: Using the proposed method, the authors have generated phantoms with voxel size in the range of (25-1000 {mu}m){sup 3}/voxel. The power regression of the simulation time as a function of the reciprocal voxel size yielded a log-log slope of 1.95 (compared to a slope of 4.53 of our previous region growing algorithm). Conclusions: A new algorithm for computer simulation of breast anatomy has been proposed that allows for fast generation of high resolution anthropomorphic software phantoms.

Pokrajac, David D.; Maidment, Andrew D. A.; Bakic, Predrag R. [Computer and Information Sciences Department, Delaware State University, Dover, Delaware 19901 (United States); Department of Radiology, University of Pennsylvania, Philadelphia, Pennsylvania 19104 (United States)

2012-04-15

69

Optimized generation of high resolution breast anthropomorphic software phantoms  

International Nuclear Information System (INIS)

Purpose: The authors present an efficient method for generating anthropomorphic software breast phantoms with high spatial resolution. Employing the same region growing principles as in their previous algorithm for breast anatomy simulation, the present method has been optimized for computational complexity to allow for fast generation of the large number of phantoms required in virtual clinical trials of breast imaging. Methods: The new breast anatomy simulation method performs a direct calculation of the Cooper's ligaments (i.e., the borders between simulated adipose compartments). The calculation corresponds to quadratic decision boundaries of a maximum a posteriori classifier. The method is multiscale due to the use of octree-based recursive partitioning of the phantom volume. The method also provides user-control of the thickness of the simulated Cooper's ligaments and skin. Results: Using the proposed method, the authors have generated phantoms with voxel size in the range of (25-1000 ?m)3/voxel. The power regression of the simulation time as a function of the reciprocal voxel size yielded a log-log slope of 1.95 (compared to a slope of 4.53 of our previous region growing algorithm). Conclusions: A new algorithm for computer simulation of breast anatomy has been proposed that allows for fast generation of high resolution anthropomorphic software phantoms.

70

Input language in software generation system for film data processing  

International Nuclear Information System (INIS)

Input language is described that has been constructed in the course of the development of the software generation system for the film data processing at JINR. The language allowed one to considerably simplify the work in the system framework and also to standardize the calling of any package of program included in the system. Input language is intended for the user-nonprogrammer. Detailed description of main directives of the language as well as their parameters are presented. The directives were determined to be of three types: input/output, descriptive and initiation. The application of the language both in batch and interactive modes is its particular feature. It has been realized by means of ample possibilities of CCL language provided by the CDC-6500 system software

71

2nd International Arctic Ungulate Conference  

Directory of Open Access Journals (Sweden)

Full Text Available The 2nd International Arctic Ungulate Conference was held 13-17 August 1995 on the University of Alaska Fairbanks campus. The Institute of Arctic Biology and the Alaska Cooperative Fish and Wildlife Research Unit were responsible for organizing the conference with assistance from biologists with state and federal agencies and commercial organizations. David R. Klein was chair of the conference organizing committee. Over 200 people attended the conference, coming from 10 different countries. The United States, Canada, and Norway had the largest representation. The conference included invited lectures; panel discussions, and about 125 contributed papers. There were five technical sessions on Physiology and Body Condition; Habitat Relationships; Population Dynamics and Management; Behavior, Genetics and Evolution; and Reindeer and Muskox Husbandry. Three panel sessions discussed Comparative caribou management strategies; Management of introduced, reestablished, and expanding muskox populations; and Health risks in translocation of arctic ungulates. Invited lectures focused on the physiology and population dynamics of arctic ungulates; contaminants in food chains of arctic ungulates and lessons learned from the Chernobyl accident; and ecosystem level relationships of the Porcupine Caribou Herd.

A. Anonymous

1996-01-01

72

Curvature of 2nd type induced on plane distribution  

Directory of Open Access Journals (Sweden)

Full Text Available In many-dimensional projective space the plane distribution is considered. The curvature of group connection of 2-nd type, induced by composite clothing of plane distribution, is constructed. It is proved, that a immovability of Cartan’s plane and Bortolotti’s hyperplane in case of holonomic distribution attracts the vanishing of curvature tensor of 2-nd type.

Omelyan O.

2014-11-01

73

Software para la Evaluación y Selección de Generadores Independientes / Independent Generator Evaluation and Selection Software  

Scientific Electronic Library Online (English)

Full Text Available SciELO Cuba | Language: Spanish Abstract in spanish En muchas industrias, edificios, empresas y servicios por razones productivas, emergentes o de confiablidad, es necesario generar energía eléctrica independientemente del Sistema Eléctrico Nacional. En otros casos se necesitan plantas de generación de energía eléctrica que puedan trabajar de forma a [...] islada alimentando un circuito determinado. Para realizar la selección más económica y eficiente de la capacidad a instalar se debe tomar en consideración no sólo el comportamiento del sistema donde se va a insertar la unidad, sino también, las características operacionales del generador ante las fluctuaciones de la carga, sus límites operacionales y la estabilidad resultante. Este trabajo presenta un software que permite realizar la selección más adecuada considerando la curva de capacidad y la estabilidad del generador ante las particularidades del sistema. Con su aplicación es posible reducir los gastos y las pérdidas económicas debidas a una selección inapropiada. Como caso se presenta su aplicación a una planta de una fábrica de envases de alimentos. Abstract in english In many industries, buildings and services is necessary to employ independent power plants to supply the electric power to a particular system. In other cases, islanded operation is desired due to several specific situations. In order to realize the most efficient, economic and adequate selection of [...] the generator capacity is necessary to consider not only the systems load characteristic, also is necessary to know the limits of capabilities and the resultant stability of the power generator. This paper presents a software that allows to select the adequate generator, expose the operating limits and the resulting stability in fluctuating load condition. With its application is possible to reduce economic losses and the costs due to an impropriated generator selection with an oversized o sub utilized machine. As a case a 2500 kVA power plant is analyzed.

Marcos Alberto, de Armas Teyra; Miguel, Castro Fernández.

2014-04-01

74

2nd International technical meeting on small reactors  

Energy Technology Data Exchange (ETDEWEB)

The 2nd International Technical Meeting on Small Reactors was held on November 7-9, 2012 in Ottawa, Ontario. The meeting was hosted by Atomic Energy of Canada Limited (AECL) and Canadian Nuclear Society (CNS). There is growing international interest and activity in the development of small nuclear reactor technology. This meeting provided participants with an opportunity to share ideas and exchange information on new developments. This Technical Meeting covered topics of interest to designers, operators, researchers and analysts involved in the design, development and deployment of small reactors for power generation and research. A special session focussed on small modular reactors (SMR) for generating electricity and process heat, particularly in small grids and remote locations. Following the success of the first Technical Meeting in November 2010, which captured numerous accomplishments of low-power critical facilities and small reactors, the second Technical Meeting was dedicated to the achievements, capabilities, and future prospects of small reactors. This meeting also celebrated the 50th Anniversary of the Nuclear Power Demonstration (NPD) reactor which was the first small reactor (20 MWe) to generate electricity in Canada.

NONE

2013-07-01

75

2nd International technical meeting on small reactors  

International Nuclear Information System (INIS)

The 2nd International Technical Meeting on Small Reactors was held on November 7-9, 2012 in Ottawa, Ontario. The meeting was hosted by Atomic Energy of Canada Limited (AECL) and Canadian Nuclear Society (CNS). There is growing international interest and activity in the development of small nuclear reactor technology. This meeting provided participants with an opportunity to share ideas and exchange information on new developments. This Technical Meeting covered topics of interest to designers, operators, researchers and analysts involved in the design, development and deployment of small reactors for power generation and research. A special session focussed on small modular reactors (SMR) for generating electricity and process heat, particularly in small grids and remote locations. Following the success of the first Technical Meeting in November 2010, which captured numerous accomplishments of low-power critical facilities and small reactors, the second Technical Meeting was dedicated to the achievements, capabilities, and future prospects of small reactors. This meeting also celebrated the 50th Anniversary of the Nuclear Power Demonstration (NPD) reactor which was the first small reactor (20 MWe) to generate electricity in Canada.

76

Effective Test Case Generation Using Antirandom Software Testing  

Directory of Open Access Journals (Sweden)

Full Text Available Random Testing is a primary technique for the software testing. Antirandom Testing improves the fault-detection capability of Random Testing by employing the location information of previously executed test cases. Antirandom testing selects test case such that it is as different as possible from all the previous executed test cases. The implementation is illustrated using basic examples. Moreover, compared with Random Testing, test cases generated in Antirandom Testing are more evenly spread across the input domain. AntirandomTesting has conventionally been applied to programs that have only numerical input types, because the distance between numerical inputs is readily measurable. The vast majority of research involves distance techniques for generating the antirandom test cases. Different from these techniques, we focus on the concrete values ofprogram inputs by proposing a new method to generate the antirandom test cases. The proposed method enables Antirandom Testing to be applied to all kinds of programs regardless of their input types. Empirical studies are further conducted for comparison and evaluation of the effectiveness of these methods is also presented. Experimental results show that, compared with random and hamming distance techniques, the proposed method significantly reduces the number of test cases required to detect the first failure. Overall, proposed antirandom testing gives higher fault coverage than antirandom testing with hamming distance method, which gives higher fault coverage than pure random testing.

Kulvinder Singh,

2010-11-01

77

A NEO population generation and observation simulation software tool  

Science.gov (United States)

One of the main targets of ESA's Space Situational Awareness (SSA) program is to build a wide knowledge base about objects that can potentially harm Earth (Near-Earth Objects, NEOs). An important part of this effort is to create the Small Bodies Data Centre (SBDC) which is going to aggregate measurement data from a fully-integrated NEO observation sensor network. Until this network is developed, artificial NEO measurement data is needed in order to validate SBDC algorithms. Moreover, to establish a functioning NEO observation sensor network, it has to be determined where to place sensors, what technical requirements have to be met in order to be able to detect NEOs and which observation strategies work the best. Because of this, a sensor simulation software was needed. This paper presents a software tool which allows users to create and analyse NEO populations and to simulate and analyse population observations. It is a console program written in Fortran and comes with a Graphical User Interface (GUI) written in Java and C. The tool can be distinguished into the components ``Population Generator'' and ``Observation Simulator''. The Population Generator component is responsible for generating and analysing a NEO population. Users can choose between creating fictitious (random) and synthetic populations. The latter are based on one of two models describing the orbital and size distribution of observed NEOs: The existing socalled ``Bottke Model'' (Bottke et al. 2000, 2002) and the new ``Granvik Model'' (Granvik et al. 2014, in preparation) which has been developed in parallel to the tool. Generated populations can be analysed by defining 2D, 3D and scatter plots using various NEO attributes. As a result, the tool creates the appropiate files for the plotting tool ``gnuplot''. The tool's Observation Simulator component yields the Observation Simulation and Observation Analysis functions. Users can define sensor systems using ground- or space-based locations as well as optical or radar sensors and simulate observation campaigns. The tool outputs field-of-view crossings and actual detections of the selected NEO population objects. Using the Observation Analysis users are able to process and plot the results of the Observation Simulation. In order to enable end-users to handle the tool in a user-intuitive and comfortable way, a GUI has been created based on the modular Eclipse Rich Client Platform (RCP) technology. Through the GUI users can easily enter input data for the tool, execute it and view its output data in a clear way. Additionally, the GUI runs gnuplot to create plot pictures and presents them to the user. Furthermore, users can create projects to organise executions of the tool.

Müller, Sven; Gelhaus, Johannes; Hahn, Gerhard; Franco, Raffaella

78

Elements of the Next Generation Science Standards' (NGSS) New Framework for K-12 Science Education aligned with STEM designed projects created by Kindergarten, 1st and 2nd grade students in a Reggio Emilio project approach setting  

Science.gov (United States)

This paper examines how elements of the Next Generation Science Standards' (NGSS) New Framework for K-12 Science Education standards (National Research Council 2011)---specifically the cross-cutting concept "cause and effect" are aligned with early childhood students' creation of projects of their choice. The study took place in a Reggio Emilio-inspired, K-12 school, in a multi-aged kindergarten, first and second grade classroom with 14 students. Students worked on their projects independently with the assistance of their peers and teachers. The students' projects and the alignment with the Next Generation Science Standards' New Framework were analyzed by using pre and post assessments, student interviews, and discourse analysis. Results indicate that elements of the New Framework for K-12 Science Education emerged through students' project presentation, particularly regarding the notion of "cause and effect". More specifically, results show that initially students perceived the relationship between "cause and effect" to be negative.

Facchini, Nicole

79

Test case generation for On Board Computer Software Major Components  

Directory of Open Access Journals (Sweden)

Full Text Available PES Institute of Technology along with five other institutions is developing a student imaging satellite. In imaging satellite development, software implementation plays an important role. On Board Computer (OBC is one which runs the satellite software. Satellite software has its sub components like telemetry, control modes, data processing, actuator, tele command and etc. Design, development and testing of these components are done successfully. Testing plays very important role to assure its software components functionality. In this paper, we present the testing methodology that we used along with main software components that we tested for on board computer. These test cases used to assure the functional correctness of control modes, actuator, tele command, telemetry and data processing of on board computer components.

Dinesha H A

2013-12-01

80

JNCI 92#3/2nd pages  

Science.gov (United States)

New Guidelines to Evaluate the Response to Treatment in Solid Tumors Patrick Therasse, Susan G. Arbuck, Elizabeth A. Eisenhauer, Jantien Wanders, Richard S. Kaplan, Larry Rubinstein, Jaap Verweij, Martine Van Glabbeke, Allan T. van Oosterom, Michaele C. Christian, Steve G. Gwyther Anticancer cytotoxic agents go through a process by which their antitumor activityÐon the basis of the amount of tu-mor shrinkage they could generateÐhas been investigated.

 
 
 
 
81

Safety profile of bilastine: 2nd generation H1-antihistamines.  

Science.gov (United States)

Bilastine is a new H1 antagonist with no sedative side effects, no cardiotoxic effects, and no hepatic metabolism. In addition, bilastine has proved to be effective for the symptomatic treatment of allergic rhinoconjunctivitis and urticaria. Pharmacological studies have shown that bilastine is highly selective for the H1 receptor in both in vivo and in vitro studies, and with no apparent affinity for other receptors. The absorption of bilastine is fast, linear and dose-proportional; it appears to be safe and well tolerated at all doses levels in healthy population. Multiple administration of bilastine has confirmed the linearity of the kinetic parameters. The distribution in the brain is undetectable. The safety profile in terms of adverse effects is very similar to placebo in all Phase I, II and III clinical trials. Bilastine (20 mg), unlike cetirizine, does not increase alcohol effects on the CNS. Bilastine 20 mg does not increase the CNS depressant effect of lorazepam. Bilastine 20 mg is similar to placebo in the driving test. Therefore, it meets the current criteria for medication used in the treatment of allergic rhinitis and urticaria. PMID:23242729

Scaglione, F

2012-12-01

82

2nd generation of RSL’s spectrum database SPECCHIO  

Digital Repository Infrastructure Vision for European Research (DRIVER)

The organised storage of spectral data described by according metadata is important for long term use and data sharing with other scientists. The recently redesigned SPECCHIO system acts as a repository for spectral field campaign and reference signatures. An analysis of metadata space has resulted in a non-redundant relational data model and efficient graphical user interfaces with underlying processing mechanisms minimizing the required user interaction during data capture. Data retrieval i...

Hueni, A.; Nieke, J.; Schopfer, Ju?rg; Kneubu?hler, Mathias; Itten, Klaus I.

2007-01-01

83

Generation and Optimization of Test cases for Object-Oriented Software Using State Chart Diagram  

CERN Document Server

The process of testing any software system is an enormous task which is time consuming and costly. The time and required effort to do sufficient testing grow, as the size and complexity of the software grows, which may cause overrun of the project budget, delay in the development of software system or some test cases may not be covered. During SDLC (software development life cycle), generally the software testing phase takes around 40-70% of the time and cost. State-based testing is frequently used in software testing. Test data generation is one of the key issues in software testing. A properly generated test suite may not only locate the errors in a software system, but also help in reducing the high cost associated with software testing. It is often desired that test data in the form of test sequences within a test suite can be automatically generated to achieve required test coverage. This paper proposes an optimization approach to test data generation for the state-based software testing. In this paper, ...

Swain, Ranjita Kumari; Mohapatra, Durga Prasad

2012-01-01

84

S-Cube: Enabling the Next Generation of Software Services  

Science.gov (United States)

The Service Oriented Architecture (SOA) paradigm is increasingly adopted by industry for building distributed software systems. However, when designing, developing and operating innovative software services and servicebased systems, several challenges exist. Those challenges include how to manage the complexity of those systems, how to establish, monitor and enforce Quality of Service (QoS) and Service Level Agreements (SLAs), as well as how to build those systems such that they can proactively adapt to dynamically changing requirements and context conditions. Developing foundational solutions for those challenges requires joint efforts of different research communities such as Business Process Management, Grid Computing, Service Oriented Computing and Software Engineering. This paper provides an overview of S-Cube, the European Network of Excellence on Software Services and Systems. S-Cube brings together researchers from leading research institutions across Europe, who join their competences to develop foundations, theories as well as methods and tools for future service-based systems.

Metzger, Andreas; Pohl, Klaus

85

Thermoluminescent characteristics of ZrO2:Nd films  

International Nuclear Information System (INIS)

In this work it is exposed the obtained results after analysing the photo luminescent and thermoluminescent characteristics of activated zirconium oxide with neodymium (ZrO2 :Nd) and its possible application in the UV radiation dosimetry. The realized experiments had as objective to study the characteristics such as the optimum thermal erased treatment, the influence of light on the response, the response depending on the wavelength, the fadeout of the information, the temperature effect, the response depending on the time and the recurring of the response. The results show that the ZrO2 :Nd is a promising material to be used as Tl dosemeter for the UV radiation. (Author)

86

The 2nd reactor core of the NS Otto Hahn  

International Nuclear Information System (INIS)

Details of the design of the 2nd reactor core are given, followed by a brief report summarising the operating experience gained with this 2nd core, as well as by an evaluation of measured data and statements concerning the usefulness of the knowledge gained for the development of future reactor cores. Quite a number of these data have been used to improve the concept and thus the specifications for the fuel elements of the 3rd core of the reactor of the NS Otto Hahn. (orig./HP)

87

GelClust: a software tool for gel electrophoresis images analysis and dendrogram generation.  

Science.gov (United States)

This paper presents GelClust, a new software that is designed for processing gel electrophoresis images and generating the corresponding phylogenetic trees. Unlike the most of commercial and non-commercial related softwares, we found that GelClust is very user-friendly and guides the user from image toward dendrogram through seven simple steps. Furthermore, the software, which is implemented in C# programming language under Windows operating system, is more accurate than similar software regarding image processing and is the only software able to detect and correct gel 'smile' effects completely automatically. These claims are supported with experiments. PMID:23727299

Khakabimamaghani, Sahand; Najafi, Ali; Ranjbar, Reza; Raam, Monireh

2013-08-01

88

Basis Principles of Software Development for Eddy Current Inspection of PWR/WWER Steam Generator Tubes  

International Nuclear Information System (INIS)

Extensive inspection of PWR/WWER steam generators associated with development of own designs of eddy current inspection systems including manipulators, push-pullers, controllers, probes, etc. influence on INETEC decision to start with development of its own software for EC inspections. In last year incredible results were obtained. Main software packages were finished with increased possibilities compared to other software available on the world market. In this article some basic principles of EC NDT software development is described including organizational aspects of software team, description of tasks and description of main achievements. Also associated problems and future development directions are discussed. (author)

89

CLIP: Bridging Pipelines to Instrument Control Software  

Science.gov (United States)

ESO develops in collaboration with the instrument consortia the instrument control software and data processing pipelines for the VLT and VLTI instruments. As astronomical instruments become more complex, there is a growing need for more comprehensive data processing to be performed directly by the instrument control software. The Common Library for Image Processing (CLIP) has been developed as a flexible and scalable framework to provide an interface to pipeline data processing capabilities for the instrument acquisition process. This paper gives an overview of the design of the CLIP and how it copes with the constraints and requirements of the 2nd Generation Instruments at the ESO Very Large Telescope.

Ballester, P.; Biereichel, P.; Kaufer, A.; Kiekebusch, M.; Lorch, H.

2008-08-01

90

Application of a path sensitizing method on automated generation of test specifications for control software  

International Nuclear Information System (INIS)

An automated generation method for test specifications has been developed for sequential control software in plant control equipment. Sequential control software can be represented as sequential circuits. The control software implemented in a control equipment is designed from these circuit diagrams. In logic tests of VLSI's, path sensitizing methods are widely used to generate test specifications. But the method generates test specifications at a single time only, and can not be directly applied to sequential control software. The basic idea of the proposed method is as follows. Specifications of each logic operator in the diagrams are defined in the software design process. Therefore, test specifications of each operator in the control software can be determined from these specifications, and validity of software can be judged by inspecting all of the operators in the logic circuit diagrams. Candidates for sensitized paths, on which test data for each operator propagates, can be generated by the path sensitizing method. To confirm feasibility of the method, it was experimentally applied to control software in digital control equipment. The program could generate test specifications exactly, and feasibility of the method was confirmed. (orig.) (3 refs., 7 figs.)

91

2nd International Conference on Data Management Technologies and Applications  

CERN Document Server

The 2nd International Conference on Data Management Technologies and Applications (DATA) aims to bring together researchers, engineers and practitioners interested on databases, data warehousing, data mining, data management, data security and other aspects of information systems and technology involving advanced applications of data.

2013-01-01

92

SEMANTIC WEB-BASED SOFTWARE ENGINEERING BY AUTOMATED REQUIREMENTS ONTOLOGY GENERATION IN SOA  

Digital Repository Infrastructure Vision for European Research (DRIVER)

This paper presents an approach for automated generation of requirements ontology using UML diagrams in service-oriented architecture (SOA). The goal of this paper is to convenience progress of software engineering processes like software design, software reuse, service discovering and etc. The proposed method is based on a four conceptual layers. The first layer includes requirements achieved by stakeholders, the second one designs service-oriented diagrams from the data in f...

Vahid Rastgoo; Monireh-Sadat Hosseini; Esmaeil Kheirkhah

2014-01-01

93

Open-source software for generating electrocardiogram signals  

CERN Document Server

ECGSYN, a dynamical model that faithfully reproduces the main features of the human electrocardiogram (ECG), including heart rate variability, RR intervals and QT intervals is presented. Details of the underlying algorithm and an open-source software implementation in Matlab, C and Java are described. An example of how this model will facilitate comparisons of signal processing techniques is provided.

McSharry, P E; Sharry, Patrick E. Mc; Cifford, Gari D.

2004-01-01

94

A software architecture for dynamically generated adaptive Web stores  

Digital Repository Infrastructure Vision for European Research (DRIVER)

We provide technical details about the software and hardware architecture of SETA, a prototype toolkit for the creation of Web stores which personalize the interaction with customers. SETA is based on a multi-agent architecture and on the use of MAS technologies, which support a seamless communication among the system agents and an easy distribution of such agents on different computers.

Goy, Annamaria; Ardissono, Liliana; Petrone, Giovanna; Segnan, Marino

2001-01-01

95

Using Automatic Code Generation in the Attitude Control Flight Software Engineering Process  

Science.gov (United States)

This paper presents an overview of the attitude control subsystem flight software development process, identifies how the process has changed due to automatic code generation, analyzes each software development phase in detail, and concludes with a summary of our lessons learned.

McComas, David; O'Donnell, James R., Jr.; Andrews, Stephen F.

1999-01-01

96

Software &  

...Products SM in education Software & systems Standards & regulations Strategies Supply chain Sustainability paradox Sustainable Europe Sustainable interaction design Teamwork Webcast Archive February 2014 (...Product service systems, Products, Software & systems, Standards & regulations, Strategies, Supply chain, Sustainability paradox, Sustainable Europe, Sustainable interaction design, Teamwork, Webcast ...Products, SM in education, Software & systems, Standards & regulations, Strategies, Supply chain, Sustainability paradox, Sustainable Europe, Sustainable interaction design, Teamwork, Webcast ...Products, SM in education, Software & systems, Standards & regulations, Strategies, Supply chain, Sustainability paradox, Sustainable Europe, Sustainable interaction design, Teamwork, Webcast ...

97

2nd International Conference on Green Communications and Networks 2012  

CERN Document Server

The objective of the 2nd International Conference on Green Communications and Networks 2012 (GCN 2012) is to facilitate an exchange of information on best practices for the latest research advances in the area of communications, networks and intelligence applications. These mainly involve computer science and engineering, informatics, communications and control, electrical engineering, information computing, and business intelligence and management. Proceedings of the 2nd International Conference on Green Communications and Networks 2012 (GCN 2012) will focus on green information technology and applications, which will provide in-depth insights for engineers and scientists in academia, industry, and government. The book addresses the most innovative research developments including technical challenges, social and economic issues, and presents and discusses the authors’ ideas, experiences, findings, and current projects on all aspects of advanced green information technology and applications. Yuhang Yang is ...

Ma, Maode; GCN 2012

2013-01-01

98

A Novel Scheme to Design Software-Controllable Vector Microwave Signal Generator and its Application  

Directory of Open Access Journals (Sweden)

Full Text Available With the rapid development of wireless communications, there will be many communication standards in the future, which may cost much to buy the corresponding vector microwave signal generator. Hence, this study investigated a novel vector microwave signal generation method, which modeled the vector baseband signal by the CAD software (Agilent ADS and then control the conventional microwave signal generation hardware to output vector microwave signals. Compared with the specified vector microwave signal generator developed by Agilent, Anritsu, etc., our software-controllable microwave signal source is cheaper, more flexible and more convenient. Moreover, as an application of our method, we model and realize the TD-SCDMA baseband signal corrupted by multipath channel and Additive White Gaussian Noise (AWGN in ADS software and then control the hardware (Agilent E4432B to generate the TD-SCDMA microwave signals. The measurements of the TD-SCDMA microwave signals approve the validity of our method.

L. Meng

2010-01-01

99

Learning from examples - Generation and evaluation of decision trees for software resource analysis  

Science.gov (United States)

A general solution method for the automatic generation of decision (or classification) trees is investigated. The approach is to provide insights through in-depth empirical characterization and evaluation of decision trees for software resource data analysis. The trees identify classes of objects (software modules) that had high development effort. Sixteen software systems ranging from 3,000 to 112,000 source lines were selected for analysis from a NASA production environment. The collection and analysis of 74 attributes (or metrics), for over 4,700 objects, captured information about the development effort, faults, changes, design style, and implementation style. A total of 9,600 decision trees were automatically generated and evaluated. The trees correctly identified 79.3 percent of the software modules that had high development effort or faults, and the trees generated from the best parameter combinations correctly identified 88.4 percent of the modules on the average.

Selby, Richard W.; Porter, Adam A.

1988-01-01

100

Mongolia 360°, 2nd Land Art Biennial, Creating Identities.  

Digital Repository Infrastructure Vision for European Research (DRIVER)

LAM 360º – 2nd Land Art Mongolia Biennial curated by Anna Brietzke, Orna Tsultem, Fumio Nanjo Locations: Ikh Gazriin Chuluu (Dundgobi) (45°29'33.24"N, 107°13'28.50"E) and National Mongolian Modern Art Gallery, Ulaanbataar, Mongolia. International site specific visual art event in the Mongolian Gobi Desert, a seminar on Art and Politics and an exhibition of documents and artifacts related to the works produced in the Gobi desert at Ikh Gazriin Chuluu.

Macleod, Anna

2012-01-01

 
 
 
 
101

2nd International Conference on Intelligent Technologies and Engineering Systems  

CERN Document Server

This book includes the original, peer reviewed research papers from the 2nd International Conference on Intelligent Technologies and Engineering Systems (ICITES2013), which took place on December 12-14, 2013 at Cheng Shiu University in Kaohsiung, Taiwan. Topics covered include: laser technology, wireless and mobile networking, lean and agile manufacturing, speech processing, microwave dielectrics, intelligent circuits and systems, 3D graphics, communications, and structure dynamics and control.

Chen, Cheng-Yi; Yang, Cheng-Fu

2014-01-01

102

Introduction on the 2nd annual general meeting of ARCCNM  

International Nuclear Information System (INIS)

This paper outlines general information on the 2nd annual general meeting of ARCCNM (Asian Regional Cooperative Council for Nuclear Medicine). The international symposium exchanged new development recently on basic and clinical nuclear medicine. Asian school of nuclear medicine is an educational enterprise of ARCCNM, and the objective is to organize and coordinate academic and training programs in nuclear medicine. It will promote nuclear medicine in Asia region through enhancing regional scientific activities and research collaboration

103

2nd International Open and Distance Learning (IODL) Symposium  

Digital Repository Infrastructure Vision for European Research (DRIVER)

This closing remarks prepared and presented by Prof. Dr. Murat BARKAN Anadolu University, Eskisehir, TURKEY DEAR GUESTS, As the 2nd International Open and Distance Learning (IODL) Symposium is now drawing to end, I would like to thank you all for your outstanding speeches, distinguished presentations, constructive roundtable and session discussions, and active participation during the last five days. I hope you all share my view that the whole symposium has been...

Barkan, Reviewed By Murat

2006-01-01

104

Automatic Generation of Supervisory Control System Software Using Graph Composition  

Science.gov (United States)

This paper describes the automatic generation of system descriptions for SCADA (Supervisory Control And Data Acquisition) systems. The proposed method produces various types of data and programs for SCADA systems from equipment definitions using conversion rules. At first, this method makes directed graphs, which represent connections between the equipment, from equipment definitions. System descriptions are generated using the conversion rules, by analyzing these directed graphs, and finding the groups of equipment that involve similar operations. This method can make the conversion rules multi levels by using the composition of graphs, and can reduce the number of rules. The developer can define and manage these rules efficiently.

Nakata, Hideo; Sano, Tatsuro; Kojima, Taizo; Seo, Kazuo; Uchida, Tomoyuki; Nakamura, Yasuaki

105

Software System For Generating, Identification And Simulation Of The Biotechnological Processes Models  

Directory of Open Access Journals (Sweden)

Full Text Available The paper deals with the development of a software system (BIOCALC for the modelling, identification and simulation of the biotechnological processes. BIOCALC is based on the algorithm developed by George Bastin and Daniel Dochain in (Bastin, et al., 1990, starting with the reaction scheme of the process. The BIOCALC software generates the differential equations of the process, identifies the model coefficients and allows the model simulation comparing the results to the experimental data.

Sergiu CARAMAN

2001-12-01

106

Software System For Generating, Identification And Simulation Of The Biotechnological Processes Models  

Digital Repository Infrastructure Vision for European Research (DRIVER)

The paper deals with the development of a software system (BIOCALC) for the modelling, identification and simulation of the biotechnological processes. BIOCALC is based on the algorithm developed by George Bastin and Daniel Dochain in (Bastin, et al., 1990), starting with the reaction scheme of the process. The BIOCALC software generates the differential equations of the process, identifies the model coefficients and allows the model simulation comparing the results to the experimental data.

Caraman, Sergiu; Frangu, Laurentiu; Turtoi, Maria

2001-01-01

107

Contribution to the software execution platform integration in a generative model driven engineering  

Digital Repository Infrastructure Vision for European Research (DRIVER)

To minimize the inherent complexity of multitasking programs, a promising approach is to automate developments. In practice, automation is achieved by generators. Those generators produce applications which execute on software multitasking platforms (for example multitasking operating systems). Such generators are in fact specific to selected platforms. They are made of implementation rules which are specific to each platform. In order to cope with adaptable and flexible solutions, this study...

Thomas, Fre?de?ric

2008-01-01

108

Model-based Testing: Next Generation Functional Software Testing  

Digital Repository Infrastructure Vision for European Research (DRIVER)

The idea of model-based testing is to use an explicit abstract model of a SUT and its environment to automatically derive tests for the SUT: the behavior of the model of the SUT is interpreted as the intended behavior of the SUT. The technology of automated model-based test case generation has matured to the point where large-scale deployments of this technology are becoming commonplace. The prerequisites for success, such as qualification of the test team, integrated tool chain availabilit...

Legeard, Bruno

2010-01-01

109

Software Reliability Testing Data Generation Approach Based on a Mixture Model  

Directory of Open Access Journals (Sweden)

Full Text Available To solve the problem about software reliability testing cases and testing data generation of real-time control systems, this study applies the reliability testing cases generation approach based on the mixture of operation profile and Markov chain which describes software operation profile by the use cases of UML, establishes the use model based on UML model for automatically deriving the testing model from the use model, generates a reliability testing case set based on the testing model and generates testing input data of reliability testing semi-automatically by eliciting input and output variables and abstracting testing input and output classes. The results of reliability testing on the actual airborne software show that the framework of the test case set generated by the proposed model is fairly stable. Thus, the proposed approach is suitable for generating reliability testing data of the real-time control system software semi-automatically, greatly simplifies the reliability testing process, improves testing efficiency and ensures testing validity.

Qin Zheng

2010-01-01

110

Energy, environment and technological innovation: Rome 2nd international congress  

Energy Technology Data Exchange (ETDEWEB)

From the three volumes containing the proceedings of the October 12-16, 1992 2nd International Congress on Energy, Environment and Technological Innovation held at the University of Rome 'La Sapienza', separate abstracts were prepared for 41 papers. The selection of papers included recent developments and research trends in the following high-tech areas: biomass plantations, wind turbines, photovoltaic power plants, solar architecture, building energy management, global warming, automobile air pollution abatement, district heating with cogeneration, and hydrogen fuels for transportation.

1992-01-01

111

2nd Karlsruhe International Summer School on Fusion Technologies  

International Nuclear Information System (INIS)

For the second time, the Karlsruhe Research enter together with European research institutions and industries invited young scientists and engineers to its ''International Summer School on Fusion Technologies.'' Fifty participants from all over Europe attended the lectures by 35 experts preesenting contributions from their areas of competence. Ten young scientists from India and another 10 from China were connected to the events by video link. Physics student Kornelia Stycz describes her impressions as a participant in the ''2nd International Summer School on Fusion Technologies.'' (orig.)

112

THR Simulator – the software for generating radiographs of THR prosthesis  

Directory of Open Access Journals (Sweden)

Full Text Available Abstract Background Measuring the orientation of acetabular cup after total hip arthroplasty is important for prognosis. The verification of these measurement methods will be easier and more feasible if we can synthesize prosthesis radiographs in each simulated condition. One reported method used an expensive mechanical device with an indeterminable precision. We thus develop a program, THR Simulator, to directly synthesize digital radiographs of prostheses for further analysis. Under Windows platform and using Borland C++ Builder programming tool, we developed the THR Simulator. We first built a mathematical model of acetabulum and femoral head. The data of the real dimension of prosthesis was adopted to generate the radiograph of hip prosthesis. Then with the ray tracing algorithm, we calculated the thickness each X-ray beam passed, and then transformed to grey scale by mapping function which was derived by fitting the exponential function from the phantom image. Finally we could generate a simulated radiograph for further analysis. Results Using THR Simulator, the users can incorporate many parameters together for radiograph synthesis. These parameters include thickness, film size, tube distance, film distance, anteversion, abduction, upper wear, medial wear, and posterior wear. These parameters are adequate for any radiographic measurement research. This THR Simulator has been used in two studies, and the errors are within 2° for anteversion and 0.2 mm for wearing measurement. Conclusion We design a program, THR Simulator that can synthesize prosthesis radiographs. Such a program can be applied in future studies for further analysis and validation of measurement of various parameters of pelvis after total hip arthroplasty.

Hou Sheng-Mou

2009-01-01

113

Software module for geometric product modeling and NC tool path generation  

International Nuclear Information System (INIS)

The intelligent CAD/CAM system named VIRTUAL MANUFACTURE is created. It is consisted of four intelligent software modules: the module for virtual NC machine creation, the module for geometric product modeling and automatic NC path generation, the module for virtual NC machining and the module for virtual product evaluation. In this paper the second intelligent software module is presented. This module enables feature-based product modeling carried out via automatic saving of the designed product geometric features as knowledge data. The knowledge data are afterwards applied for automatic NC program generation for the designed product NC machining. (Author)

114

Scoping analysis of the Advanced Test Reactor using SN2ND  

Energy Technology Data Exchange (ETDEWEB)

A detailed set of calculations was carried out for the Advanced Test Reactor (ATR) using the SN2ND solver of the UNIC code which is part of the SHARP multi-physics code being developed under the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program in DOE-NE. The primary motivation of this work is to assess whether high fidelity deterministic transport codes can tackle coupled dynamics simulations of the ATR. The successful use of such codes in a coupled dynamics simulation can impact what experiments are performed and what power levels are permitted during those experiments at the ATR. The advantages of the SN2ND solver over comparable neutronics tools are its superior parallel performance and demonstrated accuracy on large scale homogeneous and heterogeneous reactor geometries. However, it should be noted that virtually no effort from this project was spent constructing a proper cross section generation methodology for the ATR usable in the SN2ND solver. While attempts were made to use cross section data derived from SCALE, the minimal number of compositional cross section sets were generated to be consistent with the reference Monte Carlo input specification. The accuracy of any deterministic transport solver is impacted by such an approach and clearly it causes substantial errors in this work. The reasoning behind this decision is justified given the overall funding dedicated to the task (two months) and the real focus of the work: can modern deterministic tools actually treat complex facilities like the ATR with heterogeneous geometry modeling. SN2ND has been demonstrated to solve problems with upwards of one trillion degrees of freedom which translates to tens of millions of finite elements, hundreds of angles, and hundreds of energy groups, resulting in a very high-fidelity model of the system unachievable by most deterministic transport codes today. A space-angle convergence study was conducted to determine the meshing and angular cubature requirements for the ATR, and also to demonstrate the feasibility of performing this analysis with a deterministic transport code capable of modeling heterogeneous geometries. The work performed indicates that a minimum of 260,000 linear finite elements combined with a L3T11 cubature (96 angles on the sphere) is required for both eigenvalue and flux convergence of the ATR. A critical finding was that the fuel meat and water channels must each be meshed with at least 3 'radial zones' for accurate flux convergence. A small number of 3D calculations were also performed to show axial mesh and eigenvalue convergence for a full core problem. Finally, a brief analysis was performed with different cross sections sets generated from DRAGON and SCALE, and the findings show that more effort will be required to improve the multigroup cross section generation process. The total number of degrees of freedom for a converged 27 group, 2D ATR problem is {approx}340 million. This number increases to {approx}25 billion for a 3D ATR problem. This scoping study shows that both 2D and 3D calculations are well within the capabilities of the current SN2ND solver, given the availability of a large-scale computing center such as BlueGene/P. However, dynamics calculations are not realistic without the implementation of improvements in the solver.

Wolters, E.; Smith, M. (NE NEAMS PROGRAM); ( SC)

2012-07-26

115

Software quality assurance project for reactor physics codes at the Point Lepreau Generating Station  

International Nuclear Information System (INIS)

One of the ongoing challenges faced by the Nuclear Industry is Software Quality Assurance (SQA). In this paper, a project to address SQA issues in the Reactor Physics Group at the Point Lepreau Generating Station (PLGS) will be discussed. The work illustrates a process which could be implemented at any facility to achieve code compliance to CSA Standard N286.7 requirements. (author)

116

Real-time infrared scene generation software for I2RSS hardware in the loop  

Science.gov (United States)

This paper describes the current research and development of advanced scene generation technology for integration into the I2RSS - Hardware-in-the-Loop (HWIL) facilities at the US Army AMRDEC at Redstone Arsenal, AL. A real-time dynamic infra-red (IR) scene generator has been developed in support of a high altitude scenario leveraging COTS hardware and open source software. The Multi-Spectral Mode Scene Generator (MMSG) is an extensible software architecture that is powerful yet flexible. The I2RSS scene generator has implemented dynamic signature by integrating the signature prediction codes along with Open Source Software, COTS hardware along with custom built interfaces. A modular, plug-in framework has been developed that supports rapid reconfiguration to permit the use of a variety of state data input sources, geometric model formats, and signature and material databases. The platform independent software yields a cost-effective upgrade path to integrate best-of-breed graphics and system architectures.

Lyles, Patrick V.; Cosby, David S.; Buford, James A., Jr.; Bunfield, Dennis H.

2005-05-01

117

2nd international conference on advanced nanomaterials and nanotechnology  

CERN Document Server

Nanoscale science and technology have occupied centre stage globally in modern scientific research and discourses in the early twenty first century. The enabling nature of the technology makes it important in modern electronics, computing, materials, healthcare, energy and the environment. This volume contains selected articles presented (as Invited/Oral/Poster presentations) at the 2nd international conference on advanced materials and nanotechnology (ICANN-2011) held recently at the Indian Institute of Technology Guwahati, during Dec 8-10, 2011. The list of topics covered in this proceedings include: Synthesis and self assembly of nanomaterials Nanoscale characterisation Nanophotonics & Nanoelectronics Nanobiotechnology Nanocomposites  F   Nanomagnetism Nanomaterials for Enery Computational Nanotechnology Commercialization of Nanotechnology The conference was represented by around 400 participants from several countries including delegates invited from USA, Germany, Japan, UK, Taiwan, Italy, Singapor...

Goswami, D; Perumal, A

2013-01-01

118

Isotope effects on vapour phase 2nd viral coefficients  

International Nuclear Information System (INIS)

Vapor phase 2nd virial coefficient isotope effects (VCIE's) are interpreted. A useful correlation ids developed between -?(B-b0)/(B-b0) = -VCIE and the reference condensed phase reduced isotopic partition function ratio [ln(fc/fg)]*. B is the second virial coefficient , b0 = 2??3/3, ? is the Lennard-Jones size parameter, and ? is an isotopic difference, light-heavy. [ln(fc/fg)]* can be obtained from vapor pressure isotope effects for T/TCRITICAL p/f2g), where ln(fp/f2g) is the reduced isotopic partition function ratio describing the equilibrium between monomers and interacting pairs. At temperatures well removed from crossovers in ln(fp/f2g) or [ln(fc/fg)]*, ln(fp/f2g) = (0.4±0.2)[ln(fc/fg)]*. (author)

119

2nd International Conference on NeuroRehabilitation  

CERN Document Server

The book is the proceedings of the 2nd International Conference on NeuroRehabilitation (ICNR 2014), held 24th-26th June 2014 in Aalborg, Denmark. The conference featured the latest highlights in the emerging and interdisciplinary field of neural rehabilitation engineering and identified important healthcare challenges the scientific community will be faced with in the coming years. Edited and written by leading experts in the field, the book includes keynote papers, regular conference papers, and contributions to special and innovation sessions, covering the following main topics: neuro-rehabilitation applications and solutions for restoring impaired neurological functions; cutting-edge technologies and methods in neuro-rehabilitation; and translational challenges in neuro-rehabilitation. Thanks to its highly interdisciplinary approach, the book will not only be a  highly relevant reference guide for academic researchers, engineers, neurophysiologists, neuroscientists, physicians and physiotherapists workin...

Andersen, Ole; Akay, Metin

2014-01-01

120

Minimal Testcase Generation for Object-Oriented Software with State Charts  

Directory of Open Access Journals (Sweden)

Full Text Available Today statecharts are a de facto standard in industry for modeling system behavior. Test data generation isone of the key issues in software testing. This paper proposes an reduction approach to test data generationfor the state-based software testing. In this paper, first state transition graph is derived from state chartdiagram. Then, all the required information are extracted from the state chart diagram. Then, test casesare generated. Lastly, a set of test cases are minimized by calculating the node coverage for each test case.It is also determined that which test cases are covered by other test cases. The advantage of our testgeneration technique is that it optimizes test coverage by minimizing time and cost. The present test datageneration scheme generates test cases which satisfy transition path coverage criteria, path coveragecriteria and action coverage criteria. A case study on Railway Ticket Vending Machine (RTVM has beenpresented to illustrate our approach.

Ranjita Kumari Swain

2012-08-01

 
 
 
 
121

Reliable Mining of Automatically Generated Test Cases from Software Requirements Specification (SRS)  

CERN Document Server

Writing requirements is a two-way process. In this paper we use to classify Functional Requirements (FR) and Non Functional Requirements (NFR) statements from Software Requirements Specification (SRS) documents. This is systematically transformed into state charts considering all relevant information. The current paper outlines how test cases can be automatically generated from these state charts. The application of the states yields the different test cases as solutions to a planning problem. The test cases can be used for automated or manual software testing on system level. And also the paper presents a method for reduction of test suite by using mining methods thereby facilitating the mining and knowledge extraction from test cases.

Raamesh, Lilly

2010-01-01

122

Software for Generating Troposphere Corrections for InSAR Using GPS and Weather Model Data  

Science.gov (United States)

Atmospheric errors due to the troposphere are a limiting error source for spaceborne interferometric synthetic aperture radar (InSAR) imaging. This software generates tropospheric delay maps that can be used to correct atmospheric artifacts in InSAR data. The software automatically acquires all needed GPS (Global Positioning System), weather, and Digital Elevation Map data, and generates a tropospheric correction map using a novel algorithm for combining GPS and weather information while accounting for terrain. Existing JPL software was prototypical in nature, required a MATLAB license, required additional steps to acquire and ingest needed GPS and weather data, and did not account for topography in interpolation. Previous software did not achieve a level of automation suitable for integration in a Web portal. This software overcomes these issues. GPS estimates of tropospheric delay are a source of corrections that can be used to form correction maps to be applied to InSAR data, but the spacing of GPS stations is insufficient to remove short-wavelength tropospheric artifacts. This software combines interpolated GPS delay with weather model precipitable water vapor (PWV) and a digital elevation model to account for terrain, increasing the spatial resolution of the tropospheric correction maps and thus removing short wavelength tropospheric artifacts to a greater extent. It will be integrated into a Web portal request system, allowing use in a future L-band SAR Earth radar mission data system. This will be a significant contribution to its technology readiness, building on existing investments in in situ space geodetic networks, and improving timeliness, quality, and science value of the collected data

Moore, Angelyn W.; Webb, Frank H.; Fishbein, Evan F.; Fielding, Eric J.; Owen, Susan E.; Granger, Stephanie L.; Bjoerndahl, Fredrik; Loefgren, Johan; Fang, Peng; Means, James D.; Bock, Yehuda; Tong, Xiaopeng

2013-01-01

123

Software architecture for control and data acquisition of linear plasma generator Magnum-PSI  

International Nuclear Information System (INIS)

Highlights: ? An architecture based on a modular design. ? The design offers flexibility and extendability. ? The design covers the overall software architecture. ? It also covers its (sub)systems’ internal structure. -- Abstract: The FOM Institute DIFFER – Dutch Institute for Fundamental Energy Research has completed the construction phase of Magnum-PSI, a magnetized, steady-state, large area, high-flux linear plasma beam generator to study plasma surface interactions under ITER divertor conditions. Magnum-PSI consists of several hardware subsystems, and a variety of diagnostic systems. The COntrol, Data Acquisition and Communication (CODAC) system integrates these subsystems and provides a complete interface for the Magnum-PSI users. Integrating it all, from the lowest hardware level of sensors and actuators, via the level of networked PLCs and computer systems, up to functions and classes in programming languages, demands a sound and modular software architecture, which is extendable and scalable for future changes. This paper describes this architecture, and the modular design of the software subsystems. The design is implemented in the CODAC system at the level of services and subsystems (the overall software architecture), as well as internally in the software subsystems

124

Software architecture for control and data acquisition of linear plasma generator Magnum-PSI  

Energy Technology Data Exchange (ETDEWEB)

Highlights: ? An architecture based on a modular design. ? The design offers flexibility and extendability. ? The design covers the overall software architecture. ? It also covers its (sub)systems’ internal structure. -- Abstract: The FOM Institute DIFFER – Dutch Institute for Fundamental Energy Research has completed the construction phase of Magnum-PSI, a magnetized, steady-state, large area, high-flux linear plasma beam generator to study plasma surface interactions under ITER divertor conditions. Magnum-PSI consists of several hardware subsystems, and a variety of diagnostic systems. The COntrol, Data Acquisition and Communication (CODAC) system integrates these subsystems and provides a complete interface for the Magnum-PSI users. Integrating it all, from the lowest hardware level of sensors and actuators, via the level of networked PLCs and computer systems, up to functions and classes in programming languages, demands a sound and modular software architecture, which is extendable and scalable for future changes. This paper describes this architecture, and the modular design of the software subsystems. The design is implemented in the CODAC system at the level of services and subsystems (the overall software architecture), as well as internally in the software subsystems.

Groen, P.W.C., E-mail: p.w.c.groen@differ.nl [FOM Institute DIFFER – Dutch Institute For Fundamental Energy Research, Association EURATOM-FOM, Partner in the Trilateral Euregio Cluster, P.O. Box 1207, 3430 BE, Nieuwegein (Netherlands); Beveren, V. van; Broekema, A.; Busch, P.J.; Genuit, J.W.; Kaas, G.; Poelman, A.J.; Scholten, J.; Zeijlmans van Emmichoven, P.A. [FOM Institute DIFFER – Dutch Institute For Fundamental Energy Research, Association EURATOM-FOM, Partner in the Trilateral Euregio Cluster, P.O. Box 1207, 3430 BE, Nieuwegein (Netherlands)

2013-10-15

125

Infrared Properties of Cataclysmic Variables from 2MASS Results from the 2nd Incremental Data Release  

CERN Document Server

Because accretion-generated luminosity dominates the radiated energy of most cataclysmic variables, they have been ``traditionally'' observed primarily at short wavelengths. Infrared observations of cataclysmic variables contribute to the understanding of key system components that are expected to radiate at these wavelengths, such as the cool outer disk, accretion stream, and secondary star. We have compiled the J, H, and Ks photometry of all cataclysmic variables located in the sky coverage of the 2 Micron All Sky Survey (2MASS) 2nd Incremental Data Release. This data comprises 251 systems with reliably identified near-IR counterparts and S/N > 10 photometry in one or more of the three near-IR bands.

Hoard, D W; Clark, L L; Bowers, T P; Bowers, Timothy P.

2001-01-01

126

The Second Stellar Spectrum and the non-LTE Problem of the 2nd Kind  

Science.gov (United States)

This paper presents an overview of the radiative transfer problem of calculating the spectral line intensity and polarization that emerges from a (generally magnetized) astrophysical plasma composed of atoms and molecules whose excitation state is significantly influenced by radiative transitions produced by an anisotropic radiation field. The numerical solution of this non-LTE problem of the 2nd kind is facilitating the physical understanding of the second solar spectrum and the exploration of the complex magnetism of the extended solar atmosphere, but much more could be learned if high-sensitivity polarimeters were developed also for the present generation of night-time telescopes. Interestingly, I find that the population ratio between the levels of some resonance line transitions can be efficiently modulated by the inclination of a weak magnetic field when the anisotropy of the incident radiation is significant, something that could provide a new diagnostic tool in astrophysics.

Trujillo Bueno, Javier

2009-09-01

127

Results of the 2nd regular inspection of Unit 1 in the Oi Power Station  

International Nuclear Information System (INIS)

The 2nd regular inspection was carried out in fiscal 1980 on Unit 1 in the Oi Power Station for the period from February 10 to July 30, 1981. The facilities subjected to the inspection were reactor proper, cooling systems, instrumentation and control systems, radiation control systems and others. In the inspection on external appearance, leakage, performance, etc., damages were detected to fuel insert hold-down springs and the support lattices of fuel assemblies, and abnormality was found in steam generator tubes. The radiation exposure of personnel concerning the inspection work was all below the legally permissible level. The improvement works conducted during the period of inspection were the replacement of the damaged insert hold-down springs, the fuel assemblies with damaged support-lattices the support pins of upper core-structure flow guide and the plugging of the abnormal heating tubes. (J.P.N.)

128

Improvement of a plasma uniformity of the 2nd ion source of KSTAR neutral beam injector  

International Nuclear Information System (INIS)

The 2nd ion source of KSTAR (Korea Superconducting Tokamak Advanced Research) NBI (Neutral Beam Injector) had been developed and operated since last year. A calorimetric analysis revealed that the heat load of the back plate of the ion source is relatively higher than that of the 1st ion source of KSTAR NBI. The spatial plasma uniformity of the ion source is not good. Therefore, we intended to identify factors affecting the uniformity of a plasma density and improve it. We estimated the effects of a direction of filament current and a magnetic field configuration of the plasma generator on the plasma uniformity. We also verified that the operation conditions of an ion source could change a uniformity of the plasma density of an ion source

129

Research on custom-built generation of GIS application software based on metadata  

Science.gov (United States)

In order to decrease the workload of code development, and to support the reuse of common functions and customization of personal functions, the paper proposes custom-built generation platform of GIS application software based on metadata. The platform divides the GIS application software into three parts: user interface, business logic and functional model. The three parts are loosely coupled based on metadata. The design of multi-layers loosely coupling indicates the system flexibility. It allows users to customize user interface, function and data to meet their personal demands. The platform locates resources rapidly based on metadata, and then customizes the GIS application software including basic GIS functions, personal user interface and flexible data. For validating the platform, the urban geologic survey information system is generated by the platform as a test. The result shows that the platform achieves the hot-plug of extensional components by modifying metadata and provides good reuse of GIS functions. Further more, it hides the diversity of GIS components. It simplifies the development of GIS application software and improves the efficiency of development.

Hong, Lu; Zhang, Fu; Zhang, Zihui; Li, Anbo

2008-10-01

130

On the 2nd order autocorrelation of an XUV attosecond pulse train  

International Nuclear Information System (INIS)

Full text: We present the first direct measurement of sub-fs light bunching that has been achieved, extending well established fs optical metrology to XUV as pulses. A mean train pulse duration of 780 as has been extracted through a 2nd order autocorrelation approach, utilizing a nonlinear effect that is induced solely by the XUV radiation to be characterized. The approach is based on (i) a bisected spherical mirror XUV wavefront divider used as an autocorrelator and (ii) the two photon ionization of atomic He by a superposition of the 7th to the 15th harmonic of a Ti:sapph laser. The measured temporal mean width is more than twice its Fourier transform limited (FTL) value, in contrast to the as train pulse durations measured through other approaches, which where found much closer to the FTL values. We have investigated, and discuss here the origin of this discrepancy. An assessment of the validity of the 2nd order AC approach for the broad band XUV radiation of as pulses is implemented through ab initio calculations (solution of the 3D TDSE of He in the presence of the superposition of the harmonic superposition) modeling the spectral and temporal response of the two-XUV-photon He ionization detector employed. It is found that both the spectral and temporal response are not affecting the measured duration. The mean width of the as train bursts is estimated from the spectral phases of the individual harmonics as they result f the individual harmonics as they result from the rescattering model, taking into account the spatially modulated temporal width of the radiation due to the spatiotemporal intensity distribution of the driving field during the harmonic generation process. The measured value is found in reasonable agreement with the estimated duration. The method used for the 2nd order AC in itself initiates further XUV-pump-XUV-probe studies of sub-fs-scale dynamics and at the same time becomes highly pertinent in connection with nonlinear experiments using XUV free - electron laser sources. Refs. 4 (author)

131

Real-time scene generation using high-speed pixel processing hardware and open source software  

Science.gov (United States)

Modern, military scene generators increasingly utilize advanced features of consumer graphics hardware to produce wave band-specific sensor scenes. Unfortunately the advances available in the consumer graphics accelerator market do not translate immediately into product applications for military scene generators required to test next generation sensors. Testing infrared (IR) sensors used in terminal homing missiles and missile warning systems (MWS) require generating frame rates of 200 Hz or more. Modern IR emitter arrays are now able to project dynamic scenes at this higher rate, however personal computer (PC) based scene rendering systems cannot generate high-resolution, real-time frames fast enough. InterSpace has leveraged its high-speed pixel processor technology to produce high-speed rendering based on PC devices. The Hardware-in-the-Loop Functional Area of the US Army Aviation and Missile Research Development and Engineering Center has developed a suite of modular software to perform deterministic real-time, wave band-specific rendering of sensor scenes, leveraging the features of commodity graphics hardware and open source software. Together, these technologies provide the performance and accuracy to drive high-rate, high-dynamic range scene projectors.

Price, Matthew; Cosby, David

2006-05-01

132

APTWG: 2nd Asia-Pacific Transport Working Group Meeting  

International Nuclear Information System (INIS)

This conference report summarizes the contributions to and discussions at the 2nd Asia-Pacific Transport Working Group Meeting held in Chengdu, China, from 15 to 18 May 2012. The topics of the meeting were organized under five main headings: momentum transport, non-locality in transport, edge turbulence and L–H transition, three-dimensional effects on transport physics, and particle, momentum and heat pinches. It is found that lower hybrid wave and ion cyclotron wave induce co-current rotation while electron cyclotron wave induces counter-current rotation. A four-stage imaging for low (L) to high (H) confinement transition gradually emerges and a more detailed verification is urgently expected. The new edge-localized modes mitigation technique with supersonic molecular beam injection was approved to be effective to some extent on HL-2A and KSTAR. It is also found that low collisionality, trapped electron mode to ion temperature gradient transition (or transition of higher to lower density and temperature gradients), fuelling and lithium coating are in favour of inward pinch of particles in tokamak plasmas. (paper)

133

APTWG: 2nd Asia-Pacific Transport Working Group Meeting  

Science.gov (United States)

This conference report summarizes the contributions to and discussions at the 2nd Asia-Pacific Transport Working Group Meeting held in Chengdu, China, from 15 to 18 May 2012. The topics of the meeting were organized under five main headings: momentum transport, non-locality in transport, edge turbulence and L-H transition, three-dimensional effects on transport physics, and particle, momentum and heat pinches. It is found that lower hybrid wave and ion cyclotron wave induce co-current rotation while electron cyclotron wave induces counter-current rotation. A four-stage imaging for low (L) to high (H) confinement transition gradually emerges and a more detailed verification is urgently expected. The new edge-localized modes mitigation technique with supersonic molecular beam injection was approved to be effective to some extent on HL-2A and KSTAR. It is also found that low collisionality, trapped electron mode to ion temperature gradient transition (or transition of higher to lower density and temperature gradients), fuelling and lithium coating are in favour of inward pinch of particles in tokamak plasmas.

Dong, J. Q.; Shi, Y. J.; Tamura, N.; Jhang, Hogun; Watanabe, T.-H.; Ding, X. T.

2013-02-01

134

Book review: Psychology in a work context (2nd Ed.  

Directory of Open Access Journals (Sweden)

Full Text Available Bergh, Z. & Theron, A.L. (Eds (2003 Psychology in a work context (2nd Ed.. Cape Town: Oxford University Press.

This book is an overview and introduction to Industrial and Organisational Psychology. It is a work of ambitious scope, and it is clear that the contributors have invested a great deal of thought and effort in the planning and execution of the book. The current version is the second edition, and it looks set to become one of those standard textbooks that are revised every few years to keep up with changing times. It is a handsome volume, produced to a high standard of editorial care, pleasingly laid out and organised well enough to be useful as an occasional reference source. An English-Afrikaans glossary, tables of contents for every chapter as well as for the entire book, a comprehensive index and extensive bibliography make it easy to retrieve the information relating to a particular topic. Every chapter ends with a conclusion summarising the gist of the material covered. Quality illustrations lighten the tone and help to bring some of the concepts to life. Learning outcomes and self-assessment exercises and questions for every chapter will be useful to the lecturer using the book as a source for a tutored course, and for the student studying by distance learning. If sold at the suggested retail price, the book represents good value compared to imported textbooks that cover similar ground.

Nanette Tredoux

2003-10-01

135

CELEBRATED APRIL 2nd – INTERNATIONAL DAY OF PERSONS WITH AUTISM  

Directory of Open Access Journals (Sweden)

Full Text Available On April 2nd, the Macedonian Scientific Society for Autism, for the fourth time organized an event on the occasion of the International Day of Persons with Autism. The event with cultural and artistic character was held at the Museum of the Macedonian Struggle under the motto “They are not alone, we are with them”. The huge number of citizens only confirmed the motto. It seemed that the hall of the Museum of the Macedonian Struggle is too small for the warm hearts of the audience. More than 300 guests were present in the hall, among which there were children with autism and their families, prominent professors, doctors, special educators and rehabilitators, psychologists, students and other citizens with glad heart and will who decided to enrich the event with their presence. The event was opened by the violinist Plamenka Trajkovska, which performed one song. After her, the President of the Macedonian Scientific Society for Autism, PhD. Vladimir Trajkovski delivered his speech. The professor told the parents of autistic children, who were present in large number, not to lose hope, to fight for their children, and that the Macedonian Scientific Society for Autism will provide tremendous support and assistance in this struggle.

Manuela KRCHANOSKA

2014-09-01

136

GENESIS: Agile Generation of Information Management Oriented Software / GENESIS: Generación ágil de software orientado a gestión de información  

Scientific Electronic Library Online (English)

Full Text Available SciELO Colombia | Language: English Abstract in spanish La especificación de un sistema de información puede estar clara desde el principio: debe adquirir, desplegar, consultar y modificar datos, usando una base de datos. El asunto es decidir cuál información manejar. En el caso que origina este trabajo, la información evoluciona permanentemente, incluso [...] hasta final del proyecto. Esto implica la construcción de un nuevo sistema cada vez que se redefine la información. Este artículo presenta Genesis, una infraestructura ágil para la construcción inmediata del sistema de información que sea requerido. Los expertos describen su información y consultas. Genesis produce el software correspondiente, generando las interfaces gráficas y la base de datos apropiados. Abstract in english The specification for an information system can be clear from the beginning: it must acquire, display, query and modify data, using a database. The main issue is to decide which information to manage. In the case originating this work, information was always evolving, even up to the end of the proje [...] ct. This implies the construction of a new system each time the information is redefined. This article presents Genesis, an agile for the immediate construction of required information systems. Experts describe their information needs and queries, and Genesis generates the corresponding application, with the appropriate graphical interfaces and database.

Claudia, Jiménez Guarín; Juan Erasmo, Gómez.

2010-05-01

137

Studi On Oxidation State Of U In Ba2NdUO6  

International Nuclear Information System (INIS)

Ba2NdUO6 is not of the important compounds that is formed from a solidification process for high level liquid waste using super high temperature method Ba2NdUO6 has ordered perovskite structure. The objective of this study is to investigate oxidation state of U in Ba2NdUO6. The properties of Ba2NdUO6 were observed by using Faraday-type torsion magnetometer and X-ray Photoelectron Spectrometer (XPS). The magnetic susceptibility measured in the temperature range of 4K to room temperature showed that the Ba2NdUO6 is paramagnetism that obeys the Curie-Weiss law. The effective moment of Ba2NdUO6 is 3.04 ?B. The results of xPs spectrum showed that the peaks of U4f for Ba2NdUO6 appeared exactly between binding energy of UO2 and UO3. It can be concluded that Ba2NdUO6 has binding energy peaks corresponding to pentavalent uranium

138

The rank of the 2nd Gaussian map for general curves  

CERN Document Server

We prove that, for the general curve of genus g, the 2nd Gaussian map is injective if g = 18. The proof relies on the study of the limit of the 2nd Gaussian map when the general curve of genus g degenerates to a general stable binary curve, i.e. the union of two rational curves meeting at g+1 points.

Calabri, Alberto; Miranda, Rick

2009-01-01

139

Measurement of the (5p1/2nd)3 auto-ionizing levels of strontium  

International Nuclear Information System (INIS)

The (5p1/2nd)3 (n equal to 11 to 24) auto-ionizing levels of Sr have been measured by the laser melti-step excitation technique. Their effective quantum numbers have been determined. The interaction with the (5p3/2nd)3 series has been discussed

140

Psychiatric Diagnosis and Concomitant Medical Treatment for 1st and 2nd Grade Children  

Science.gov (United States)

This study examined the proportion of children in 1st and 2nd grade classes who were currently prescribed medication for psychotropic disorders. The study also examined the attitudes of 1st and 2nd grade teachers toward diagnosis of psychiatric disorders and use of psychiatric medication to treat children. Results of the current study indicate…

Cornell-Swanson, La Vonne; Frankenberger, William; Ley, Katie; Bowman, Krista

2007-01-01

 
 
 
 
141

2nd International Open and Distance Learning (IODL Symposium  

Directory of Open Access Journals (Sweden)

Full Text Available This closing remarks prepared and presented by Prof. Dr. Murat BARKAN Anadolu University, Eskisehir, TURKEY DEAR GUESTS, As the 2nd International Open and Distance Learning (IODL Symposium is now drawing to end, I would like to thank you all for your outstanding speeches, distinguished presentations, constructive roundtable and session discussions, and active participation during the last five days. I hope you all share my view that the whole symposium has been a very stimulating and successful experience. Also, on behalf of all the participants, I would like to take this opportunity to thank and congratulate the symposium organization committee for their excellent job in organizing and hosting our 2nd meeting. Throughout the symposium, five workshops, six keynote speeches and 66 papers, which were prepared by more than 150 academicians and practitioners from 23 different countries, reflected remarkable and various views and approaches about open and flexible learning. Besides, all these academic endeavors, 13 educational films were displayed during the symposium. The technology exhibition, hosted by seven companies, was very effective to showcase the current level of the technology and the success of applications of theory into practice. Now I would like to go over what our scholar workshop and keynote presenters shared with us: Prof. Marina McIsaac form Arizona State University dwelled on how to determine research topics worthwhile to be examined and how to choose appropriate research design and methods. She gave us clues on how to get articles published in professional journals. Prof. Colin Latchem from Australia and Prof. Dr. Ali Ekrem Ozkul from Anadolu University pointed to the importance of strategic planning for distance and flexible learning. They highlighted the advantages of strategic planning for policy-makers, planners, managers and staff. Dr. Wolfram Laaser from Fern University of Hagen, presented different multimedia clips and directed interactive exercises using flashmx in his workshop. Jack Koumi from UK, presented a workshop about what to teach on video and when to choose other media. He exemplified 27 added value techniques and teaching functions for TV and video. He later specified different capabilities and limitations of eight different media used in teaching, emphasizing the importance of optimizing media deployment. Dr. Janet Bohren from University of Cincinnati and Jennifer McVay-Dyche from United Theological Seminary, explained their experience with a course management system used to develop dialogue between K-12 teachers in Turkey and the US, on the topics of religion, culture and schools. Their workshop provided an overview of a pilot study. They showed us a good case-study of utilizing “Blackboard” as a mean for getting rid of biases and improving the understanding of the American and Turkish teachers against each other. We had very remarkable key notes as well. Dr Nikitas Kastis representing European Distance and E-Learning Network (EDEN made his speech on distance and e-Learning evolutions and trends in Europe. He informed the audience about the application and assessment criteria at European scale, concerning e-Learning in the education and training systems. Meanwhile, our key note speakers took our attention to different applications of virtual learning. Dr. Piet Kommers from University of Twente exemplified a virtual training environment for acquiring surgical skills. Dr. Timothy Shih from Tamkang University presented their project called Hard SCORM (Sharable Content Object Reference Model as an asynchronous distance learning specification. In his speech titled “Engaging and Supporting Problem Solving Online” Prof. David Jonassen from University of Missouri, reflected his vision of the future of education and explained why it should embrace problem solving. Then he showed us examples of incorporating this vision with learning environments for making online problem solving possible. Dr. Wolfram Laaser from Fern University talked on applications of ICT at Europe

Reviewed by Murat BARKAN

2006-10-01

142

Software Holography: Interferometric Data Analysis for the Challenges of Next Generation Observatories  

CERN Document Server

Next generation radio observatories such as the MWA, LWA, LOFAR, CARMA and SKA provide a number of challenges for interferometric data analysis. These challenges include heterogeneous arrays, direction-dependent instrumental gain, and refractive and scintillating atmospheric conditions. From the analysis perspective, this means that calibration solutions can not be described using a single complex gain per antenna. In this paper we use the optimal map-making formalism developed for CMB analyses to extend traditional interferometric radio analysis techniques--removing the assumption of a single complex gain per antenna and allowing more complete descriptions of the instrumental and atmospheric conditions. Due to the similarity with holographic mapping of radio antenna surfaces, we call this extended analysis approach software holography. The resulting analysis algorithms are computationally efficient, unbiased, and optimally sensitive. We show how software holography can be used to solve some of the challenges...

Morales, Miguel F

2008-01-01

143

Reliable Mining of Automatically Generated Test Cases from Software Requirements Specification (SRS  

Directory of Open Access Journals (Sweden)

Full Text Available Writing requirements is a two-way process. In this paper we use to classify Functional Requirements (FR and Non Functional Requirements (NFR statements from Software Requirements Specification (SRS documents. This is systematically transformed into state charts considering all relevant information. The current paper outlines how test cases can be automatically generated from these state charts. The application of the states yields the different test cases as solutions to a planning problem. The test cases can be used for automated or manual software testing on system level. And also the paper presents a method for reduction of test suite by using mining methods thereby facilitating the mining and knowledge extraction from test cases.

Lilly Raamesh

2010-01-01

144

Comparison of HIV-1 drug resistance profiles generated from novel software applications for routine patient care  

Science.gov (United States)

Introduction Clinical laboratories performing routine HIV-1 genotyping antiviral drug resistance (DR) testing need reliable and up-to-date information systems to provide accurate and timely test results to optimize antiretroviral treatment in HIV-1-infected patients. Materials and Methods Three software applications were used to compare DR profiles generated from the analysis of HIV-1 protease (PR) and reverse transcriptase (RT) gene sequences obtained by Sanger sequencing assay in 100 selected clinical plasma samples from March 2013 through May 2014. Interpretative results obtained from the Trugene HIV-1 Genotyping assay (TG; Guidelines v17.0) were compared with a newly FDA-registered data processing module (DPM v1.0) and the research-use-only ViroScore-HIV (VS) software, both of which use the latest versions of Stanford HIVdb (SD v7.0) and geno2pheno (G2P v3.3) interpretive algorithms (IA). Differences among the DR interpretive algorithms were compared according to drug class (NRTI, NNRTI, PI) and each drug. HIV-1 tropism and integrase inhibitor resistance were not evaluated (not available in TG). Results Overall, only 17 of the 100 TG sequences obtained yielded equivalent DR profiles among all 3 software applications for every IA and for all drug classes. DPM and VS generated equivalent results with >99.9% agreement. Excluding AZT, DDI, D4T and rilpivirine (not available in G2P), ranges of agreement in DR profiles among the three IA (using the DPM) are shown in Table 1. Conclusions Substantial discrepancies (interpretive algorithms for ETR, while G2P differed from TG and SD for resistance to TDF and TPV/r. Use of more than one DR interpretive algorithm using well-validated software applications, such as DPM v1.0 and VS, would enable clinical laboratories to provide clinically useful and accurate DR results for patient care needs. PMID:25397496

Gonzalez, Dimitri; Digmann, Benjamin; Barralon, Matthieu; Boulme, Ronan; Sayada, Chalom; Yao, Joseph

2014-01-01

145

Wind-US Results for the AIAA 2nd Propulsion Aerodynamics Workshop  

Science.gov (United States)

This presentation contains Wind-US results presented at the 2nd Propulsion Aerodynamics Workshop. The workshop was organized by the American Institute of Aeronautics and Astronautics, Air Breathing Propulsion Systems Integration Technical Committee with the purpose of assessing the accuracy of computational fluid dynamics for air breathing propulsion applications. Attendees included representatives from government, industry, academia, and commercial software companies. Participants were encouraged to explore and discuss all aspects of the simulation process including the effects of mesh type and refinement, solver numerical schemes, and turbulence modeling. The first set of challenge cases involved computing the thrust and discharge coefficients for a 25deg conical nozzle for a range of nozzle pressure ratios between 1.4 and 7.0. Participants were also asked to simulate two cases in which the 25deg conical nozzle was bifurcated by a solid plate, resulting in vortex shedding (NPR=1.6) and shifted plume shock (NPR=4.0). A second set of nozzle cases involved computing the discharge and thrust coefficients for a convergent dual stream nozzle for a range of subsonic nozzle pressure ratios. The workshop committee also compared the plume mixing of these cases across various codes and models. The final test case was a serpentine inlet diffuser with an outlet to inlet area ratio of 1.52 and an offset of 1.34 times the inlet diameter. Boundary layer profiles, wall static pressure, and total pressure at downstream rake locations were examined.

Dippold, Vance III; Foster, Lancert; Mankbadi, Mina

2014-01-01

146

Display system software for the integration of an ADAGE 3000 programmable display generator into the solid modeling package C.A.D. software  

Science.gov (United States)

A software system that integrates an ADAGE 3000 Programmable Display Generator into a C.A.D. software package known as the Solid Modeling Program is described. The Solid Modeling Program (SMP) is an interactive program that is used to model complex solid object through the composition of primitive geomeentities. In addition, SMP provides extensive facilities for model editing and display. The ADAGE 3000 Programmable Display Generator (PDG) is a color, raster scan, programmable display generator with a 32-bit bit-slice, bipolar microprocessor (BPS). The modularity of the system architecture and the width and speed of the system bus allow for additional co-processors in the system. These co-processors combine to provide efficient operations on and rendering of graphics entities. The resulting software system takes advantage of the graphics capabilities of the PDG in the operation of SMP by distributing its processing modules between the host and the PDG. Initially, the target host computer was a PRIME 850, which was later substituted with a VAX-11/785. Two versions of the software system were developed, a phase 1 and a phase 2. In phase 1, the ADAGE 3000 is used as a frame buffer. In phase II, SMP was functionally partitioned and some of its functions were implemented in the ADAGE 3000 by means of ADAGE's SOLID 3000 software package.

Montoya, R. J.; Lane, H. H., Jr.

1986-01-01

147

Evolution of a Reconfigurable Processing Platform for a Next Generation Space Software Defined Radio  

Science.gov (United States)

The National Aeronautics and Space Administration (NASA)Harris Ka-Band Software Defined Radio (SDR) is the first, fully reprogrammable space-qualified SDR operating in the Ka-Band frequency range. Providing exceptionally higher data communication rates than previously possible, this SDR offers in-orbit reconfiguration, multi-waveform operation, and fast deployment due to its highly modular hardware and software architecture. Currently in operation on the International Space Station (ISS), this new paradigm of reconfigurable technology is enabling experimenters to investigate navigation and networking in the space environment.The modular SDR and the NASA developed Space Telecommunications Radio System (STRS) architecture standard are the basis for Harris reusable, digital signal processing space platform trademarked as AppSTAR. As a result, two new space radio products are a synthetic aperture radar payload and an Automatic Detection Surveillance Broadcast (ADS-B) receiver. In addition, Harris is currently developing many new products similar to the Ka-Band software defined radio for other applications. For NASAs next generation flight Ka-Band radio development, leveraging these advancements could lead to a more robust and more capable software defined radio.The space environment has special considerations different from terrestrial applications that must be considered for any system operated in space. Each space mission has unique requirements that can make these systems unique. These unique requirements can make products that are expensive and limited in reuse. Space systems put a premium on size, weight and power. A key trade is the amount of reconfigurability in a space system. The more reconfigurable the hardware platform, the easier it is to adapt to the platform to the next mission, and this reduces the amount of non-recurring engineering costs. However, the more reconfigurable platforms often use more spacecraft resources. Software has similar considerations to hardware. Having an architecture standard promotes reuse of software and firmware. Space platforms have limited processor capability, which makes the trade on the amount of amount of flexibility paramount.

Kacpura, Thomas J.; Downey, Joseph A.; Anderson, Keffery R.; Baldwin, Keith

2014-01-01

148

Radcalc for Windows 2.0 transportation packaging software to determine hydrogen generation and transportation classification  

Energy Technology Data Exchange (ETDEWEB)

Radclac for Windows is a user friendly menu-driven Windows compatible software program with applications in the transportation of radioactive materials. It calculates the radiolytic generation of hydrogen gas in the matrix of low-level and high-level radioactive wastes. It also calculates pressure buildup due to hydrogen and the decay heat generated in a package at seal time. It computes the quantity of a radionuclide and its associated products for a given period of time. In addition, the code categorizes shipment quantities as reportable quantity (RQ), radioactive Type A or Type B, limited quality (LQ), low specific activity (LSA), highway road controlled quality (HRCQ), and fissile excepted using US Department of Transportation (DOT) definitions and methodologies.

Green, J.R.

1996-10-21

149

Meteosat Second Generation station: processing software and computing arquitecture; Estacion de Recepcion de Imagenes del Satelite Meteosat Segunda generacion: Arquitectura Informatica y Software de Proceso  

Energy Technology Data Exchange (ETDEWEB)

The Renewable Energy Division of CIEMAT houses a specific station for receiving the Meteosat Second Generation images, which is of interest in the works being carried out concerning the solar radiation derived from satellite images. The complexity, the huge amount of information being received and the particular characteristics of the MSG images encouraged the design and development of a specific computer structure and the associated software as well, for a better and more suitable use of the images. This document describes the mentioned structure and software. (Author) 8 refs.

Martin, L.; Cony, M.; Navarro, A. A.; Zarzalejo, L. F.; Polo, J.

2010-05-01

150

Similar regulation of the synthesis of adenovirus fiber and of simian virus 40-specific proteins encoded by the helper-defective Ad2+SV40 hybrid viruses Ad2+ND5 and Ad2+ND4del.  

Digital Repository Infrastructure Vision for European Research (DRIVER)

Human adenoviruses fail to multiply effectively in monkey cells. The block to the replication of these viruses can be overcome by coinfection with simian virus 40 (SV40) or when part of the SV40 genome is integrated into and expressed as part of the adenovirus type 2 (Ad2) genome, as occurs in several Ad2+SV40 hybrid viruses, such as Ad2+ND1, Ad2+ND2, and Ad2+ND4. The SV40 helper-defective Ad2+SV40 hybrid viruses Ad2+ND5 and Ad2+ND4del were analyzed to determine why they are unable to grow ef...

Klockmann, U.; Klessig, D. F.; Deppert, W.

1985-01-01

151

DCT and Eigenvectors of Covariance of 1st and 2nd order Discrete fractional Brownian motion  

Digital Repository Infrastructure Vision for European Research (DRIVER)

This paper establishes connection between discrete cosine transform (DCT) and 1st and 2nd order discrete-time fractional Brownian motion process. It is proved that the eigenvectors of the auto-covariance matrix of a 1st and 2nd order discrete-time fractional Brownian motion can be approximated by DCT basis vectors in the asymptotic sense. Perturbation in eigenvectors from DCT basis vectors is modeled using the analytic perturbation theory of linear operators.

Gupta, Anubha; Joshi, Shivdutt

2013-01-01

152

Quetzalcoatl: Una Herramienta para Generar Contratos de Desarrollo de Software en Entornos de Outsourcing / Quetzalcoatl: A Tool for Generating Software Development Contracts in Outsourcing Environments  

Scientific Electronic Library Online (English)

Full Text Available SciELO Portugal | Language: Spanish Abstract in spanish Actualmente el outsourcing es una de las actividades principales de trabajo. Sin embargo, las relaciones que se dan entre un cliente y un proveedor de servicios no son lo suficientemente fuertes para lograr las expectativas de los acuerdos. El contrato de outsourcing para proyectos en desarrollo de [...] software es una alternativa a este tipo de relaciones. En este artículo se presenta la arquitectura de la herramienta Quetzalcoatl, así como las funciones principales que ofrece la herramienta, todo esto con el objetivo de generar y evaluar contratos para proyectos de desarrollo de software en entornos de outsourcing como apoyo a las PYMEs. Abstract in english Nowadays, outsourcing is one of the most important work activities for the software development companies. However, the relationships between a client and a service provider are not b enough to meet the expectations of the agreements. The outsourcing contract for software development projects is an [...] alternative to this type of relationship. This paper presents the architecture of the tool named Quetzalcoatl, also the main functions that this tool offers in order to generate and evaluate contracts for software development projects in outsourcing environments to support SMEs.

Jezreel, Mejía; Sergio D., Ixmatlahua; Alma I., Sánchez.

2014-03-01

153

Pragmatics Annotated Coloured Petri Nets for Protocol Software Generation and Verification  

DEFF Research Database (Denmark)

This paper presents the formal definition of Pragmatics Annotated Coloured Petri Nets (PA-CPNs). PA-CPNs represent a class of Coloured Petri Nets (CPNs) that are designed to support automated code genera-tion of protocol software. PA-CPNs restrict the structure of CPN models and allow Petri net elements to be annotated with so-called pragmatics, which are exploited for code generation. The approach and tool for gen-erating code is called PetriCode and has been discussed and evaluated in earlier work already. The contribution of this paper is to give a formal def-inition for PA-CPNs; in addition, we show how the structural restrictions of PA-CPNs can be exploited for making the verification of the modelled protocols more efficient. This is done by automatically deriving progress measures for the sweep-line method, and by introducing so-called service testers, that can be used to control the part of the state space that is to be explored for verification purposes.

Simonsen, Kent Inge; Kristensen, Lars Michael

2014-01-01

154

Next generation hyper-scale software and hardware systems for big data analytics  

CERN Document Server

Building on foundational technologies such as many-core systems, non-volatile memories and photonic interconnects, we describe some current technologies and future research to create real-time, big data analytics, IT infrastructure. We will also briefly describe some of our biologically-inspired software and hardware architecture for creating radically new hyper-scale cognitive computing systems. About the speaker Rich Friedrich is the director of Strategic Innovation and Research Services (SIRS) at HP Labs. In this strategic role, he is responsible for research investments in nano-technology, exascale computing, cyber security, information management, cloud computing, immersive interaction, sustainability, social computing and commercial digital printing. Rich's philosophy is to fuse strategy and inspiration to create compelling capabilities for next generation information devices, systems and services. Using essential insights gained from the metaphysics of innnovation, he effectively leads ...

CERN. Geneva

2013-01-01

155

Next Generation Astronomical Data Processing using Big Data Technologies from the Apache Software Foundation  

Science.gov (United States)

In this era of exascale instruments for astronomy we must naturally develop next generation capabilities for the unprecedented data volume and velocity that will arrive due to the veracity of these ground-based sensor and observatories. Integrating scientific algorithms stewarded by scientific groups unobtrusively and rapidly; intelligently selecting data movement technologies; making use of cloud computing for storage and processing; and automatically extracting text and metadata and science from any type of file are all needed capabilities in this exciting time. Our group at NASA JPL has promoted the use of open source data management technologies available from the Apache Software Foundation (ASF) in pursuit of constructing next generation data management and processing systems for astronomical instruments including the Expanded Very Large Array (EVLA) in Socorro, NM and the Atacama Large Milimetre/Sub Milimetre Array (ALMA); as well as for the KAT-7 project led by SKA South Africa as a precursor to the full MeerKAT telescope. In addition we are funded currently by the National Science Foundation in the US to work with MIT Haystack Observatory and the University of Cambridge in the UK to construct a Radio Array of Portable Interferometric Devices (RAPID) that will undoubtedly draw from the rich technology advances underway. NASA JPL is investing in a strategic initiative for Big Data that is pulling in these capabilities and technologies for astronomical instruments and also for Earth science remote sensing. In this talk I will describe the above collaborative efforts underway and point to solutions in open source from the Apache Software Foundation that can be deployed and used today and that are already bringing our teams and projects benefits. I will describe how others can take advantage of our experience and point towards future application and contribution of these tools.

Mattmann, Chris

2014-04-01

156

Software for generation and analysis of photoelastic fringes in plates with a single hole subjected to in-plane loads  

International Nuclear Information System (INIS)

A software package for generating and analyzing photoelastic images on infinite rectangular plates, subjected to in-plane loads, is being presented. It allows the user to generate photoelastic images as produced in a polariscope fed by monochromatic light. Both circular and plane polariscopes in conditions of dark or light field can be selected. Tools for obtaining light intensity distributions along horizontal and vertical lines and for extracting darkest regions of photoelastic fringes are also available. The extraction of such regions can be done by digital image processing (DIP). This process produces thin lines, from which main stresses and intensity factor used in the Fracture Mechanics can be obtained. The software was developed for running on DOS environment in Super VGA mode. The synthetic photoelastic images are generated in 64 gray levels. This software is a useful tool for teaching the fundamentals of photoelasticity and will help the researchers in the development of photoelastic experiments. (author). 6 fefs., 7 figs

157

Computer-generated holograms (CGH) realization: the integration of dedicated software tool with digital slides printer  

Science.gov (United States)

Last generation of digital printer is usually characterized by a spatial resolution enough high to allow the designer to realize a binary CGH directly on a transparent film avoiding photographic reduction techniques. These devices are able to produce slides or offset prints. Furthermore, services supplied by commercial printing company provide an inexpensive method to rapidly verify the validity of the design by means of a test-and-trial process. Notably, this low-cost approach appears to be suitable for a didactical environment. On the basis of these considerations, a set of software tools able to design CGH's has been developed. The guidelines inspiring the work have been the following ones: (1) ray-tracing approach, considering the object to be reproduced as source of spherical waves; (2) Optimization and speed-up of the algorithms used, in order to produce a portable code, runnable on several hardware platforms. In this paper calculation methods to obtain some fundamental geometric functions (points, lines, curves) are described. Furthermore, by the juxtaposition of these primitives functions it is possible to produce the holograms of more complex objects. Many examples of generated CGHs are presented.

Guarnieri, Vittorio; Francini, Franco

1997-12-01

158

Phase relationship in the TiO2Nd2O3 pseudo-binary system  

International Nuclear Information System (INIS)

Highlights: ? DSC and XRD measurements for the TiO2Nd2O3 system. ? Nd2Ti2O7, Nd2TiO5, Nd2Ti3O9 and Nd4Ti9O24 exist. ? Nd2Ti4O11 and Nd4Ti9O24 were the same compounds. ? Thermodynamic calculation on the TiO2Nd2O3 system. - Abstract: Phase equilibria in the TiO2Nd2O3 system have been experimentally investigated via X-ray diffraction (XRD) and differential scanning calorimetry (DSC). Four compounds Nd2Ti2O7, Nd2TiO5, Nd2Ti3O9 and Nd4Ti9O24 were confirmed to exist. The literature reported Nd2Ti4O11 was proved to be the same compound as Nd4Ti9O24, and the reported phase transformation of Nd2Ti4O11 from ? structure to ? at 1373 K was not detected. All the phase diagram data from both the literatures and the present work were critically reviewed and taken into account during the thermodynamic optimization of the TiO2Nd2O3 system. A set of consistent thermodynamic parameters, which can explain most of the experimental data ofthe experimental data of the TiO2Nd2O3 system, was achieved. The calculated phase diagram of the TiO2Nd2O3 system was provided.

159

A new analytic solution for 2nd-order Fermi acceleration  

International Nuclear Information System (INIS)

A new analytic solution for 2nd-order Fermi acceleration is presented. In particular, we consider time-dependent rates for stochastic acceleration, diffusive and convective escape as well as adiabatic losses. The power law index q of the turbulence spectrum is unconstrained and can therefore account for Kolmogorov (q = 5/3) and Kraichnan (q = 3/2) turbulence, Bohm diffusion (q = 1) as well as the hard-sphere approximation (q = 2). This considerably improves beyond solutions known to date and will prove a useful tool for more realistic modelling of 2nd-order Fermi acceleration in a variety of astrophysical environments

160

Oxidized Modified Proteins in the Atherosclerosis Genesis at a Diabetes Mellitus of the 2nd Type  

Directory of Open Access Journals (Sweden)

Full Text Available A role of free-radical oxidation in atherogenesis in patients with a diabetes mellitus of the 2nd type at an ischemic heart disease is regarded. It is established, that the oxidized modified proteins are in tight contact with the lipid peroxidation and support a free-radical oxidation in the given category of patients. It is demonstrated, that the oxidized modified protein detection can be both early and integral test of metabolic disturbances and, in perspective, hemostasiologic disturbances at a diabetes mellitus of the 2nd type.

O.V. Zanozina

2009-11-01

 
 
 
 
161

A study of Ca 4p1/2,3/2nd (J = 3) autoionizing states  

International Nuclear Information System (INIS)

The spectral properties of Ca 4pnd (J = 3) autoionizing states have been studied by employing the combination of multichannel quantum defect theory (MQDT) with the K-matrix method. The cross section of 4p3/215d excited from 4s15d and energy levels of 4pnd (J = 3) are calculated by using a theoretical model with more complete channels, and the configuration interaction is analysed in detail between 4p1/2nd, 4p3/2nd3/2,5/2 and 4p1/2,3/2ng of five Rydberg series

162

Power system economics : the Nordic electricity market. 2nd ed.  

Energy Technology Data Exchange (ETDEWEB)

This book written as a textbook for students of engineering is designed for the Norwegian Power Markets course which is part of the Energy and Environment Master's Program and the recently established international MSc program in Electric Power Engineering. As the title indicates, the book deals with both power system economics in general and the practical implementation and experience from the Nordic market. Areas of coverage include: -- Restructuring/deregulation of the power supply system -- Grid access including tariffs and congestion management -- Generation planning -- Market modeling -- Ancillary services -- Regulation of grid monopolies. Although Power Systems Economics is written primarily as a textbook for students, other readers will also find the book interesting. It deals with problems that have been subject of considerable attention in the power sector for some years and it addresses issues that are still relevant and important. (au)

Wangensteen, Ivar

2012-07-01

163

Power system economics : the Nordic electricity market. 2nd ed.  

International Nuclear Information System (INIS)

This book written as a textbook for students of engineering is designed for the Norwegian Power Markets course which is part of the Energy and Environment Master's Program and the recently established international MSc program in Electric Power Engineering. As the title indicates, the book deals with both power system economics in general and the practical implementation and experience from the Nordic market. Areas of coverage include: -- Restructuring/deregulation of the power supply system -- Grid access including tariffs and congestion management -- Generation planning -- Market modeling -- Ancillary services -- Regulation of grid monopolies. Although Power Systems Economics is written primarily as a textbook for students, other readers will also find the book interesting. It deals with problems that have been subject of considerable attention in the power sector for some years and it addresses issues that are still relevant and important. (au)

164

Experimental investigation of the generation mechanism of aerodynamic noise. 2nd Report. On correlation between surface pressure fluctuation and aerodynamic sound radiated from a circular cylinder; Kurikion no hassei kiko ni kansuru jikken kaiseki. 2. Hyomen atsuryoku hendo to kurikion no sogo sokan ni tsuite  

Energy Technology Data Exchange (ETDEWEB)

The mechanism of aerodynamic sound generation from a circular cylinder is investigated experimentally using coherence functions between surface pressure fluctuation and radiated sound at Reynolds numbers from 10{sup 4} to 1.4 {times} 10{sup 5}. The correlation between the surface pressure fluctuation and the radiated sound at the fundamental frequency is good, indicating the strong contribution of ordered structures to aerodynamic sound generation. The characteristic length of ordered structure Lc is estimated using the integral scale of the spanwise coherence function of surface pressure fluctuations. The sound pressure is calculated using a modified Curle`s equation, with the characteristic length and measured surface pressure fluctuations. The predicted spectra of radiated sound are in good agreement with those actually measured up to five times the fundamental frequency. This result shows that Lc, is useful for estimating the character of radiated sound from a circular cylinder. 16 refs., 13 figs., 2 tabs.

Iida, A.; Kato, C.; Otaguro, T. [Hitachi, Ltd., Tokyo (Japan); Fujita, H. [Nihon University, Tokyo (Japan). College of Science and Technology

1996-12-25

165

2nd international expert meeting straw power; 2. Internationale Fachtagung Strohenergie  

Energy Technology Data Exchange (ETDEWEB)

Within the 2nd Guelzow expert discussions at 29th to 30th March, 2012 in Berlin (Federal Republic of Germany), the following lectures were held: (1) Promotion of the utilisation of straw in Germany (A. Schuette); (2) The significance of straw in the heat and power generation in EU-27 member states in 2020 and in 2030 under consideration of the costs and sustainability criteria (C. Panoutsou); (3) State of he art of the energetic utilization of hay goods in Europe (D. Thraen); (4) Incineration technological characterisation of straw based on analysis data as well as measured data of large-scale installations (I. Obernberger); (5) Energetic utilization of hay goods in Germany (T. Hering); (6) Actual state of the art towards establishing the first German straw thermal power station (R. Knieper); (7) Straw thermal power plants at agricultural sow farms and poultry farms (H. Heilmann); (8) Country report power from straw in Denmark (A. Evald); (9) Country report power from straw in Poland (J. Antonowicz); (10) Country report power from straw in China (J. Zhang); (11) Energetic utilisation of straw in Czechia (D. Andert); (12) Mobile pelletization of straw (S. Auth); (13) Experiences with the straw thermal power plant from Vattenfall (N. Kirkegaard); (14) Available straw potentials in Germany (potential, straw provision costs) (C. Weiser); (15) Standardization of hay good and test fuels - Classification and development of product standards (M. Englisch); (16) Measures of reduction of emissions at hay good incinerators (V. Lenz); (17) Fermentation of straw - State of the art and perspectives (G. Reinhold); (18) Cellulosis - Ethanol from agricultural residues - Sustainable biofuels (A. Hartmair); (19) Syngas by fermentation of straw (N. Dahmen); (20) Construction using straw (D. Scharmer).

NONE

2012-06-15

166

Proceedings of the 2nd KUR symposium on hyperfine interactions  

International Nuclear Information System (INIS)

Hyperfine interactions between a nuclear spin and an electronic spin discovered from hyperfine splitting in atomic optical spectra have been utilized not only for the determination of nuclear parameters in nuclear physics but also for novel experimental techniques in many fields such as solid state physics, chemistry, biology, mineralogy and for diagnostic methods in medical science. Experimental techniques based on hyperfine interactions yield information about microscopic states of matter so that they are important in material science. Probes for material research using hyperfine interactions have been nuclei in the ground state and radioactive isotopes prepared with nuclear reactors or particle accelerators. But utilization of muons generated from accelerators is recently growing. Such wide spread application of hyperfine interaction techniques gives rise to some difficulty in collaboration among various research fields. In these circumstances, the present workshop was planned after four years since the last KUR symposium on the same subject. This report summarizes the contributions to the workshop in order to be available for the studies of hyperfine interactions. (J.P.N.)

167

Measurement of leak radiation in labyrinth. Measurement of leak radiation in labyrinth of the 2nd light-ion room  

International Nuclear Information System (INIS)

Assessment of radiation leaked or diffused in the passages of labyrinth in the facility equipped with an accelerator is essential to make a shield design for such facility. In this study, data available to define a bench mark were collected by measuring the radioactivity in the labyrinth of the 2nd light-ion room of TIARA in Takasaki Institute of Japan Atomic Energy Research Institute. Using neutrons generated by injecting proton of 67 MeV to a thick target of copper, the leaked radiation was determined. The measurement items were as follows; 1) energy spectrum of neutron source, 2) distributions of neutron and ?-ray intensities in the radiation room, 3) dose and intensity distributions and energy spectra of neutron and ?-ray in the labyrinth. Based on these bench mark data, we have a plan to evaluate the simple experience typed and Monte Carlo typed calculation codes, which are used for designing a radiation shield at present. (M.N.)

168

Implementation of a Software-Defined Radio Transceiver on High-Speed Digitizer/Generator SDR14  

Digital Repository Infrastructure Vision for European Research (DRIVER)

This thesis describes the specification, design and implementation of a software-defined radio system on a two-channel 14-bit digitizer/generator. The multi-stage interpolations and decimations which are required to operate two analog-to-digital converters at 800 megasamples per second (MSps) and two digital-to-analog converters at 1600 MSps from a 25 MSps software-side interface, were designed and implemented. Quadrature processing was used throughout the system, and a combination of fine-tu...

Bjo?rklund, Daniel

2012-01-01

169

Preparations for the 2nd IGS Reprocessing Campaign  

Science.gov (United States)

The Analysis Centers (ACs) of the International GNSS Service (IGS) are now completing their first collective reanalysis of the history of global network GPS data collected since 1994. A consistent set of the latest models and methodology is being used to generate GPS orbits, Earth orientation parameters (EOPs), station coordinate time series, and station and satellite clocks. These results have been contributed to the new ITRF2008 multi-technique terrestrial reference frame and EOP combination. Preparations will begin during 2010 for the next IGS reprocessing effort. Despite the major progress made in the first IGS reanalysis, further analysis improvements remain to be implemented. The list includes: add GLONASS as well as GPS observations; adopt a new reference frame based on ITRF2008; update the IGS antenna calibrations based on the first reprocessing results and other sources; use the new EGM2008 geopotential model with perhaps revised time-varying coefficients; implement a model for previously neglected higher-order ionospheric effects; consider the satellite dynamical effects of Earth albedo reflection and re-radiated thermal emissions; apply various refinements in modeling tropospheric delays; include station displacements due the S1 and S2 atmospheric pressure tides; use a new model for the subdaily EOP tidal variations, if available; reconsider the handling of EOP constraints and a prioris by ACs; incorporate all high-order relativistic effects; and revisit the treatment of all analysis constraints to remove as many as possible and to understand better the effects of those that remain. Other operational aspects need to be evaluated also, such as how best to treat non-tidal loading station displacements, whether to continue forming weekly SINEX solutions or to move instead to daily integrations, and more consistent and rigorous methods to combine AC solutions.

Ray, J.

2009-12-01

170

Library perceptions of using social software as blogs in the idea generation phase of service innovations : Lessons from an experiment  

DEFF Research Database (Denmark)

This article investigates the use of social software such as blogs to communicate with and to involve users in the idea generation process of service innovations. After a theoretical discussion of user involvement and more specifically user involvement using web-tools with specific focus on blogs, the article reports findings and lessons from a field experiment at a university library. In the experiment, a blog was established to collect service innovation ideas from the library users. The experiment shows that a blog may engage a limited number of users in the idea generation process and generate a useful, but modest amount of ideas.

Scupola, Ada; Nicolajsen, Hanne Westh

2013-01-01

171

Towards a "2nd Generation" of Quality Labels: a Proposal for the Evaluation of Territorial Quality Marks / Vers une «2ème génération» de labels de qualité: une proposition pour l'évaluation des marques de qualité territoriale / Hacia una "2" generación" de sellos de calidad: una propuesta para la evaluación de las marcas de calidad territorial  

Scientific Electronic Library Online (English)

Full Text Available SciELO Colombia | Language: English Abstract in spanish La literatura reciente analiza el papel de las especificidades territoriales como el núcleo de las estrategias de desarrollo territorial rural basadas en la diferenciación. Desafortunadamente, la proliferación de los sistemas de garantía de calidad está provocando un "laberinto de sellos", que difun [...] den los esfuerzos locales de capitalizar las especificidades rurales. Una segunda generación de sellos se está desarrollando actualmente para simplificar la diferenciación territorial. Una parte de los territorios al sur de Europa está basando sus estrategias de desarrollo rural mediante el proyecto Marca de Calidad Territorial Europea (MCTE). Este trabajo propone una metodología original, diseñada y desarrollada por los autores para la evaluación de algunos de los sellos de segunda generación. Esta metodología se ha validado en quince territorios rurales como los pioneros de la MCTE en España. Abstract in english Recent literature analyses the role of territorial specificities, as the core of territorial rural development strategies based on differentiation. Unfortunately, the proliferation of quality assurance schemes is provoking a "labyrinth of labels" which diffuses the local efforts for capitalizing rur [...] al specificities. A second-generation of labels is currently being developed to simplify the territorial differentiation message. A number of territories in Southern Europe are basing their rural development strategies joining the so-called European Territorial Quality Mark (ETQM) Project. This paper proposes an original methodology, designed and developed by authors, for the evaluation of some of these second-generation labels. This methodology has been validated in 15 rural territories as the pioneers of the ETQM in Spain.

Eduardo, Ramos; Dolores, Garrido.

2014-12-01

172

Evaluation of a Hand Washing Program for 2nd-Graders  

Science.gov (United States)

The purpose of this project was to determine if a multiple-week learner-centered hand washing program could improve hand hygiene behaviors of 2nd-graders in a northern Illinois public school system. Volunteers from the Rockford Hand Washing Coalition went into 19 different classrooms for 4 consecutive weeks and taught a learner-centered program.…

Tousman, Stuart; Arnold, Dani; Helland, Wealtha; Roth, Ruth; Heshelman, Nannatte; Castaneda, Oralia; Fischer, Emily; O'Neil, Kristen; Bileto, Stephanie

2007-01-01

173

All in a Day's Work: Careers Using Science, 2nd Edition (e-book)  

Science.gov (United States)

"Almost all careers in the 21st century require a working knowledge of science and mathematics," says Steve Metz, The Science Teacher field editor, in his introduction to All in a Day's Work, 2nd edition . "The pending retirement of 78 mi

Megan Sullivan

2009-06-23

174

Point classification of 2nd order ODEs: Tresse classification revisited and beyond  

CERN Document Server

In 1896 Tresse gave a complete description of relative differential invariants for the pseudogroup action of point transformations on the 2nd order ODEs. The purpose of this paper is to review, in light of modern geometric approach to PDEs, this classification and also discuss the role of absolute invariants and the equivalence problem.

Kruglikov, Boris

2008-01-01

175

Proceedings of the 2nd symposium on valves for coal conversion and utilization  

Energy Technology Data Exchange (ETDEWEB)

The 2nd symposium on valves for coal conversion and utilization was held October 15 to 17, 1980. It was sponsored by the US Department of Energy, Morgantown Energy Technology Center, in cooperation with the Valve Manufacturers Association. Seventeen papers have been entered individually into EDB and ERA. (LTN)

Maxfield, D.A. (ed.)

1981-01-01

176

Proceedings of the 2nd Mediterranean Conference on Information Technology Applications (ITA '97)  

International Nuclear Information System (INIS)

This is the proceedings of the 2nd Mediterranean Conference on Information Technology Applications, held in Nicosia, Cyprus, between 6-7 November, 1997. It contains 16 papers. Two of these fall within the scope of INIS and are dealing with Telemetry, Radiation Monitoring, Environment Monitoring, Radiation Accidents, Air Pollution Monitoring, Diagnosis, Computers, Radiology and Data Processing

177

8. Book Review: ‘Broken Bones: Anthropological Analysis of Blunt Force Trauma’ 2 nd edition, 2014  

Directory of Open Access Journals (Sweden)

Full Text Available 'Broken Bones: Anthropological Analysis of Blunt Force Trauma' 2nd edition, 2014. Editors: Vicki L. Wedel and Alison Galloway; Publisher: Charles C. Thomas, Illinois. pp 479 + xxiii ISBN: 978-0-398-08768-5 (Hard ISBN: 978-0-398-08769-2 (eBook

R. Gaur

2014-04-01

178

Technical Adequacy of the Disruptive Behavior Rating Scale-2nd Edition--Self-Report  

Science.gov (United States)

This study provides preliminary analysis of the Disruptive Behavior Rating Scale-2nd Edition--Self-Report, which was designed to screen individuals aged 10 years and older for anxiety and behavior symptoms. Score reliability and internal and external facets of validity were good for a screening-level test.

Erford, Bradley T.; Miller, Emily M.; Isbister, Katherine

2015-01-01

179

Efficient FPGA implementation of 2nd order digital controllers using Matlab/Simulink  

Digital Repository Infrastructure Vision for European Research (DRIVER)

This paper explains a method for the design and implementation of digital controller based on Field Programmable Gate Array (FPGA) device. It is more compact, power efficient and provides high speed capabilities as compared to software based PID controllers. The proposed method is based on implementation of Digital controller as digital filters using DSP architectures. The PID controller is designed using MATLAB and Simulink to generate a set of coefficients associated with the desired contro...

Vikas gupta; Khare, K.; Singh, R. P.

2011-01-01

180

Optimizing Software Testing and Test Case Generation by using the concept of Hamiltonian Paths  

Directory of Open Access Journals (Sweden)

Full Text Available Software testing is a trade-off between budget, time and quality. Broadly, software testing can be classified as Unit testing, Integration testing, Validation testing and System testing. By including the concept of Hamiltonian paths we can improve greatly on the facet of software testing of any project. This paper shows how Hamiltonian paths can be used for requirement specification. It can also be used in acceptance testing phase for checking if all the user requirements are met or not. Further it gives the necessary calculations and algorithms to show the feasibility of its implementation.

Ankita Bihani

2014-04-01

 
 
 
 
181

Methodology and Toolset for Model Verification, Hardware/Software co-simulation, Performance Optimisation and Customisable Source-code generation  

DEFF Research Database (Denmark)

The MODUS project aims to provide a pragmatic and viable solution that will allow SMEs to substantially improve their positioning in the embedded-systems development market. The MODUS tool will provide a model verification and Hardware/Software co-simulation tool (TRIAL) and a performance optimisation and customisable source-code generation tool (TUNE). The concept is depicted in automated modelling and optimisation of embedded-systems development. The tool will enable model verification by guiding the selection of existing open-source model verification engines, based on the automated analysis of system properties, and producing inputs to be fed into these engines, interfacing with standard (SystemC) simulation platforms for HW/SW co-simulation, customisable source-code generation towards respecting coding standards and conventions and software performance-tuning optimisation through automated design transformations.

Berger, Michael Stübert; Soler, José

2013-01-01

182

Real-time infrared and semi-active laser scene generation software for AMSTAR hardware in the loop  

Science.gov (United States)

This paper describes the current research and development of advanced scene generation technology for integration into the Advanced Multispectral Simulation Test and Acceptance Resource (AMSTAR) Hardware-in-the-Loop (HWIL) facilities at the US Army AMRDEC and US Army Redstone Technical Test Center at Redstone Arsenal, AL. A real-time multi-mode (infra-red (IR) and semi-active laser (SAL)) scene generator for a tactical sensor system has been developed leveraging COTS hardware and open source software (OSS). A modular, plug-in architecture has been developed that supports rapid reconfiguration to permit the use of a variety of state data input sources, geometric model formats, and signature and material databases. The platform-independent software yields a cost-effective upgrade path to integrate best-of-breed personal computer (PC) graphics processing unit (GPU) technology.

Cosby, David S.; Lyles, Patrick; Bunfield, Dennis; Trimble, Darian; Rossi, Todd

2005-05-01

183

GSIMF: a web service based software and database management system for the next generation grids  

International Nuclear Information System (INIS)

To process the vast amount of data from high energy physics experiments, physicists rely on Computational and Data Grids; yet, the distribution, installation, and updating of a myriad of different versions of different programs over the Grid environment is complicated, time-consuming, and error-prone. Our Grid Software Installation Management Framework (GSIMF) is a set of Grid Services that has been developed for managing versioned and interdependent software applications and file-based databases over the Grid infrastructure. This set of Grid services provide a mechanism to install software packages on distributed Grid computing elements, thus automating the software and database installation management process on behalf of the users. This enables users to remotely install programs and tap into the computing power provided by Grids

184

Input-profile-based software failure probability quantification for safety signal generation systems  

International Nuclear Information System (INIS)

The approaches for software failure probability estimation are mainly based on the results of testing. Test cases represent the inputs, which are encountered in an actual use. The test inputs for the safety-critical application such as a reactor protection system (RPS) of a nuclear power plant are the inputs which cause the activation of protective action such as a reactor trip. A digital system treats inputs from instrumentation sensors as discrete digital values by using an analog-to-digital converter. Input profile must be determined in consideration of these characteristics for effective software failure probability quantification. Another important characteristic of software testing is that we do not have to repeat the test for the same input value since the software response is deterministic for each specific digital input. With these considerations, we propose an effective software testing method for quantifying the failure probability. As an example application, the input profile of the digital RPS is developed based on the typical plant data. The proposed method in this study is expected to provide a simple but realistic mean to quantify the software failure probability based on input profile and system dynamics.

185

User's manual for the UNDERDOG [Underground Nuclear Depository Evaluation, Reduction, and Detailed Output Generator] data reduction software  

International Nuclear Information System (INIS)

UNDERDOG is a computer program that aids experimentalists in the process of data reduction. This software allows a user to reduce, extract, and generate displays of data collected at the WIPP site. UNDERDOG contains three major functional components: a Data Reduction package, a Data Analysis interface, and a Publication-Quality Output generator. It also maintains audit trails of all actions performed for quality assurance purposes and provides mechanisms which control an individual's access to the data. UNDERDOG was designed to run on a Digital Equipment Corporation VAX computer using the VMS operating system. 8 refs., 24 figs., 2 tabs

186

Extreme sea storm in the Mediterranean Sea. Trends during the 2nd half of the 20th century.  

Science.gov (United States)

Extreme sea storm in the Mediterranean Sea. Trends during the 2nd half of the 20th century Piero Lionello, University of Salento, piero.lionello@unisalento.it Maria Barbara Galati, University of Salento, mariabarbara.galati@unisalento.it Cosimo Pino, University of Salento, pino@le.infn.it The analysis of extreme Significant Wave Height (SWH) values and their trend is crucial for planning and managing coastal defences and off-shore activities. The analysis provided by this study covers a 44-year long period (1958-2001). First the WW3 (Wave Watch 3) model forced with the REMO-Hipocas regional model wind fields has been used for the hindcast of extreme SWH values over the Mediterranean basin with a 0.25 deg lat-lon resolution. Subsequently, the model results have been processed with an ad hoc software to detect storms. GEV analysis has been perfomed and a set of indicators for extreme SWH have been computed, using the Mann Kendall test for assessing statistical significance of trends for different parameter such as the number of extreme events, their duration and their intensity. Results suggest a transition towards weaker extremes and a milder climate over most of the Mediterranean Sea.

Pino, C.; Lionello, P.; Galati, M. B.

2009-04-01

187

Generating Variable Strength Covering Array for Combinatorial Software Testing with Greedy Strategy  

Directory of Open Access Journals (Sweden)

Full Text Available Combinatorial testingis a practical and efficient software testing techniques, which could detectthe faults that triggered by interactions among factors in software. Comparedto the classic fixed strength combinatorial testing, the variable strengthcombinatorial testing usually uses less test cases to detect more interactionfaults, because it considers the actual interaction relationship in softwaresufficiently. For a model of variable strength combinatorial testing that hasbeen propose previously, two heuristic algorithms, which are based onone-test-at-a-time greedy strategy, are proposed in this paper to generatevariable strength covering arrays as test suites in software testing.Experimental results show that, compared to some existed algorithms and tools,the two proposed algorithms have advantages onboth the execution effectiveness and the optimality of the size of generatedtest suite.

Ziyuan Wang

2013-12-01

188

Revised data for 2nd version of nuclear criticality safety handbook/data collection  

International Nuclear Information System (INIS)

This paper outlines the data prepared for the 2nd version of Data Collection of the Nuclear Criticality Safety Handbook. These data are discussed in the order of its preliminary table of contents. The nuclear characteristic parameters (k?, M2, D) were derived, and subcriticality judgment graphs were drawn for eleven kinds of fuels which were often encountered in criticality safety evaluation of fuel cycle facilities. For calculation of criticality data, benchmark calculations using the combination of the continuous energy Monte Carlo criticality code MVP and the Japanese Evaluated Nuclear Data Library JENDL-3.2 were made. The calculation errors were evaluated for this combination. The implementation of the experimental results obtained by using NUCEF facilities into the 2nd version of the Data Collection is under discussion. Therefore, related data were just mentioned. A database is being prepared to retrieve revised data easily. (author)

189

eLEN2 — 2nd generation eLearning Exchange Networks.  

Digital Repository Infrastructure Vision for European Research (DRIVER)

Since May 2007 the authors of this paper have explored and evaluated the use, including relative merits and challenges of social networking within the context of higher education professional development programmes in France and in Britain (Marsh & Panckhurst, 2007; Panckhurst & Marsh, 2008). A social networking tool was adopted for Masters' level courses in order to try and establish an effective collaborative pedagogical environment and sense of community, by placing students at the centre ...

Panckhurst, Rachel; Marsh, Debra

2009-01-01

190

2nd generation lignocellulosic bioethanol: is torrefaction a possible approach to biomass pretreatment?  

Energy Technology Data Exchange (ETDEWEB)

Biomass pretreatement is a key and energy-consuming step for lignocellulosic ethanol production; it is largely responsible for the energy efficiency and economic sustainability of the process. A new approach to biomass pretreatment for the lignocellulosic bioethanol chain could be mild torrefaction. Among other effects, biomass torrefaction improves the grindability of fibrous materials, thus reducing energy demand for grinding the feedstock before hydrolysis, and opens the biomass structure, making this more accessible to enzymes for hydrolysis. The aim of the preliminary experiments carried out was to achieve a first understanding of the possibility to combine torrefaction and hydrolysis for lignocellulosic bioethanol processes, and to evaluate it in terms of sugar and ethanol yields. In addition, the possibility of hydrolyzing the torrefied biomass has not yet been proven. Biomass from olive pruning has been torrefied at different conditions, namely 180-280 C for 60-120 min, grinded and then used as substrate in hydrolysis experiments. The bioconversion has been carried out at flask scale using a mixture of cellulosolytic, hemicellulosolitic, {beta}-glucosidase enzymes, and a commercial strain of Saccharomyces cerevisiae. The experiments demonstrated that torrefied biomass can be enzymatically hydrolyzed and fermented into ethanol, with yields comparable with grinded untreated biomass and saving electrical energy. The comparison between the bioconversion yields achieved using only raw grinded biomass or torrefied and grinded biomass highlighted that: (1) mild torrefaction conditions limit sugar degradation to 5-10%; and (2) torrefied biomass does not lead to enzymatic and fermentation inhibition. Energy consumption for ethanol production has been preliminary estimated, and three different pretreatment steps, i.e., raw biomass grinding, biomass-torrefaction grinding, and steam explosion were compared. Based on preliminary results, steam explosion still has a significant advantage compared to the other two process chains. (orig.)

Chiaramonti, David; Rizzo, Andrea Maria; Prussi, Matteo [University of Florence, CREAR - Research Centre for Renewable Energy and RE-CORD, Florence (Italy); Tedeschi, Silvana; Zimbardi, Francesco; Braccio, Giacobbe; Viola, Egidio [ENEA - Laboratory of Technology and Equipment for Bioenergy and Solar Thermal, Rotondella (Italy); Pardelli, Paolo Taddei [Spike Renewables s.r.l., Florence (Italy)

2011-03-15

191

Utilisation of 2nd generation web technologies in master level vocational teacher training  

Directory of Open Access Journals (Sweden)

Full Text Available The Masters level Opportunities and Technological Innovation in Vocational Teacher Education project (project site: http://motivate.tmpk.bmf.hu/ aims to develop the use and management of virtual learning environments in the area of vocational teacher training, drawing on a well established international partnership of institutions providing both technical and educational expertise. This paper gives an overall picture of the first results and products of the collaboration. We touch upon the goals, the assessments and the learning process of using “Multimedia and e-Learning: e-learning methods and tools” module in details. The main cooperative and collaborative devices are presented in virtual learning environment. The communication during collaborative learning, the structured debate on forum and the benefits of collaborative learning in VLE are interpreted at the end of this paper.

Péter Tóth

2009-03-01

192

Performance of 2nd Generation BaBar Resistive Plate Chambers  

Energy Technology Data Exchange (ETDEWEB)

The BaBar detector has operated nearly 200 Resistive Plate Chambers (RPCs), constructed as part of an upgrade of the forward endcap muon detector, for the past two years. The RPCs experience widely different background and luminosity-driven singles rates (0.01-10 Hz/cm{sup 2}) depending on position within the endcap. Some regions have integrated over 0.3 C/cm{sup 2}. RPC efficiency measured with cosmic rays is high and stable. The average efficiency measured with beam is also high. However, a few of the highest rate RPCs have suffered efficiency losses of 5-15%. Although constructed with improved techniques and minimal use of linseed oil, many of the RPCs, which are operated in streamer mode, have shown increased dark currents and noise rates that are correlated with the direction of the gas flow and the integrated current. Studies of the above aging effects are presented and correlated with detector operating conditions.

Anulli, F.; Baldini, R.; Calcaterra, A.; de Sangro, R.; Finocchiaro, G.; Patteri, P.; Piccolo, M.; Zallo, A.; /Frascati; Cheng, C.H.; Lange, D.J.; Wright, D.M.; /LLNL,; Messner, R.; Wisniewski, William J.; /SLAC; Pappagallo, M.; /Bari U. /INFN, Bari; Andreotti, M.; Bettoni, D.; Calabrese, R.; Cibinetto, G.; Luppi, E.; Negrini, M.; /Ferrara; Capra, R.; /Genoa U. /INFN, Genoa /Naples U. /INFN, Naples /Perugia U. /INFN, Perugia /Pisa U. /INFN, Pisa /Rome U. /INFN, Rome /Oregon U. /UC, Riverside

2005-07-12

193

PPRODUCTION OF 2ND GENERATION BIOETHANOL FROM LUCERNE – OPTIMIZATION OF HYDROTHERMAL PRETREATMENT  

Directory of Open Access Journals (Sweden)

Full Text Available Lucerne (Medicago sativa has many qualities associated with sustainable agriculture such as nitrogen fixation and high biomass yield. Therefore, there is interest in whether lucerne is a suitable biomass substrate for bioethanol production, and if hydrothermal pretreatment (HTT of lucerne improves enzymatic convertibility, providing sufficient enzymatic conversion of carbohydrate to simple sugars for ethanol production. The HTT process was optimised for lucerne hay, and the pretreated biomass was assessed by carbohydrate analysis, inhibitor characterisation of liquid phases, and by simultaneous saccharification and fermentation (SSF of the whole slurry with Cellubrix enzymes and Saccharomyces cerevisiae yeast. The optimal HTT conditions were 205°C for 5 minutes, resulting in pentose recovery of 81%, and an enzymatic convertibility of glucan to monomeric glucose of 74%, facilitating a conversion of 6.2% w/w of untreated material into bioethanol in SSF, which is equivalent to 1,100 litre ethanol per hectare per year

Sune Tjalfe Thomsen,

2012-02-01

194

Utilisation of 2nd generation web technologies in master level vocational teacher training  

Digital Repository Infrastructure Vision for European Research (DRIVER)

The Masters level Opportunities and Technological Innovation in Vocational Teacher Education project (project site: http://motivate.tmpk.bmf.hu/) aims to develop the use and management of virtual learning environments in the area of vocational teacher training, drawing on a well established international partnership of institutions providing both technical and educational expertise. This paper gives an overall picture of the first results and products of the collaboration. We touch upon the g...

Péter Tóth

2009-01-01

195

Software concept of in-service diagnostic systems for nuclear steam generating facilities  

International Nuclear Information System (INIS)

The concept of software systems of in-service diagnostics is presented for the primary circuits of WWER-440 and WWER-1000 reactors. The basic and supplementary systems and user software are described for the collection, processing and evaluation of diagnostic signals from the primary circuits of the Dukovany and Bohunice nuclear power plants and the design is presented of the hierarchical structure of computers in the diagnostic systems of the Mochovce and Temelin nuclear power plants. The systems are operated using computers of Czechoslovak make of the ADT production series with operating systems RTE-II or DOS IV. (J.B.)

196

Automatic Generation of Machine Emulators: Efficient Synthesis of Robust Virtual Machines for Legacy Software Migration  

DEFF Research Database (Denmark)

As older mainframe architectures become obsolete, the corresponding le- gacy software is increasingly executed via platform emulators running on top of more modern commodity hardware. These emulators are virtual machines that often include a combination of interpreters and just-in-time compilers. Implementing interpreters and compilers for each combination of emulated and target platform independently of each other is a redundant and error-prone task. We describe an alternative approach that automatically synthesizes specialized virtual-machine interpreters and just-in-time compilers, which then execute on top of an existing software portability platform such as Java. The result is a considerably reduced implementation effort.

Franz, Michael; Gal, Andreas

2006-01-01

197

Proceedings of the 2nd KAERI-JAERI Joint Seminar on the PIE Technology  

International Nuclear Information System (INIS)

This proceedings contains articles of the 2nd KAERI-JAERI Joint Seminar on the PIE Technology. It was held on Sep. 20-22, 1995 in Taejon, Korea. This proceedings is comprised of 4 sessions. The main topic titles of session are as follows: current status and future program on PIE, operating experiences of PIE facility, PIE techniques and evaluation of PIE data. (Yi, J. H.)

198

Experiences in commissioning of 1st and 2nd unit of NPP Mochovce  

International Nuclear Information System (INIS)

Paper is focused on responsibilities and position of VUJE Trnava Inc. in process of commissioning of new NPP units. It describes the order of activities during commissioning from the preparation of measurements to evaluation of measured data. Paper's scope is on the commissioning the 1st and 2nd units of the NPP Mochovce, which has been commissioned in the years 1998 and 1999. Basic characteristics of the NPP Mochovce design are also included in this paper. (author)

199

Pb2+, Nd3+ and Eu3+ as local structural probes in sodium borate glasses  

International Nuclear Information System (INIS)

The structure of sodium borate glasses has been investigated using Pb2+, Nd3+ and Eu3+ as local probes. Materials with low modifier concentrations were found to have an approximately two-dimensional (2D) B-O network. Increasing the proportion of Na2O leads to a gradual 2D ? 3D transition of the network former which is completely achieved in glasses containing 25 mol% Na2O. (Auth.)

200

Production of artificial ionospheric layers by frequency sweeping near the 2nd gyroharmonic  

Digital Repository Infrastructure Vision for European Research (DRIVER)

Artificial ionospheric plasmas descending from the background F-region have been observed on multiple occasions at the High Frequency Active Auroral Research Program (HAARP) facility since it reached full 3.6 MW power. Proximity of the transmitter frequency to the 2nd harmonic of the electron gyrofrequency (2fce) has been noted as a requirement for their occurrence, and their disappearance after only a few minutes has been attributed to the increasing...

Pedersen, T.; Mccarrick, M.; Reinisch, B.; Watkins, B.; Hamel, R.; Paznukhov, V.

2011-01-01

 
 
 
 
201

Oxidized Modified Proteins in the Atherosclerosis Genesis at a Diabetes Mellitus of the 2nd Type  

Digital Repository Infrastructure Vision for European Research (DRIVER)

A role of free-radical oxidation in atherogenesis in patients with a diabetes mellitus of the 2nd type at an ischemic heart disease is regarded. It is established, that the oxidized modified proteins are in tight contact with the lipid peroxidation and support a free-radical oxidation in the given category of patients. It is demonstrated, that the oxidized modified protein detection can be both early and integral test of metabolic disturbances and, in perspective, hemostasiologic disturbances...

Zanozina, O. V.; Borovkov, N. N.; Sherbatyuk, T. G.

2009-01-01

202

Proceedings of the 2nd IWDG International Whale Conference. Muc Mhara Ireland's Smallest Whale  

Digital Repository Infrastructure Vision for European Research (DRIVER)

Muc Mhara – Ireland’s smallest whale. Proceedings of the 2nd Irish Whale and Dolphin Group International Whale Conference. Papers presented include, “Introduction: The harbour porpoise or Muc Mhara”, “An Irish name for the humble harbour porpoise”, “Life in the Fast Lane: Ecology and Behaviour of harbour porpoises in the Gulf of Maine”, “The ecology of harbour porpoise (Phocoena phocoena) in Irish waters: what strandings programmes tell us.”, “Passive acoustic monitoring...

Berrow, S. D.; Deegan, B.

2010-01-01

203

Very large millimeter/submillimeter array toward search for 2nd Earth  

Science.gov (United States)

ALMA (Atacama Large Millimeter/submillimeter Array) is a revolutionary radio telescope and its early scientific operation has just started. It is expected that ALMA will resolve several cosmic questions and will give us a new cosmic view. Our passion for astronomy naturally goes beyond ALMA because we believe that the 21st-century astronomy should pursue the new scientific frontier. In this conference, we propose a project of the future radio telescope to search for habitable planets and finally detect 2nd Earth as a migratable planet. Detection of 2nd Earth is one of the ultimate dreams not only for astronomers but also for every human being. To directly detect 2nd Earth, we have to carefully design the sensitivity and angular resolution of the telescope by conducting trade-off analysis between the confusion limit and the minimum detectable temperature. The result of the sensitivity analysis is derived assuming an array that has sixty-four (64) 50-m antennas with 25-?m surface accuracy mainly located within the area of 300 km (up to 3000 km), dual-polarization SSB receivers with the best noise temperature performance achieved by ALMA or better, and IF bandwidth of 128 or 256 GHz.. We temporarily name this telescope "Very Large Millimeter/Submillimeter Array (VLMSA)". Since this sensitivity is extremely high, we can have a lot of chances to study the galaxy, star formation, cosmology and of course the new scientific frontier.

Iguchi, Satoru; Saito, Masao

2012-09-01

204

A proposed approach for developing next-generation computational electromagnetics software  

Energy Technology Data Exchange (ETDEWEB)

Computations have become a tool coequal with mathematics and measurements as a means of performing electromagnetic analysis and design. This is demonstrated by the volume of articles and meeting presentations in which computational electromagnetics (CEM) is routinely employed to address an increasing variety of problems. Yet, in spite of the substantial resources invested in CEM software over the past three decades, little real progress seems to have been made towards providing the EM engineer software tools having a functionality equivalent to that expected of hardware instrumentation. Furthermore, the bulk of CEM software now available is generally of limited applicability to large, complex problems because most modeling codes employ a single field propagator, or analytical form, of Maxwell`s Equations. The acknowledged advantages of hybrid models, i.e., those which employ different propagators in differing regions of a problem, are relatively unexploited. The thrust of this discussion is to propose a new approach designed to address both problems outlined above, integrating advances being made in both software and hardware development. After briefly reviewing the evolution of modeling CEM software to date and pointing out the deficiencies thereof, we describe an approach for making CEM tools more truly ``user friendly`` called EMSES (Electromagnetic Modeling and Simulation Environment for Systems). This will be achieved through two main avenues. One is developing a common problem-description language implemented in a visual programming environment working together with a translator that produces the specific model description needed by various numerical treatments, in order to optimize user efficiency. The other is to employ a new modeling paradigm based on the idea of field propagators to expedite the development of the hybrid models that are needed to optimize computation efficiency.

Miller, E.K.; Kruger, R.P. [Los Alamos National Lab., NM (United States); Moraites, S. [Simulated Life Systems, Inc., Chambersburg, PA (United States)

1993-02-01

205

A proposed approach for developing next-generation computational electromagnetics software  

Energy Technology Data Exchange (ETDEWEB)

Computations have become a tool coequal with mathematics and measurements as a means of performing electromagnetic analysis and design. This is demonstrated by the volume of articles and meeting presentations in which computational electromagnetics (CEM) is routinely employed to address an increasing variety of problems. Yet, in spite of the substantial resources invested in CEM software over the past three decades, little real progress seems to have been made towards providing the EM engineer software tools having a functionality equivalent to that expected of hardware instrumentation. Furthermore, the bulk of CEM software now available is generally of limited applicability to large, complex problems because most modeling codes employ a single field propagator, or analytical form, of Maxwell's Equations. The acknowledged advantages of hybrid models, i.e., those which employ different propagators in differing regions of a problem, are relatively unexploited. The thrust of this discussion is to propose a new approach designed to address both problems outlined above, integrating advances being made in both software and hardware development. After briefly reviewing the evolution of modeling CEM software to date and pointing out the deficiencies thereof, we describe an approach for making CEM tools more truly user friendly'' called EMSES (Electromagnetic Modeling and Simulation Environment for Systems). This will be achieved through two main avenues. One is developing a common problem-description language implemented in a visual programming environment working together with a translator that produces the specific model description needed by various numerical treatments, in order to optimize user efficiency. The other is to employ a new modeling paradigm based on the idea of field propagators to expedite the development of the hybrid models that are needed to optimize computation efficiency.

Miller, E.K.; Kruger, R.P. (Los Alamos National Lab., NM (United States)); Moraites, S. (Simulated Life Systems, Inc., Chambersburg, PA (United States))

1993-01-01

206

A technical note about Phidel: A new software for evaluating magnetic induction field generated by power lines  

International Nuclear Information System (INIS)

The Regional Environment Protection Agency of Friuli Venezia Giulia (ARPA FVG, Italy) has performed an analysis on existing software designed to calculate magnetic induction field generated by power lines. As far as the agency's requirements are concerned the tested programs display some difficulties in the immediate processing of electrical and geometrical data supplied by plant owners, and in certain cases turn out to be inadequate in representing complex configurations of power lines. Phidel, an innovative software, tackles and works out all the above-mentioned problems. Therefore, the obtained results, when compared with those of other programs, are the closest to experimental measurements. The output data can be employed both in the GIS and Excel environments, allowing the immediate overlaying of digital cartography and the determining of the 3 and 10 ?T bands, in compliance with the Italian Decree of the President of the Council of Ministers of 8 July 2003. (authors)

207

Sustainable development - a role for nuclear power? 2nd scientific forum  

International Nuclear Information System (INIS)

The 2nd Scientific Forum of the International Atomic Energy Agency (IAEA) was held during the 43rd General Conference. This paper summarizes the deliberations of the two-day Forum. The definition of 'sustainable development' of the 1987 Bruntland Commission - 'development that meets the needs of the present without compromising the ability of future generations to meet their own needs' - provided the background for the Forum's debate whether and how nuclear power could contribute to sustainable energy development. The framework for this debate comprises different perspectives on economic, energy, environmental, and political considerations. Nuclear power, along with all energy generating systems, should be judged on these considerations using a common set of criteria (e.g., emission levels, economics, public safety, wastes, and risks). First and foremost, there is a growing political concern over the possible adverse impact of increasing emissions of greenhouse gases from fossil fuel combustion. However, there is debate as to whether this would have any material impact on the predominantly economic criteria currently used to make investment decisions on energy production. According to the views expressed, the level of safety of existing nuclear power plants is no longer a major concern - a view not yet fully shared by the general public. The need to maintain the highest standards of safety in operation remains, especially under the mounting pressure of competitiveness in deregulated and liberalized energy markets. The industry must continuously reinforce a strong safety culture among reactor designers, builders, and operators. Furthermore, a convincing case for safety will have to be made for any new reactor designs. Of greater concern to the public and politicians are the issues of radioactive waste and proliferation of nuclear weapons. There is a consensus among technical experts that radioactive wastes from nuclear power can be disposed of safely and economically in deep geologic formations. However, the necessary political decisions to select sites for repositories need public support and understanding about what the industry is doing and what can be done. As to nuclear weapons proliferation, the existing safeguards system must be fully maintained and strengthened and inherently proliferation-resistant fuel cycles should be explored. Overviews of the future global energy demand and of the prospects for nuclear power in various economic regions of the world indicate that, in the case of the OECD countries, the dominant issue is economics in an increasingly free market system for electricity. For the so-called transition economies, countries of the Former Soviet Union and Central and Eastern Europe, the issue is one of managing nuclear power plant operations safely. In the case of developing countries, the dominant concern is effective management of technology, in addition to economics and finance. The prospects for nuclear power depend on the resolution of two cardinal issues. The first is economic competitiveness, and in particular, reduced capital cost. The second is public confidence in the ability of the industry to manage plant operations and its high level waste safely. There is a continuing need for dialogue and communication with all sectors of the public: economists, investors, social scientists, politicians, regulators, unions, and environmentalists. Of help in this dialogue would be nuclear power's relevance to and comparative advantages in addressing environmental issues, such as global climate change, local air quality, and regional acidification. Suggestions have been made for a globalized approach to critical nuclear power issues, such as waste management, innovative and proliferation-resistant reactors and fuel cycles, and international standards for new generation nuclear reactor designs.The conclusion seems to be that there is a role for nuclear energy in sustainable development, especially if greenhouse gas emissions are to be limited. Doubts persist in the minds of many energy experts over the pote

208

A Software Safety Certification Plug-in for Automated Code Generators (Executive Briefing)  

Science.gov (United States)

A viewgraph presentation describing a certification tool to check the safety of auto-generated codes is shown. The topics include: 1) Auto-generated Code at NASA; 2) Safety of Auto-generated Code; 3) Technical Approach; and 4) Project Plan.

Denney, Ewen; Schumann, Johann; Greaves, Doug

2006-01-01

209

Makahiki+WattDepot : An open source software stack for next generation energy research and education  

DEFF Research Database (Denmark)

The accelerating world-wide growth in demand for energy has led to the conceptualization of a “smart grid”, where a variety of decentralized, intermittent, renewable energy sources (for example, wind, solar, and wave) would provide most or all of the power required by small-scale “micro-grids” servicing hundreds to thousands of consumers. Such a smart grid will require consumers to transition from passive to active participation in order to optimize the efficiency and effectiveness of the grid’s electrical capabilities. This paper presents a software stack comprised of two open source software systems, Makahiki and WattDepot, which together are designed to engage consumers in energy issues through a combination of education, real-time feedback, incentives, and game mechanics. We detail the novel features of Makahiki and WattDepot, along with our initial experiences using them to implement an energy challenge called the Kukui Cup.

Johnson, Philip M.; Xu, Yongwen

2013-01-01

210

Programa computacional para geração de séries sintéticas de precipitação / Software for generation of synthetic series of precipitation  

Scientific Electronic Library Online (English)

Full Text Available SciELO Brazil | Language: Portuguese Abstract in portuguese Desenvolveu-se um programa computacional que permite a aplicação da metodologia para geração de séries sintéticas de precipitação desenvolvida por OLIVEIRA (2003). O desenvolvimento do aplicativo foi viabilizado pela elaboração de um algoritmo computacional em ambiente de programação "Borland Delphi [...] 6.0". Os dados de entrada necessários são provenientes de banco de dados no formato padronizado pela Agência Nacional de Águas (ANA) com registros pluviométricos diários provenientes de estações meteorológicas. A partir dessas informações, o programa computacional é capaz de gerar séries sintéticas de precipitação diária contendo o total precipitado em milímetros, a duração do evento em horas, o tempo padronizado de ocorrência da intensidade máxima instantânea e a intensidade máxima instantânea padronizada. A série sintética gerada é armazenada em arquivos no formato "Texto" que podem ser acessados posteriormente por outros aplicativos e/ou planilhas eletrônicas. Além dos arquivos, são apresentadas várias informações na forma de gráficos e quadros, facilitando a avaliação do desempenho da metodologia desenvolvida. Abstract in english A computational model was developed to generate synthetic series of rainfall using the method developed by OLIVEIRA (2003). The software was developed in Borland Delphi 6.0 environment. The input data come from the daily precipitation data in the standardized format of the National Water Agency (ANA [...] ). The software is capable to generate synthetic series of daily rainfall containing the amount and the duration of the rainfall, and the standardized event time of the maximum instantaneous intensity. The generated synthetic series are stored in text-formatted files that may be accessed by others softwares and/or electronic datasheets. There were also presented graphs and tables format, to easily evaluate the performance of the method developed.

Sidney S., Zanetti; Fernando F., Pruski; Michel C., Moreira; Gilberto C., Sediyama; Demetrius D., Silva.

2005-04-01

211

Research on Object-oriented Software Testing Cases of Automatic Generation  

Directory of Open Access Journals (Sweden)

Full Text Available In the research on automatic generation of testing cases, there are different execution paths under drivers of different testing cases. The probability of these paths being executed is also different. For paths which are easy to be executed, more redundant testing case tend to be generated; But only fewer testing cases are generated for the control paths which are hard to be executed. Genetic algorithm can be used to instruct the automatic generation of testing cases. For the former paths, it can restrict the generation of these kinds of testing cases. On the contrary, the algorithm will encourage the generation of such testing cases as much as possible. So based on the study on the technology of path-oriented testing case automatic generation, the genetic algorithm is adopted to construct the process of automatic generation. According to the triggering path during the dynamic execution of program, the generated testing cases are separated into different equivalence class. The number of testing case is adjusted dynamicly by the fitness corresponding to the paths. The method can create a certain number of testing cases for each execution path to ensure the sufficiency. It also reduces redundant testing cases so it is an effective method for automatic generation of testing cases.

Junli Zhang

2013-11-01

212

Technical Background Material for the Wave Generation Software AwaSys 5  

Digital Repository Infrastructure Vision for European Research (DRIVER)

"Les Appareils Generateurs de Houle en Laboratorie" presented by Bi¶esel and Suquet in 1951 discussed and solved the analytical problems concerning a number of di®erent wave generator types. For each wave maker type the paper presented the transfer function between wave maker displacement and wave amplitude in those cases where the analytical problem could be solved. The article therefore represented a giant step in wave generation techniques and found the basis for today's wave generation ...

Frigaard, Peter; Andersen, Thomas Lykke

2010-01-01

213

Software tool for analysing the family shopping basket without candidate generation  

Directory of Open Access Journals (Sweden)

Full Text Available Tools leading to useful knowledge being obtained for supporting marketing decisions being taken are currently needed in the e-commerce environment. A process is needed for this which uses a series of techniques for data-processing; data-mining is one such technique enabling automatic information discovery. This work presents the association rules as a suitable technique for dis-covering how customers buy from a company offering business to consumer (B2C e-business, aimed at supporting decision-ma-king in supplying its customers or capturing new ones. Many algorithms such as A priori, DHP, Partition, FP-Growth and Eclat are available for implementing association rules; the following criteria were defined for selecting the appropriate algorithm: database insert, computational cost, performance and execution time. The development of a software tool is also presented which involved the CRISP-DM approach; this software tool was formed by the following four sub-modules: data pre-processing, data-mining, re-sults analysis and results application. The application design used three-layer architecture: presentation logic, business logic and service logic. Data warehouse design and algorithm design were included in developing this data-mining software tool. It was tested by using a FoodMart company database; the tests included performance, functionality and results’ validity, thereby allo-wing association rules to be found. The results led to concluding that using association rules as a data mining technique facilita-tes analysing volumes of information for B2C e-business services which represents a competitive advantage for those companies using Internet as their sales’ media.

Roberto Carlos Naranjo Cuervo

2010-05-01

214

Characterization of the 1st and 2nd EF-hands of NADPH oxidase 5 by fluorescence, isothermal titration calorimetry, and circular dichroism  

Directory of Open Access Journals (Sweden)

Full Text Available Abstract Background Superoxide generated by non-phagocytic NADPH oxidases (NOXs is of growing importance for physiology and pathobiology. The calcium binding domain (CaBD of NOX5 contains four EF-hands, each binding one calcium ion. To better understand the metal binding properties of the 1st and 2nd EF-hands, we characterized the N-terminal half of CaBD (NCaBD and its calcium-binding knockout mutants. Results The isothermal titration calorimetry measurement for NCaBD reveals that the calcium binding of two EF-hands are loosely associated with each other and can be treated as independent binding events. However, the Ca2+ binding studies on NCaBD(E31Q and NCaBD(E63Q showed their binding constants to be 6.5 × 105 and 5.0 × 102 M-1 with ?Hs of -14 and -4 kJ/mol, respectively, suggesting that intrinsic calcium binding for the 1st non-canonical EF-hand is largely enhanced by the binding of Ca2+ to the 2nd canonical EF-hand. The fluorescence quenching and CD spectra support a conformational change upon Ca2+ binding, which changes Trp residues toward a more non-polar and exposed environment and also increases its ?-helix secondary structure content. All measurements exclude Mg2+-binding in NCaBD. Conclusions We demonstrated that the 1st non-canonical EF-hand of NOX5 has very weak Ca2+ binding affinity compared with the 2nd canonical EF-hand. Both EF-hands interact with each other in a cooperative manner to enhance their Ca2+ binding affinity. Our characterization reveals that the two EF-hands in the N-terminal NOX5 are Ca2+ specific. Graphical abstract

Wei Chin-Chuan

2012-04-01

215

Conference of the Liber Groupe des Cartothecaires, Paris 29th August-2nd September 2006  

Directory of Open Access Journals (Sweden)

Full Text Available Following on from Copenhagen (2000, Helsinki (2002 and Cambridge (2004, the meeting of the "Groupe des Cartothécaires" took place in Paris from the 29th of August to the 2nd of September 2006. It was hosted by the Bibliothèque nationale de France ( BnF, with the support of Institut Géographique National ( IGN, Service Historique de la Défense ( SHD, Archives nationales and the Library of the Château de Chantilly. The Conference was organised by the Comité Français de Cartographie (CFC. A committee was set up within the CFC, which organised the scientific program and participated in the general organisation.

Helène Richard

2007-04-01

216

Future Control and Automation : Proceedings of the 2nd International Conference on Future Control and Automation  

CERN Document Server

This volume Future Control and Automation- Volume 1 includes best papers selected from 2012 2nd International Conference on Future Control and Automation (ICFCA 2012) held on July 1-2, 2012, Changsha, China. Future control and automation is the use of control systems and information technologies to reduce the need for human work in the production of goods and services. This volume can be divided into five sessions on the basis of the classification of manuscripts considered, which is listed as follows: Identification and Control, Navigation, Guidance and Sensor, Simulation Technology, Future Telecommunications and Control

2012-01-01

217

Group field theory as the 2nd quantization of Loop Quantum Gravity  

CERN Document Server

We construct a 2nd quantized reformulation of canonical Loop Quantum Gravity at both kinematical and dynamical level, in terms of a Fock space of spin networks, and show in full generality that it leads directly to the Group Field Theory formalism. In particular, we show the correspondence between canonical LQG dynamics and GFT dynamics leading to a specific GFT model from any definition of quantum canonical dynamics of spin networks. We exemplify the correspondence of dynamics in the specific example of 3d quantum gravity. The correspondence between canonical LQG and covariant spin foam models is obtained via the GFT definition of the latter.

Oriti, Daniele

2013-01-01

218

The Development of Information Literacy Assessment for 2nd Grade Students and Their Performance  

Digital Repository Infrastructure Vision for European Research (DRIVER)

The main purpose of this study was to develop an Information Literacy Assessment for 2nd-grade students and evaluate their performance. The assessment included a regular test and a portfolio assessment. There were 30 multiple-choice items and 3 constructed-response items in the test, while the portfolio assessment was based on the Super3 model. This study was conducted in an elementary school located in southern Taiwan. One hundred and forty-two second graders took the test, and only one clas...

Lin Ching Chen; Yu-Pin Chen

2013-01-01

219

2nd irradiation test and PIEs for developing neutron absorbing and burnable poison materials  

International Nuclear Information System (INIS)

DyxTiyOz and GdxTiyOz have been developed as neutron absorbing and burnable poison materials since these lanthanoid oxides have been considered as good irradiation resistant materials. DyxTiyOz has been used as a control rod material whereas GdxTiyOz is considered as a burnable poison material. The feasibility study of the 1st irradiation and PIE results was reported in. The present paper describes the 2nd irradiation test results including a safety analysis and some preliminary PIE results

220

PREFACE: 2nd International Meeting for Researchers in Materials and Plasma Technology  

Science.gov (United States)

These proceedings present the written contributions of the participants of the 2nd International Meeting for Researchers in Materials and Plasma Technology, 2nd IMRMPT, which was held from February 27 to March 2, 2013 at the Pontificia Bolivariana Bucaramanga-UPB and Santander and Industrial - UIS Universities, Bucaramanga, Colombia, organized by research groups from GINTEP-UPB, FITEK-UIS. The IMRMPT, was the second version of biennial meetings that began in 2011. The three-day scientific program of the 2nd IMRMPT consisted in 14 Magisterial Conferences, 42 Oral Presentations and 48 Poster Presentations, with the participation of undergraduate and graduate students, professors, researchers and entrepreneurs from Colombia, Russia, France, Venezuela, Brazil, Uruguay, Argentina, Peru, Mexico, United States, among others. Moreover, the objective of IMRMPT was to bring together national and international researchers in order to establish scientific cooperation in the field of materials science and plasma technology; introduce new techniques of surface treatment of materials to improve properties of metals in terms of the deterioration due to corrosion, hydrogen embrittlement, abrasion, hardness, among others; and establish cooperation agreements between universities and industry. The topics covered in the 2nd IMRMPT include New Materials, Surface Physics, Laser and Hybrid Processes, Characterization of Materials, Thin Films and Nanomaterials, Surface Hardening Processes, Wear and Corrosion / Oxidation, Modeling, Simulation and Diagnostics, Plasma Applications and Technologies, Biomedical Coatings and Surface Treatments, Non Destructive Evaluation and Online Process Control, Surface Modification (Ion Implantation, Ion Nitriding, PVD, CVD). The editors hope that those interested in the are of materials science and plasma technology, enjoy the reading that reflect a wide range of topics. It is a pleasure to thank the sponsors and all the participants and contributors for making possible this international meeting of researchers. It should be noted that the event organized by UIS and UPB universities, through their research groups FITEK and GINTEP, was a very significant contribution to the national and international scientific community, achieving the interaction of different research groups from academia and business sector. On behalf of the research groups GINTEP - UPB and FITEK - UIS, we greatly appreciate the support provided by the Sponsors, who allowed to continue with the dream of research. Ely Dannier V-Nitilde no The Editor The PDF file also contains a list of committees and sponsors.

Niño, Ely Dannier V.

2013-11-01

 
 
 
 
221

2nd workshop on Wendelstein VII-X, Schloss Ringberg, Bavaria, 13-16 June 1988  

International Nuclear Information System (INIS)

This IPP-Report is based on the 'Summary of the Workshop' by H. Wobig, and contains a number of figures and tables from contributed papers with some short descriptive remarks. About 40 papers were presented at the 2nd Workshop on Wendelstein VII-X. The programme of the workshop is given in appendix 1. There were nearly 50 participants as listed in appendix 2, several of them on a part-time basis. Appendix 3 gives the correspondence for the numbers of figures and tables to those contained in the contributions to the workshop. (orig.)

222

Future Control and Automation : Proceedings of the 2nd International Conference on Future Control and Automation  

CERN Document Server

This volume Future Control and Automation- Volume 2 includes best papers from 2012 2nd International Conference on Future Control and Automation (ICFCA 2012) held on July 1-2, 2012, Changsha, China. Future control and automation is the use of control systems and information technologies to reduce the need for human work in the production of goods and services. This volume can be divided into six sessions on the basis of the classification of manuscripts considered, which is listed as follows: Mathematical Modeling, Analysis and Computation, Control Engineering, Reliable Networks Design, Vehicular Communications and Networking, Automation and Mechatronics.

2012-01-01

223

Radcalc: Transportation packaging software to determine hydrogen generation and transportation classification  

Energy Technology Data Exchange (ETDEWEB)

Ionizing radiation present in high-level and low-level radioactive wastes may cause the radiolytic formation of hydrogen gas. The safe transportation of radioactive wastes, therefore, requires a reliable method to determine the quantity of hydrogen in sealed packages. However, direct measurement of hydrogen buildup in packages is not always possible and can be time consuming and costly. The U.S. Nuclear Regulatory Commission recognized these difficulties and accepted the G value method for calculating hydrogen gas generated in containers. Radcalc for Windows is a computer program that uses the G value method to calculate hydrogen gas generation in packages. The program calculates the hydrogen generation rate, hydrogen gas volume, related pressure, and decay heat generation rate in a package. It also incorporates a robust decay algorithm that calculates the activity of a radionuclide and its associated products over a specified period of time.

Green, J.R.; Hillesland, K.E.; Roetman, V.E.; Field, J.G. [Westinghouse Hanford Co., Richland, WA (United States)

1996-12-31

224

Radcalc: Transportation packaging software to determine hydrogen generation and transportation classification  

International Nuclear Information System (INIS)

Ionizing radiation present in high-level and low-level radioactive wastes may cause the radiolytic formation of hydrogen gas. The safe transportation of radioactive wastes, therefore, requires a reliable method to determine the quantity of hydrogen in sealed packages. However, direct measurement of hydrogen buildup in packages is not always possible and can be time consuming and costly. The U.S. Nuclear Regulatory Commission recognized these difficulties and accepted the G value method for calculating hydrogen gas generated in containers. Radcalc for Windows is a computer program that uses the G value method to calculate hydrogen gas generation in packages. The program calculates the hydrogen generation rate, hydrogen gas volume, related pressure, and decay heat generation rate in a package. It also incorporates a robust decay algorithm that calculates the activity of a radionuclide and its associated products over a specified period of time

225

Software tools for automatic generation of finite element mesh and application of biomechanical calculation in medicine  

Directory of Open Access Journals (Sweden)

Full Text Available Cardiovascular diseases are common and a special difficulty in their curing is diagnostics. Modern medical instruments can provide data that is much more adequate for computer modeling. Computer simulations of blood flow through the cardiovascular organs give powerful advantages to scientists today. The motivation for this work is raw data that our Center recently received from the University Clinical center in Heidelberg from a multislice CT scanner. In this work raw data from CT scanner was used for creating a 3D model of the aorta. In this process we used Gmsh, TetGen (Hang Si as well as our own software tools, and the result was the 8-node (brick mesh on which the calculation was run. The results obtained were very satisfactory so...

Milašinovi? Danko Z.

2008-01-01

226

Proceedings of the 2nd technical meeting on high temperature gas-cooled reactors  

International Nuclear Information System (INIS)

From the point of view for establishing and upgrading the technology basis of HTGRs, the 2nd Technical Meeting on High Temperature Gas-cooled Reactors (HTGRs) was held on March 11 and 12, 1992, in Tokai Research Establishment in order to review the present status and the results of Research and Development (R and D) of HTGRs, to discuss on the items of R and D which should be promoted more positively in the future and then, to help in determining the strategy of development of high temperature engineering and examination in JAERI. At the 2nd Technical Meeting, which followed the 1st Technical Meeting held in February 1990 in Tokai Research Establishment, expectations to the High Temperature Engineering Test Reactor (HTTR), possible contributions of the HTGRs to the preservation of the global environment and the prospect of HTGRs were especially discussed, focusing on the R and D of Safety, high temperature components and process heat utilization by the experts from JAERI as well as universities, national institutes, industries and so on. This proceedings summarizes the papers presented in the oral sessions and materials exhibited in the poster session at the meeting and will be variable as key materials for promoting the R and D on HTGRs from now on. (author)

227

Proceedings of the 2nd JAERI symposium on HTGR technologies October 21 ? 23, 1992, Oarai, Japan  

International Nuclear Information System (INIS)

The Japan Atomic Energy Research Institute (JAERI) held the 2nd JAERI Symposium on HTGR Technologies on October 21 to 23, 1992, at Oarai Park Hotel at Oarai-machi, Ibaraki-ken, Japan, with support of International Atomic Energy Agency (IAEA), Science and Technology Agency of Japan and the Atomic Energy Society of Japan on the occasion that the construction of the High Temperature Engineering Test Reactor (HTTR), which is the first high temperature gas-cooled reactor (HTGR) in Japan, is now being proceeded smoothly. In this symposium, the worldwide present status of research and development (R and D) of the HTGRs and the future perspectives of the HTGR development were discussed with 47 papers including 3 invited lectures, focusing on the present status of HTGR projects and perspectives of HTGR Development, Safety, Operation Experience, Fuel and Heat Utilization. A panel discussion was also organized on how the HTGRs can contribute to the preservation of global environment. About 280 participants attended the symposium from Japan, Bangladesh, Germany, France, Indonesia, People's Republic of China, Poland, Russia, Switzerland, United Kingdom, United States of America, Venezuela and the IAEA. This paper was edited as the proceedings of the 2nd JAERI Symposium on HTGR Technologies, collecting the 47 papers presented in the oral and poster sessions along with 11 panel exhibitions on the results of research and development associated to the HTTR. (author)

228

A Customer Value Creation Framework for Businesses That Generate Revenue with Open Source Software  

Directory of Open Access Journals (Sweden)

Full Text Available Technology entrepreneurs must create value for customers in order to generate revenue. This article examines the dimensions of customer value creation and provides a framework to help entrepreneurs, managers, and leaders of open source projects create value, with an emphasis on businesses that generate revenue from open source assets. The proposed framework focuses on a firm's pre-emptive value offering (also known as a customer value proposition. This is a firm's offering of the value it seeks to create for a customer, in order to meet his or her requirements.

Aparna Shanker

2012-03-01

229

Efficient FPGA implementation of 2nd order digital controllers using Matlab/Simulink  

Directory of Open Access Journals (Sweden)

Full Text Available This paper explains a method for the design and implementation of digital controller based on Field Programmable Gate Array (FPGA device. It is more compact, power efficient and provides high speed capabilities as compared to software based PID controllers. The proposed method is based on implementation of Digital controller as digital filters using DSP architectures. The PID controller is designed using MATLAB and Simulink to generate a set of coefficients associated with the desired controller characteristics. The controller coefficients are then included in VHDL that implements the PID controller on to FPGA. MATLAB program is used to design PID controller to calculate and plot the time response of the control system. The synthesis report concludes the resource utilization of selected FPGA.

Vikas gupta

2011-08-01

230

Net Generation at Social Software: Challenging Assumptions, Clarifying Relationships and Raising Implications for Learning  

Science.gov (United States)

This paper takes as its starting point assumptions about use of information and communication technology (ICT) by people born after 1983, the so called net generation. The focus of the paper is on social networking. A questionnaire survey was carried out with 1070 students from schools in Eastern Finland. Data are presented on students' ICT-skills…

Valtonen, Teemu; Dillon, Patrick; Hacklin, Stina; Vaisanen, Pertti

2010-01-01

231

Massive coordination of dispersed generation using PowerMatcher based software agents  

International Nuclear Information System (INIS)

One of the outcomes of the EU-Fifth framework CRISP-project (http://crisp.ecn.nl/), has been the development of a real-time control strategy based on the application of distributed intelligence (ICT) to coordinate demand and supply in electricity grids. This PowerMatcher approach has been validated in two real-life and real-time field tests. The experiments aimed at controlled coordination of dispersed electricity suppliers (DG-RES) and demanders in distributed grids enabled by ICT-networks. Optimization objectives for the technology in the tests were minimization of imbalance in a commercial portfolio and mitigation of strong load variations in a distribution network with residential micro-CHPs. With respect to the number of ICT-nodes, the field tests were on a relatively small-scale. However, application of the technology has yielded some very encouraging results in both occasions. In the present paper, lessons learned from the field experiments are discussed. Furthermore, it contains an account of the roadmap for scaling up these field-tests with a larger number of nodes and with more diverse appliance/installation types. Due to its autonomous decision making agent-paradigm, the PowerMatcher software technology is expected to be widely more scaleable than central coordination approaches. Indeed, it is based on microeconomic theory and is expected to work best if it is applied on a massive scale in transparent market settings. A set of various types of supply and des. A set of various types of supply and demand appliances was defined and implemented in a PowerMatcher software simulation environment. A massive amount of these PowerMatcher node-agents each representing such a devicetype was utilized in a number of scenario calculations. As the production of DG-RES-resources and the demand profiles are strongly dependent on the time-of-year, climate scenarios leading to operational snapshots of the cluster were taken for a number of representative periods. The results of these larger scale simulations as well as scalability issues, encountered, are discussed. Further issues covered are the stability of the system as reflected by the internal price development pattern that acts as an 'invisible hand' to reach the common optimisation goal. Finally, the effects of scaling-up the technology are discussed in terms of possible 'emergent behaviour' of subsets in the cluster and primary process quality of appliances operating concertedly using the PowerMatcher

232

Radcalc for windows benchmark study: A comparison of software results with Rocky Flats hydrogen gas generation data  

Energy Technology Data Exchange (ETDEWEB)

Radcalc for Windows Version 2.01 is a user-friendly software program developed by Waste Management Federal Services, Inc., Northwest Operations for the U.S. Department of Energy (McFadden et al. 1998). It is used for transportation and packaging applications in the shipment of radioactive waste materials. Among its applications are the classification of waste per the US. Department of Transportation regulations, the calculation of decay heat and daughter products, and the calculation of the radiolytic production of hydrogen gas. The Radcalc program has been extensively tested and validated (Green et al. 1995, McFadden et al. 1998) by comparison of each Radcalc algorithm to hand calculations. An opportunity to benchmark Radcalc hydrogen gas generation calculations to experimental data arose when the Rocky Flats Environmental Technology Site (RFETS) Residue Stabilization Program collected hydrogen gas generation data to determine compliance with requirements for shipment of waste in the TRUPACT-II (Schierloh 1998). The residue/waste drums tested at RFETS contain contaminated, solid, inorganic materials in polyethylene bags. The contamination is predominantly due to plutonium and americium isotopes. The information provided by Schierloh (1 998) of RFETS includes decay heat, hydrogen gas generation rates, calculated G{sub eff} values, and waste material type, making the experimental data ideal for benchmarking Radcalc. The following sections discuss the RFETS data and the Radcalc cases modeled with the data. Results are tabulated and also provided graphically.

MCFADDEN, J.G.

1999-07-19

233

Radcalc for windows benchmark study: A comparison of software results with Rocky Flats hydrogen gas generation data  

International Nuclear Information System (INIS)

Radcalc for Windows Version 2.01 is a user-friendly software program developed by Waste Management Federal Services, Inc., Northwest Operations for the U.S. Department of Energy (McFadden et al. 1998). It is used for transportation and packaging applications in the shipment of radioactive waste materials. Among its applications are the classification of waste per the US. Department of Transportation regulations, the calculation of decay heat and daughter products, and the calculation of the radiolytic production of hydrogen gas. The Radcalc program has been extensively tested and validated (Green et al. 1995, McFadden et al. 1998) by comparison of each Radcalc algorithm to hand calculations. An opportunity to benchmark Radcalc hydrogen gas generation calculations to experimental data arose when the Rocky Flats Environmental Technology Site (RFETS) Residue Stabilization Program collected hydrogen gas generation data to determine compliance with requirements for shipment of waste in the TRUPACT-II (Schierloh 1998). The residue/waste drums tested at RFETS contain contaminated, solid, inorganic materials in polyethylene bags. The contamination is predominantly due to plutonium and americium isotopes. The information provided by Schierloh (1 998) of RFETS includes decay heat, hydrogen gas generation rates, calculated Geff values, and waste material type, making the experimental data ideal for benchmarking Radcalc. The following sections discuss the RFETS data and the Radcalc cases modeled with the data. Results are tabulated and also provided graphically

234

Radcalc for windows benchmark study: A comparison of software results with Rocky Flats hydrogen gas generation data  

International Nuclear Information System (INIS)

Radcalc for Windows Version 2.01 is a user-friendly software program developed by Waste Management Federal Services, Inc., Northwest Operations for the U.S. Department of Energy (McFadden et al. 1998). It is used for transportation and packaging applications in the shipment of radioactive waste materials. Among its applications are the classification of waste per the US. Department of Transportation regulations, the calculation of decay heat and daughter products, and the calculation of the radiolytic production of hydrogen gas. The Radcalc program has been extensively tested and validated (Green et al. 1995, McFadden et al. 1998) by comparison of each Radcalc algorithm to hand calculations. An opportunity to benchmark Radcalc hydrogen gas generation calculations to experimental data arose when the Rocky Flats Environmental Technology Site (RFETS) Residue Stabilization Program collected hydrogen gas generation data to determine compliance with requirements for shipment of waste in the TRUPACT-II (Schierloh 1998). The residue/waste drums tested at RFETS contain contaminated, solid, inorganic materials in polyethylene bags. The contamination is predominantly due to plutonium and americium isotopes. The information provided by Schierloh (1 998) of RFETS includes decay heat, hydrogen gas generation rates, calculated G(sub eff) values, and waste material type, making the experimental data ideal for benchmarking Radcalc. The following sections discuss the RFETS data and theng sections discuss the RFETS data and the Radcalc cases modeled with the data. Results are tabulated and also provided graphically

235

Development of new generation software tools for simulation of electron beam formation in novel high power gyrotrons  

International Nuclear Information System (INIS)

Computer aided design (CAD) based on numerical experiments performed by using adequate physical models and efficient simulation codes is an indispensable tool for development, investigation, and optimization of gyrotrons used as radiation sources for electron cyclotron resonance heating (ECRH) of fusion plasmas. In this paper, we review briefly the state-of-the-art in the field of modelling and simulation of intense, relativistic, helical electron beams formed in the electron-optical systems (EOS) of powerful gyrotrons. We discuss both the limitations of the known computer codes and the requirements for increasing their capabilities for solution of various design problems that are being envisaged in the development of the next generation gyrotrons for ECRH. Moreover, we present the concept followed by us in an attempt to unite the advantages of the modern programming techniques with self-consistent, first-principles 3D physical models in the creation of a new highly efficient and versatile software package for simulation of powerful gyrotrons

236

Development of new generation software tools for simulation of electron beam formation in novel high power gyrotrons  

Energy Technology Data Exchange (ETDEWEB)

Computer aided design (CAD) based on numerical experiments performed by using adequate physical models and efficient simulation codes is an indispensable tool for development, investigation, and optimization of gyrotrons used as radiation sources for electron cyclotron resonance heating (ECRH) of fusion plasmas. In this paper, we review briefly the state-of-the-art in the field of modelling and simulation of intense, relativistic, helical electron beams formed in the electron-optical systems (EOS) of powerful gyrotrons. We discuss both the limitations of the known computer codes and the requirements for increasing their capabilities for solution of various design problems that are being envisaged in the development of the next generation gyrotrons for ECRH. Moreover, we present the concept followed by us in an attempt to unite the advantages of the modern programming techniques with self-consistent, first-principles 3D physical models in the creation of a new highly efficient and versatile software package for simulation of powerful gyrotrons.

Sabchevski, S [Institute of Electronics, Bulgarian Academy of Sciences, BG-1784 Sofia (Bulgaria); Zhelyazkov, I [Faculty of Physics, Sofia University, BG-1164 Sofia (Bulgaria); Benova, E [Faculty of Physics, Sofia University, BG-1164 Sofia (Bulgaria); Atanassov, V [Institute of Electronics, Bulgarian Academy of Sciences, BG-1784 Sofia (Bulgaria); Dankov, P [Faculty of Physics, Sofia University, BG-1164 Sofia (Bulgaria); Thumm, M [Forschungszentrum Karlsruhe, Association EURATOM-FZK, Institute for Pulsed Power and Microwave Technology, D-76021 Karlsruhe (Germany); Dammertz, G [University of Karlsruhe, Institute of High Frequency Techniques and Electronics, D-76128 Karlsruhe (Germany); Piosczyk, B [Forschungszentrum Karlsruhe, Association EURATOM-FZK, Institute for Pulsed Power and Microwave Technology, D-76021 Karlsruhe (Germany); Illy, S [Forschungszentrum Karlsruhe, Association EURATOM-FZK, Institute for Pulsed Power and Microwave Technology, D-76021 Karlsruhe (Germany); Tran, M Q [Centre de Recherches en Physique des Plasmas, Association EURATOM-CRPP, Ecole Polytechnique Federale de Lausanne, CH-1015 Lausanne (Switzerland); Alberti, S [Centre de Recherches en Physique des Plasmas, Association EURATOM-CRPP, Ecole Polytechnique Federale de Lausanne, CH-1015 Lausanne (Switzerland); Hogge, J-Ph [Centre de Recherches en Physique des Plasmas, Association EURATOM-CRPP, Ecole Polytechnique Federale de Lausanne, CH-1015 Lausanne (Switzerland)

2006-07-15

237

Multigrid preconditioning of the generator two-phase mixture balance equations in the Genepi software  

International Nuclear Information System (INIS)

In the framework of the two-phase fluid simulations of the steam generators of pressurized water nuclear reactors, we present in this paper a geometric version of a pseudo-Full MultiGrid (pseudo- FMG) Full Approximation Storage (FAS) preconditioning of balance equations in the GENEPI code. In our application, the 3D steady state flow is reached by a transient computation using a semi-implicit fractional step algorithm for the averaged two-phase mixture balance equations (mass, momentum and energy for the secondary flow). Our application, running on workstation clusters, is based on a CEA code-linker and the PVM package. The difficulties to apply the geometric FAS multigrid method to the momentum and mass balance equations are addressed. The use of a sequential pseudo-FMG FAS twogrid method for both energy and mass/momentum balance equations, using dynamic multigrid cycles, leads to perceptibly improvements in the computation convergences. An original parallel red-black pseudo-FMG FAS three-grid algorithm is presented too. The numerical tests (steam generator mockup simulations) underline the sizable increase in speed of convergence of the computations, essentially for the ones involving a large number of freedom degrees (about 100 thousand cells). The two-phase mixture balance equation residuals are quickly reduced: the reached speed-up stands between 2 and 3 following the number of grids. The effects on the convergence behavior of the numerical parameters are investior of the numerical parameters are investigated

238

A Facilitated Interface to Generate a Combined Textual and Graphical Database System Using Widely Available Software  

Directory of Open Access Journals (Sweden)

Full Text Available Data-Base Management System (DBMS is the current standard for storing information. A DBMS organizes and maintains a structure of storage of data. Databases make it possible to store vast amounts of randomly created information and then retrieve items using associative reasoning in search routines. However, design of databases is cumbersome. If one is to use a database primarily to directly input information, each field must be predefined manually, and the fields must be organized to permit coherent data input. This static requirement is problematic and requires that database table(s be predefined and customized at the outset, a difficult proposition since current DBMS lack a user friendly front end to allow flexible design of the input model. Furthermore, databases are primarily text based, making it difficult to process graphical data. We have developed a general and nonproprietary approach to the problem of input modeling designed to make use of the known informational architecture to map data to a database and then retrieve the original document in freely editable form. We create form templates using ordinary word processing software: Microsoft InfoPath 2007. Each field in the form is given a unique name identifier in order to be distinguished in the database. It is possible to export text based documents created initially in Microsoft Word by placing a colon at the beginning of any desired field location. InfoPath then captures the preceding string and uses it as the label for the field. Each form can be structured in a way to include any combination of both textual and graphical fields. We input data into InfoPath templates. We then submit the data through a web service to populate fields in an SQL database. By appropriate indexing, we can then recall the entire document from the SQL database for editing, with corresponding audit trail. Graphical data is handled no differently than textual data and is embedded in the database itself permitting direct query approaches. This technique makes it possible for general users to benefit from a combined text-graphical database environment with a flexible non-proprietary interface. Consequently, any template can be effortlessly transformed to a database system and easily recovered in a narrative form.

Corey Lawson

2012-10-01

239

Summary of the 2nd workshop on ion beam-applied biology  

International Nuclear Information System (INIS)

Induction of novel plant resources by ion beam-irradiation has been investigated in JAERI. To share the knowledge of the present status of the field, and to find out future plants, 1st Workshop on ion beam-applied biology was held last year titled as ''Development of breeding technique for ion beams''. To further improve the research cooperation and to exchange useful information in the field, researchers inside JAERI and also with researchers outside, such as those from agricultural experiment stations, companies, and Universities met each other at the 2nd workshop on ion beam-applied biology titled as ''Future development of breeding technique for ion beams''. People from RIKEN, Institute of Radiation Breeding, Wakasa wan Energy Research Center, National Institute of Radiological Science also participated in this workshop. The 12 of the presented papers are indexed individually. (J.P.N.)

240

Simultaneous vibrant soundbridge implantation and 2nd stage auricular reconstruction for microtia with aural atresia  

Directory of Open Access Journals (Sweden)

Full Text Available Aural atresia and severe microtia are associated malformations that result in problems with hearing and cosmesis, associated speech and language difficulties and diminished self-esteem. In cases where middle ear ossiculoplasty and aural atresia canalplasty are expected to give poor hearing outcomes that would eventually require the use of hearing aids, bone anchored hearing aids or active middle ear implants may be better options. This case report describes a simultaneous Vibrant Soundbridge implantation and 2nd stage auricular reconstruction with rib graft cartilage for an 11-year-old boy with grade III microtia and aural atresia 8 months after the 1st stage reconstruction. Audiometric results of the Vibrant Soundbridge aided ear were comparable to that of the contralateral hearing aid aided ear.

Jocelynne del Prado

2011-07-01

 
 
 
 
241

Glass fiber laser at 1. 36. mu. m from SiO sub 2 :Nd  

Energy Technology Data Exchange (ETDEWEB)

By adding 14 mol % P{sub 2}O{sub 5} to the core of a SiO{sub 2}:Nd fiber, laser emission was obtained at 1.36 {mu}m. From the fluorescent spectra and laser thresholds for the {sup 4}{ital F}{sub 3/2} to {sup 4}{ital I}{sub 11/2} and {sup 4}{ital F}{sub 3/2} to {sup 4}{ital I}{sub 3/2} transitions, the net gain at 1.36 {mu}m is 0.024 dB/mW, and the ratio of excited-state absorption (the {sup 4}{ital F}{sub 3/2} to {sup 4}{ital G}{sub 1/2} transition) to stimulated emission is estimated to be 0.78.

Hakimi, F.; Po, H.; Tumminelli, R.; McCollum, B.C.; Zenteno, L.; Cho, N.M.; Snitzer, E. (Polaroid Corporation, 38 Henry Street, Cambridge, Massachusetts 02139 (US))

1989-10-01

242

A critical discussion of the 2nd intercomparison on electron paramagnetic resonance dosimetry with tooth enamel  

International Nuclear Information System (INIS)

Recently, we have participated in 'The 2nd International Intercomparison on EPR Tooth Dosimetry' wherein 18 laboratories had to evaluate low-radiation doses (100-1000 mGy) in intact teeth (Wieser et al., Radiat. Meas., 32 (2000a) 549). The results of this international intercomparison seem to indicate a promising picture of EPR tooth dosimetry. In this paper, the two Belgian EPR participants present a more detailed and critical study of their contribution to this intercomparison. The methods used were maximum likelihood common factor analysis (MLCFA) and spectrum subtraction. Special attention is paid to potential problems with sample preparation, intrinsic dose evaluation, linearity of the dose response, and determination of dose uncertainties

243

2nd FP7 Conference and International Summer School Nanotechnology : From Fundamental Research to Innovations  

CERN Document Server

This book presents some of the latest achievements in nanotechnology and nanomaterials from leading researchers in Ukraine, Europe, and beyond. It features contributions from participants in the 2nd International Summer School “Nanotechnology: From Fundamental Research to Innovations” and International Research and Practice Conference “Nanotechnology and Nanomaterials”, NANO-2013, which were held in Bukovel, Ukraine on August 25-September 1, 2013. These events took place within the framework of the European Commission FP7 project Nanotwinning, and were organized jointly by the Institute of Physics of the National Academy of Sciences of Ukraine, University of Tartu (Estonia), University of Turin (Italy), and Pierre and Marie Curie University (France). Internationally recognized experts from a wide range of universities and research institutions share their knowledge and key results on topics ranging from nanooptics, nanoplasmonics, and interface studies to energy storage and biomedical applications. Pr...

Yatsenko, Leonid

2015-01-01

244

2. slovenski MoodleMoot = 2nd Slovenian MoodleMoot  

Directory of Open Access Journals (Sweden)

Full Text Available Moodle, an open source learning management system, is becoming widely used and recognised all over the world. Slovenian Moodle users have been participating and sharing their experience in the Moodle.si community since 2006. The initiator of the Moodle.si community – the Faculty of Management Koper organised the first Slovenian MoodleMoot Conference last year. The event was organised again in May 2008. The conference was organised by the Centre for E-Learning of the Faculty of Management Koper in co-operation with the Open Source Centre – Slovenia, Artesia and the National School for Leadership in Education. This paper presents the 2nd International Moodle.si Conference.

Viktorija Sulcic

2008-09-01

245

Proceedings of the 2nd joint seminar on atomic collisions and heavy ion induced nuclear reactions  

International Nuclear Information System (INIS)

The meeting of the 2nd joint seminar on atomic collisions and heavy ion induced nuclear reactions was held at the University of Tokyo, May 13 and 14, 1982. The aim of this seminar has been not only to recognize the common problems lying between above two research fields, but also to obtain an overview of the theoretical and experimental approaches to clear the current problems. In the seminar, more than 50 participants gathered and presented 16 papers. These are two general reviews and fourteen comprehensive surveys on topical subjects which have been developed very intensively in recent years. The editors would like to thank all participants for their assistance and cooperation in making possible a publication of these proceedings. (author)

246

Proceedings of the 2nd seminar of R and D on advanced ORIENT  

International Nuclear Information System (INIS)

The 2nd Seminar of R and D on advanced ORIENT was held at Ricotte, on November 7th, 2008, Japan Atomic Energy Agency. The first meeting of this seminar was held on Oarai, Ibaraki on May, 2008, and more than fifty participants including related researchers and general public people were attended to this seminar. The second seminar has headed by Nuclear Science and Engineering Directorate, JAEA on Tokai, Ibaraki with 63 participants. Spent nuclear fuel should be recognized not only mass of radioactive elements but also potentially useful materials including platinum metals and rare earth elements. Taking the cooperation with universities related companies and research institutes, into consideration, we aimed at expanding and progressing the basic researches. This report records abstracts and figures submitted from the oral speakers in this seminar. (author)

247

Nonlinear Dynamics of Memristor Based 2nd and 3rd Order Oscillators  

Exceptional behaviours of Memristor are illustrated in Memristor based second order (Wien oscillator) and third order (phase shift oscillator) oscillator systems in this Thesis. Conventional concepts about sustained oscillation have been argued by demonstrating the possibility of sustained oscillation with oscillating resistance and dynamic poles. Mathematical models are also proposed for analysis and simulations have been presented to support the surprising characteristics of the Memristor based oscillator systems. This thesis also describes a comparative study among the Wien family oscillators with one Memristor. In case of phase shift oscillator, one Memristor and three Memristors systems are illustrated and compared to generalize the nonlinear dynamics observed for both 2nd order and 3rd order system. Detail explanations are provided with analytical models to simplify the unconventional properties of Memristor based oscillatory systems.

Talukdar, Abdul Hafiz

2011-05-01

248

2nd symposium on materials research 1991. Papers and posters. Vol. 3  

International Nuclear Information System (INIS)

With the '2nd symposium on materials research' the technological status of the Federal Republic was to be documented and balanced in the area of the new materials. Through overview lectures and subject-related lectures, results of fundamental research up to practical material developments are introduced. In the first volume, the polymers and metals topic circles are discussed; in the second volume, the ceramic materials, composites as well as the measurement technology, the testing method and analysis engineering; and in the third volume thin film technology and tribology. This was followed by a poster presentation (286 posters) on the subject of ceramic materials, powder metallurgy, high temperature and special materials, composites and new polymers. (MM)

249

Black Hole Evaporation and Generalized 2nd Law with Nonequilibrium Thermodynamics  

CERN Document Server

In general, when a black hole evaporates, there arises a net energy flow from black hole into its outside environment due to Hawking radiation and energy accretion onto black hole. The existence of energy flow means that the thermodynamic state of the whole system, which consists of a black hole and its environment, is in a nonequilibrium state. To know the detail of evaporation process, the nonequilibrium effects of energy flow should be taken into account. The nonequilibrium nature of black hole evaporation is a challenging topic including issues of not only black hole physics but also nonequilibrium physics. Using the nonequilibrium thermodynamics which has been formulated recently, this report shows: (1) the self-gravitational effect of black hole which appears as its negative heat capacity guarantees the validity of generalized 2nd law without entropy production inside the outside environment, (2) the nonequilibrium effect of energy flow tends to shorten the evaporation time (life time) of black hole, an...

Saida, Hiromi

2007-01-01

250

Results of the 2nd regular inspection of the experimental fast reactor Joyo  

International Nuclear Information System (INIS)

The 2nd regular inspection and also some facility modification were made on the experimental fast reactor Joyo from August, 1980, to March, 1981. To secure the safety of the reactor and its stable operation, the inspection involved the overhaul and inspection of components, piping and instrumentation, and the tests of function, leakage, calibration and performance. The reactor and the facilities inspected were reactor proper, fuel facility, cooling system, measurement and control systems, waste facility, radiation control facility, reactor containment, etc. Both facilities and performance of Joyo passed the regular inspection. The following matters are described: the contents of the inspection and the results (facilities and performance); the works done for the installation, etc. during the inspection; and radiation control data during the inspection. (J.P.N.)

251

Efficiency losses at least halved. 2nd generation of CSS technologies; Effizienzeinbussen mindestens halbiert. CCS-Technologien der zweiten Generation  

Energy Technology Data Exchange (ETDEWEB)

In cooperation with Darmstadt Technical University, Alstom is testing two new processes for CO2 removal from flue gas. Especially the chemical looping process is very promising as it involves an extremely low reduction of the power plant efficiency.

Jopp, Klaus

2011-07-01

252

The influence of the 1st AlN and the 2nd GaN layers on properties of AlGaN/2nd AlN/2nd GaN/1st AlN/1st GaN structure  

Science.gov (United States)

This is a theoretical study of the 1st AlN interlayer and the 2nd GaN layer on properties of the Al0.3Ga0.7N/2nd AlN/2nd GaN/1st AlN/1st GaN HEMT structure by self-consistently solving coupled Schrödinger and Poisson equations. Our calculation shows that by increasing the 1st AlN thickness from 1.0 nm to 3.0 nm, the 2DEG, which is originally confined totally in the 2nd channel, gradually decreases, begins to turn up and eventually concentrates in the 1st one. The total 2DEG (2DEG in both channels) sheet density increases nearly linearly with the increasing 1st AlN thickness. And the slope of the potential profile of the AlGaN changes with the 1st AlN thickness, causing the unusual dependence of the total 2DEG sheet density on the thickness of the AlGaN barrier. The variations of 2DEG distribution, the total 2DEG sheet density and the conduction band profiles as a function of the 2nd GaN thickness also have been discussed. Their physical mechanisms have been investigated on the basis of the surface state theory. And the confinement of 2DEG can be further enhanced by the double-AlN interlayer, compared with the InGaN back-barrier.

Bi, Yang; Wang, XiaoLiang; Yang, CuiBai; Xiao, HongLing; Wang, CuiMei; Peng, EnChao; Lin, DeFeng; Feng, Chun; Jiang, LiJuan

2011-09-01

253

Evidence for the stable existence of the Fe2Nd phase in the Fe-Nd system  

Science.gov (United States)

We did directional solidification experiments on Fe20Nd80 alloys at several low cooling rates in order to observe the solidification sequence, displayed along the length of the sample. We also took measurements of the transition temperatures around the eutectic temperature using a Calvet type calorimeter. Our results show that the formation of the eutectic L?(Nd)+Fe17Nd2 occurs first, and is followed by the eutectic L?(Nd)+Fe2Nd. The calorimetric measurements give the temperature of the first eutectic to be 688 °C, the temperature of the peritectic formation of the phase Fe2Nd to be 682 °C and the temperature of its eutectoid decomposition as 659 °C. Thermomagnetic measurements give the Curie temperature of the Fe2Nd phase as 250 °C. A new phase diagram for this system is proposed.

Santos, I. A.; Gama, S.

1999-08-01

254

Comparative evaluation of results in the combined radiotherapy of patients with cervix cancer of 2 nd and 3 rd stages  

International Nuclear Information System (INIS)

Better treatment results on combined radiotherapy of patients with cervix cancer of 2 nd and 3 rd stages by mobile and unmoved irradiation methods from outward and inward irradiation using 'Agat B' were shown. 87,0 per cent of patients with cervix cancer of the 2 nd stage and 70,2 per cent of the 3 rd stage treated by this method have chances on survival for more than 4 years. Comparing to the other methods it decreased cystitis complication by 19,1 per cent and rectum complication by 20,0 per cent after afterloading. (Translated by J.U. 7 tabs.,)

255

Move Table: An Intelligent Software Tool for Optimal Path Finding and Halt Schedule Generation  

Directory of Open Access Journals (Sweden)

Full Text Available This study aims to help army officials in taking decisions before war to decide the optimal path for army troops moving between two points in a real world digital terrain, considering factors like traveled distance, terrain type, terrain slope, and road network. There can optionally be one or more enemies (obstacles located on the terrain which should be avoided. A tile-based A* search strategy with diagonal distance and tie-breaker heuristics is proposed for finding the optimal path between source and destination nodes across a real-world 3-D terrain. A performance comparison (time analysis, search space analysis, and accuracy has been made between the multiresolution A* search and the proposed tile-based A* search for large-scale digital terrain maps. Different heuristics, which are used by the algorithms to guide these to the goal node, are presented and compared to overcome some of the computational constraints associated with path finding on large digital terrains. Finally, a halt schedule is generated using the optimal path, weather condition, moving time, priority and type of a column, so that the senior military planners can strategically decide in advance the time and locations where the troops have to halt or overtake other troops depending on their priority and also the time of reaching the destination.

Anupam Agrawal

2007-09-01

256

Move Table: An Intelligent Software Tool for OptimalPath Finding and Halt Schedule Generation  

Directory of Open Access Journals (Sweden)

Full Text Available This study aims to help army officials in taking decisions before war to decide the optimalpath for army troops moving between two points in a real world digital terrain, consideringfactors like traveled distance, terrain type, terrain slope, and road network. There can optionallybe one or more enemies (obstacles located on the terrain which should be avoided. A tile-basedA* search strategy with diagonal distance and tie-breaker heuristics is proposed for finding theoptimal path between source and destination nodes across a real-world  3-D  terrain. A performancecomparison (time analysis, search space analysis, and accuracy has been made between themultiresolution A* search and the proposed tile-based A* search for large-scale digital terrainmaps. Different heuristics, which are used by the algorithms to guide these to the goal node,are presented and compared to overcome some of the computational constraints associated withpath finding on large digital terrains. Finally, a halt schedule is generated using the optimal path,weather condition, moving time, priority and type of a column, so that the senior military plannerscan strategically decide in advance the time and locations where the troops have to halt orovertake other troops depending on their priority and also the time of reaching the destination.

Anupam Agrawal

2007-09-01

257

Performance of the discriminating thermoluminescence personal dosimeter and results in the 2nd IAEA/RCA personal dosimetry intercomparison  

International Nuclear Information System (INIS)

The composition, experiment method, Back-scattering of slab phantom, energy response, directional response, energy discrimination method of radiation and dose calibration method of discriminating thermoluminescence personal dosimeter were presented. Good results were achieved in both phases of the 2nd IAEA/RCA personal dosimeter intercomparison with the dosimeters. The dosimeter have been bought by about 50 personal dose monitoring departments now

258

Observation in a School without Walls: Peer Observation of Teaching in a 2nd-12th Grade Independent School  

Science.gov (United States)

What happens when teachers start to observe each other's classes? How do teachers make meaning of observing and being observed? What effects, if any, does requiring peer observation have on the teaching community? This research explores these questions in a qualitative study of peer observation of teaching (POT) in the 2nd-12th grades of an…

Salvador, Josephine

2012-01-01

259

THINKLET: ELEMENTO CLAVE EN LA GENERACIÓN DE MÉTODOS COLABORATIVOS PARA EVALUAR USABILIDAD DE SOFTWARE / THINKLET: KEY ELEMENT IN THE COLLABORATIVE METHODS GENERATION FOR EVALUATE SOFTWARE USABILITY  

Scientific Electronic Library Online (English)

Full Text Available SciELO Colombia | Language: Spanish Abstract in spanish En la actualidad, la usabilidad es un atributo fundamental para el éxito de un producto software. La competitividad entre organizaciones obliga a mejorar el nivel de usabilidad de los productos, debido al riesgo que existe de perder clientes, si el producto no es fácil de usar y/o fácil de aprender. [...] Aunque se han establecido métodos para evaluar la usabilidad de productos software, la mayoría de estos métodos no consideran la posibilidad de involucrar a varias personas trabajando de forma colaborativa en el proceso de evaluación. Por esta razón, convendría utilizar la Metodología para el Diseño de Métodos de Evaluación de Usabilidad Colaborativos, de tal forma que se diseñen métodos que permitan a varias personas de diversas áreas de conocimiento, trabajar de forma colaborativa en el proceso de evaluación. Este artículo presenta de forma general, la metodología mencionada y hace especial énfasis en los thinklets, como elementos clave para el diseño de procesos colaborativos. Abstract in english Currently, usability is a critical attribute to success of software. The competition among organizations forces to improve the level of product usability due to the risk of losing customers if product would not be easy to use and/or easy to learn. Methods have been established to evaluate the usabil [...] ity of software products; however, most of these methods don't take into account the possibility to involve several people working collaboratively in the evaluation process. Therefore, Methodology for Design of Collaborative Usability Evaluation Methods should be used to design methods that allow several people from a range of knowledge areas to work collaboratively in the evaluation process. This paper presents the methodology mentioned and gives special emphasis on Thinklets, as key elements for design of collaborative processes.

Andrés, Solano Alegría; Yenny, Méndez Alegría; César, Collazos Ordóñez.

2010-07-01

260

An open-source software tool for the generation of relaxation time maps in magnetic resonance imaging  

Digital Repository Infrastructure Vision for European Research (DRIVER)

Abstract Background In magnetic resonance (MR) imaging, T1, T2 and T2* relaxation times represent characteristic tissue properties that can be quantified with the help of specific imaging strategies. While there are basic software tools for specific pulse sequences, until now there is no universal software program available to automate pixel-wise mapping of relaxation times from various types of images or MR systems. Such a software program would allow researchers to test and...

Kühne Titus; Wassmuth Ralf; Abdel-Aty Hassan; Rudolph Andre; Messroghli Daniel R; Dietz Rainer; Schulz-Menger Jeanette

2010-01-01

 
 
 
 
261

Academic Training - 2nd Term: 08.01.2007 - 31.03.2007  

CERN Multimedia

2006 - 2007 ACADEMIC TRAINING PROGRAMME 2nd Term : 08.01.2007 - 31.03.2007 LECTURE SERIES Applied Superconductivity by V. Palmieri, INFN, Padova, It. 17, 18, 19 January 11:00 -1200 - Auditorium, Bldg 500 String Theory for Pedestrians by B. Zwiebach, M.I.T. Cambridge, USA 29, 30, 31 January 11:00-12:00 - Auditorium, Bldg 500 on 29, 30 January TH Auditorium on 31 January Introduction to Supersymmetry by D. Kaplan, John Hopkins University, Baltimore, USA 12, 13, 14, 15 February 11:00-12:00 - Auditorium, Bldg 500 The Hunt for the Higgs Particle by F. Zwirner, University of Padova, It 27, 28 February, 1st March 11:00-12:00 - Auditorium, Bldg 500 From Evolution Theory to Parallel and Distributed Genetic by F. Fernandez de Vega 15, 16, March 11:00-12:00 - Auditorium, Bldg 500 The lectures are open to all those interested, without application. The abstract of the lectures, as well as any change to the above information (title, dates, time, place etc.) will be published in the CERN bulletin, the WWW, an...

2006-01-01

262

Academic Training - 2nd Term: 08.01.2007 - 31.03.2007  

CERN Multimedia

2006 - 2007 ACADEMIC TRAINING PROGRAMME 2nd Term : 08.01.2007 - 31.03.2007 LECTURE SERIES Applied Superconductivity by V. Palmieri, INFN, Padova, It. 17, 18, 19 January 11:00 -1200 - Auditorium, bldg. 500 String Theory for Pedestrians by B. Zwiebach, M.I.T. Cambridge, USA 29, 30, 31 January 11:00-12:00 - Auditorium, bldg. 500 on 29, 30 January TH Auditorium on 31 January Introduction to Supersymmetry by D. Kaplan, John Hopkins University, Baltimore, USA 12, 13, 14, 15 February 11:00-12:00 - Auditorium, bldg. 500 The Hunt for the Higgs Particle by F. Zwirner, University of Padova, It 27, 28 February, 1st March 11:00-12:00 - Auditorium, bldg. 500 From Evolution Theory to Parallel and Distributed Genetic Programming by F. Fernandez de Vega 15, 16, March 11:00-12:00 - Auditorium, bldg. 500 The lectures are open to all those interested, without application. The abstract of the lectures, as well as any change to the above information (title, dates, time, place etc.) will be published in the WWW, and ...

2006-01-01

263

Proceedings of the 2nd CSNI Specialist Meeting on Simulators and Plant Analysers  

International Nuclear Information System (INIS)

The safe utilisation of nuclear power plants requires the availability of different computerised tools for analysing the plant behaviour and training the plant personnel. These can be grouped into three categories: accident analysis codes, plant analysers and training simulators. The safety analysis of nuclear power plants has traditionally been limited to the worst accident cases expected for the specific plant design. Many accident analysis codes have been developed for different plant types. The scope of the analyses has continuously expanded. The plant analysers are now emerging tools intended for extensive analysis of the plant behaviour using a best estimate model for the whole plant including the reactor and full thermodynamic process, both combined with automation and electrical systems. The comprehensive model is also supported by good visualisation tools. Training simulators with real time plant model are tools for training the plant operators to run the plant. Modern training simulators have also features supporting visualisation of the important phenomena occurring in the plant during transients. The 2nd CSNI Specialist Meeting on Simulators and Plant Analysers in Espoo attracted some 90 participants from 17 countries. A total of 49 invited papers were presented in the meeting in addition to 7 simulator system demonstrations. Ample time was reserved for the presentations and informal discussions during the four meeting days. (orig.)

264

Multiferroicity in La1/2Nd1/2FeO3 nanoparticles  

Science.gov (United States)

Nano-sized La1/2Nd1/2FeO3 (LNF) powder is synthesized by the sol-gel citrate method. The Rietveld refinement of the X-ray diffraction profile of the sample at room temperature (303 K) shows the orthorhombic phase with Pbnm symmetry. The particle size is obtained by transmission electron microscope. The antiferromagnetic nature of the sample is explained using zero field cooled and field cooled magnetisation and the corresponding hysteresis loop. A signature of weak ferromagnetic phase is observed in LNF at low temperature which is explained on the basis of spin glass like behaviour of surface spins. The dielectric relaxation of the sample has been investigated using impedance spectroscopy in the frequency range from 42 Hz to 1 MHz and in the temperature range from 303 K to 513 K. The Cole-Cole model is used to analyse the dielectric relaxation of LNF. The frequency dependent conductivity spectra follow the power law. The magneto capacitance measurement of the sample confirms its multiferroic behaviour.

Chanda, Sadhan; Saha, Sujoy; Dutta, Alo; Mahapatra, A. S.; Chakrabarti, P. K.; Kumar, Uday; Sinha, T. P.

2014-11-01

265

Mesocosm soil ecological risk assessment tool for GMO 2nd tier studies  

DEFF Research Database (Denmark)

Ecological Risk Assessment (ERA) of GMO is basically identical to ERA of chemical substances, when it comes to assessing specific effects of the GMO plant material on the soil ecosystem. The tiered approach always includes the option of studying more complex but still realistic ecosystem level effects in 2nd tier caged experimental systems, cf. the new GMO ERA guidance: EFSA Journal 2010; 8(11):1879. We propose to perform a trophic structure analysis, TSA, and include the trophic structure as an ecological endpoint to gain more direct insight into the change in interactions between species, i.e. the food-web structure, instead of relying only on the indirect evidence from population abundances. The approach was applied for effect assessment in the agro-ecosystem where we combined factors of elevated CO2, viz. global climate change, and GMO plant effects. A multi-species (Collembola, Acari and Enchytraeidae) mesocosm factorial experiment was set up in a greenhouse at ambient CO2 and 450 ppm CO2 with a GM barley variety and conventional varieties. The GM barley differed concerning the composition of amino acids in the grain (antisense C-hordein line). The fungicide carbendazim acted as a positive control. After 5 and 11 weeks, data on populations, plants and soil organic matter decomposition were evaluated. Natural abundances of stable isotopes, 13C and 15N, of animals, soil, plants and added organic matter (crushed maize leaves) were used to describe the soil food web structure.

D'Annibale, Alessandra; Maraldo, Kristine

266

DRS // CUMULUS Oslo 2013. The 2nd International Conference for Design Education Researchers  

Directory of Open Access Journals (Sweden)

Full Text Available 14-17 May 2013, Oslo, NorwayWe have received more than 200 full papers for the 2nd International Conference for Design Education Researchers in Oslo.This international conference is a springboard for sharing ideas and concepts about contemporary design education research. Contributors are invited to submit research that deals with different facets of contemporary approaches to design education research. All papers will be double-blind peer-reviewed. This conference is open to research in any aspect and discipline of design educationConference themeDesign Learning for Tomorrow - Design Education from Kindergarten to PhDDesigned artefacts and solutions influence our lives and values, both from a personal and societal perspective. Designers, decision makers, investors and consumers hold different positions in the design process, but they all make choices that will influence our future visual and material culture. To promote sustainability and meet global challenges for the future, professional designers are dependent on critical consumers and a design literate general public.  For this purpose design education is important for all. We propose that design education in general education represents both a foundation for professional design education and a vital requirement for developing the general public’s competence for informed decision making.REGISTRATION AT http://www.hioa.no/DRScumulus

Liv Merete Nielsen

2013-01-01

267

The Development of Information Literacy Assessment for 2nd Grade Students and Their Performance  

Directory of Open Access Journals (Sweden)

Full Text Available The main purpose of this study was to develop an Information Literacy Assessment for 2nd-grade students and evaluate their performance. The assessment included a regular test and a portfolio assessment. There were 30 multiple-choice items and 3 constructed-response items in the test, while the portfolio assessment was based on the Super3 model. This study was conducted in an elementary school located in southern Taiwan. One hundred and forty-two second graders took the test, and only one class was randomly selected as the subjects for the portfolio assessment. The results showed that the test and portfolio assessment had good validity and reliability. In the fields of library literacy and media literacy, second-grade students with different abilities performed differently, while boys and girls performed similarly. Students performed well in the process of the Super3 model, only in the Plan Phase, they still needed teachers’ help to pose inquiry questions. At last, several suggestions were proposed for information literacy assessment and future research.

Lin Ching Chen

2013-10-01

268

Proceedings: 2nd IEA international workshop on beryllium technology for fusion  

International Nuclear Information System (INIS)

The 2nd IEA International Workshop on Beryllium Technology for Fusion was held September 6--8, 1995 at Jackson Lake Lodge, Wyoming. Forty-four participants took part in the workshop representing Europe, Japan, the Russian Federation, and the United States including representatives from both government laboratories and private industry. The workshop was divided into six technical sessions and a ''town meeting'' panel discussion. Technical sessions addressed the general topics of: Thermomechanical Properties; Manufacturing Technologies; Radiation Effects; Plasma/Tritium Interactions; Safety, Applications, and Design; and Joining and Testing. This volume contains the majority of the papers presented at the workshop. In some instances, the authors of the papers could not be present at the workshop, and the papers were given by others, sometimes in summary form and in some instances combined with others. The full papers are included here in the sequence in which they would have been given. In other instances, presentations were made but no papers were submitted for publication. Those papers do not appear here. In summary, the workshop was very successful. The main objectives of bringing key members of the fusion beryllium community together was certainly met. Forty-four participants registered, and 35 papers were presented. Individual papers are indexed separately on the energy data bases

269

Evaluation of the Battelle Developmental Inventory, 2nd Edition, Screening Test for Use in States' Child Outcomes Measurement Systems under the Individuals with Disabilities Education Act  

Science.gov (United States)

This study evaluated the Battelle Developmental Inventory, 2nd Edition, Screening Test (BDI-2 ST) for use in states' child outcomes accountability systems under the Individuals with Disabilities Education Act. Complete Battelle Developmental Inventory, 2nd Edition (BDI-2), assessment data were obtained for 142 children, ages 2 to 62 months, who…

Elbaum, Batya; Gattamorta, Karina A.; Penfield, Randall D.

2010-01-01

270

The influence of the 1st AlN and the 2nd GaN layers on properties of AlGaN/2nd AlN/2nd GaN/1st AlN/1st GaN structure  

Energy Technology Data Exchange (ETDEWEB)

This is a theoretical study of the 1st AlN interlayer and the 2nd GaN layer on properties of the Al{sub 0.3}Ga{sub 0.7}N/2nd AlN/2nd GaN/1st AlN/1st GaN HEMT structure by self-consistently solving coupled Schroedinger and Poisson equations. Our calculation shows that by increasing the 1st AlN thickness from 1.0 nm to 3.0 nm, the 2DEG, which is originally confined totally in the 2nd channel, gradually decreases, begins to turn up and eventually concentrates in the 1st one. The total 2DEG (2DEG in both channels) sheet density increases nearly linearly with the increasing 1st AlN thickness. And the slope of the potential profile of the AlGaN changes with the 1st AlN thickness, causing the unusual dependence of the total 2DEG sheet density on the thickness of the AlGaN barrier. The variations of 2DEG distribution, the total 2DEG sheet density and the conduction band profiles as a function of the 2nd GaN thickness also have been discussed. Their physical mechanisms have been investigated on the basis of the surface state theory. And the confinement of 2DEG can be further enhanced by the double-AlN interlayer, compared with the InGaN back-barrier. (orig.)

Bi, Yang; Peng, EnChao; Lin, DeFeng [Chinese Academy of Sciences, Materials Science Center, Institute of Semiconductors, P.O. Box 912, Beijing (China); Wang, XiaoLiang; Yang, CuiBai; Xiao, HongLing; Wang, CuiMei; Feng, Chun; Jiang, LiJuan [Chinese Academy of Sciences, Materials Science Center, Institute of Semiconductors, P.O. Box 912, Beijing (China); Chinese Academy of Sciences, Key Laboratory of Semiconductor Materials Science, Institute of Semiconductors, P.O. Box 912, Beijing (China)

2011-09-15

271

Madeira Extreme Floods: 2009/2010 Winter. Case study - 2nd and 20th of February  

Science.gov (United States)

Floods are at world scale the natural disaster that affects a larger fraction of the population. It is a phenomenon that extends it's effects to the surrounding areas of the hydrographic network (basins, rivers, dams) and the coast line. Accordingly to USA FEMA (Federal Emergency Management Agency) flood can be defined as:"A general and temporary condition of partial or complete inundation of two or more acres of normally dry land area or of two or more properties from: Overflow of inland or tidal waters; Unusual and rapid accumulation or runoff of surface waters from any source; Mudflow; Collapse or subsidence of land along the shore of a lake or similar body of water as a result of erosion or undermining caused by waves or currents of water exceeding anticipated cyclical levels that result in a flood as defined above." A flash flood is the result of intense and long duration of continuous precipitation and can result in dead casualties (i.e. floods in mainland Portugal in 1967, 1983 and 1997). The speed and strength of the floods either localized or over large areas, results in enormous social impacts either by the loss of human lives and or the devastating damage to the landscape and human infrastructures. The winter of 2009/2010 in Madeira Island was characterized by several episodes of very intense precipitation (specially in December 2009 and February 2010) adding to a new record of accumulated precipitation since there are records in the island. In February two days are especially rainy with absolute records for the month of February (daily records since 1949): 111mm and 97mm on the 2nd and 20th respectively. The accumulated precipitation ended up with the terrible floods on the 20th of February causing the lost of dozens of human lives and hundreds of millions of Euros of losses The large precipitation occurrences either more intense precipitation in a short period or less intense precipitation during a larger period are sometimes the precursor of geological phenomena resulting in land movement, many times in the same or very near areas from previous episodes. Although flood episodes have a strong dependency in the topography and hydrological capacity of the terrains, the human intervention is also an enormously important factor, more specifically the anthropogenic factors such deforestation, dams, change of water fluxes, and impermeabilization of the terrain surface. The risk assessment of floods should be address based not only on the knowledge of the meteorological and hidrometeorological factors, such the accumulated precipitation and soil water balance but also in the river path and water amounts and well the surrounding geomorphology of the water basins. The current work is focused in the meteorological contribution for the floods occurrence episode of 2010 in the Madeira Island, specifically the climatic characterization of the 2009/2010 Winter with particular incidence on the days of the 2nd and 20th of February.

Pires, V.; Marques, J.; Silva, A.

2010-09-01

272

Re-fighting the 2nd Anglo-Boer War: historians in the trenches  

Directory of Open Access Journals (Sweden)

Full Text Available Some one hundred years ago, South Africa was tom apart by the 2nd Anglo- Boer War (1899-1902. The war was a colossal psychological experience fought at great expense: It cost Britain twenty-two thousand men and £223 million. The social, economic and political cost to South Africa was greater than the statistics immediately indicate: at least ten thousand fighting men in addition to the camp deaths, where a combination of indifference and incompetence resulted in the deaths of 27 927 Boers and at least 14 154 Black South Africans. Yet these numbers belie the consequences. It was easy for the British to 'forget' the pain of the war, which seemed so insignificant after the losses sustained in 1914-18. With a long history of far-off battles and foreign wars, the British casualties of the Anglo-Boer War became increasingly insignificant as opposed to the lesser numbers held in the collective Afrikaner mind. This impact may be stated somewhat more candidly in terms of the war participation ratio for the belligerent populations. After all, not all South Africans fought in uniform. For the Australian colonies these varied between 4½per thousand (New South Wales to 42.3 per thousand (Tasmania. New Zealand 8 per thousand, Britain 8½ per thousand: and Canada 12.3 per thousand; while in parts of South Africa this was perhaps as high as 900 per thousand. The deaths and high South African participation ratio, together with the unjustness of the war in the eyes of most Afrikaners, introduced bitterness, if not a hatred, which has cast long shadows upon twentieth-century South Africa.

Ian Van der Waag

2012-02-01

273

Conference Report on the 2nd International Symposium on Lithium Applications for Fusion Devices  

Science.gov (United States)

The 2nd International Symposium on Lithium Applications for Fusion Devices (ISLA-2011) was held on 27-29 April 2011 at the Princeton Plasma Physics Laboratory (PPPL) with broad participation from the community working on aspects of lithium research for fusion energy development. This community is expanding rapidly in many areas including experiments in magnetic confinement devices and a variety of lithium test stands, theory and modeling and developing innovative approaches. Overall, 53 presentations were given representing 26 institutions from 10 countries. The latest experimental results from nine magnetic fusion devices were given in 24 presentations, from NSTX (PPPL, USA), LTX (PPPL, USA), FT-U (ENEA, Italy), T-11M (TRINITY, RF), T-10 (Kurchatov Institute, RF), TJ-II (CIEMAT, Spain), EAST (ASIPP, China), HT-7 (ASIPP, China), and RFX (Padova, Italy). Sessions were devoted to: I. Lithium in magnetic confinement experiments (facility overviews), II. Lithium in magnetic confinement experiments (topical issues), III. Special session on liquid lithium technology, IV. Lithium laboratory test stands, V. Lithium theory/modeling/comments, VI. Innovative lithium applications and VII. Panel discussion on lithium PFC viability in magnetic fusion reactors. There was notable participation from the fusion technology communities, including the IFE, IFMIF and TBM communities providing productive exchanges with the physics oriented magnetic confinement lithium research groups. It was agreed to continue future exchanges of ideas and data to help develop attractive liquid lithium solutions for very challenging magnetic fusion issues, such as development of a high heat flux steady-state divertor concept and acceptable plasma disruption mitigation techniques while improving plasma performance with lithium. The next workshop will be held at ENEA, Frascati, Italy in 2013.

Ono, M.; Bell, M. G.; Hirooka, Y.; Kaita, R.; Kugel, H. W.; Mazzitelli, G.; Menard, J. E.; Mirnov, S. V.; Shimada, M.; Skinner, C. H.; Tabares, F. L.

2012-03-01

274

Archaeometric study of glass beads from the 2nd century BC cemetery of Numantia  

Directory of Open Access Journals (Sweden)

Full Text Available Recent archaeologícalf ieldwork undertaken in the Celtiberian cremation necropolis of Numantia (Soria, Spain has provided a group of glass beads from the 2nd century BC. Such glass beads were part, together with other metallic and ceramic items, of the offerings deposited with the dead. They are ring-shaped in typology and deep-blue, amber, or semitransparent white in colour. This paper reports results derived from the chemical and microstructural characterization carried out on a representative sample set of this group of beads. The main goal of the research was to find out about their production technology to explore their probable provenance. In addítion, corrosion mechanisms were also assessed to determine the influence of crematíon on the beads' structure. The resulting data suggest that these blue and amber beads were made using soda-lime silicate glass, whereas semi-transparent white ones were manufactured from alumino-silicate glass. It has also determined that some transition metal oxides were used as chromophores, as well as lead oxide for decoration.

La reciente excavación de la necrópolis celtibérica de Numancia (Garray, Soria ha permitido recuperar un conjunto de cuentas de vidrio del siglo II a.C. Las cuentas, junto con otros objetos de metaly cerámica, formaban parte de las ofrendas depositadas con el difunto, siendo de tipología anular y coloreadas en azul oscuro, ambar y blanco semitransparente. Este trabajo presenta los resultados obtenidos en la caracterización química y microestructural de una muestra representativa de este conjunto. El objetivo principal de la investigación consistió en recabar información sobre su tecnología de manufactura y evaluar su posible procedencia. Asimismo, también se investigaron sus mecanismos de corrosión para determinar si la cremación había inducido cambios en su estructura. Los resultados indican que las cuentas azules y ámbar se realizaron con vidrio de silicato sódico cálcico y las blancas semitransparentes con vidrio de aluminosilicato, utilizando óxidos de metales de transición como cromóforos y óxido de plomo para la decoración.

García Heras, Manuel

2003-06-01

275

Open3DGRID : An open-source software aimed at high-throughput generation of molecular interaction fields (MIFs)  

DEFF Research Database (Denmark)

Description Open3DGRID is an open-source software aimed at high-throughput generation of molecular interaction fields (MIFs). Open3DGRID can generate steric potential, electron density and MM/QM electrostatic potential fields; furthermore, it can import GRIDKONT binary files produced by GRID and CoMFA/CoMSIA fields (exported from SYBYL with the aid of a small SPL script). High computational performance is attained through implementation of parallelized algorithms for MIF generation. Most prominent features in Open3DGRID include: •Seamless integration with OpenBabel, PyMOL, GAUSSIAN, FIREFLY, GAMESS-US, TURBOMOLE, MOLDEN, Molecular Discovery GRID •Multi-threaded computation of MIFs (both MM and QM); support for MMFF94 and GAFF force-fields with automated assignment of atom types to the imported molecular structures •Human and machine-readable text output, integrated with 3D maps in several formats to allow visualization of results in PyMOL, MOE, Maestro and SYBYL •User-friendly interface toall major QM packages (e.g. GAUSSIAN, FIREFLY, GAMESS-US, TURBOMOLE, MOLDEN), allows calculation of QM electron density and electrostatic potential 3D maps from within Open3DGRID •User-friendly interface to Molecular Discovery GRID to compute GRID MIFs from within Open3DGRID Open3DGRID is controlled through a command line interface; commands can be either entered interactively from a command prompt or read from a batch script. If PyMOL is installed on the system while Open3DGRID is being operated interactively, the setup of 3D grid computations can be followed in real time on PyMOL's viewport, allowing to tweak grid size and training/test set composition very easily. The main output is arranged as human-readable plain ASCII text, while a number of additional files are generated to store data and to export the results of computations for further analysis and visualization with third party tools. In particular, Open3DGRID can export 3D maps for visualization in PyMOL, MOE, Maestro and SYBYL. Open3DGRID is written in C; while pre-built binaries are available for mainstream operating systems (Windows 32/64-bit, Linux 32/64-bit, Solaris x86 32/64-bit, FreeBSD 32/64-bit, Intel Mac OS X 32/64-bit), source code is portable and can be compiled under any *NIX platform supporting POSIX threads. The modular nature of the code allows for easy implementation of new features, so that the core application can be customized to meet individual needs. A detailed ChangeLog is kept to keep track of the additions and modifications during Open3DGRID's development.

Tosco, Paolo; Balle, Thomas

276

Herramienta software para el análisis de canasta de mercado sin selección de candidatos / Software tool for analysing the family shopping basket without candidate generation  

Scientific Electronic Library Online (English)

Full Text Available SciELO Colombia | Language: Spanish Abstract in spanish Actualmente en el entorno del comercio electrónico es necesario contar con herramientas que permitan obtener conocimiento útil que brinde soporte a la toma de decisiones de marketing; para ello se necesita de un proceso que utiliza una serie de técnicas para el procesamiento de los datos, entre ella [...] s se encuentra la minería de datos, que permite llevar a cabo un proceso de descubrimiento de información automático. Este trabajo tiene como objetivo presentar la técnica de reglas de asociación como la adecuada para descubrir cómo compran los clientes en una empresa que ofrece un servicio de comercio electrónico tipo B2C, con el fin de apoyar la toma de decisiones para desarrollar ofertas hacia sus clientes o cautivar nuevos. Para la implementación de las reglas de asociación existe una variedad de algoritmos como: A priori, DHP, Partition, FP-Growth y Eclat y para seleccionar el más adecuado se define una serie de criterios (Danger y Berlanga, 2001), entre los que se encuentran: inserciones a la base de datos, costo computacional, tiempo de ejecución y rendimiento, los cuales se analizaron en cada algoritmo para realizar la selección. Además, se presenta el desarrollo de una herramienta software que contempla la metodología CRISP-DM constituida por cuatro submódulos, así: Preprocesamiento de datos, Minería de datos, Análisis de resultados y Aplicación de resultados. El diseño de la aplicación utiliza una arquitectura de tres capas: Lógica de presentación, Lógica del Negocio y Lógica de servicios; dentro del proceso de construcción de la herramienta se incluye el diseño de la bodega de datos y el diseño de algoritmo como parte de la herramienta de minería de datos. Las pruebas hechas a la herramienta de minería de datos desarrollada se realizaron con una base de datos de la compañía FoodMart3. Estas pruebas fueron de: rendimiento, funcionalidad y confiabilidad en resultados, las cuales permiten encontrar reglas de asociación igualmente. Los resultados obtenidos facilitaron concluir, entre otros aspectos, que las reglas de asociación como técnica de minería de datos permiten analizar volúmenes de datos para servicios de comercio electrónico tipo B2C, lo cual es una ventaja competitiva para las empresas. Abstract in english Tools leading to useful knowledge being obtained for supporting marketing decisions being taken are currently needed in the ecommerce environment. A process is needed for this which uses a series of techniques for data-processing; data-mining is one such technique enabling automatic information disc [...] overy. This work presents the association rules as a suitable technique for discovering how customers buy from a company offering business to consumer (B2C) e-business, aimed at supporting decision-making in supplying its customers or capturing new ones. Many algorithms such as A priori, DHP, Partition, FP-Growth and Eclat are available for implementing association rules; the following criteria were defined for selecting the appropriate algorithm: database insert, computational cost, performance and execution time. The development of a software tool is also presented which involved the CRISP-DM approach; this software tool was formed by the following four sub-modules: data pre-processing, data-mining, results analysis and results application. The application design used three-layer architecture: presentation logic, business logic and service logic. Data warehouse design and algorithm design were included in developing this data-mining software tool. It was tested by using a FoodMart company database; the tests included performance, functionality and results’ validity, thereby allowing association rules to be found. The results led to concluding that using association rules as a data mining technique facilitates analysing volumes of information for B2C e-business services which represents a competitive advantage for those companies using Internet as their sales’ media.

Roberto Carlos, Naranjo Cuervo; Luz Marina, Sierra Martínez.

2009-04-01

277

GENERACIÓN AUTOMÁTICA DE APLICACIONES SOFTWARE A PARTIR DEL ESTANDAR MDA BASÁNDOSE EN LA METODOLOGÍA DE SISTEMAS EXPERTOS E INTELIGENCIA ARTIFICIAL / AUTOMATIC GENERATION OF SOFTWARE APPLICATIONS FROM STANDARD MDA STANDARD BASED ON THE METHOD OF ARTIFICIAL INTELLIGENCE AND EXPERT SYSTEMS  

Directory of Open Access Journals (Sweden)

Full Text Available RESUMEN ANALÍTICO Son muchos los estudios que se han presentado a cerca de la generación automática de líneas de código, este artículo pretende presentar una solución a las limitaciones de una herramienta muy conocida llamada MDA, haciendo uso los avances tecnológicos de la inteligencia artificial y los sistemas expertos. Abarca los principios del marco de trabajo de MDA, transformando los modelos usados y añadiendo características a estos que permitirán hacer más eficiente esta metodología de trabajo. El modelo propuesto abarca las fases del ciclo de vida software siguiendo las reglas del negocio que hacen parte esencial un proyecto real de software. Es con las reglas del negocio que se empieza a dar la transformación del estándar MDA y se pretende dar un aporte que contribuya a automatizar las reglas del negocio de forma tal que sirva para la definición de las aplicaciones en todo el ciclo de vida que la genera. ANALYTICAL SUMMARY Many studies are presented about automatic generation of code lines, this article want to present a solution for limitations of a tool called MDA, using from Artifcial intelligence technological advances and expert sistems. covering the principle of MDA work frame, transforming used models and adding characteristics to this that allow to make more effcient this work metodology. the proposed model covers the phases cycle life software, following the business rules that make essential part in a real software proyect. With the Business rules can start to transform the standard MDA aiming to give a contribution to automate the business rules that works to defne aplications in all the life's cicle that generate it.

IVÁN MAURICIO RUEDA CÁCERES

2011-04-01

278

Proceedings of the 2nd international advisory committee on biomolecular dynamics instrument DNA in MLF at J-PARC  

International Nuclear Information System (INIS)

The 2nd International Advisory Committee on the 'Biomolecular Dynamics Backscattering Spectrometer DNA' was held on November 12th - 13th, 2008 at J-PARC Center, Japan Atomic Energy Agency. This IAC has been organized for aiming to realize an innovative neutron backscattering instrument in the Materials and Life Science Experimental Facility (MLF) at the J-PARC and therefore four leading scientists in the field of neutron backscattering instruments has been selected as the member (Dr. Dan Neumann (Chair); Prof. Ferenc Mezei; Dr. Hannu Mutka; Dr. Philip Tregenna-Piggott), and the 1st IAC had been held on February 27th - 29th, 2008. This report includes the executive summary and materials of the presentations in the 2nd IAC. (author)

279

On the possibility of low-threshold anomalous absorption in tokamak 2nd-harmonic electron cyclotron resonance heating experiments  

Science.gov (United States)

The parametric decay of an extraordinary electron cyclotron pump wave to electron Bernstein (EB) wave and ion Bernstein wave leading to anomalous absorption in 2nd-harmonic electron cyclotron resonance heating (ECRH) experiments is analyzed. It is shown that in an axisymmetric system, like tokamak, excitation of the low-threshold absolute parametric decay instability (PDI) is possible in the case of non-monotonic behaviour of plasma density in radial direction and due to poloidal magnetic-field inhomogeneity. The threshold of the absolute PDI instability determined by linear dissipation of the EB wave is shown to be substantially exceeded in nowadays 2nd-harmonic ECRH experiments in tokamaks that provide an explanation for the mysterious ion acceleration effect observed there.

Gusakov, E.; Popov, A.

2012-07-01

280

2nd PEGS Annual Symposium on Antibodies for Cancer Therapy: April 30–May 1, 2012, Boston, USA  

Digital Repository Infrastructure Vision for European Research (DRIVER)

The 2nd Annual Antibodies for Cancer Therapy symposium, organized again by Cambridge Healthtech Institute as part of the Protein Engineering Summit, was held in Boston, USA from April 30th to May 1st, 2012. Since the approval of the first cancer antibody therapeutic, rituximab, fifteen years ago, eleven have been approved for cancer therapy, although one, gemtuzumab ozogamicin, was withdrawn from the market.  The first day of the symposium started with a historical review of early work for l...

Ho, Mitchell; Royston, Ivor; Beck, Alain

2012-01-01

 
 
 
 
281

Exploration of performance limitation of 9-cell cavity processed in KEK AR East 2nd experimental hall  

International Nuclear Information System (INIS)

So far our 9-cell cavity performance is often suffered from field emission. We are investigating our facilities at the KEK AR East 2nd experimental hall. We examined two points of view post EP/BCP cleaning and particle contamination. Particle contamination problem has been found in our HPR system, cavity assembly, and vacuum evacuation procedure. We have taken cures against these problems. We will report about these problems and the cured results on cavity performance in this paper. (author)

282

Teachers' Spatial Anxiety Relates to 1st-and 2nd-Graders' Spatial Learning  

Science.gov (United States)

Teachers' anxiety about an academic domain, such as math, can impact students' learning in that domain. We asked whether this relation held in the domain of spatial skill, given the importance of spatial skill for success in math and science and its malleability at a young age. We measured 1st-and 2nd-grade teachers' spatial anxiety…

Gunderson, Elizabeth A.; Ramirez, Gerardo; Beilock, Sian L.; Levine, Susan C.

2013-01-01

283

Report on the 2nd International Consortium on Hallucination Research: Evolving Directions and Top-10 “Hot Spots” in Hallucination Research  

Digital Repository Infrastructure Vision for European Research (DRIVER)

This article presents a report on the 2nd meeting of the International Consortium on Hallucination Research, held on September 12th and 13th 2013 at Durham University, UK. Twelve working groups involving specialists in each area presented their findings and sought to summarize the available knowledge, inconsistencies in the field, and ways to progress. The 12 working groups reported on the following domains of investigation: cortical organisation of hallucinations, nonclinical hallucinations,...

Waters, Flavie; Woods, Angela; Fernyhough, Charles

2013-01-01

284

On possibility of low-threshold two-plasmon decay instability in 2nd harmonic ECRH experiments at toroidal devices  

Digital Repository Infrastructure Vision for European Research (DRIVER)

The effects of the parametric decay of the 2nd harmonic X-mode into two short wave-length UH plasmons propagating in opposite directions is considered. The possibility of the absolute instability excitation is demonstrated in the case of the density profile possessing local maximum slightly exceeding the UH resonance value. The threshold of the absolute instability is shown to be substantially smaller than that provided by the standard theory for monotonous density profile.

Gusakov E. Z.; Yu, Popov A.

2012-01-01

285

Investigations of near IR photoluminescence properties in TiO2:Nd,Yb materials using hyperspectral imaging methods  

International Nuclear Information System (INIS)

TiO2 and TiO2:Nd,Yb films were deposited by a doctor blade deposition technique from pastes prepared by a sol–gel process, and characterized by electron microscopy and spectroscopic techniques. Near infrared (NIR) photoluminescence (PL) properties upon 808 nm excitation were also examined. The rutile TiO2:Nd,Yb samples exhibited the strongest NIR PL signal. The relationship between the morphological properties, annealing temperature and the optical behavior of TiO2:Nd,Yb films is discussed. Furthermore, the study showed that hyperspectral imaging spectroscopy can be used as a rapid and nondestructive macroscopic characterization technique for the identification of spectral features and evaluation of luminescent surfaces of oxides. -- Highlights: ? Films and powders of Nd and Yb-doped titania have been synthesized. ? Three modifications (anatase, rutile and Degussa P25) have been studied. ? The NIR photoluminescence properties were studied by hyperspectral imaging. ? Emission at 978, 1008, 1029, 1064 and 1339 nm was obtained. ? The structural properties and their influence on the optical behavior are discussed

286

Investigations of near IR photoluminescence properties in TiO{sub 2}:Nd,Yb materials using hyperspectral imaging methods  

Energy Technology Data Exchange (ETDEWEB)

TiO{sub 2} and TiO{sub 2}:Nd,Yb films were deposited by a doctor blade deposition technique from pastes prepared by a sol–gel process, and characterized by electron microscopy and spectroscopic techniques. Near infrared (NIR) photoluminescence (PL) properties upon 808 nm excitation were also examined. The rutile TiO{sub 2}:Nd,Yb samples exhibited the strongest NIR PL signal. The relationship between the morphological properties, annealing temperature and the optical behavior of TiO{sub 2}:Nd,Yb films is discussed. Furthermore, the study showed that hyperspectral imaging spectroscopy can be used as a rapid and nondestructive macroscopic characterization technique for the identification of spectral features and evaluation of luminescent surfaces of oxides. -- Highlights: ? Films and powders of Nd and Yb-doped titania have been synthesized. ? Three modifications (anatase, rutile and Degussa P25) have been studied. ? The NIR photoluminescence properties were studied by hyperspectral imaging. ? Emission at 978, 1008, 1029, 1064 and 1339 nm was obtained. ? The structural properties and their influence on the optical behavior are discussed.

Garskaite, Edita; Flø, Andreas S. [Department of Mathematical Sciences and Technology, Norwegian University of Life Sciences, N-1432 Aas (Norway); Helvoort, Antonius T.J. van [Department of Physics, Norwegian University of Science and Technology, 7491 Trondheim (Norway); Kareiva, Aivaras [Department of General and Inorganic Chemistry, Vilnius University, Naugarduko 24, LT-03225 Vilnius (Lithuania); Olsen, Espen, E-mail: espen.olsen@umb.no [Department of Mathematical Sciences and Technology, Norwegian University of Life Sciences, N-1432 Aas (Norway)

2013-08-15

287

An open-source software tool for the generation of relaxation time maps in magnetic resonance imaging  

Directory of Open Access Journals (Sweden)

Full Text Available Abstract Background In magnetic resonance (MR imaging, T1, T2 and T2* relaxation times represent characteristic tissue properties that can be quantified with the help of specific imaging strategies. While there are basic software tools for specific pulse sequences, until now there is no universal software program available to automate pixel-wise mapping of relaxation times from various types of images or MR systems. Such a software program would allow researchers to test and compare new imaging strategies and thus would significantly facilitate research in the area of quantitative tissue characterization. Results After defining requirements for a universal MR mapping tool, a software program named MRmap was created using a high-level graphics language. Additional features include a manual registration tool for source images with motion artifacts and a tabular DICOM viewer to examine pulse sequence parameters. MRmap was successfully tested on three different computer platforms with image data from three different MR system manufacturers and five different sorts of pulse sequences: multi-image inversion recovery T1; Look-Locker/TOMROP T1; modified Look-Locker (MOLLI T1; single-echo T2/T2*; and multi-echo T2/T2*. Computing times varied between 2 and 113 seconds. Estimates of relaxation times compared favorably to those obtained from non-automated curve fitting. Completed maps were exported in DICOM format and could be read in standard software packages used for analysis of clinical and research MR data. Conclusions MRmap is a flexible cross-platform research tool that enables accurate mapping of relaxation times from various pulse sequences. The software allows researchers to optimize quantitative MR strategies in a manufacturer-independent fashion. The program and its source code were made available as open-source software on the internet.

Kühne Titus

2010-07-01

288

Public Health Genomics European Network: Report from the 2nd Network Meeting in Rome  

Directory of Open Access Journals (Sweden)

Full Text Available

Dear Sirs,

The Public Health Genomics European Network (PHGEN is a mapping exercise for the responsible and effective integration of genome-based knowledge and technologies into public policy and health services for the benefit of population health. In 2005, the European Commission called for a “networking exercise…to lead to an inventory report on genetic determinants relevant for public health”[1], this lead to the funding of a PHGEN three year project (EC project 2005313.This project started in early 2006 with a kick-off meeting in Bielefeld / Germany.The project work is comprised of, according to the public health trias, three one year periods of assessment, policy development and assurance.At the end of the assessment phase a network meeting was held in Rome from January, 31st to February 2nd 2007 with over 90 network members and network observers in attendance. The participants represented different organisations throughout the European Union with expertise in areas such as human genetics and other medical disciplines,epidemiology,public health, law, ethics, political and social sciences. The aim of the meeting was to wrap up the last year’s assessment period and to herald the policy development phase.The assessment period of PHGEN was characterised by several activities: - Contact and cooperation with other European and internationally funded networks and projects on public health genomics or related issues (e.g. EuroGenetest, EUnetHTA, Orphanet, IPTS, PHOEBE, GRaPHInt, P3G - Identification of key experts in public health genomics in the European members states, applicant countries and EFTA/EEA countries from different disciplines (e.g. human genetics and other medical disciplines, public health, law, philosophy, epidemiology, political and social sciences - Building up national task forces on public health genomics in the above mentioned countries - Establishing and work in three working groups: public health genomics definitions, genetic exceptionalism and public health genomics issues and priorities - Participation in the development process on OECD and European Council documents on genetic testing - Dissemination of results in journals, on websites and in conferences.

Nicole Rosenkötter

2007-03-01

289

PREFACE: 1st-2nd Young Researchers Meetings in Rome - Proceedings  

Science.gov (United States)

Students in science, particularly in physics, face a fascinating and challenging future. Scientists have proposed very interesting theories, which describe the microscopic and macroscopic world fairly well, trying to match the quantum regime with cosmological scales. Between the extremes of this scenario, biological phenomena in all their complexity take place, challenging the laws we observe in the atomic and sub-atomic world. More and more accurate and complex experiments have been devised and these are now going to test the paradigms of physics. Notable experiments include: the Large Hadronic Collider (LHC), which is going to shed light on the physics of the Standard Model of Particles and its extensions; the Planck-Herschel satellites, which target a very precise measurement of the properties of our Universe; and the Free Electron Lasers facilities, which produce high-brilliance, ultrafast X-ray pulses, allowing the investigation of the fundamental processes of solid state physics, chemistry, and biology. These projects are the result of huge collaborations spread across the world, involving scientists belonging to different and complementary research fields: physicists, chemists, biologists and others, keen to make the best of these extraordinary laboratories. Even though each branch of science is experiencing a process of growing specialization, it is very important to keep an eye on the global picture, remaining aware of the deep interconnections between inherent fields. This is even more crucial for students who are beginning their research careers. These considerations motivated PhD students and young post-docs connected to the Roman scientific research area to organize a conference, to establish the background and the network for interactions and collaborations. This resulted in the 1st and 2nd Young Researchers Meetings in Rome (http://ryrm.roma2.infn.it), one day conferences aimed primarily at graduate students and post-docs, working in physics in Italy and abroad. In its first two editions, the meeting was held at the Universities of Roma "Tor Vergata" (July 2009) and "LaSapienza" (February 2010), and organized in sections dedicated to up-to-date topics spanning broad research fields: Astrophysics-Cosmology, Soft-Condensed Matter Physics, Theoretical-Particle Physics, and Medical Physics. In these proceedings we have collected some of the contributions which were presented during the meetings.

YRMR Organizing Committee; Cannuccia, E.; Mazzaferro, L.; Migliaccio, M.; Pietrobon, D.; Stellato, F.; Veneziani, M.

2011-03-01

290

Hanbury Brown and Twiss Interferometry with respect to 2nd and 3rd-order event planes in Au+Au collisions at = 200 GeV  

Science.gov (United States)

Azimuthal angle dependence of HBT interferometry have been measured with respect to 2nd and 3rd-order event planes in Au+Au collisions at = 200 GeV at the PHENIX experiment. The 3rd-oder dependence of a Gaussian source radii was clearly observed as well as 2nd-order dependence. The result for 2nd-order indicates that the initial source eccentricity is diluted but still retain the initial shape at freeze-out, while the result for 3rd-order implies that the initial triangularity vanishes during the medium evolution, which is supported by a Gaussian source model and Monte-Carlo simulation.

Niida, Takafumi; Phenix Collaboration

2014-09-01

291

Diva-Fit: A step-by-step manual for generating high-resolution graphs and histogram overlays of flow cytometry data obtained with FACSDiva software  

Directory of Open Access Journals (Sweden)

Full Text Available In recent years, flow cytometry has been revolutionized via the introduction of digital data acquisition and analysis tools that facilitate simultaneous investigation of ten or more parameters. At the same time, some data presentation tools offered by commercial suppliers have remained surprisingly “antiquated”. This leads to the ironic fact that high-quality data is often represented by poor-quality illustrations: namely pixelated plots. In particular, data obtained using FACSDiva software is frequently exported into figures as low-resolution pixel graphics resulting in low-quality images. Additionally, even the newest version, Diva_6.1.2, is still unable to generate histogram overlays, a popular and convincing tool for direct data comparison. We hereby present an easy and down-to-earth Diva-Figure-improvement toolbox (Diva-Fit, which facilitates the generation of high-resolution graphs based on data acquired with FACSDiva software. Moreover, Diva-Fit allows the easy removal of unwanted quadrant labels without impairing the quality of FACS plots. Finally, we show that Diva-Fit supports straightforward composition of histogram overlays. All software tools necessary are freely available. We believe that the proposed toolbox may be very useful for many researchers working with flow cytometry.

Kristoffer Weber

2009-08-01

292

The whiteStar development project: Westinghouse's next generation core design simulator and core monitoring software to power the nuclear renaissance  

International Nuclear Information System (INIS)

The WhiteStar project has undertaken the development of the next generation core analysis and monitoring system for Westinghouse Electric Company. This on-going project focuses on the development of the ANC core simulator, BEACON core monitoring system and NEXUS nuclear data generation system. This system contains many functional upgrades to the ANC core simulator and BEACON core monitoring products as well as the release of the NEXUS family of codes. The NEXUS family of codes is an automated once-through cross section generation system designed for use in both PWR and BWR applications. ANC is a multi-dimensional nodal code for all nuclear core design calculations at a given condition. ANC predicts core reactivity, assembly power, rod power, detector thimble flux, and other relevant core characteristics. BEACON is an advanced core monitoring and support system which uses existing instrumentation data in conjunction with an analytical methodology for on-line generation and evaluation of 3D core power distributions. This new system is needed to design and monitor the Westinghouse AP1000 PWR. This paper describes provides an overview of the software system, software development methodologies used as well some initial results. (authors)

293

Conceptual design and optimization of a 1-1/2 generation PFBC plant task 14. Topical report  

Energy Technology Data Exchange (ETDEWEB)

The economics and performance of advanced pressurized fluidized bed (PFBC) cycles developed for utility applications during the last 10 years (especially the 2nd-Generation PFBC cycle) are projected to be favorable compared to conventional pulverized coal power plants. However, the improved economics of 2nd-Generation PFBC cycles are accompanied by the perception of increased technological risk related to the pressurized carbonizer and its associated gas cleanup systems. A PFBC cycle that removed the uncertainties of the carbonizer while retaining the high efficiency and low cost of a 2nd-Generation PFBC cycle could improve the prospects for early commercialization and pave the way for the introduction of the complete 2nd-Generation PFBC cycle at some later date. One such arrangement is a PFBC cycle with natural gas topping combustion, referred to as the 1.5-Generation PFBC cycle. This cycle combines the advantages of the 2nd-Generation PFBC plant with the reduced risk associated with a gas turbine burning natural gas, and can potentially be part of a phased approach leading to the commercialization of utility 2nd-Generation PFBC cycles. The 1.5-Generation PFBC may also introduce other advantages over the more complicated 2nd-Generation PFBC system. This report describes the technical and economic evaluation of 1.5-Generation PFBC cycles for utility or industrial power generation.

White, J.S.; Witman, P.M.; Harbaugh, L.; Rubow, L.N.; Horazak, D.A.

1994-12-01

294

Experiments and Demonstrations in Physics: Bar-Ilan Physics Laboratory (2nd Edition)  

Science.gov (United States)

The following sections are included: * Data-acquisition systems from PASCO * ScienceWorkshop 750 Interface and DataStudio software * 850 Universal Interface and Capstone software * Mass on spring * Torsional pendulum * Hooke's law * Characteristics of DC source * Digital storage oscilloscope * Charging and discharging a capacitor * Charge and energy stored in a capacitor * Speed of sound in air * Lissajous patterns * I-V characteristics * Light bulb * Short time intervals * Temperature measurements * Oersted's great discovery * Magnetic field measurements * Magnetic force * Magnetic braking * Curie's point I * Electric power in AC circuits * Faraday's law of induction I * Self-inductance and mutual inductance * Electromagnetic screening * LCR circuit I * Coupled LCR circuits * Probability functions * Photometric laws * Kirchhoff's rule for thermal radiation * Malus' law * Infrared radiation * Irradiance and illuminance

Kraftmakher, Yaakov

2014-08-01

295

Development of Hydrologic Characterization Technology of Fault Zones -- Phase I, 2nd Report  

Energy Technology Data Exchange (ETDEWEB)

This is the year-end report of the 2nd year of the NUMO-LBNL collaborative project: Development of Hydrologic Characterization Technology of Fault Zones under NUMO-DOE/LBNL collaboration agreement, the task description of which can be found in the Appendix 3. Literature survey of published information on the relationship between geologic and hydrologic characteristics of faults was conducted. The survey concluded that it may be possible to classify faults by indicators based on various geometric and geologic attributes that may indirectly relate to the hydrologic property of faults. Analysis of existing information on the Wildcat Fault and its surrounding geology was performed. The Wildcat Fault is thought to be a strike-slip fault with a thrust component that runs along the eastern boundary of the Lawrence Berkeley National Laboratory. It is believed to be part of the Hayward Fault system but is considered inactive. Three trenches were excavated at carefully selected locations mainly based on the information from the past investigative work inside the LBNL property. At least one fault was encountered in all three trenches. Detailed trench mapping was conducted by CRIEPI (Central Research Institute for Electric Power Industries) and LBNL scientists. Some intriguing and puzzling discoveries were made that may contradict with the published work in the past. Predictions are made regarding the hydrologic property of the Wildcat Fault based on the analysis of fault structure. Preliminary conceptual models of the Wildcat Fault were proposed. The Wildcat Fault appears to have multiple splays and some low angled faults may be part of the flower structure. In parallel, surface geophysical investigations were conducted using electrical resistivity survey and seismic reflection profiling along three lines on the north and south of the LBNL site. Because of the steep terrain, it was difficult to find optimum locations for survey lines as it is desirable for them to be as straight as possible. One interpretation suggests that the Wildcat Fault is westerly dipping. This could imply that the Wildcat Fault may merge with the Hayward Fault at depth. However, due to the complex geology of the Berkeley Hills, multiple interpretations of the geophysical surveys are possible. iv An effort to construct a 3D GIS model is under way. The model will be used not so much for visualization of the existing data because only surface data are available thus far, but to conduct investigation of possible abutment relations of the buried formations offset by the fault. A 3D model would be useful to conduct 'what if' scenario testing to aid the selection of borehole drilling locations and configurations. Based on the information available thus far, a preliminary plan for borehole drilling is outlined. The basic strategy is to first drill boreholes on both sides of the fault without penetrating it. Borehole tests will be conducted in these boreholes to estimate the property of the fault. Possibly a slanted borehole will be drilled later to intersect the fault to confirm the findings from the boreholes that do not intersect the fault. Finally, the lessons learned from conducting the trenching and geophysical surveys are listed. It is believed that these lessons will be invaluable information for NUMO when it conducts preliminary investigations at yet-to-be selected candidate sites in Japan.

Karasaki, Kenzi; Onishi, Tiemi; Black, Bill; Biraud, Sebastien

2009-03-31

296

FOREWORD: 2nd International Workshop on New Computational Methods for Inverse Problems (NCMIP 2012)  

Science.gov (United States)

Conference logo This volume of Journal of Physics: Conference Series is dedicated to the scientific contributions presented during the 2nd International Workshop on New Computational Methods for Inverse Problems, (NCMIP 2012). This workshop took place at Ecole Normale Supérieure de Cachan, in Cachan, France, on 15 May 2012, at the initiative of Institut Farman. The first edition of NCMIP also took place in Cachan, France, within the scope of the ValueTools Conference, in May 2011 (http://www.ncmip.org/2011/). The NCMIP Workshop focused on recent advances in the resolution of inverse problems. Indeed inverse problems appear in numerous scientific areas such as geophysics, biological and medical imaging, material and structure characterization, electrical, mechanical and civil engineering, and finance. The resolution of inverse problems consists of estimating the parameters of the observed system or structure from data collected by an instrumental sensing or imaging device. Its success firstly requires the collection of relevant observation data. It also requires accurate models describing the physical interactions between the instrumental device and the observed system, as well as the intrinsic properties of the solution itself. Finally, it requires the design of robust, accurate and efficient inversion algorithms. Advanced sensor arrays and imaging devices provide high rate and high volume data; in this context, the efficient resolution of the inverse problem requires the joint development of new models and inversion methods, taking computational and implementation aspects into account. During this one-day workshop, researchers had the opportunity to bring to light and share new techniques and results in the field of inverse problems. The topics of the workshop were: algorithms and computational aspects of inversion, Bayesian estimation, kernel methods, learning methods, convex optimization, free discontinuity problems, metamodels, proper orthogonal decomposition, reduced models for the inversion, non-linear inverse scattering, image reconstruction and restoration, applications (bio-medical imaging, non-destructive evaluation etc). NCMIP 2012 was a one-day workshop. Each of the submitted papers was reviewed by 2 to 4 reviewers. Among the accepted papers, there are 8 oral presentations and 5 posters. Three international speakers were invited for a long talk. This second edition attracted 60 registered attendees in May 2012. NCMIP 2012 was supported by Institut Farman (ENS Cachan) and endorsed by the following French research networks (GDR ISIS, GDR Ondes, GDR MOA, GDR MSPC). The program committee acknowledges the following laboratories CMLA, LMT, LSV, LURPA, SATIE, as well as DIGITEO Network. Laure Blanc-Féraud and Pierre-Yves Joubert Workshop Co-chairs Laure Blanc-Féraud, I3S laboratory, CNRS, France Pierre-Yves Joubert, IEF laboratory, Paris-Sud University, CNRS, France Technical Program Committee Alexandre Baussard, ENSTA Bretagne, Lab-STICC, France Marc Bonnet, ENSTA, ParisTech, France Jerôme Darbon, CMLA, ENS Cachan, CNRS, France Oliver Dorn, School of Mathematics, University of Manchester, UK Mário Figueiredo, Instituto Superior Técnico, Lisbon, Portugal Laurent Fribourg, LSV, ENS Cachan, CNRS, France Marc Lambert, L2S Laboratory, CNRS, SupElec, Paris-Sud University, France Anthony Quinn, Trinity College, Dublin, Ireland Christian Rey, LMT, ENS Cachan, CNRS, France Joachim Weickert, Saarland University, Germany Local Chair Alejandro Mottini, Morpheme group I3S-INRIA Sophie Abriet, SATIE, ENS Cachan, CNRS, France Béatrice Bacquet, SATIE, ENS Cachan, CNRS, France Reviewers Gilles Aubert, J-A Dieudonné Laboratory, CNRS and University of Nice-Sophia Antipolis, France Alexandre Baussard, ENSTA Bretagne, Lab-STICC, France Laure Blanc-Féraud, I3S laboratory, CNRS, France Marc Bonnet, ENSTA, ParisTech, France Jerôme Darbon, CMLA, ENS Cachan, CNRS, France Oliver Dorn, School of Mathematics, University of Manchester, UK Gérard Favier, I3S laboratory, CNRS, France Mário Figueiredo, Instituto Superior Técnico, Lisb

Blanc-Féraud, Laure; Joubert, Pierre-Yves

2012-09-01

297

Development of Hydrologic Characterization Technology of Fault Zones: Phase I, 2nd Report  

International Nuclear Information System (INIS)

This is the year-end report of the 2nd year of the NUMO-LBNL collaborative project: Development of Hydrologic Characterization Technology of Fault Zones under NUMO-DOE/LBNL collaboration agreement, the task description of which can be found in the Appendix 3. Literature survey of published information on the relationship between geologic and hydrologic characteristics of faults was conducted. The survey concluded that it may be possible to classify faults by indicators based on various geometric and geologic attributes that may indirectly relate to the hydrologic property of faults. Analysis of existing information on the Wildcat Fault and its surrounding geology was performed. The Wildcat Fault is thought to be a strike-slip fault with a thrust component that runs along the eastern boundary of the Lawrence Berkeley National Laboratory. It is believed to be part of the Hayward Fault system but is considered inactive. Three trenches were excavated at carefully selected locations mainly based on the information from the past investigative work inside the LBNL property. At least one fault was encountered in all three trenches. Detailed trench mapping was conducted by CRIEPI (Central Research Institute for Electric Power Industries) and LBNL scientists. Some intriguing and puzzling discoveries were made that may contradict with the published work in the past. Predictions are made regarding the hydrologic property of the Wildcat Fault based on the analysis of fault structure. Preliminary conceptual models of the Wildcat Fault were proposed. The Wildcat Fault appears to have multiple splays and some low angled faults may be part of the flower structure. In parallel, surface geophysical investigations were conducted using electrical resistivity survey and seismic reflection profiling along three lines on the north and south of the LBNL site. Because of the steep terrain, it was difficult to find optimum locations for survey lines as it is desirable for them to be as straight as possible. One interpretation suggests that the Wildcat Fault is westerly dipping. This could imply that the Wildcat Fault may merge with the Hayward Fault at depth. However, due to the complex geology of the Berkeley Hills, multiple interpretations of the geophysical surveys are possible. iv An effort to construct a 3D GIS model is under way. The model will be used not so much for visualization of the existing data because only surface data are available thus far, but to conduct investigation of possible abutment relations of the buried formations offset by the fault. A 3D model would be useful to conduct 'what if' scenario testing to aid the selection of borehole drilling locations and configurations. Based on the information available thus far, a preliminary plan for borehole drilling is outlined. The basic strategy is to first drill boreholes on both sides of the fault without penetrating it. Borehole tests will be conducted in these boreholes to estimate the property of the fault. Possibly a slanted borehole will be drilled later to intersect the fault to confirm the findings from the boreholes that do not intersect the fault. Finally, the lessons learned from conducting the trenching and geophysical surveys are listed. It is believed that these lessons will be invaluable information for NUMO when it conducts preliminary investigations at yet-to-be selected candidate sites in Japan.

298

2nd Radio and Antenna Days of the Indian Ocean (RADIO 2014)  

Science.gov (United States)

It was an honor and a great pleasure for all those involved in its organization to welcome the participants to the ''Radio and Antenna Days of the Indian Ocean'' (RADIO 2014) international conference that was held from 7th to 10th April 2014 at the Sugar Beach Resort, Wolmar, Flic–en–Flac, Mauritius. RADIO 2014 is the second of a series of conferences organized in the Indian Ocean region. The aim of the conference is to discuss recent developments, theories and practical applications covering the whole scope of radio–frequency engineering, including radio waves, antennas, propagation, and electromagnetic compatibility. The RADIO international conference emerged following discussions with engineers and scientists from the countries of the Indian Ocean as well as from other parts of the world and a need was felt for the organization of such an event in this region. Following numerous requests, the Island of Mauritius, worldwide known for its white sandy beaches and pleasant tropical atmosphere, was again chosen for the organization of the 2nd RADIO international conference. The conference was organized by the Radio Society, Mauritius and the Local Organizing Committee consisted of scientists from SUPELEC, France, the University of Mauritius, and the University of Technology, Mauritius. We would like to take the opportunity to thank all people, institutions and companies that made the event such a success. We are grateful to our gold sponsors CST and FEKO as well as URSI for their generous support which enabled us to partially support one PhD student and two scientists to attend the conference. We would also like to thank IEEE–APS and URSI for providing technical co–sponsorship. More than hundred and thirty abstracts were submitted to the conference. They were peer–reviewed by an international scientific committee and, based on the reviews, either accepted, eventually after revision, or rejected. RADIO 2014 brought together participants from twenty countries spanning five continents: Australia, Botswana, Brazil, Canada, China, Denmark, France, India, Italy, Mauritius, Poland, Reunion Island, Russia, South Africa, South Korea, Spain, Switzerland, The Netherlands, United Kingdom, and USA. The conference featured eleven oral sessions and one poster session on state–of–the–art research themes. Three internationally recognized scientists delivered keynote speeches during the conference. Prizes for the first and second Best Student Papers were awarded during the closing ceremony. Following the call for the extended contributions for publication as a volume in the IOP Conference Series: Materials Science and Engineering (MSE), both on–line and in print, we received thirty–two full papers. All submitted contributions were then peer–reviewed, revised whenever necessary, and accepted or rejected based on the recommendations of the reviewers of the editorial board. At the end of the procedure, twenty–five of them have been accepted for publication in this volume.

2014-10-01

299

Impacts of European Biofuel Policies on Agricultural Markets and Environment under Consideration of 2nd Generation Technologies and international Trade  

Digital Repository Infrastructure Vision for European Research (DRIVER)

Even though recent discussions on food prices and indirect land use change point at potential conflicts associated with the production of biofuels the appraisal of biofuels as an effective instrument to slow down climate change and reduce energy dependency still prevails. The EU Renewable Energy Directive (EUROPEAN COMMISSION, 2009) underlines this trend by setting a target of 10% share of energy from renewable sources in the transport sector by 2020. As economic competitiveness of biofuel pr...

Becker, A.; Adena?uer, Marcel; Blanco Fonseca, Maria

2010-01-01

300

Comparison of elution efficiency of 99Mo/99mTc generator using theoretical and a free web based software method  

International Nuclear Information System (INIS)

Full text: Generator is constructed on the principle of decay growth relationship between a long lived parent radionuclide and short lived daughter radionuclide. Difference in chemical properties of daughter and parent radionuclide helps in efficient separation of the two radionuclides. Aim and Objectives: The present study was designed to calculate the elution efficiency of the generator using the traditional formula based method and free web based software method. Materials and Methods: 99Mo/99mTc MON.TEK (Monrol, Gebze) generator and sterile 0.9% NaCl vial and vacuum vial in the lead shield were used for the elution. A new 99Mo/99mTc generator (calibrated activity 30GBq) calibrated for thursday was received on monday morning in our department. Generator was placed behind lead bricks in fume hood. The rubber plugs of both vacuum and 0.9% NaCl vial were wiped with 70% isopropyl alcohol swabs. Vacuum vial placed inside the lead shield was inserted in the vacuum position simultaneously 10 ml NaCl vial was inserted in the second slot. After 1-2 min vacuum vial was removed without moving the emptied 0.9%NaCl vial. The vacuum slot was covered with another sterile vial to maintain sterility. The RAC was measured in the calibrated dose calibrator (Capintec, 15 CRC). The elution efficiency was calculated theoretically and using free web based software (Apache Web server (www.apache.org) and PHP (www.php.net). Web site of the Italia PHP (www.php.net). Web site of the Italian Association of Nuclear Medicine and Molecular Imaging (www.aimn.it). Results: The mean elution efficiency calculated by theoretical method was 93.95% +0.61. The mean elution efficiency as calculated by the software was 92.85% + 0.89. There was no statistical difference in both the methods. Conclusion: The free web based software provides precise and reproducible results and thus saves time and mathematical calculation steps. This enables a rational use of available activity and also enabling a selection of the type and number of procedures to perform in a busy nuclear medicine department

 
 
 
 
301

Improving the performance of E-beam 2nd writing in mask alignment accuracy and pattern faultless for CPL technology  

Science.gov (United States)

The chromeless phase lithography (CPL) is a potential technology for low k1 optical image. For the CPL technology, we can control the local transmission rate to get optimized through pitch imaging performance. The CPL use zebra pattern to manipulate the pattern local transmission as a tri-tone structure in mask manufacturing. It needs the 2nd level writing to create the zebra pattern. The zebra pattern must be small enough not to be printed out and the 2nd writing overlay accuracy must keep within 40nm. The request is a challenge to E-beam 2nd writing function. The focus of this paper is in how to improve the overlay accuracy and get a precise pattern to form accurate pattern transmission. To fulfill this work several items have been done. To check the possibility of contamination in E-Beam chamber by the conductive layer coating we monitor the particle count in the E-Beam chamber before and after the coated blank load-unload. The conductivity of our conductive layer has been checked to eliminate the charging effect by optimizing film thickness. The dimension of alignment mark has also been optimized through experimentation. And finally we checked the PR remain to ensure sufficient process window in our etching process. To verify the performance of our process we check the 3D SEM picture. Also we use AIMs to prove the resolution improvement capability in CPL compared to the traditional methods-Binary mask and Half Tone mask. The achieved overlay accuracy and process can provide promising approach for NGL reticle manufacturing of CPL technology.

Lee, Booky; Hung, Richard; Lin, Orson; Wu, Yuan-Hsun; Kozuma, Makoto; Shih, Chiang-Lin; Hsu, Michael; Hsu, Stephen D.

2005-01-01

302

Interaction between Short-Term Heat Pretreatment and Fipronil on 2nd Instar Larvae of Diamondback Moth, Plutella Xylostella (Linn)  

Digital Repository Infrastructure Vision for European Research (DRIVER)

Based on the cooperative virulence index (c.f.) and LC50 of fipronil, the interaction effect between short-term heat pretreatment and fipronil on 2nd instar larvae of diamondback moth (DBM), Plutella xylostella (Linnaeus), was assessed. The results suggested that pretreatment of the tested insects at 30 °C for 2, 4 and 8h could somewhat decrease the toxicity of fipronil at all set concentrations. The LC50 values of fipronil increased after heat pretreatment and c.f. values in all these treat...

Gu, Xiaojun; Tian, Sufen; Wang, Dehui; Gao, Fei; Wei, Hui

2010-01-01

303

Virtual Visit to the ATLAS Control Room by 2nd High School of Eleftherio–Kordelio in Thessaloniki  

CERN Multimedia

Our school is the 2nd High School of Eleftherio – Kordelio. It is located at the west suburbs of Thessaloniki in Greece and our students are between 15-17 years old. Thessaloniki is the second largest city in Greece with a port of a major role in trading at the area of South Balkans. During this period of time our students have heard so much about CERN and the great discoveries which have taken place there and they are really keen on visiting and learning many things about it.

2013-01-01

304

[In search of the ideal surgical treatment for lymphedema. Report of 2nd European Conference on supermicrosurgery (Barcelona - March 2012)].  

Science.gov (United States)

Since more than 50 years, many surgeons all around the world try to find the perfect surgical technique to treat limb lymphedemas. Decongestive physiotherapy associated with the use of a compressive garment has been the primary choice for lymphedema treatment. Many different surgical techniques have been developed, however, to date, there is no consensus on surgical procedure. Most surgical experts of lymphedema met in the second European Conference on supermicrosurgery, organized on March 1st and 2nd 2012, in San Pau Hospital, Barcelona. Together they tried to clarify these different options and ideally a strategy for using these techniques. PMID:23063020

Rausky, J; Robert, N; Binder, J-P; Revol, M

2012-12-01

305

Synthetic CO, H2 and HI surveys of the Galactic 2nd Quadrant, and the properties of molecular gas  

CERN Document Server

We present CO, H2, HI and HISA distributions from a set of simulations of grand design spirals including stellar feedback, self-gravity, heating and cooling. We replicate the emission of the 2nd Galactic Quadrant by placing the observer inside the modelled galaxies and post process the simulations using a radiative transfer code, so as to create synthetic observations. We compare the synthetic datacubes to observations of the 2nd Quadrant of the Milky Way to test the ability of the current models to reproduce the basic chemistry of the Galactic ISM, as well as to test how sensitive such galaxy models are to different recipes of chemistry and/or feedback. We find that models which include feedback and self-gravity can reproduce the production of CO with respect to H2 as observed in our Galaxy, as well as the distribution of the material perpendicular to the Galactic plane. While changes in the chemistry/feedback recipes do not have a huge impact on the statistical properties of the chemistry in the simulated g...

Duarte-Cabral, A; Dobbs, C L; Mottram, J C; Gibson, S J; Brunt, C M; Douglas, K A

2014-01-01

306

D Modeling of Headstones of the 2ND and 3RD Century by Low Cost Photogrammetric Techniques  

Science.gov (United States)

As a dozen headstones have been discovered during excavations in south Alsace, archaeologists stored them in the Regional Directorate of Cultural Affairs in Strasbourg. In order to complete the survey they are used to practice by hand on the steles, they asked the INSA Strasbourg to reconstruct at least the 7 figured sandstones in 3D. The high accuracy required by the archaeologists can be reached by an expensive technique using laserscanning system. Aim of the current work is to look for an alternative method and (if appropriate) low cost software allowing to provide a similar quality and a sufficient level of details. The 3D reconstruction of the headstones based exclusively on multiple images processing is presented. The step of point cloud generation is detailed because it determines the final product quality. Therefore, an assessment of the produced point cloud has been performed through comparison to a reference point cloud obtained by laser scanning technique. The steps leading to the photo-realistic textured 3D models of the headstones are presented and the software used for that are evaluated. The final product respects the accuracy requirement of 1 mm desired by the archaeologists.

Landes, T.; Waton, M.-D.; Alby, E.; Gourvez, S.; Lopes, B.

2013-07-01

307

2nd International Conference on INformation Systems Design and Intelligent Applications  

CERN Document Server

The second international conference on INformation Systems Design and Intelligent Applications (INDIA – 2015) held in Kalyani, India during January 8-9, 2015. The book covers all aspects of information system design, computer science and technology, general sciences, and educational research. Upon a double blind review process, a number of high quality papers are selected and collected in the book, which is composed of two different volumes, and covers a variety of topics, including natural language processing, artificial intelligence, security and privacy, communications, wireless and sensor networks, microelectronics, circuit and systems, machine learning, soft computing, mobile computing and applications, cloud computing, software engineering, graphics and image processing, rural engineering, e-commerce, e-governance, business computing, molecular computing, nano computing, chemical computing, intelligent computing for GIS and remote sensing, bio-informatics and bio-computing. These fields are not only ...

Satapathy, Suresh; Sanyal, Manas; Sarkar, Partha; Mukhopadhyay, Anirban

2015-01-01

308

Generating statements at whole-body imaging with a workflow-optimized software tool - first experiences with multireader analysis  

International Nuclear Information System (INIS)

Introduction: Due to technical innovations in sectional diagram methods, whole-body imaging has increased in importance for clinical radiology, particularly for the diagnosis of systemic tumor disease. Large numbers of images have to be evaluated in increasingly shorter time periods. The aim was to create and evaluate a new software tool to assist and automate the process of diagnosing whole-body datasets. Material and Methods: Thirteen whole-body datasets were evaluated by 3 readers using the conventional system and the new software tool. The times for loading the datasets, examining 5 different regions (head, neck, thorax, abdomen and pelvis/skeletal system) and retrieving a relevant finding for demonstration were acquired. Additionally a Student T-Test was performed. For qualitative analysis the 3 readers used a scale from 0 - 4 (0 = bad, 4 = very good) to assess dataset loading convenience, lesion location assistance, and ease of use. Additionally a kappa value was calculated. Results: The average loading time was 39.7 s (± 5.5) with the conventional system and 6.5 s (± 1.4) (p 0.9). The qualitative analysis showed a significant advantage with respect to convenience (p 0.9). (orig.)

309

Early prediction for necessity of 2nd I-131 ablation therapy with serum thyroglobulin levels in patients with differentiated thyroid cancer  

Energy Technology Data Exchange (ETDEWEB)

The aim of our study was to evaluate the predictive value of serum thyroglobulin levels, measured at preoperative status and just before 1st I-131 ablation therapy with high serum TSH, for necessity of 2nd I-131 ablation therapy in differentiated thyroid cancer patients. 111 patients with DTC who underwent total or near total thyroidectomy followed by immediate I-131 ablation therapy, were enrolled in this study. TSH, Tg and anti-Tg autoantibody were measured before thyroidectomy (TSHpreop, Tgpreop and Anti-Tgpreop) and just before 1st I-131 ablation therapy (TSHabl, Tgabl and Anti-Tgabl). All TSHabl levels were above 30mU/liter, ATg [(Tgpreop-Tgabl)X100/(Tgpreop)] was calculated. 29 patients(26.1%, 29/111) had to receive 2nd I-131 ablation therapy. Of 70 patients whose Tgabl were under 10 ng/ml, only 11 patients had received 2nd I-131 ablation therapy (15.7%). Patients with Tgabl greater than or equal to 10 ng/ml had received 2nd I-131 ablation therapy (18/41, 43.9%) than patients with lower Tgabl level. There was a disparity of necessity of 2nd I-131 ablation therapy between two groups(Tgabl <10 ng/ml and Tgabl =10 ng/ml, two by two /2 test p=0.0016). Of 41 patients with Tgabl greater than or equal to 10 ng/ml, 19 patients showed increased Tg levels (ATg<0). Patients with negative ATg and Tgabl greater than or equal to 10 ng/ml showed a strikingly high necessity of 2nd I-131 ablation therapy (11/19, 57.9%). There was also a significant disparity of necessity of 2nd I-131 ablation therapy between two groups(ATg<0 + Tgabl =10 ng/ml and the others, two by two /2 test, p=0.0012). These results suggest that high Tgabl level just before 1st I-131 ablation therapy can forecast the necessity of 2nd I-131 ablation therapy. Moreover, Difference of Tg level between preoperative status and just before 1st I-131 ablation therapy could also suggest necessity of 2nd I-131 ablation therapy at early period of DTC patients surveillance.

Bae, Jin Ho; Seo, Ji Hyoung; Jeong, Shin Young; Yoo, Jeong Soo; Ahn, Byeong Cheol; Lee, Jae Tae; Lee, Kyu Bo [Kyungpook National University Hospital, Daegu (Korea, Republic of)

2005-07-01

310

A computer software system for the generation of global ocean tides including self-gravitation and crustal loading effects  

Science.gov (United States)

A computer software system is described which computes global numerical solutions of the integro-differential Laplace tidal equations, including dissipation terms and ocean loading and self-gravitation effects, for arbitrary diurnal and semidiurnal tidal constituents. The integration algorithm features a successive approximation scheme for the integro-differential system, with time stepping forward differences in the time variable and central differences in spatial variables. Solutions for M2, S2, N2, K2, K1, O1, P1 tidal constituents neglecting the effects of ocean loading and self-gravitation and a converged M2, solution including ocean loading and self-gravitation effects are presented in the form of cotidal and corange maps.

Estes, R. H.

1977-01-01

311

CaF2:Mn thermoluminescence: a single glow peak not described by 1st or 2nd order kinetics  

International Nuclear Information System (INIS)

The thermoluminescence (TL) of CaF2:Mn has been studied using photon counting and digital recording. For doses of 10 rad or less the TL glow curves appear to consist of a single glow peak. However, there are indications - which are pronounced at larger doses - that one additional low intensity peak (area less than or equal to one percent) is superimposed on each side of the central peak. The intense peak is not described by 1st or 2nd order kinetics but is well described by the more general kinetics from which these kinetics are derived. These observations, and the results of additional kinetic analysis, demonstrate that retrapping is not negligible and may include all three peaks. In such systems, which are likely to include other dosimeter materials and minerals, peak height will not increase linearly with dose; an important factor for dosimetry and dating applications

312

Use of 2nd and 3rd Level Correlation Analysis for Studying Degradation in Polycrystalline Thin-Film Solar Cells  

Energy Technology Data Exchange (ETDEWEB)

The correlation of stress-induced changes in the performance of laboratory-made CdTe solar cells with various 2nd and 3rd level metrics is discussed. The overall behavior of aggregated data showing how cell efficiency changes as a function of open-circuit voltage (Voc), short-circuit current density (Jsc), and fill factor (FF) is explained using a two-diode, PSpice model in which degradation is simulated by systematically changing model parameters. FF shows the highest correlation with performance during stress, and is subsequently shown to be most affected by shunt resistance, recombination and in some cases voltage-dependent collection. Large decreases in Jsc as well as increasing rates of Voc degradation are related to voltage-dependent collection effects and catastrophic shunting respectively. Large decreases in Voc in the absence of catastrophic shunting are attributed to increased recombination. The relevance of capacitance-derived data correlated with both Voc and FF is discussed.

Albin, D. S.; del Cueto, J. A.; Demtsu, S. H.; Bansal, S.

2011-03-01

313

Results of the independent verification of radiological remedial action at 433 South 2nd East Street, Monticello, Utah (MS00103)  

International Nuclear Information System (INIS)

In 1980 the site of a vanadium and uranium mill at Monticello, Utah, was accepted into the US Department of Energy's (DOE's) Surplus Facilities Management Program, with the objectives of restoring the government-owned mill site to safe levels of radioactivity, disposing of or containing the tailings in an environmentally safe manner, and performing remedial actions on off-site (vicinity)properties that had been contaminated by radioactive material resulting from mill operations. During 1984 and 1985, UNC Geotech, the remedial action contractor designated by DOE, performed remedial action on the vicinity property at 433 South 2nd East Street, Monticello, Utah. The Pollutant Assessments Group (PAG) of Oak Ridge National Laboratory was assigned the responsibility of verifying the data supporting the adequacy of remedial action and confirming the site's compliance with DOE guidelines. The PAG found that the site successfully meets the DOE remedial action objectives. Procedures used by PAG are described. 2 tabs., 3 refs

314

Testing the 2nd degree local polynomical approximation method using the calculations of fast power reactor two-dimensional models  

International Nuclear Information System (INIS)

A 2nd order local polynomial approximation method (LPA2) is compared with the method that is at present used by UJV as a standard tool for fast breeder reactor neutron calculations. The comparison of the results of two-dimensional cylindrical fast reactor benchmark calculations by both method leads to the following conclusions: a) from the point of view of computational accuracy LPA2 and standard methods are equivalent; b) from the point of view of machine time consumption the LPA2 method is clearly superior. In actual situations the LPA2 method is 2.5 to 5 times faster for few-group (approximately 4) and 10 to 20 times faster for multi-group (approximately 26) calculations than the standard method. (author)

315

Results of the independent verification of radiological remedial action at 396 South 2nd East Street, Monticello, Utah (MS00085)  

International Nuclear Information System (INIS)

In 1980 the site of a vanadium and uranium mill at Monticello, Utah, was accepted into the US Department of Energy's (DOE's) Surplus Facilities Management Program, with the objectives of restoring the government-owned mill site to safe levels of radioactivity disposing of or containing the tailings in an environmentally safe manner, and performing remedial actions on off-site (vicinity) properties that had been contaminated by radioactive material resulting from mill operations. During 1985, UNC Geotech, the remedial action contractor designated by DOE, performed remedial action on the vicinity property at 396 South 2nd East Street, Monticello, Utah. The Pollutant Assessments Group (PAG) of Oak Ridge National Laboratory was assigned the responsibility of verifying the data supporting the adequacy of remedial action and confirming the site's compliance with DOE guidelines. The PAG found that the site successfully meets the DOE remedial action objectives. Procedures used by PAG are described. 34 refs., 2 tabs

316

Results of the independent verification of radiological remedial action at 384 South 2nd East Street, Monticello, Utah (MS00084)  

International Nuclear Information System (INIS)

In 1980 the site of a vanadium and uranium mill at Monticello, Utah, was accepted into the US Department of Energy's (DOE's) Surplus Facilities Management Program, with the objectives of restoring the government-owned mill site to safe levels of radioactivity, disposing of or containing the tailings in an environmentally safe manner, and performing remedial actions on off-site (vicinity) properties that had been contaminated by radioactive material resulting from mill operations. During 1984, UNC Geotech, the remedial action contractor designated by DOE, performed remedial action on the vicinity property at 384 South 2nd East Street, Monticello, Utah. The Pollutant Assessments Group (PAG) of Oak Ridge National Laboratory was assigned the responsibility of verifying the data supporting the adequacy of remedial action and confirming the site's compliance with DOE guidelines. The PAG found that the site successfully meets the DOE remedial action objectives. Procedures used by PAG are described. 3 refs., 2 tabs

317

Software Engineering  

Science.gov (United States)

CSC 450. Software Engineering (3) Prerequisite: CSC 332 and senior standing. Study of the design and production of large and small software systems. Topics include systems engineering, software life-cycle and characterization; use of software tools. Substantial software project required.

Tagliarini, Gene

2003-04-21

318

Directional fidelity of nanoscale motors and particles is limited by the 2nd law of thermodynamics--via a universal equality.  

Science.gov (United States)

Directional motion of nanoscale motors and driven particles in an isothermal environment costs a finite amount of energy despite zero work as decreed by the 2nd law, but quantifying this general limit remains difficult. Here we derive a universal equality linking directional fidelity of an arbitrary nanoscale object to the least possible energy driving it. The fidelity-energy equality depends on the environmental temperature alone; any lower energy would violate the 2nd law in a thought experiment. Real experimental proof for the equality comes from force-induced motion of biological nanomotors by three independent groups - for translational as well as rotational motion. Interestingly, the natural self-propelled motion of a biological nanomotor (F1-ATPase) known to have nearly 100% energy efficiency evidently pays the 2nd law decreed least energy cost for direction production. PMID:23883059

Wang, Zhisong; Hou, Ruizheng; Efremov, Artem

2013-07-21

319

Operation of Finnish nuclear power plants. Quarterly report, 2nd quarter 1998  

International Nuclear Information System (INIS)

Quarterly reports on the operation of Finnish NPPs describe events and observations relating to nuclear and radiation safety that the Radiation and Nuclear Safety Authority (STUK) considers safety significant. Safety improvements at the plants are also described. The report includes a summary of the radiation safety of plant personnel and the environment and tabulated data on the plants' production and load factors. The Loviisa plant units were in power operation for the whole second quarter of 1998. The Olkiluoto units discontinued electricity generation for annual maintenance and also briefly for tests pertaining to the power upratings of the units. In addition, there were breaks in power generation at Olkiluoto 2 due to a low electricity demand in Midsummer and turbine balancing. The Olkiluoto units were in power operation in this quarter with the exception of the aforementioned breaks. The load factor average of the four plant units was 87.7%. The events in this quarter had no bearing on the nuclear or radiation safety. Occupational doses and radioactive releases off-site were below authorised limits. Radioactive substances were measurable in samples collected around the plants in such quantities only as have no bearing on the radiation exposure of the population. (orig.)

320

The 2nd DBCLS BioHackathon: interoperable bioinformatics Web services for integrated applications  

Directory of Open Access Journals (Sweden)

Full Text Available Abstract Background The interaction between biological researchers and the bioinformatics tools they use is still hampered by incomplete interoperability between such tools. To ensure interoperability initiatives are effectively deployed, end-user applications need to be aware of, and support, best practices and standards. Here, we report on an initiative in which software developers and genome biologists came together to explore and raise awareness of these issues: BioHackathon 2009. Results Developers in attendance came from diverse backgrounds, with experts in Web services, workflow tools, text mining and visualization. Genome biologists provided expertise and exemplar data from the domains of sequence and pathway analysis and glyco-informatics. One goal of the meeting was to evaluate the ability to address real world use cases in these domains using the tools that the developers represented. This resulted in i a workflow to annotate 100,000 sequences from an invertebrate species; ii an integrated system for analysis of the transcription factor binding sites (TFBSs enriched based on differential gene expression data obtained from a microarray experiment; iii a workflow to enumerate putative physical protein interactions among enzymes in a metabolic pathway using protein structure data; iv a workflow to analyze glyco-gene-related diseases by searching for human homologs of glyco-genes in other species, such as fruit flies, and retrieving their phenotype-annotated SNPs. Conclusions Beyond deriving prototype solutions for each use-case, a second major purpose of the BioHackathon was to highlight areas of insufficiency. We discuss the issues raised by our exploration of the problem/solution space, concluding that there are still problems with the way Web services are modeled and annotated, including: i the absence of several useful data or analysis functions in the Web service "space"; ii the lack of documentation of methods; iii lack of compliance with the SOAP/WSDL specification among and between various programming-language libraries; and iv incompatibility between various bioinformatics data formats. Although it was still difficult to solve real world problems posed to the developers by the biological researchers in attendance because of these problems, we note the promise of addressing these issues within a semantic framework.

Katayama Toshiaki

2011-08-01

 
 
 
 
321

User Manual for Beta Version of TURBO-GRD: A Software System for Interactive Two-Dimensional Boundary/ Field Grid Generation, Modification, and Refinement  

Science.gov (United States)

TURBO-GRD is a software system for interactive two-dimensional boundary/field grid generation. modification, and refinement. Its features allow users to explicitly control grid quality locally and globally. The grid control can be achieved interactively by using control points that the user picks and moves on the workstation monitor or by direct stretching and refining. The techniques used in the code are the control point form of algebraic grid generation, a damped cubic spline for edge meshing and parametric mapping between physical and computational domains. It also performs elliptic grid smoothing and free-form boundary control for boundary geometry manipulation. Internal block boundaries are constructed and shaped by using Bezier curve. Because TURBO-GRD is a highly interactive code, users can read in an initial solution, display its solution contour in the background of the grid and control net, and exercise grid modification using the solution contour as a guide. This process can be called an interactive solution-adaptive grid generation.

Choo, Yung K.; Slater, John W.; Henderson, Todd L.; Bidwell, Colin S.; Braun, Donald C.; Chung, Joongkee

1998-01-01

322

Promoting concrete algorithm for implementation in computer system and data movement in terms of software reuse to generate actual values suitable for different access  

Directory of Open Access Journals (Sweden)

Full Text Available The construction of functional algorithms by a good line and programming, open new routes and in the same time increase the capability to use them in the Mechatronics systems with specific and reliability system for any practical implementation and by justification in aspect of the economy context, and in terms of maintenance, making it more stable etc. This flexibility is really a possibility for the new approach and by makes the program code an easy way for updating data and In many cases is needed a quick access method which is which is specified in the context of generating appropriate values for digital systems. This forms, is opening a new space and better management to manage a respective values of a program code, and for software reuse, because this solution reduce costs and has a positive effect in terms of a digital economy.

Nderim Zeqiri

2013-04-01

323

A study on trait anger – anger expression and friendship commitment levels of primary school 2nd stage students who play – do not play sports  

Digital Repository Infrastructure Vision for European Research (DRIVER)

The aim of this research was to investigate the state of trait anger-anger expression and friendship commitment levels depending on the fact whether 2nd-stage-students in primary schools played sports or not.Personal Information Form, Trait Anger-Anger Expression Scales and Inventory of Peer Attachment were used in order to collect the data.The population of the research was consisted of the students who studied in 2nd stage of 40 primary state schools that belonged to National Education Dire...

Hüseyin K?r?mo?lu; Yunus Y?ld?r?m; Ali Temiz

2010-01-01

324

Development of a radioactive waste treatment equipment utilizing microwave heating, 2nd report  

International Nuclear Information System (INIS)

The objective of the present study is to establish an incineration technique utilizing microwave heating which enables a high volume reduction of spent ion-exchange resins and filtering media generated at nuclear facilities. The past three years from 1982 to 1985, with the financial aid from the Agency of Science and Technology, brought a great and rapid progress to this project when the heating technique was switched from direct microwave heating to indirect heating by employing a bed of beads of silicon carbide. This material was also used to build a secondary furnace, walls and roster bars, to treat the obnoxious gases and soot arising in the primary incineration process by the radiating heat of this material heated to above 1000 deg C again by microwave energy, but not by the originarily applied direct plasma torch combustion. The incinerator and the secondary furnace were integrated into one unit as the principal treating equipment. This novel approach made possible a well stabilized continuous incineration operation. Further, developmental efforts toward industrial applications were made by setting up a pilot plant with microwave generators, 2 sets of 5 kW of 2450 MHz and 1 set of 25 kW of 915 MHz, and tests were carried out to prove remarkably high volume reduction capability well above roughly 200 on weight basis. For hot test runs, a one - tenth scale pilot test setup was installed at the TOKAI Laboratory of Japan Atmic Energy Research Institute and tested wiic Energy Research Institute and tested with materials spiked with radioisotopes and also with spent ion-exchange resins stored there. Very satisfactory results were obtained in these proving tests to show the efficient capability of high volume reduction treatment of otherwise stable radioactive waste materials such as spent ion-exchange resins. (author)

325

Programed oil generation of the Zubair Formation, Southern Iraq oil fields: Results from Petromod software modeling and geochemical analysis  

Science.gov (United States)

1D petroleum system modeling was performed on wells in each of four oil fields in South Iraq, Zubair (well Zb-47), Nahr Umr (well NR-9), West Qurna (well WQ-15 and 23), and Majnoon (well Mj-8). In each of these fields, deposition of the Zubair Formation was followed by continuous burial, reaching maximum temperatures of 100??C (equivalent to 0. 70%Ro) at depths of 3,344-3,750 m of well Zb-47 and 3,081. 5-3,420 m of well WQ-15, 120??C (equivalent to 0. 78%Ro) at depths of 3,353-3,645 m of well NR-9, and 3,391-3,691. 5 m of well Mj-8. Generation of petroleum in the Zubair Formation began in the late Tertiary, 10 million years ago. At present day, modeled transformation ratios (TR) indicate that 65% TR of its generation potential has been reached in well Zb-47, 75% TR in well NR-9 and 55-85% TR in West Qurna oil field (wells WQ-15 and WQ-23) and up to 95% TR in well Mj-8, In contrast, younger source rocks are immature to early mature (oil field in Hilla region of western Euphrates River whereas the Zubair Formation is immature within temperature range of 65-70??C (0. 50%Ro equivalent) with up to 12% (TR = 12%) hydrocarbon generation efficiency and hence poor generation could be assessed in this last location. The Zubair Formation was deposited in a deltaic environment and consists of interbedded shales and porous and permeable sandstones. In Basrah region, the shales have total organic carbon of 0. 5-7. 0 wt%, Tmax 430-470??C and hydrogen indices of up to 466 with S2 = 0. 4-9. 4 of kerogen type II & III and petroleum potential of 0. 4-9. 98 of good hydrocarbon generation, which is consistent with 55-95% hydrocarbon efficiency. These generated hydrocarbons had charged (in part) the Cretaceous and Tertiary reservoirs, especially the Zubair Formation itself, in the traps formed by Alpine collision that closed the Tethys Ocean between Arabian and Euracian Plates and developed folds in Mesopotamian Basin 15-10 million years ago. These traps are mainly stratigraphic facies of sandstones with the shale that formed during the deposition of the Zubair Formation in transgression and regression phases within the main structural folds of the Zubair, Nahr Umr, West Qurna and Majnoon Oil fields. Oil biomarkers of the Zubair Formation Reservoirs are showing source affinity with mixed oil from the Upper Jurassic and Lower Cretaceous strata, including Zubair Formation organic matters, based on presentation of GC and GC-MS results on diagrams of global petroleum systems. ?? 2010 Saudi Society for Geosciences.

Al-Ameri, T. K.; Pitman, J.; Naser, M.E.; Zumberge, J.; Al-Haydari, H. A.

2011-01-01

326

MYOB software for dummies  

CERN Document Server

Your complete guide to MYOB® AccountRight softwareNow in its seventh edition, MYOB® Software For Dummies walks you through everything you need to know, from starting your MYOB® file from scratch and recording payments and receipts, to tracking profit and analysing sales. This new edition includes all the information you need on the new generation of MYOB® AccountRight software, including the new cloud computing features. Set up MYOB® software - understand how to make it work the first time Keep track of purchases and sales - monitor customer accounts and ensure you get pai

Curtis, Veechi

2012-01-01

327

GENII [Generation II]: The Hanford Environmental Radiation Dosimetry Software System: Volume 3, Code maintenance manual: Hanford Environmental Dosimetry Upgrade Project  

International Nuclear Information System (INIS)

The Hanford Environmental Dosimetry Upgrade Project was undertaken to incorporate the internal dosimetry models recommended by the International Commission on Radiological Protection (ICRP) in updated versions of the environmental pathway analysis models used at Hanford. The resulting second generation of Hanford environmental dosimetry computer codes is compiled in the Hanford Environmental Dosimetry System (Generation II, or GENII). This coupled system of computer codes is intended for analysis of environmental contamination resulting from acute or chronic releases to, or initial contamination of, air, water, or soil, on through the calculation of radiation doses to individuals or populations. GENII is described in three volumes of documentation. This volume is a Code Maintenance Manual for the serious user, including code logic diagrams, global dictionary, worksheets to assist with hand calculations, and listings of the code and its associated data libraries. The first volume describes the theoretical considerations of the system. The second volume is a Users' Manual, providing code structure, users' instructions, required system configurations, and QA-related topics. 7 figs., 5 tabs

328

GENII (Generation II): The Hanford Environmental Radiation Dosimetry Software System: Volume 3, Code maintenance manual: Hanford Environmental Dosimetry Upgrade Project  

Energy Technology Data Exchange (ETDEWEB)

The Hanford Environmental Dosimetry Upgrade Project was undertaken to incorporate the internal dosimetry models recommended by the International Commission on Radiological Protection (ICRP) in updated versions of the environmental pathway analysis models used at Hanford. The resulting second generation of Hanford environmental dosimetry computer codes is compiled in the Hanford Environmental Dosimetry System (Generation II, or GENII). This coupled system of computer codes is intended for analysis of environmental contamination resulting from acute or chronic releases to, or initial contamination of, air, water, or soil, on through the calculation of radiation doses to individuals or populations. GENII is described in three volumes of documentation. This volume is a Code Maintenance Manual for the serious user, including code logic diagrams, global dictionary, worksheets to assist with hand calculations, and listings of the code and its associated data libraries. The first volume describes the theoretical considerations of the system. The second volume is a Users' Manual, providing code structure, users' instructions, required system configurations, and QA-related topics. 7 figs., 5 tabs.

Napier, B.A.; Peloquin, R.A.; Strenge, D.L.; Ramsdell, J.V.

1988-09-01

329

Report of 2nd workshop on particle process. A report of the Yayoi study meeting  

International Nuclear Information System (INIS)

In Nuclear Engineering Research Laboratory, Faculty of Engineering, University of Tokyo, a short term research named Yayoi Research Group, as a joint application research work of nuclear reactor (Yayoi) and electron Linac in Japan, has been held more than 10 times a year. This report is arranged the summaries of 'Research on Particle Method', one of them, held on August 7, 1996. As named 'Particle Method' here, the method explaining and calculating the fluids and powders as a group of particles is more suitable for treating a problem with boundary face and a large deformation of the fluids on comparison with the conventional method using lattice, which is more expectable in future development. In this report, the following studies are contained; 1) Stress analysis without necessary of element breakdown, 2) Local interpolation differential operator method and nonstructural lattice, 3) Selforganized simulation of the dynamical construction, 4) A lattice BGK solution of laminar flow over a background facing step, 5) Numerical analysis of solid-gas two phase flow using discrete element method, 6) Application of flow analysis technique to power generation plant equipments, 7) Corrision wave captured flow calculation using the particle method, and 8) Analysis of complex problem on thermal flow using the particle (MPS) method. (G.K.)

330

2nd Workshop on Jet Modification in the RHIC and LHC Era  

CERN Document Server

A workshop organized jointly by the Wayne State Heavy Ion Group and the JET Collaboration. The goal of this 2 1/2 day meeting is to review the most important new experimental measurements and theoretical breakthroughs that have occurred in the past year andto throughly explore the limits of perturbative QCD based approaches to the description of hard processes in heavy-ion collisions. Over the period of three days, topics covered will include new experimental observables that may discern between different perturbative approaches, the inevitable transformation of analytic schemes to Monte-Carlo event generators, and the progress made towards Next to Leading Order calculations of energy loss. The workshop is intended to be slow paced:We envision a mixture of longer invited talks and shorter contributed talks,allowing sufficient time for discussion, as well as time to follow up on more technical aspects of the data analysis and theoretical calculations. One of the outcomes of this workshop will be a ...

2013-01-01

331

Software Engineering Program: Software Process Improvement Guidebook  

Science.gov (United States)

The purpose of this document is to provide experience-based guidance in implementing a software process improvement program in any NASA software development or maintenance community. This guidebook details how to define, operate, and implement a working software process improvement program. It describes the concept of the software process improvement program and its basic organizational components. It then describes the structure, organization, and operation of the software process improvement program, illustrating all these concepts with specific NASA examples. The information presented in the document is derived from the experiences of several NASA software organizations, including the SEL, the SEAL, and the SORCE. Their experiences reflect many of the elements of software process improvement within NASA. This guidebook presents lessons learned in a form usable by anyone considering establishing a software process improvement program within his or her own environment. This guidebook attempts to balance general and detailed information. It provides material general enough to be usable by NASA organizations whose characteristics do not directly match those of the sources of the information and models presented herein. It also keeps the ideas sufficiently close to the sources of the practical experiences that have generated the models and information.

1996-01-01

332

Incretinas, incretinomiméticos, inhibidores de DPP IV: (2ª parte Incretins, Incretinmimetics, Inhibitors (2nd part  

Directory of Open Access Journals (Sweden)

Full Text Available En los últimos años se reconoce un nuevo mecanismo involucrado en la fisiopatología de la Diabetes Mellitus tipo 2: el déficit de producción y/o acción de las incretinas. Las incretinas son enterohormonas que estimulan la secreción de insulina en respuesta a la ingesta de nutrientes. Glucagon like peptide-1 (GLP1 y Polipéptido insulinotrópico glucosa dependiente (GIP son las principales incretinas descubiertas hasta hoy. Ambas presentan también efecto trófico sobre las células beta de los islotes pancreáticos. GLP-1 presenta otras acciones como son la inhibición de la secreción de glucagón, enlentecimiento del vaciamiento gástrico e inhibición del apetito. Ambas incretinas son rápidamente clivadas por la enzima dipeptidil peptidasa 4 (DPP-4. Nuevas drogas como los incretinomiméticos, análogos y los inhibidores de DPP-4 se presentan como una terapéutica prometedora para los pacientes con diabetes tipo 2. Conflicto de intereses: Dr. León Litwak - Miembro del Board Latinoamericano de Eli Lilly y Sanofi Aventis - Miembro del Board Nacional de los laboratorios Novo Nordisk, Novartis, GlaxoSmithKline, Sanofi Aventis, Boheringer Ingelheim, Bristol Myers, Astra Zeneca - Investigador principal de protocolos pertenecientes a Eli Lilly, Novo Nordisk, Novartis, GlaxoSmithKline, Takeda, PPDF, Pfizer, Merck Sharp and Dôhme, Amger, Roche, Minimed, Quintiles - Conferencista de los laboratorios mencionados.Two main pathophysiological mechanisms are currently involved in Type 2 Diabetes (T2DM, insulin resistance and impairment of beta cell function. However, in recent years a new mechanism was reported: a significant decrease in incretins production and/or action. Incretins are gastrointestinal hormones whose main action is stimulating insulin secretion in response to nutrients. The best known incretins are glucagon like peptide-1 (GLP-1 and Gastric insulinotropic peptide (GIP. GLP-1 and GIP not only increase insulin secretion, but also decrease glucagon secretion, slow gastric emptying and reduce apetite, generating weight loss. Both incretins are rapidly clived by the enzyme dipeptidil peptidase 4 (DPP4. In order to emulate incretins action, several drugs were developed: GLP-1 receptor agonists, GLP-1 mimetics, and DPP4 inhibitors. All of them seem to became a very promising tool for the treatment of T2DM. Financial Interests: Dr. León Litwak - Member of the Latin American Board of Eli Lilly and Sanofi Aventis - Member of the National Board of the following laboratories: Novo Nordisk, Novartis, GlaxoSmithKlein Sanofi, Aventis, Boheringer Ingelheim, Bristol Myers, Astra Zeneca - Principal Investigator of Protocols from: Eli Lilly, Novo Nordisk, Novartis, GlaxoSmithKlein, Takeda, PPDF, Pfizer, Merck Sharp and Dôhme, Amgen, Roche, Minimed, Quintiles - Lecturer for the former laboratories.

Claudia Bayón

2010-09-01

333

Incretinas, incretinomiméticos, inhibidores de DPP IV: (2ª parte) / Incretins, Incretinmimetics, Inhibitors (2nd part)  

Scientific Electronic Library Online (English)

Full Text Available SciELO Argentina | Language: Spanish Abstract in spanish En los últimos años se reconoce un nuevo mecanismo involucrado en la fisiopatología de la Diabetes Mellitus tipo 2: el déficit de producción y/o acción de las incretinas. Las incretinas son enterohormonas que estimulan la secreción de insulina en respuesta a la ingesta de nutrientes. Glucagon like p [...] eptide-1 (GLP1) y Polipéptido insulinotrópico glucosa dependiente (GIP) son las principales incretinas descubiertas hasta hoy. Ambas presentan también efecto trófico sobre las células beta de los islotes pancreáticos. GLP-1 presenta otras acciones como son la inhibición de la secreción de glucagón, enlentecimiento del vaciamiento gástrico e inhibición del apetito. Ambas incretinas son rápidamente clivadas por la enzima dipeptidil peptidasa 4 (DPP-4). Nuevas drogas como los incretinomiméticos, análogos y los inhibidores de DPP-4 se presentan como una terapéutica prometedora para los pacientes con diabetes tipo 2. Conflicto de intereses: Dr. León Litwak - Miembro del Board Latinoamericano de Eli Lilly y Sanofi Aventis - Miembro del Board Nacional de los laboratorios Novo Nordisk, Novartis, GlaxoSmithKline, Sanofi Aventis, Boheringer Ingelheim, Bristol Myers, Astra Zeneca - Investigador principal de protocolos pertenecientes a Eli Lilly, Novo Nordisk, Novartis, GlaxoSmithKline, Takeda, PPDF, Pfizer, Merck Sharp and Dôhme, Amger, Roche, Minimed, Quintiles - Conferencista de los laboratorios mencionados. Abstract in english Two main pathophysiological mechanisms are currently involved in Type 2 Diabetes (T2DM), insulin resistance and impairment of beta cell function. However, in recent years a new mechanism was reported: a significant decrease in incretins production and/or action. Incretins are gastrointestinal hormon [...] es whose main action is stimulating insulin secretion in response to nutrients. The best known incretins are glucagon like peptide-1 (GLP-1) and Gastric insulinotropic peptide (GIP). GLP-1 and GIP not only increase insulin secretion, but also decrease glucagon secretion, slow gastric emptying and reduce apetite, generating weight loss. Both incretins are rapidly clived by the enzyme dipeptidil peptidase 4 (DPP4). In order to emulate incretins action, several drugs were developed: GLP-1 receptor agonists, GLP-1 mimetics, and DPP4 inhibitors. All of them seem to became a very promising tool for the treatment of T2DM. Financial Interests: Dr. León Litwak - Member of the Latin American Board of Eli Lilly and Sanofi Aventis - Member of the National Board of the following laboratories: Novo Nordisk, Novartis, GlaxoSmithKlein Sanofi, Aventis, Boheringer Ingelheim, Bristol Myers, Astra Zeneca - Principal Investigator of Protocols from: Eli Lilly, Novo Nordisk, Novartis, GlaxoSmithKlein, Takeda, PPDF, Pfizer, Merck Sharp and Dôhme, Amgen, Roche, Minimed, Quintiles - Lecturer for the former laboratories.

Claudia, Bayón; Mercedes Araceli, Barriga; León, Litwak.

2010-09-01

334

The 2008—2013 Crisis as Metastasis. A Preview of the 2nd edition of The Cancer Stage of Capitalism by Pluto Press  

Directory of Open Access Journals (Sweden)

Full Text Available By means of selection of relevant excerpts, a preview is offered hereby of the 2nd edition of John McMurtry's prophetic 1999 book "The Cancer Stage of Capitalism", published by Pluto Press, and entitled "The Cancer Stage of Capitalism and Its Cure"

John McMurtry

2013-03-01

335

The 2nd International Conference on Nuclear Physics in Astrophysics Refereed and selected contributions Debrecen, Hungary May 16–20, 2005  

CERN Document Server

Launched in 2004, "Nuclear Physics in Astrophysics" has established itself in a successful topical conference series addressing the forefront of research in the field. This volume contains the selected and refereed papers of the 2nd conference, held in Debrecen in 2005 and reprinted from "The European Physical Journal A - Hadrons and Nuclei".

Fülöp, Zsolt; Somorjai, Endre

2006-01-01

336

Multi-atom resonant photoemission and the development of next-generation software and high-speed detectors for electron spectroscopy  

International Nuclear Information System (INIS)

This dissertation has involved the exploration of a new effect in photoelectron emission, multi-atom resonant photoemission (MARPE), as well as the development of new software, data analysis techniques, and detectors of general use in such research. We present experimental and theoretical results related to MARPE, in which the photoelectron intensity from a core level on one atom is influenced by a core-level absorption resonance on another. We point out that some of our and others prior experimental data has been strongly influenced by detector non-linearity and that the effects seen in new corrected data are smaller and of different form. Corrected data for the MnO(001) system with resonance between the O 1s and Mn 2p energy levels are found to be well described by an extension of well-known intraatomic resonant photoemission theory to the interatomic case, provided that interactions beyond the usual second-order Kramers-Heisenberg treatment are included. This theory is also found to simplify under certain conditions so as to yield results equivalent to a classical x-ray optical approach, with the latter providing an accurate and alternative, although less detailed and general, physical picture of these effects. Possible future applications of MARPE as a new probe of near-neighbor identities and bonding and its relationship to other known effects are also discussed. We also consider in detail specially written data acquisition software that has been used for most of the measurements reported here. This software has been used with an existing experimental system to develop the method of detector characterization and then data correction required for the work described above. The development of a next generation one-dimensional, high-speed, electron detector is also discussed. Our goal has been to design, build and test a prototype high-performance, one-dimensional pulse-counting detector that represents a significant advancement in detector technology and is well matched to modern high-brightness synchrotrons radiation sources and high-transmission electron-energy analyzers as typically used in photoelectron spectroscopy experiments. The general design of the detector and the results of initial tests are discussed and the acquisition of photoelectron spectra with the first test detector is described

337

Multi-atom resonant photoemission and the development of next-generation software and high-speed detectors for electron spectroscopy  

Energy Technology Data Exchange (ETDEWEB)

This dissertation has involved the exploration of a new effect in photoelectron emission, multi-atom resonant photoemission (MARPE), as well as the development of new software, data analysis techniques, and detectors of general use in such research. We present experimental and theoretical results related to MARPE, in which the photoelectron intensity from a core level on one atom is influenced by a core-level absorption resonance on another. We point out that some of our and others prior experimental data has been strongly influenced by detector non-linearity and that the effects seen in new corrected data are smaller and of different form. Corrected data for the MnO(001) system with resonance between the O 1s and Mn 2p energy levels are found to be well described by an extension of well-known intraatomic resonant photoemission theory to the interatomic case, provided that interactions beyond the usual second-order Kramers-Heisenberg treatment are included. This theory is also found to simplify under certain conditions so as to yield results equivalent to a classical x-ray optical approach, with the latter providing an accurate and alternative, although less detailed and general, physical picture of these effects. Possible future applications of MARPE as a new probe of near-neighbor identities and bonding and its relationship to other known effects are also discussed. We also consider in detail specially written data acquisition software that has been used for most of the measurements reported here. This software has been used with an existing experimental system to develop the method of detector characterization and then data correction required for the work described above. The development of a next generation one-dimensional, high-speed, electron detector is also discussed. Our goal has been to design, build and test a prototype high-performance, one-dimensional pulse-counting detector that represents a significant advancement in detector technology and is well matched to modern high-brightness synchrotrons radiation sources and high-transmission electron-energy analyzers as typically used in photoelectron spectroscopy experiments. The general design of the detector and the results of initial tests are discussed and the acquisition of photoelectron spectra with the first test detector is described.

Kay, Alexander William

2000-09-01

338

Contractions of 2D 2nd Order Quantum Superintegrable Systems and the Askey Scheme for Hypergeometric Orthogonal Polynomials  

Directory of Open Access Journals (Sweden)

Full Text Available We show explicitly that all 2nd order superintegrable systems in 2 dimensions are limiting cases of a single system: the generic 3-parameter potential on the 2-sphere, S9 in our listing. We extend the Wigner-Inönü method of Lie algebra contractions to contractions of quadratic algebras and show that all of the quadratic symmetry algebras of these systems are contractions of that of S9. Amazingly, all of the relevant contractions of these superintegrable systems on flat space and the sphere are uniquely induced by the well known Lie algebra contractions of e(2 and so(3. By contracting function space realizations of irreducible representations of the S9 algebra (which give the structure equations for Racah/Wilson polynomials to the other superintegrable systems, and using Wigner's idea of ''saving'' a representation, we obtain the full Askey scheme of hypergeometric orthogonal polynomials. This relationship directly ties the polynomials and their structure equations to physical phenomena. It is more general because it applies to all special functions that arise from these systems via separation of variables, not just those of hypergeometric type, and it extends to higher dimensions.

Ernest G. Kalnins

2013-10-01

339

2nd dimensional GC-MS analysis of sweat volatile organic compounds prepared by solid phase micro-extraction.  

Science.gov (United States)

The characteristics of an individual's odor from sweat, breath and skin provide important information for criminal tracking in field of forensic science. Solid phase micro-extraction gas chromatography/mass spectrometry (SPME-GC/MS) was used to determine human sweat volatile organic compounds (VOCs) profiles. The mass spectrometric analysis (with electron impact mode) followed by 2nd dimensional separation with two different GC columns (one polar and one relatively nonpolar) connected in parallel were used to identify the 574 compounds from sweat samples. The components included alcohols, aldehydes, aliphatics/aromatics, carboxylic acids, esters, ketones, and other organic compounds (amides/amines, thio/thioesters, oxide, sulfides, nitro compounds). Of these compounds, 1-tridecanol, 1,3-bis(1,1-dimethyl ethyl)-benzene, 4,4'-(1-methylethylidene) bis-phenol and 7-acetyl-6-ethyl-1,1,4,4,-tetramethyl-tetraline were common components in all donor's sweat volatile samples. Age-related specific compounds were also detected. The results suggest that characteristic volatile profiles of human sweat emanations could provide the valuable information to forensic scientists. PMID:24763202

Choi, Mi-Jung; Oh, Chang-Hwan

2014-01-01

340

Comparison and analysis on specification of X80 line pipe for 2nd West-East gas transmission pipeline project  

Energy Technology Data Exchange (ETDEWEB)

China has put the 2nd West-East Gas Pipeline project into practice. The Pipeline has a total length of 8082 km, consists of one main pipeline and 6 branch pipelines and pass through 13 provinces. The second west-east gas pipeline has a transmission capacity of 30 billion gas and needs high pressure transmission. After numerous discussions and comparisons of professional experts, the plan to use pipes of grade X80 and the transmission pressure of 12MPa is determined. During the establishment of the technical specification of grade X80 tubular goods for the project, prior research is carried out and based on this, 16 technical specifications are issued. When these are made, the balance of line pipe strength, roughness, toughness and ductility is considered comprehensively. The mass produced X80 SSAW pipe is of better quality, and each property index of it exceeds the API standard requirements and has reached the technical standards of the second west-east gas pipeline project.

Qiurong, Ma; Chunyong, Huo [Tubular Goods Research Center of CNPC (China)

2010-07-01

 
 
 
 
341

Report on the 2nd International Consortium on Hallucination Research: evolving directions and top-10 "hot spots" in hallucination research.  

Science.gov (United States)

This article presents a report on the 2nd meeting of the International Consortium on Hallucination Research, held on September 12th and 13th 2013 at Durham University, UK. Twelve working groups involving specialists in each area presented their findings and sought to summarize the available knowledge, inconsistencies in the field, and ways to progress. The 12 working groups reported on the following domains of investigation: cortical organisation of hallucinations, nonclinical hallucinations, interdisciplinary approaches to phenomenology, culture and hallucinations, subtypes of auditory verbal hallucinations, a Psychotic Symptoms Rating Scale multisite study, visual hallucinations in the psychosis spectrum, hallucinations in children and adolescents, Research Domain Criteria behavioral constructs and hallucinations, new methods of assessment, psychological therapies, and the Hearing Voices Movement approach to understanding and working with voices. This report presents a summary of this meeting and outlines 10 hot spots for hallucination research, which include the in-depth examination of (1) the social determinants of hallucinations, (2) translation of basic neuroscience into targeted therapies, (3) different modalities of hallucination, (4) domain convergence in cross-diagnostic studies, (5) improved methods for assessing hallucinations in nonclinical samples, (6) using humanities and social science methodologies to recontextualize hallucinatory experiences, (7) developmental approaches to better understand hallucinations, (8) changing the memory or meaning of past trauma to help recovery, (9) hallucinations in the context of sleep and sleep disorders, and (10) subtypes of hallucinations in a therapeutic context. PMID:24282321

Waters, Flavie; Woods, Angela; Fernyhough, Charles

2014-01-01

342

Report on the 2nd International Consortium on Hallucination Research: Evolving Directions and Top-10 “Hot Spots” in Hallucination Research  

Science.gov (United States)

This article presents a report on the 2nd meeting of the International Consortium on Hallucination Research, held on September 12th and 13th 2013 at Durham University, UK. Twelve working groups involving specialists in each area presented their findings and sought to summarize the available knowledge, inconsistencies in the field, and ways to progress. The 12 working groups reported on the following domains of investigation: cortical organisation of hallucinations, nonclinical hallucinations, interdisciplinary approaches to phenomenology, culture and hallucinations, subtypes of auditory verbal hallucinations, a Psychotic Symptoms Rating Scale multisite study, visual hallucinations in the psychosis spectrum, hallucinations in children and adolescents, Research Domain Criteria behavioral constructs and hallucinations, new methods of assessment, psychological therapies, and the Hearing Voices Movement approach to understanding and working with voices. This report presents a summary of this meeting and outlines 10 hot spots for hallucination research, which include the in-depth examination of (1) the social determinants of hallucinations, (2) translation of basic neuroscience into targeted therapies, (3) different modalities of hallucination, (4) domain convergence in cross-diagnostic studies, (5) improved methods for assessing hallucinations in nonclinical samples, (6) using humanities and social science methodologies to recontextualize hallucinatory experiences, (7) developmental approaches to better understand hallucinations, (8) changing the memory or meaning of past trauma to help recovery, (9) hallucinations in the context of sleep and sleep disorders, and (10) subtypes of hallucinations in a therapeutic context. PMID:24282321

Waters, Flavie

2014-01-01

343

Universe (2nd edition)  

International Nuclear Information System (INIS)

A general text on astronomy is presented. The foundations of the science are reviewed, including descriptions of naked-eye observatons of eclipses and planetary motions and such basic tools as Kepler's laws, the fundamental properties of light, and the optics of telescopes. The formation of the solar system is addressed, and the planets and their satellites are discussed individually. Solar science is treated in detail. Stellar evolution is described chronologically from birth to death. Molecular clouds, star clusters, nebulae, neutron stars, black holes, and various other phenomena that occur in the life of a star are examined in the sequence in which they naturally occur. A survey of the Milky Way introduces galactic astronomy. Quasars and cosmology are addressed, including the most recent developments in research. 156 references

344

FragVLib a free database mining software for generating "Fragment-based Virtual Library" using pocket similarity search of ligand-receptor complexes  

Directory of Open Access Journals (Sweden)

Full Text Available Abstract Background With the exponential increase in the number of available ligand-receptor complexes, researchers are becoming more dedicated to mine these complexes to facilitate the drug design and development process. Therefore, we present FragVLib, free software which is developed as a tool for performing similarity search across database(s of ligand-receptor complexes for identifying binding pockets which are similar to that of a target receptor. Results The search is based on 3D-geometric and chemical similarity of the atoms forming the binding pocket. For each match identified, the ligand's fragment(s corresponding to that binding pocket are extracted, thus, forming a virtual library of fragments (FragVLib that is useful for structure-based drug design. Conclusions An efficient algorithm is implemented in FragVLib to facilitate the pocket similarity search. The resulting fragments can be used for structure-based drug design tools such as Fragment-Based Lead Discovery (FBLD. They can also be used for finding bioisosteres and as an idea generator.

Khashan Raed

2012-08-01

345

Home and away: hybrid perspective on identity formation in 1.5 and second generation adolescent immigrants in Israel.  

Digital Repository Infrastructure Vision for European Research (DRIVER)

Immigration is not only about changing countries, but also about shifting identities. This change is especially important for adolescents. This article examines identity formation among 1.5 and 2nd generation adolescent immigrants to Israel. A survey of 125 children of immigrants aged 12-19 examined the role of social structures such as pace of life, culture, religion and language on identity formation in 1.5 and 2nd generational groups. We have identified several significant factors affectin...

Harper, Robin A.; Hani Zubida; Liron Lavi; Ora Nakash; Anat Shoshani

2013-01-01

346

Clinical impact of dose reductions and interruptions of second-generation tyrosine kinase inhibitors in patients with chronic myeloid leukaemia.  

Digital Repository Infrastructure Vision for European Research (DRIVER)

Second (2nd)-generation tyrosine kinase inhibitors (TKI) (dasatinib, nilotinib) are effective in patients with all phases of chronic myeloid leukaemia (CML). Dose reductions and treatment interruptions are frequently required due to toxicity, but their significance is unknown. We analysed the impact of dose reductions/interruptions and dose intensity of 2nd-generation TKI on response and survival. A total of 280 patients with CML (all phases) were analysed. Dose reductions were considered whe...

Santos, Fabio P. S.; Kantarjian, Hagop; Fava, Carmen; O’brien, Susan; Garcia-manero, Guillermo; Ravandi, Farhad; Wierda, William; Thomas, Deborah; Shan, Jianquin; Cortes, Jorge

2010-01-01

347

ENABLE -- A systolic 2nd level trigger processor for track finding and e/? discrimination for ATLAS/LHC  

International Nuclear Information System (INIS)

The Enable Machine is a systolic 2nd level trigger processor for the transition radiation detector (TRD) of ATLAS/LHC. It is developed within the EAST/RD-11 collaboration at CERN. The task of the processor is to find electron tracks and to reject pion tracks according to the EAST benchmark algorithm in less than 10?s. Track are identified by template matching in a (?,z) region of interest (RoI) selected by a 1st level trigger. In the (?,z) plane tracks of constant curvature are straight lines. The relevant lines form mask templates. Track identification is done by histogramming the coincidences of the templates and the RoI data for each possible track. The Enable Machine is an array processor that handles tracks of the same slope in parallel, and tracks of different slope in a pipeline. It is composed of two units, the Enable histogrammer unit and the Enable z/?-board. The interface daughter board is equipped with a HIPPI-interface developed at JINR/-Dubna, and Xilinx 'corner turning' data converter chips. Enable uses programmable gate arrays (XILINX) for histogramming and synchronous SRAMs for pattern storage. With a clock rate of 40 MHz the trigger decision time is 6.5 ?s and the latency 7.0 ?s. The Enable machine is scalable in the RoI size as well as in the number of tracks processed. It can be adapted to different recognition tasks and detector setups. The prototype of the Enable Machine has been tested in a beam time of the RD6 collaboration at CERN in Octthe RD6 collaboration at CERN in October 1993

348

Does the Application of Instructional Mathematics Software Have Enough Efficiency?  

Directory of Open Access Journals (Sweden)

Full Text Available Modern tools as new educational systems can improve teaching-learning procedures in schools. Teaching mathematics is one of the main and difficult components in educational systems. Informing methods is essential for teachers and instructors. It seems that did not forget usual teaching method and using software or media considered as remedial teaching. Teachers always follow dynamic methods for teaching and learning. The aim of this study is that views of students studied regard to math software and its efficiency. Twenty two girl students of 2nd grade are chosen at high schools. Through standard questionnaire and survey method, views of students are collected. Data are studied via Kolmogorov-Smirnov and one sample sign tests. The results of tests indicated that students have positive views toward co-instructional software of math learning. Therefore it seems that mathematical software can advantages for teaching and learning at high schools.

Zahra Kalantarnia

2013-12-01

349

Software reliability  

CERN Document Server

Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

Bendell, A

2014-01-01

350

Comparative analysis of effectiveness of treatment with anti-TB drugs of the 1st and 2nd lines for children and adolescents with multidrug resistant tuberculosis  

Directory of Open Access Journals (Sweden)

Full Text Available The paper shows results of study on comparative treatment effectiveness in children and adolescents with from multi drug resistant tuberculosis MDR TB (2000-2008 treated with anti-TB drugs of the 2nd line (80 patients and 1st line (80 patients in the Kazakhstan. It was stated in patients with MDR TB that outcomes of treatment were successful in 91.2%, but relapse development of TB disease occurred in 12.7% of cases, and 5 (6.2% patients died (P ?0.05. Thus, patients with MDR TB need to be treated with anti-TB drugs of the 2nd line accordingly to their DST.

Tleukhan Abildaev

2012-05-01

351

The Effects of Star Strategy of Computer-Assisted Mathematics Lessons on the Achievement and Problem Solving Skills in 2nd Grade Courses  

Directory of Open Access Journals (Sweden)

Full Text Available The aim of research is to determine the effect of STAR strategy on 2nd grade students’ academic achievement and problem solving skills in the computer assisted mathematics instruction. 30 students who attend to 2nd grade course in a primary school in Ayd?n in 2010-2011 education year join the study group. The research has been taken place in 7 week. 3 types of tests have been used as means of data collecting tool, “Academic Achievement Test”, “Problem Solving Achievement Test” and “The Evaluation Form of Problem Solving Skills”. At the end of research students’ views about computer assisted mathematics instruction were evaluated. It has been examined that whether the differences between the scores of pre-test and post-test are statistically meaningful or not. According to the results, a positive increase on the academic achievement and problem solving skills has been determined at the end of the education carried out with STAR strategy.

Jale ?PEK

2013-12-01

352

Microstructure, mechanical properties and fracture behavior of peak-aged Mg--4Y--2Nd--1Gd alloys under different aging conditions  

Energy Technology Data Exchange (ETDEWEB)

The morphology of precipitates and grain boundaries of peak-aged Mg--4Y--2Nd--1Gd alloys under different aging conditions were analyzed by transmission electron microscopy (TEM), and the mechanical properties and fracture behavior of the studied alloys both at room and elevated temperatures were investigated. The {beta} Prime Prime and {beta} Prime phases are the main precipitates of the alloys peak-aged at 200 Degree-Sign C and 225 Degree-Sign C, while the alloy peak-aged at 250 Degree-Sign C mainly consists of {beta}{sub 1} and {beta} phases. Discussion on relationship between precipitates and mechanical properties, fracture behavior reveals that the precipitates' density, kind and arrangement are the dominating factors influencing the mechanical properties, and the combined influence of grain boundary structure and precipitation hardening determine the fracture mechanism of Mg--4Y--2Nd--1Gd alloys.

Liu, Zhijie [National Engineering Research Center of Light Alloy Net Forming, School of Materials Science and Engineering, Shanghai Jiao Tong University, Shanghai 200240 (China); Wu, Guohua, E-mail: ghwu@sjtu.edu.cn [National Engineering Research Center of Light Alloy Net Forming, School of Materials Science and Engineering, Shanghai Jiao Tong University, Shanghai 200240 (China); Key State Laboratory of Metal Matrix Composite, School of Materials Science and Engineering, Shanghai Jiao Tong University, Shanghai 200240 (China); Liu, Wencai; Pang, Song [National Engineering Research Center of Light Alloy Net Forming, School of Materials Science and Engineering, Shanghai Jiao Tong University, Shanghai 200240 (China); Ding, Wenjiang [National Engineering Research Center of Light Alloy Net Forming, School of Materials Science and Engineering, Shanghai Jiao Tong University, Shanghai 200240 (China); Key State Laboratory of Metal Matrix Composite, School of Materials Science and Engineering, Shanghai Jiao Tong University, Shanghai 200240 (China)

2013-01-20

353

Effect of Mg substitution on photoluminescence of MgxCa1-xAl2O4: Eu2+, Nd3+  

International Nuclear Information System (INIS)

Rare earth ion-doped calcium aluminate (CaAl2O4) is an efficient blue phosphor. The compositions in the series MgxCa1-xAl2O4: Eu2+, Nd3+ (x=0.05-0.25) codoped with 1 mol% Eu and 3 mol% Nd were prepared by the solid state synthesis method. Crystalline phase, morphology and structural details were investigated by powder X-ray diffraction (XRD), scanning electron microscopy (SEM) and transmission electron microscopy (TEM) techniques. Effect of Mg substitution on structure and photoluminescence characteristics was investigated. Photoluminescence characteristics show the intense emission for MgCaAl2O4: Eu2+, Nd3+ in the blue region (?max=440 nm) with long persistence. The blue emission corresponds to transitions from 4f6 5d1 to 4f7 of Eu2+ ion

354

Software Complexity Methodologies & Software Security  

Digital Repository Infrastructure Vision for European Research (DRIVER)

It is broadly clear that complexity is one of the software natural features. Software natural complexity and software requirement functionality are two inseparable part and they have special range. measurement complexity have explained with using the MacCabe and Halsted models and with an example discuss about software complexity in this paper Flow metric information Henry and Kafura, complexity metric system Agresti-card-glass, design metric in item’s level have compared and peruse then ca...

Masoud Rafighi; Nasser Modiri

2011-01-01

355

Solid solutions in the HfO2-Nd2O3(Pr2O3, Tb2O3) systems with mixed conductivity  

International Nuclear Information System (INIS)

X-ray diffraction and electric conductivity methods were used to investigate solid solutions of monoclinic structure, cubic fluorite type and pyrochlore type solid solutions in HfO2-Nd2O3(Pr2O3, Tb2O3) systems. Tetragonal solid solutions on the base of HfO2 have been revealed at temperatures above 1650 deg C as well

356

Report on the 2nd Florence International Symposium on Advances in Cardiomyopathies: 9th meeting of the European Myocardial and Pericardial Diseases WG of the ESC  

Digital Repository Infrastructure Vision for European Research (DRIVER)

A bridge between clinical and basic science aiming at cross fertilization, with leading experts presenting alongside junior investigators, is the key feature of the “2nd Florence International Symposium on Advances in Cardiomyopathies” , 9th Meeting of the Myocardial and Pericardial Diseases Working Group of the European Society of Cardiology, which was held in Florence, Italy on 26-­-28th September 2012. Patients with cardiomyopathies, with an estimated 3 per thousand prevalence in the ...

Franco Cecchi; Iacopo Olivotto; Robert Bonow; Magdi Yacoub

2012-01-01

357

Absolutely oil-free vacuum - news in vacuum technology abroad. (Based on the proceedings of the 2 nd and 3 rd european vacuum conferences)  

International Nuclear Information System (INIS)

In this paper a detailed description of the working principles of those new oil-free vacuum pumps are given which have been presented and exhibited during the 2 nd and 3 rd European Vacuum Conferences. These are diaphragm pumps, oil-free piston pumps, mechanical pumps with claw type rotor and oil-free molecular drag (spiro-)pumps. Their technical characteristics, typical pumping curves and costs of their operation are given. The fields of their recommended applications are also discussed

358

Comparison of the 2nd-order and 4th-order Staggered-Grid Finite-Difference Implementations of the TSN Method for Rupture Propagation  

Science.gov (United States)

The TSN (Traction-at-Split-Nodes) method has been developed independently by Andrews (1973, 1976, 1999) and Day (1977, 1982). Andrews implemented his TSN formulation in the finite-difference scheme in which spatial differentiation is equivalent to the 2nd-order finite-element method. Day implemented his slightly different formulation of the TSN method in the 2nd-order partly-staggered finite-difference scheme. Dalguer and Day (2006) adapted the TSN method to the velocity-stress staggered-grid finite-difference scheme. Whereas the 4th-order spatial differencing is applied outside the fault, the 2nd-order differencing is applied along the fault plane. We present two implementations of the Day's TSN formulation in the velocity-stress staggered-grid finite-difference scheme for a 3D viscoelastic medium. In the first one we apply the 2nd-order spatial differencing everywhere in the grid including derivatives at the fault in the direction perpendicular to the fault plane. In the second implementation we similarly apply the 4th-order spatial differencing. In both cases we use the adjusted finite-difference approximations (AFDA, Kristek et al. 2002, Moczo et al. 2004) to derivatives in the direction perpendicular to the fault plane in order to have the same order of approximation everywhere. We numerically investigate convergence rates of both implementations with respect to rupture-time, final-slip, and peak-slip-rate metrics. Moreover, we compare the numerical solutions to those obtained by the finite-element implementation of the TSN method.

Kristek, J.; Moczo, P.; Galis, M.

2006-12-01

359

Induced antiferromagnetism and large magnetoresistances in RuSr2(Nd,Y,Ce)2Cu2O10-d ruthenocuprates  

Digital Repository Infrastructure Vision for European Research (DRIVER)

RuSr2(Nd,Y,Ce)2Cu2O10-d ruthenocuprates have been studied by neutron diffraction, magnetotransport and magnetisation measurements and the electronic phase diagram is reported. Separate Ru and Cu spin ordering transitions are observed, with spontaneous Cu antiferromagnetic order for low hole doping levels p, and a distinct, induced-antiferromagnetic Cu spin phase in the 0.02 < p < 0.06 pseudogap region. This ordering gives rise to large negative magnetoresistances which vary ...

Mclaughlin, A. C.; Sher, F.; Kimber, S. A. J.; Attfield, J. P.

2007-01-01

360

2nd Annual Workshop Proceedings of the Collaborative Project "Redox Phenomena Controlling Systems" (7th EC FP CP RECOSY) (KIT Scientific Reports ; 7557)  

Digital Repository Infrastructure Vision for European Research (DRIVER)

These are proceedings of the 2nd Annual Workshop of the EURATOM FP7 Collaborative Project "Redox Phenomena Controlling System", held in Larnaca (Cyprus) 16th to 19th March 2010. The project deals with the impact of redox processes on the long-term safety of nuclear waste disposal. The proceedings have six workpackage overview contributions, and 21 reviewed scientific-technical short papers. The proceedings document the scientific-technical progress of the second project year.

Buckau, Gunnar; Kienzler, Bernhard; Duro, Lara; Grive?, Mireia; Montoya, Vanessa; ,

2010-01-01

 
 
 
 
361

An overview of existing RCM procedures, software and databases used in various industrial segments. A brief description of RCMCost software. A brief description of RCM Workstation 2.5 software  

International Nuclear Information System (INIS)

The report is structured as follows: (1) Brief history (1st generation, 2nd generation, 3rd generation); (2) Application of RCM in various industrial segments (Aircraft industry, nuclear industry, chemical industry, petroleum and gas processing and transport, services and other industrial segments); (3) RCM standards; (4) RCM tools; (5) Databases usable for RCM; and (6) A brief description of selected codes for RCM analysis (RCMCost v3.0, RCM Workstation 2.5). (P.A.)

362

Software methodologies for the SSC  

International Nuclear Information System (INIS)

This report describes some of the considerations that will determine how the author developed software for the SSC. He begins with a review of the general computing problem for SSC experiments and recent experiences in software engineering for the present generation of experiments. This leads to a discussion of the software technologies that will be critical for the SSC experiments. He describes the emerging software standards and commercial products that may be useful in addressing the SSC needs. He concludes with some comments on how collaborations and the SSC Lab should approach the software development issue

363

New glycoproteomics software, GlycoPep Evaluator, generates decoy glycopeptides de novo and enables accurate false discovery rate analysis for small data sets.  

Science.gov (United States)

Glycoproteins are biologically significant large molecules that participate in numerous cellular activities. In order to obtain site-specific protein glycosylation information, intact glycopeptides, with the glycan attached to the peptide sequence, are characterized by tandem mass spectrometry (MS/MS) methods such as collision-induced dissociation (CID) and electron transfer dissociation (ETD). While several emerging automated tools are developed, no consensus is present in the field about the best way to determine the reliability of the tools and/or provide the false discovery rate (FDR). A common approach to calculate FDRs for glycopeptide analysis, adopted from the target-decoy strategy in proteomics, employs a decoy database that is created based on the target protein sequence database. Nonetheless, this approach is not optimal in measuring the confidence of N-linked glycopeptide matches, because the glycopeptide data set is considerably smaller compared to that of peptides, and the requirement of a consensus sequence for N-glycosylation further limits the number of possible decoy glycopeptides tested in a database search. To address the need to accurately determine FDRs for automated glycopeptide assignments, we developed GlycoPep Evaluator (GPE), a tool that helps to measure FDRs in identifying glycopeptides without using a decoy database. GPE generates decoy glycopeptides de novo for every target glycopeptide, in a 1:20 target-to-decoy ratio. The decoys, along with target glycopeptides, are scored against the ETD data, from which FDRs can be calculated accurately based on the number of decoy matches and the ratio of the number of targets to decoys, for small data sets. GPE is freely accessible for download and can work with any search engine that interprets ETD data of N-linked glycopeptides. The software is provided at https://desairegroup.ku.edu/research. PMID:25137014

Zhu, Zhikai; Su, Xiaomeng; Go, Eden P; Desaire, Heather

2014-09-16

364

A software engineering process for safety-critical software application  

International Nuclear Information System (INIS)

Application of computer software to safety-critical systems in on the increase. To be successful, the software must be designed and constructed to meet the functional and performance requirements of the system. For safety reason, the software must be demonstrated not only to meet these requirements, but also to operate safely as a component within the system. For longer-term cost consideration, the software must be designed and structured to ease future maintenance and modifications. This paper presents a software engineering process for the production of safety-critical software for a nuclear power plant. The presentation is expository in nature of a viable high quality safety-critical software development. It is based on the ideas of a rational design process and on the experience of the adaptation of such process in the production of the safety-critical software for the shutdown system number two of Wolsung 2, 3 and 4 nuclear power generation plants. This process is significantly different from a conventional process in terms of rigorous software development phases and software design techniques, The process covers documentation, design, verification and testing using mathematically precise notations and highly reviewable tabular format to specify software requirements and software requirements and software requirements and code against software design using static analysis. The software engineering process described in this paper applies the principle of informatis paper applies the principle of information-hiding decomposition in software design using a modular design technique so that when a change is required or an error is detected, the affected scope can be readily and confidently located. it also facilitates a sense of high degree of confidence in the 'correctness' of the software production, and provides a relatively simple and straightforward code implementation effort. 1 figs., 10 refs. (Author)

365

2nd State of the Onion: Larry Wall's Keynote Address at the Second Annual O'Reilly Perl Conference  

Science.gov (United States)

This page, part of publisher O'Reilly & Associates' Website devoted to the Perl language, contains a transcript of Larry Wall's keynote address at the second annual O'Reilly Perl Conference, which was held August 17-20, 1998, in San Jose, California. In his keynote address, Larry Wall, the original author of the Perl programming language, provides a thought-provoking (and entertaining) mix of philosophy and technology. Wall's talk touches on the future of the Perl language, the relationship of the free software community to commercial software developers, chaos, complexity, and human symbology. The page also includes copies of graphics used during the keynote.

366

EDITORIAL: Selected Papers from OMS'07, the 2nd Topical Meeting of the European Optical Society on Optical Microsystems (OMS)  

Science.gov (United States)

OMS'07 was the 2nd Topical Meeting of the European Optical Society (EOS) on Optical Microsystems (OMS). It was organized by the EOS in the frame of its international topical meeting activity, and after the success of the inaugural meeting was once again held in Italy, 30 September to 3 October 2007, amidst the wonderful scenery of the Island of Capri. The local organizing committee was composed of researchers from `La Sapienza' University in Rome and the National Council of Research (CNR) in Naples, Italy. A selected group of leading scientists in the field formed the international scientific committee. The conference was fully dedicated to the most recent advancements carried out in the field of optical microsystems. More then 150 scientists coming from five continents attended the conference and more than 100 papers were presented, organized into the following sessions: Photonic cystals and metamaterials Optofluidic microsystems and devices Optical microsystems and devices New characterization methods for materials and devices Application of optical systems Optical sources and photodetectors Optical resonators Nonlinear optic devices Micro-optical devices. Four keynote lecturers were invited for the Plenary sessions: Federico Capasso, Harvard University, USA; Bahram Javidi, University of Connecticut, USA (Distinguished Lecturer, Emeritus of LEOS--IEEE Society); Demetri Psaltis, EPFL, Lausanne, Switzerland; Ammon Yariv, California Institute of Technology, USA. Furthermore, 21 invited speakers opened each session of the conference with their talks. In addition a special session was organized to celebrate eighty years of the Isituto Nazionale di Ottica Applicata (INOA) of CNR. The special invited speaker for this session was Professor Theodor W Hänsch (Nobel Prize in Physics, 2005), who gave a lecture entitled `What can we do with optical frequency combs?' In this special issue of Journal of Optics A: Pure and Applied Optics, a selection of the most interesting papers presented at OMS'07 has been collected, reporting progress in the different aspects of microsystems design, production, characterization and application. The papers embrace most of the various topics that were debated during the conference. Abstracts for the presentations given at the conference can be found on the OMS'07 website at http://www.inoa.it/oms07/. We would like to thank all the members of the scientific and industrial committees of OMS'07 for the high scientific content of the meeting, the European Optical Society for the irreplaceable support given to the conference organization and the editorial staff at Journal of Optics A for the invaluable work done in preparing the special issue.

Rendina, Ivo; Fazio, Eugenio; Ferraro, Pietro

2008-06-01

367

Software Radio  

Directory of Open Access Journals (Sweden)

Full Text Available This paper aims to provide an overview on rapidly growing technology in the radio domain which overcomes the drawbacks suffered by the conventional analog radio. This is the age of Software radio – the technology which tries to transform the hardware radio transceivers into smart programmable devices which can fit into various devices available in today’s rapidly evolving wireless communication industry. This new technology has some or the entire physical layer functions software defined. All of the waveform processing, including the physical layer, of a wireless device moves into the software. An ideal Software Radio provides improved device flexibility, software portability, and reduced development costs. This paper tries to get into the details of all this. It takes one through a brief history of conventional radios, analyzes the drawbacks and then focuses on the Software radio in overcoming these short comings.

Varun Sharma

2010-05-01

368

Software engineering  

CERN Document Server

The capability to design quality software and implement modern information systems is at the core of economic growth in the 21st century. Nevertheless, exploiting this potential is only possible when adequate human resources are available and when modern software engineering methods and tools are used. The recent years have witnessed rapid evolution of software engineering methodologies, including the creation of new platforms and tools which aim to shorten the software design process, raise its quality and cut down its costs. This evolution is made possible through ever-increasing knowledge of software design strategies as well as through improvements in system design and code testing procedures. At the same time, the need for broad access to high-performance and high-throughput computing resources necessitates the creation of large-scale, interactive information systems, capable of processing millions of transactions per seconds. These systems, in turn, call for new, innovative distributed software design a...

Zielinski, K

2005-01-01

369

Software Economies  

Digital Repository Infrastructure Vision for European Research (DRIVER)

Software construction has typically drawn on engineering metaphors like building bridges or cathedrals, which emphasize architecture, specification, central planning, and determinism. Approaches to correctness have drawn on metaphors from mathematics, like formal proofs. However, these approaches have failed to scale to modern software systems, and the problem keeps getting worse. We believe that the time has come to completely re-imagine the creation of complex software, drawing on systems i...

Bacon, David F.; Bokelberg, Eric; Chen, Yiling; Kash, Ian; Parkes, David C.; Rao, Malvika; Sridharan, Manu

2010-01-01

370

GREEN SOFTWARE ENGINEERING PROCESS : MOVING TOWARDS SUSTAINABLE SOFTWARE PRODUCT DESIGN  

Digital Repository Infrastructure Vision for European Research (DRIVER)

The Software development lifecycle (SDLC) currently focuses on systematic execution and maintenance of software by dividing the software development process into various phases that include requirements-gathering, design, implementation, testing, deployment and maintenance. The problem here is that certain important decisions taken in these phases like use of paper, generation of e-Waste, power consumption and increased carbon foot print by means of travel, Air-conditioning etc may harm the e...

Shantanu Ray

2013-01-01

371

Software Engineering for Tagging Software  

Directory of Open Access Journals (Sweden)

Full Text Available Tagging is integrated into web application to ease maintenance of large amount of information stored in aweb application. With no mention of requirement specification or design document for tagging software,academically or otherwise, integrating tagging software in a web application is a tedious task. In thispaper, a framework has been created for integration of tagging software in a web application. Theframework follows the software development life cycle paradigms and is to be used during it differentstages. The requirement component of framework presents a weighted requirement checklist that aids theuser in deciding requirement for the tagging software in a web application, from among popular ones. Thedesign component facilitates the developer in understanding the design of existing tagging software,modifying it or developing a new one. Also, the framework helps in verification and validation of taggingsoftware integrated in a web application.

Karan Gupta

2013-07-01

372

Software management issues  

International Nuclear Information System (INIS)

The difficulty of managing the software in large HEP collaborations appears to becoming progressively worst with each new generation of detector. If one were to extrapolate to the SSC, it will become a major problem. This paper explores the possible causes of the difficulty and makes suggestions on what corrective actions should be taken

373

Analysis of Polish writing on the history of physical education and sports in the North-Eastern borderlands of the 2nd republic  

Directory of Open Access Journals (Sweden)

Full Text Available The aim of this paper is presentation of the up-to-date state of research on physical education and sports in the North-Eastern Borderlands of the 2nd Republic based on analysis of Polish literatureon the subject. In the sense of territorial scope, the paper covers the areas of the Polesie, Novogrodek and Vilnius voivodeships.As for the scope of studies on the history of physical education and sports in the North-Eastern Borderlands of the 2nd Republic, the most cognitively significant is the work by Laskiewicz on „Kultura fizyczna na Wile?szczy?nie w latach 1900–1939. Zarys monograficznydziejów” (Physical Culture in the Region of Vilnius in the Years 1900–1939. An Outline of Monographic History. The history of physical culture in rural areas were fairly well drawn up. Interms of historiography, there are publications presenting physical education and sports in urban areas. The publications mainly refer to physical activity in larger towns and cities, e.g. in Baranowicze, Brest-on-Bug, Lida, Novogrodek and in Vilnius. In terms of voivodeships, papers on physical education and sports in the Region of Vilnius significantly predominate. The presented analysis of the state of research – in reference to Polish writings – shows the necessity to supplement the preliminary archival researchof the sources – in order to prepare a monograph on „Dziejów wychowania fizycznego i sportu na Kresach Pó?nocno-Wschodnich II Rzeczypospolitej” (the History of Physical Education and Sports inthe North – Eastern Borderlands of the 2nd Republic. A preliminary archival research should also be conducted in the archives kept by Byelorussia and Lithuania.

Eligiusz Ma?olepszy

2013-05-01

374

Generation of handbook of multi-group cross sections of WIMS-D libraries by using the XnWlup2.0 software  

International Nuclear Information System (INIS)

A project to prepare an exhaustive handbook of WIMS-D cross section libraries for thermal reactor applications comparing different WIMS-D compatible nuclear data libraries originating from various countries has been successfully designed. To meet the objectives of this project, a computer software package with graphical user interface for MS Windows has been developed at BARC, India. This article summarizes the salient features of this new software and presents significant improvements and extensions in relation to its first version [Ann Nucl Energ 29 (2002) 1735

375

Software requirements  

CERN Document Server

Without formal, verifiable software requirements-and an effective system for managing them-the programs that developers think they've agreed to build often will not be the same products their customers are expecting. In SOFTWARE REQUIREMENTS, Second Edition, requirements engineering authority Karl Wiegers amplifies the best practices presented in his original award-winning text?now a mainstay for anyone participating in the software development process. In this book, you'll discover effective techniques for managing the requirements engineering process all the way through the development cy

Wiegers, Karl E

2003-01-01

376

2nd (final) IAEA research co-ordination meeting on 'charge exchange cross section data for fusion plasma studies'. Summary report  

International Nuclear Information System (INIS)

The proceedings and conclusions of the 2nd Research Co-ordination Meeting on 'Charge Exchange Cross Section Data for Fusion Plasma Studies', held on September 25 and 26, 2000 at the IAEA Headquarters in Vienna, are briefly described. This report includes a summary of the presentations made by the meeting participants and a review of the accomplishments of the Co-ordinated Research Project (CRP). In addition, short summaries from the participants are included indicating the specific research completed in support of this CRP. (author)

377

Comparative analysis of effectiveness of treatment with anti-TB drugs of the 1st and 2nd lines for children and adolescents with multidrug resistant tuberculosis  

Digital Repository Infrastructure Vision for European Research (DRIVER)

The paper shows results of study on comparative treatment effectiveness in children and adolescents with from multi drug resistant tuberculosis MDR TB (2000-2008) treated with anti-TB drugs of the 2nd line (80 patients) and 1st line (80 patients) in the Kazakhstan. It was stated in patients with MDR TB that outcomes of treatment were successful in 91.2%, but relapse development of TB disease occurred in 12.7% of cases, and 5 (6.2%) patients died (P ?0.05). Thus, patients with MDR TB need t...

Tleukhan Abildaev; Gulbadan Bekembayeva; Larisa Kastykpaeva

2012-01-01

378

Proceedings of the 2nd NUCEF international symposium NUCEF'98. Safety research and development of base technology on nuclear fuel cycle  

International Nuclear Information System (INIS)

This volume contains 68 papers presented at the 2nd NUCEF International Symposium NUCEF'98 held on 16-17 November 1998, in Hitachinaka, Japan, following the 1st symposium NUCEF'95 (Proceeding: JAERI-Conf 96-003). The theme of this symposium was 'Safety Research and Development of Base Technology on Nuclear Fuel Cycle'. The papers were presented in oral and poster sessions on following research fields: (1) Criticality Safety, (2) Reprocessing and Partitioning, (3) Radioactive Waste Management. The 68 papers are indexed individually. (J.P.N.)

379

Proceedings of the 2nd NUCEF international symposium NUCEF`98. Safety research and development of base technology on nuclear fuel cycle  

Energy Technology Data Exchange (ETDEWEB)

This volume contains 68 papers presented at the 2nd NUCEF International Symposium NUCEF`98 held on 16-17 November 1998, in Hitachinaka, Japan, following the 1st symposium NUCEF`95 (Proceeding: JAERI-Conf 96-003). The theme of this symposium was `Safety Research and Development of Base Technology on Nuclear Fuel Cycle`. The papers were presented in oral and poster sessions on following research fields: (1) Criticality Safety, (2) Reprocessing and Partitioning, (3) Radioactive Waste Management. The 68 papers are indexed individually. (J.P.N.)

NONE

1999-03-01

380

2nd IAEA research coordination meeting on collection and evaluation of reference data for thermo-mechanical properties of fusion reactor plasma facing materials. Summary report  

International Nuclear Information System (INIS)

The proceedings and results of the 2nd IAEA Research Coordination Meeting on ''Collection and Evaluation of Reference Data for Thermo-mechanical Properties of Fusion Reactor Plasma Facing Materials'' held on March 25, 26 and 27, 1996 at the IAEA Headquarters in Vienna are briefly described. This report includes a summary of presentations made by the meeting participants, the results of discussions amongst the participants regarding the status of data, publication of a multi-author review paper and recommendations regarding future work. (author). 1 tab

 
 
 
 
381

Software Reviews.  

Science.gov (United States)

Reviewed are six computer software packages including "Lunar Greenhouse,""Dyno-Quest,""How Weather Works,""Animal Trackers,""Personal Science Laboratory," and "The Skeletal and Muscular Systems." Availability, functional, and hardware requirements are discussed. (CW)

Wulfson, Stephen, Ed.

1990-01-01

382

Silverlight 4 Business Intelligence Software  

CERN Document Server

Business Intelligence (BI) software allows you to view different components of a business using a single visual platform, which makes comprehending mountains of data easier. BI is everywhere. Applications that include reports, analytics, statistics, and historical and predictive modeling are all examples of BI. Currently, we are in the second generation of BI software - called BI 2.0 - which is focused on writing BI software that is predictive, adaptive, simple, and interactive. As computers and software have evolved, more data can be presented to end users with increasingly visually rich tech

Czernicki, Bart

2010-01-01

383

Preseismic oscillating electric field "strange attractor like" precursor, of T = 6 months, triggered by Ssa tidal wave. Application on large (Ms > 6.0R) EQs in Greece (October 1st, 2006 - December 2nd, 2008)  

CERN Document Server

In this work the preseismic "strange attractor like" precursor is studied, in the domain of the Earth's oscillating electric field for T = 6 months. It is assumed that the specific oscillating electric field is generated by the corresponding lithospheric oscillation, triggered by the Ssa tidal wave of the same wave length (6 months) under excess strain load conditions met in the focal area of a future large earthquake. The analysis of the recorded Earth's oscillating electric field by the two distant monitoring sites of PYR and HIO and for a period of time of 26 months (October 1st, 2006 - December 2nd, 2008) suggests that the specific precursor can successfully resolve the predictive time window in terms of months and for a "swarm" of large EQs (Ms > 6.0R), in contrast to the resolution obtained by the use of electric fields of shorter (T = 1, 14 days, single EQ identification) wave length. More over, the fractal character of the "strange attractor like" precursor in the frequency domain is pointed out. Fina...

Thanassoulas, C; Verveniotis, G; Zymaris, N

2009-01-01

384

Nuclear application software package  

International Nuclear Information System (INIS)

The Nuclear Application Software Package generates a full-core distribution and power peaking analysis every six minutes during reactor operation. Information for these calculations is provided by a set of fixed incore, self-powered rhodium detectors whose signals are monitored and averaged to obtain input for the software. Following the calculation of a power distribution and its normalization to a core heat balance, the maximum power peaks in the core and minimum DNBR are calculated. Additional routines are provided to calculate the core reactivity, future xenon concentrations, critical rod positions, and assembly isotopic concentrations

385

Inventory of safeguards software  

International Nuclear Information System (INIS)

The purpose of this survey activity will serve as a basis for determining what needs may exist in this arena for development of next-generation safeguards systems and approaches. 23 software tools are surveyed by JAEA and NMCC. Exchanging information regarding existing software tools for safeguards and discussing about a next R and D program of developing a general-purpose safeguards tool should be beneficial to a safeguards system design and indispensable to evaluate a safeguards system for future nuclear fuel facilities. (author)

386

A feasible study of docetaxel/nedaplatin combined chemotherapy for relapsed or refractory esophageal cancer patients as a 2nd-line chemotherapy  

International Nuclear Information System (INIS)

As a 2nd-line treatment for relapsed or refractory esophageal cancer patients after chemoradiotherapy, we performed a combination chemotherapy of docetaxel (DOC)/nedaplatin (CDGP) for 11 patients. Intravenous drip infusion of DOC 30 mg/m2 and CDGP 30 mg/m2 on days 1, 8 and 15, and 4 weeks treatment was assumed as 1 cycle. We treated 8 of 11 patients with more than 2 cycles, and 4 of 8 patients were treated with radiation therapy (RT). The effects by Response Evaluation Criteria In Solid Tumor (RECIST) revealed partial response (PR) in 2 patients (50%), stable disease (SD) in 1 patient and progress disease (PD) in 1 patient without RT, and PR in 3 patients and not effective in 1 patient with RT, respectively. There was no treatment-related death nor adverse event of grade 4. The Hematological toxicities of leukopenia of grade 3 were observed in 3 patients. Non-hematological toxicities more than grade 3 were not observed. The combination chemotherapy of DOC/CDGP was concluded to be safe and effective for relapsed or refractory esophageal cancer patients as a 2nd-line treatment. (author)

387

A study on trait anger – anger expression and friendship commitment levels of primary school 2nd stage students who play – do not play sports  

Directory of Open Access Journals (Sweden)

Full Text Available The aim of this research was to investigate the state of trait anger-anger expression and friendship commitment levels depending on the fact whether 2nd-stage-students in primary schools played sports or not.Personal Information Form, Trait Anger-Anger Expression Scales and Inventory of Peer Attachment were used in order to collect the data.The population of the research was consisted of the students who studied in 2nd stage of 40 primary state schools that belonged to National Education Directorate of Hatay Province between 2009-2010 academic year. Sample group was made up by 853 students of 21 primary schools who were selected from the population (262 boy students and 149 girl students who played sports as registered players; 233 boy students and 209 girl students who did not play sports..To sum up; the comparison of the scores of trait anger and external anger of the participant students who played sports yielded a statistically significant difference in terms of sex variable (p< 0.05. As for the sedentary group, boys had higher scores of internal anger and external anger than girls. In the comparison of the scores of friendship commitment in sedentary students in terms of sex variable, it was found out that there was a statistically significant difference between girls and boys, which was in favour of boys (p<0.05.

Hüseyin K?r?mo?lu

2010-09-01

388

BioTfueL Project: Targeting the Development of Second-Generation Biodiesel and Biojet Fuels Le projet BioTfueL : un projet de développement de biogazole et biokérosène de 2 génération  

Digital Repository Infrastructure Vision for European Research (DRIVER)

2nd generation biofuels will have an important part to take in the energy transition as far as fuels are concerned. Using non edible biomass, they will avoid any direct competition with food usage. Within 2nd generation biofuels, the BTL route consists in the production of middle distillates (Diesel and jet fuel) via gasification and Fischer-Tropsch (FT) synthesis. These fuels are called “drop in” fuels; this means that to be used they technically do not request any modification in t...

-c, Viguie? J.; Ullrich N.; Porot P.; Bournay L.; Hecquet M.; Rousseau J.

2013-01-01

389

ISE-SPL: uma abordagem baseada em linha de produtos de software aplicada à geração automática de sistemas para educação médica na plataforma E-learning / ISE-SPL: a software product line approach applied to automatic generation of systems for medical education in E-learning platform  

Scientific Electronic Library Online (English)

Full Text Available SciELO Brazil | Language: Portuguese Abstract in portuguese INTRODUÇÃO: O e-learning surgiu como uma forma complementar de ensino, trazendo consigo vantagens como o aumento da acessibilidade da informação, aprendizado personalizado, democratização do ensino e facilidade de atualização, distribuição e padronização do conteúdo. Neste sentido, o presente trabal [...] ho tem como objeto apresentar uma ferramenta, intitulada de ISE-SPL, cujo propósito é a geração automática de sistemas de e-learning para a educação médica, utilizando para isso sistemas ISE (Interactive Spaced-Education) e conceitos de Linhas de Produto de Software. MÉTODOS: A ferramenta consiste em uma metodologia inovadora para a educação médica que visa auxiliar o docente da área de saúde na sua prática pedagógica por meio do uso de tecnologias educacionais, todas baseadas na computação aplicada à saúde (Informática em Saúde). RESULTADOS: Os testes realizados para validar a ISE-SPL foram divididos em duas etapas: a primeira foi feita através da utilização de um software de análise de ferramentas semelhantes ao ISE-SPL, chamado S.P.L.O.T; e a segunda foi realizada através da aplicação de questionários de usabilidade aos docentes da área da saúde que utilizaram o ISE-SPL. CONCLUSÃO: Ambos os testes demonstraram resultados positivos, permitindo comprovar a eficiência e a utilidade da ferramenta de geração de softwares de e-learning para o docente da área da saúde. Abstract in english INTRODUCTION: E-learning, which refers to the use of Internet-related technologies to improve knowledge and learning, has emerged as a complementary form of education, bringing advantages such as increased accessibility to information, personalized learning, democratization of education and ease of [...] update, distribution and standardization of the content. In this sense, this paper aims to present a tool, named ISE-SPL, whose purpose is the automatic generation of E-learning systems for medical education, making use of ISE systems (Interactive Spaced-Education) and concepts of Software Product Lines. METHODS: The tool consists of an innovative methodology for medical education that aims to assist professors of healthcare in their teaching through the use of educational technologies, all based on computing applied to healthcare (Informatics in Health). RESULTS: The tests performed to validate the ISE-SPL were divided into two stages: the first was made by using a software analysis tool similar to ISE-SPL, called S.P.L.O.T and the second was performed through usability questionnaires to healthcare professors who used ISE-SPL. CONCLUSION: Both tests showed positive results, allowing to conclude that ISE-SPL is an efficient tool for generation of E-learning software and useful for teachers in healthcare.

Túlio de Paiva Marques, Carvalho; Bruno Gomes de, Araújo; Ricardo Alexsandro de Medeiros, Valentim; Jose, Diniz Junior; Francis Solange Vieira, Tourinho; Rosiane Viana Zuza, Diniz.

2013-12-01

390

Parametric Estimation of Software Systems  

Directory of Open Access Journals (Sweden)

Full Text Available Software is characterised by software metrics. Calculation of effort estimation is a type of software metrics. Software effort estimation plays a vital role in the development of software. In recent years, software has become the most expensive component of computer system projects. The major part of cost of software development is due to the human-effort, and most cost estimation methods focus on this aspect and give estimates in terms of person-month. In this paper, estimation of effort required for the development of software project is calculated using genetic algorithm approach. Software systems are becoming complex and they desire for new, effective and optimized technique with limited resources. A solution to this problem lies in nature where complex species have evolved from simple organisms and constantly become able to adapt to changes in the environment. In case of species, it takes hundreds of generations and years which are not considerable in the field of software engineering. With the use of genetic algorithm, it can be done instantly by simulating the results on various tools of genetic algorithm.

Kavita Choudhary

2011-05-01

391

MUSE instrument software  

Science.gov (United States)

MUSE Instrumentation Software is the software devoted to the control of the Multi-Unit Spectroscopic Explorer (MUSE), a second-generation VLT panoramic integral-field spectrograph instrument, installed at Paranal in January 2014. It includes an advanced and user-friendly GUI to display the raw data of the 24 detectors, as well as the on-line reconstructed images of the field of view allowing users to assess the quality of the data in quasi-real time. Furthermore, it implements the slow guiding system used to remove effects of possible differential drifts between the telescope guide probe and the instrument, and reach high image stability (software design and describe the developed tools that efficiently support astronomers while operating this complex instrument at the telescope.

Zins, Gérard; Pécontal, Arlette; Larrieu, Marie; Girard, Nathalie; Jarno, Aurélien; Cumani, Claudio; Baksai, Pedro; Comin, Mauro; Kiekebusch, Mario; Knudstrup, Jens; Popovic, Dan; Bacon, Roland; Richard, Johan; Stuik, Remko; Vernet, Joel

2014-07-01

392

Proceedings of the Fifth Triennial Software Quality Forum 2000, Software for the Next Millennium, Software Quality Forum  

Energy Technology Data Exchange (ETDEWEB)

The Software Quality Forum is a triennial conference held by the Software Quality Assurance Subcommittee for the Department of Energy's Quality Managers. The forum centers on key issues, information, and technology important in software development for the Nuclear Weapons Complex. This year it will be opened up to include local information technology companies and software vendors presenting their solutions, ideas, and lessons learned. The Software Quality Forum 2000 will take on a more hands-on, instructional tone than those previously held. There will be an emphasis on providing information, tools, and resources to assist developers in their goal of producing next generation software.

Scientific Software Engineering Group, CIC-12

2000-04-01

393

Comparison of Strong Gravitational Lens Model Software II. HydraLens: Computer-Assisted Strong Gravitational Lens Model Generation and Translation  

CERN Document Server

The behavior of strong gravitational lens model software in the analysis of lens models is not necessarily consistent among the various software available, suggesting that the use of several models may enhance the understanding of the system being studied. Among the publicly available codes, the model input files are heterogeneous, making the creation of multiple models tedious. An enhanced method of creating model files and a method to easily create multiple models, may increase the number of comparison studies. HydraLens simplifies the creation of model files for four strong gravitational lens model software packages, including Lenstool, Gravlens/Lensmodel, glafic and PixeLens, using a custom designed GUI for each of the four codes that simplifies the entry of the model for each of these codes, obviating the need for user manuals to set the values of the many flags and in each data field. HydraLens is designed in a modular fashion, which simplifies the addition of other strong gravitational lens codes in th...

Lefor, Alsn T

2015-01-01

394

Software survey  

Energy Technology Data Exchange (ETDEWEB)

This article presented a guide to new software applications designed to facilitate exploration, drilling and production activities. Oil and gas producers can use the proudcts for a range of functions, including reservoir characterization and accounting. In addition to a description of the software application, this article listed the name of software providers and the new features available in products. The featured software of Calgary-based providers included: PetroLOOK by Alcaro Softworks Inc.; ProphetFM and MasterDRIL by Advanced Measurements Inc.,; the EDGE screening tool by Canadian Discovery Ltd.; Emission Manager and Regulatory Document Manager by Envirosoft Corporation; ResSurveil, ResBalance and ResAssist by Epic Consulting Services Ltd; FAST WellTest and FAST RTA by Fekete Associates Inc.; OMNI 3D and VISTA 2D/3D by Gedco; VisualVoxAT, SBED and SBEDStudio by Geomodeling Technology Corporation; geoSCOUT, petroCUBE and gDC by GeoLOGIC Systems Ltd.; IHS Enerdeq Desktop and PETRA by IHS; DataVera by Intervera Data Solutions; FORGAS, PIPEFLO and WELLFLO by Neotechnology Consultants Ltd.; E and P Workflow Solutions by Neuralog Inc.; Oil and Gas Solutions by RiskAdvisory division of SAS; Petrel; GeoFrame, ECLIPSE, OFM, Osprey Risk and Avocet modeler, PIPESIM and Merak by Schlumberger Information Solutions; esi.manage and esi.executive by 3esi; and, dbAFE and PROSPECTOR by Winfund Corporation. Tower Management and Maintenance System, OverSite and Safety Orientation Management System software by Edmonton-based 3C Information Solutions Inc. were also highlighted along with PowerSHAPE, PowerMILL and FeatureCAM software by Windsor, Ontario-based Delcam. Software products by Texas-based companies featured in this article included the HTRI Xchanger Suite by Heat Transfer Research Inc.; Drillworks by Knowledge Systems; and GeoProbe, PowerView; GeoGraphix, AssetPlanner, Nexus software, Decision Management System, AssetSolver, and OpenWorks by Landmark; and, eVIN, Rig-Hand, and OVS by Merrick Systems Inc.

Anon.

2007-07-15

395

Molten carbonate fuel cell product design & improvement - 2nd quarter, 1996. Quarterly report, April 1--June 30, 1996  

Energy Technology Data Exchange (ETDEWEB)

The main objective of this project is to establish the commercial readiness of a molten carbonate fuel cell power plant for distributed power generation, cogeneration, and compressor station applications. This effort includes marketing, systems design and analysis, packaging and assembly, test facility development, and technology development, improvement, and verification.

NONE

1997-05-01

396

A Discrete Mechanical Model of 2D Carbon Allotropes Based on a 2nd-Generation REBO Potential: Geometry and Prestress of Single-Walled CNTs  

Digital Repository Infrastructure Vision for European Research (DRIVER)

Predicting the natural equilibrium radius of a single - walled Carbon NanoTube (CNT) of given chirality, and evaluating - if any - the accompanying prestress state, are two important issues, the first of which has been repeatedly taken up in the last decade or so. In this paper, we work out both such a prediction and such an evaluation for achiral (that is, armchair and zigzag) CNTs, modeled as discrete elastic structures whose shape and volume changes are governed by a Reac...

Favata, Antonino; Micheletti, Andrea; Podio-guidugli, Paolo; Pugno, Nicola

2014-01-01

397

SHARK (System for coronagraphy with High order Adaptive optics from R to K band): a proposal for the LBT 2nd generation instrumentation  

Science.gov (United States)

This article presents a proposal aimed at investigating the technical feasibility and the scientific capabilities of high contrast cameras to be implemented at LBT. Such an instrument will fully exploit the unique LBT capabilities in Adaptive Optics (AO) as demonstrated by the First Light Adaptive Optics (FLAO) system, which is obtaining excellent results in terms of performance and reliability. The aim of this proposal is to show the scientific interest of such a project, together with a conceptual opto-mechanical study which shows its technical feasibility, taking advantage of the already existing AO systems, which are delivering the highest Strehl experienced in nowadays existing telescopes. Two channels are foreseen for SHARK, a near infrared channel (2.5-0.9 um) and a visible one (0.9 - 0.6 um), both providing imaging and coronagraphic modes. The visible channel is equipped with a very fast and low noise detector running at 1.0 kfps and an IFU spectroscopic port to provide low and medium resolution spectra of 1.5 x 1.5 arcsec fields. The search of extra solar giant planets is the main science case and the driver for the technical choices of SHARK, but leaving room for several other interesting scientific topics, which will be briefly depicted here.

Farinato, Jacopo; Pedichini, Fernando; Pinna, Enrico; Baciotti, Francesca; Baffa, Carlo; Baruffolo, Andrea; Bergomi, Maria; Bruno, Pietro; Cappellaro, Enrico; Carbonaro, Luca; Carlotti, Alexis; Centrone, Mauro; Close, Laird; Codona, Johanan; Desidera, Silvano; Dima, Marco; Esposito, Simone; Fantinel, Daniela; Farisato, Giancarlo; Fontana, Adriano; Gaessler, Wolfgang; Giallongo, Emanuele; Gratton, Raffaele; Greggio, Davide; Guerra, Juan Carlos; Guyon, Olivier; Hinz, Philip; Leone, Francesco; Lisi, Franco; Magrin, Demetrio; Marafatto, Luca; Munari, Matteo; Pagano, Isabella; Puglisi, Alfio; Ragazzoni, Roberto; Salasnich, Bernardo; Sani, Eleonora; Scuderi, Salvo; Stangalini, Marco; Testa, Vincenzo; Verinaud, Christophe; Viotto, Valentina

2014-08-01

398

On factors contributing to quality of nuclear control computer software  

International Nuclear Information System (INIS)

Safety related computer software has increasingly come into focus in the software engineering field over the past decade. This paper describes how Ontario Hydro has addressed the software industry concerns in the methodology used for designing and implementing the unit control computer software for the new Darlington Generating Station. The corner stone of the methodology is a software quality assurance (SQA) program, which was initially set up to cover only the software development portion of the software life cycle, but which is now being extended to cover the entire software life cycle, including commissioning, operation and maintenance of the software. (author). 3 refs., 2 figs

399

A neutron diffraction study of structural distortion and magnetic ordering in the cation-ordered perovskites Ba2Nd1?xYxMoO6  

International Nuclear Information System (INIS)

The cation ordered perovskites Ba2Nd1?xYxMoO6 (0.04?x?0.35) have been synthesised by solid-state techniques under reducing conditions at temperatures up to 1350 °C. Rietveld analyses of X-ray and neutron powder diffraction data show that these compounds adopt a tetragonally distorted perovskite structure. The tetragonal distortion is driven by the bonding requirements of the Ba2+ cation that occupies the central interstice of the perovskite; this cation would be underbonded if these compounds retained the cubic symmetry exhibited by the prototypical structure. The size and charge difference between the lanthanides and Mo5+ lead to complete ordering of the cations to give a rock-salt ordering of Nd3+/Y3+O6 and MoO6 octahedra. The I4/m space group symmetry is retained on cooling the x=0.1, 0.2 and 0.35 samples to low temperature ca. 2 K. Ba2Nd0.90Y0.10MoO6 undergoes a gradual distortion of the MoO6 units on cooling from room temperature to give two long trans bonds (2.001(2) Å) along the z-direction and four shorter apical bonds (1.9563(13) Å) in the xy-plane. This distortion of the MoO6 units stabilises the 4d1 electron in the dxz and dyz orbitals whilst the dxy orbital is increased in energy due to the contraction of the Mo–O bonds in the xy-plane. This bond extension along z is propagated through the structure and gives a negative thermal expansion of ?13×10?6 K?1 along c. The overall volumetric thermal expansion is positive due to conventional expansion along the other two crystallographic axes. With increasing Y3+ content this distortion is reduced in x=0.2 and eliminated in x=0.35 which contains largely regular MoO6 octahedra. The x=0.1 and x=0.2 show small peaks in the neutron diffraction profile due to long range antiferromagnetic order arising from ordered moments of ca. 2 ?B. - Graphical Abstract: The distortion in the molybdenum crystal field is continuously adjusted by chemical composition of the perovskite. Highlights: ? Introducing Y3+ into Ba2NdMoO6 stabilises tetragonal symmetry to 2 K. ? A distortion of the ligand field around the Mo5+ 4d1 cation confers electronic stabilisation. ? The size of the distortion is progressively reduced with increasing Y3+ content. ? Distortion gives negative thermal expansion along z and antiferromagnetic order at T?15 K

400

PREFACE: 2nd Russia-Japan-USA Symposium on the Fundamental and Applied Problems of Terahertz Devices and Technologies (RJUS TeraTech - 2013)  

Science.gov (United States)

The 2nd Russia-Japan-USA Symposium 'The Fundamental & Applied Problems of Terahertz Devices & Technologies' (RJUS TeraTech - 2013) Bauman Moscow State Technical University Moscow, Russia, 3-6 June, 2013 The 2nd Russia-Japan-USA Symposium 'The Fundamental & Applied Problems of Terahertz Devices & Technologies' (RJUS TeraTech - 2013) was held in Bauman Moscow State Technical University on 3-6 June 2013 and was devoted to modern problems of terahertz optical technologies. RJUS TeraTech 2013 was organized by Bauman Moscow State Technical University in cooperation with Tohoku University (Sendai, Japan) and University of Buffalo (The State University of New York, USA). The Symposium was supported by Bauman Moscow State Technical University (Moscow, Russia) and Russian Foundation for Basic Research (grant number 13-08-06100-g). RJUS TeraTech - 2013 became a foundation for sharing and discussing modern and promising achievements in fundamental and applied problems of terahertz optical technologies, devices based on grapheme and grapheme strictures, condensed matter of different nature. Among participants of RJUS TeraTech - 2013, there were more than 100 researchers and students from different countries. This volume contains proceedings of the 2nd Russia-Japan-USA Symposium 'The Fundamental & Applied Problems of Terahertz Devices & Technologies'. Valeriy Karasik, Viktor Ryzhii and Stanislav Yurchenko Bauman Moscow State Technical University Symposium chair Anatoliy A Aleksandrov, Rector of BMSTU Symposium co-chair Valeriy E Karasik, Head of the Research and Educational Center 'PHOTONICS AND INFRARED TECHNOLOGY' (Russia) Invited Speakers Taiichi Otsuji, Research Institute of Electrical Communication, Tohoku University, Sendai, Japan Akira Satou, Research Institute of Electrical Communication, Tohoku University, Sendai, Japan Michael Shur, Electrical, Computer and System Engineering and Physics, Applied Physics, and Astronomy, Rensselaer Polytechnic Institute, NY, USA Natasha Kirova, University Paris-Sud, France Andrei Sergeev, Department of Electrical Engineering, The University of Buffalo, The State University of New Your, Buffalo, NY, USA Magnus Willander, Linkoping University (LIU), Department of Science and Technology, Linkopings, Sweden Dmitry R Khohlov, Physical Faculty, Lomonosov Moscow State University, Russia Vladimir L Vaks, Institute for Physics of Microstructures of Russian Academy of Sciences, Russia

Karasik, Valeriy; Ryzhii, Viktor; Yurchenko, Stanislav

2014-03-01