WorldWideScience
 
 
1

STARS 2.0: 2nd-generation open-source archiving and query software  

Science.gov (United States)

The Subaru Telescope is in process of developing an open-source alternative to the 1st-generation software and databases (STARS 1) used for archiving and query. For STARS 2, we have chosen PHP and Python for scripting and MySQL as the database software. We have collected feedback from staff and observers, and used this feedback to significantly improve the design and functionality of our future archiving and query software. Archiving - We identified two weaknesses in 1st-generation STARS archiving software: a complex and inflexible table structure and uncoordinated system administration for our business model: taking pictures from the summit and archiving them in both Hawaii and Japan. We adopted a simplified and normalized table structure with passive keyword collection, and we are designing an archive-to-archive file transfer system that automatically reports real-time status and error conditions and permits error recovery. Query - We identified several weaknesses in 1st-generation STARS query software: inflexible query tools, poor sharing of calibration data, and no automatic file transfer mechanisms to observers. We are developing improved query tools and sharing of calibration data, and multi-protocol unassisted file transfer mechanisms for observers. In the process, we have redefined a 'query': from an invisible search result that can only transfer once in-house right now, with little status and error reporting and no error recovery - to a stored search result that can be monitored, transferred to different locations with multiple protocols, reporting status and error conditions and permitting recovery from errors.

Winegar, Tom

2008-07-01

2

BASE - 2nd generation software for microarray data management and analysis  

Directory of Open Access Journals (Sweden)

Full Text Available Abstract Background Microarray experiments are increasing in size and samples are collected asynchronously over long time. Available data are re-analysed as more samples are hybridized. Systematic use of collected data requires tracking of biomaterials, array information, raw data, and assembly of annotations. To meet the information tracking and data analysis challenges in microarray experiments we reimplemented and improved BASE version 1.2. Results The new BASE presented in this report is a comprehensive annotable local microarray data repository and analysis application providing researchers with an efficient information management and analysis tool. The information management system tracks all material from biosource, via sample and through extraction and labelling to raw data and analysis. All items in BASE can be annotated and the annotations can be used as experimental factors in downstream analysis. BASE stores all microarray experiment related data regardless if analysis tools for specific techniques or data formats are readily available. The BASE team is committed to continue improving and extending BASE to make it usable for even more experimental setups and techniques, and we encourage other groups to target their specific needs leveraging on the infrastructure provided by BASE. Conclusion BASE is a comprehensive management application for information, data, and analysis of microarray experiments, available as free open source software at http://base.thep.lu.se under the terms of the GPLv3 license.

Nordborg Nicklas

2009-10-01

3

BASE - 2nd generation software for microarray data management and analysis  

Science.gov (United States)

Background Microarray experiments are increasing in size and samples are collected asynchronously over long time. Available data are re-analysed as more samples are hybridized. Systematic use of collected data requires tracking of biomaterials, array information, raw data, and assembly of annotations. To meet the information tracking and data analysis challenges in microarray experiments we reimplemented and improved BASE version 1.2. Results The new BASE presented in this report is a comprehensive annotable local microarray data repository and analysis application providing researchers with an efficient information management and analysis tool. The information management system tracks all material from biosource, via sample and through extraction and labelling to raw data and analysis. All items in BASE can be annotated and the annotations can be used as experimental factors in downstream analysis. BASE stores all microarray experiment related data regardless if analysis tools for specific techniques or data formats are readily available. The BASE team is committed to continue improving and extending BASE to make it usable for even more experimental setups and techniques, and we encourage other groups to target their specific needs leveraging on the infrastructure provided by BASE. Conclusion BASE is a comprehensive management application for information, data, and analysis of microarray experiments, available as free open source software at http://base.thep.lu.se under the terms of the GPLv3 license. PMID:19822003

2009-01-01

4

2nd & 3rd Generation Vehicle Subsystems  

Science.gov (United States)

This paper contains viewgraph presentation on the "2nd & 3rd Generation Vehicle Subsystems" project. The objective behind this project is to design, develop and test advanced avionics, power systems, power control and distribution components and subsystems for insertion into a highly reliable and low-cost system for a Reusable Launch Vehicles (RLV). The project is divided into two sections: 3rd Generation Vehicle Subsystems and 2nd Generation Vehicle Subsystems. The following topics are discussed under the first section, 3rd Generation Vehicle Subsystems: supporting the NASA RLV program; high-performance guidance & control adaptation for future RLVs; Evolvable Hardware (EHW) for 3rd generation avionics description; Scaleable, Fault-tolerant Intelligent Network or X(trans)ducers (SFINIX); advance electric actuation devices and subsystem technology; hybrid power sources and regeneration technology for electric actuators; and intelligent internal thermal control. Topics discussed in the 2nd Generation Vehicle Subsystems program include: design, development and test of a robust, low-maintenance avionics with no active cooling requirements and autonomous rendezvous and docking systems; design and development of a low maintenance, high reliability, intelligent power systems (fuel cells and battery); and design of a low cost, low maintenance high horsepower actuation systems (actuators).

2000-01-01

5

2nd Generation Alkaline Electrolysis : Final report  

DEFF Research Database (Denmark)

This report provides the results of the 2nd Generation Alkaline Electrolysis project which was initiated in 2008. The project has been conducted from 2009-2012 by a consortium comprising Århus University Business and Social Science – Centre for Energy Technologies (CET (former HIRC)), Technical University of Denmark – Mechanical Engineering (DTU-ME), Technical University of Denmark – Energy Conversion (DTU-EC), FORCE Technology and GreenHydrogen.dk. The project has been supported by EUDP.

Kjartansdóttir, Cecilia Kristin; Allebrod, Frank

2013-01-01

6

2nd Generation RLV Risk Definition Program  

Science.gov (United States)

The 2nd Generation RLV Risk Reduction Mid-Term Report summarizes the status of Kelly Space & Technology's activities during the first two and one half months of the program. This report was presented to the cognoscente Contracting Officer's Technical Representative (COTR) and selected Marshall Space Flight Center staff members on 26 September 2000. The report has been approved and is distributed on CD-ROM (as a PowerPoint file) in accordance with the terms of the subject contract, and contains information and data addressing the following: (1) Launch services demand and requirements; (2) Architecture, alternatives, and requirements; (3) Costs, pricing, and business cases analysis; (4) Commercial financing requirements, plans, and strategy; (5) System engineering processes and derived requirements; and (6) RLV system trade studies and design analysis.

Davis, Robert M.; Stucker, Mark (Technical Monitor)

2000-01-01

7

2nd Generation alkaline electrolysis. Final report  

Energy Technology Data Exchange (ETDEWEB)

The overall purpose of this project has been to contribute to this load management by developing a 2{sup nd} generation of alkaline electrolysis system characterized by being compact, reliable, inexpensive and energy efficient. The specific targets for the project have been to: 1) Increase cell efficiency to more than 88% (according to the higher heating value (HHV)) at a current density of 200 mA /cm{sup 2}; 2) Increase operation temperature to more than 100 degree Celsius to make the cooling energy more valuable; 3) Obtain an operation pressure more than 30 bar hereby minimizing the need for further compression of hydrogen for storage; 4) Improve stack architecture decreasing the price of the stack with at least 50%; 5) Develop a modular design making it easy to customize plants in the size from 20 to 200 kW; 6) Demonstrating a 20 kW 2{sup nd} generation stack in H2College at the campus of Arhus University in Herning. The project has included research and development on three different technology tracks of electrodes; an electrochemical plating, an atmospheric plasma spray (APS) and finally a high temperature and pressure (HTP) track with operating temperature around 250 deg. C and pressure around 40 bar. The results show that all three electrode tracks have reached high energy efficiencies. In the electrochemical plating track a stack efficiency of 86.5% at a current density of 177mA/cm{sup 2} and a temperature of 74.4 deg. C has been shown. The APS track showed cell efficiencies of 97%, however, coatings for the anode side still need to be developed. The HTP cell has reached 100 % electric efficiency operating at 1.5 V (the thermoneutral voltage) with a current density of 1. 1 A/cm{sup 2}. This track only tested small cells in an externally heated laboratory set-up, and thus the thermal loss to surroundings cannot be given. The goal set for the 2{sup nd} generation electrolyser system, has been to generate 30 bar pressure in the cell stack. An obstacle to be overcomed has been to ensure equalisation of the H{sub 2} and O{sub 2} pressure to avoid that mixing of gasses can occur. To solve this problem, a special equilibrium valve has been developed to mechanically control that the pressure of the H{sub 2} at all times equals the O{sub 2} side. The developments have resulted in a stack design, which is a cylindrical pressure vessel, with each cell having a cell ''wall'' sufficiently thick, to resist the high pressure and sealed with O-rings for perfect sealing at high pressures. The stack has in test proved to resist a pressure on 45 bar, though some adjustment is still needed to optimize the pressure resistance and efficiency. When deciding on the new stack design both a 'zero gap' and 'non-zero gap' was considered. The zero gap design is more efficient than non-zero gap, however the design is more complex and very costly, primarily because the additional materials and production costs for zero gap electrodes. From these considerations, the concept of a ''low gap'', low diameter, high pressure and high cell number electrolyser stack was born, which could offer an improved efficiency of the electrolyser without causing the same high material and production cost as a zero gap zero gap solution. As a result the low gap design and pressurized stack has reduced the price by 60% of the total system, as well as a reduced system footprint. The progress of the project required a special focus on corrosion testing and examination of polymers in order to find alternative durable membrane and gasket materials. The initial literature survey and the first tests indicated that the chemical resistance of polymers presented a greater challenge than anticipated, and that test data from commercial suppliers were insufficient to model the conditions in the electrolyser. The alkali resistant polymers (e.g. Teflon) are costly and the search for cheaper alternatives turned into a major aim. A number of different tests were run under accelerated conditions and the degradation mechani

Yde, L. [Aarhus Univ. Business and Social Science - Centre for Energy Technologies (CET), Aarhus (Denmark); Kjartansdottir, C.K. [Technical Univ. of Denmark. DTU Mechanical Engineering, Kgs. Lyngby (Denmark); Allebrod, F. [Technical Univ. of Denmark. DTU Energy Conversion, DTU Risoe Campus, Roskilde (Denmark)] [and others

2013-03-15

8

The 2nd Generation Real Time Mission Monitor (RTMM) Development  

Science.gov (United States)

The NASA Real Time Mission Monitor (RTMM) is a visualization and information system that fuses multiple Earth science data sources, to enable real time decisionmaking for airborne and ground validation experiments. Developed at the National Aeronautics and Space Administration (NASA) Marshall Space Flight Center, RTMM is a situational awareness, decision-support system that integrates satellite imagery and orbit data, radar and other surface observations (e.g., lightning location network data), airborne navigation and instrument data sets, model output parameters, and other applicable Earth science data sets. The integration and delivery of this information is made possible using data acquisition systems, network communication links, network server resources, and visualizations through the Google Earth virtual globe application. In order to improve the usefulness and efficiency of the RTMM system, capabilities are being developed to allow the end-user to easily configure RTMM applications based on their mission-specific requirements and objectives. This second generation RTMM is being redesigned to take advantage of the Google plug-in capabilities to run multiple applications in a web browser rather than the original single application Google Earth approach. Currently RTMM employs a limited Service Oriented Architecture approach to enable discovery of mission specific resources. We are expanding the RTMM architecture such that it will more effectively utilize the Open Geospatial Consortium Sensor Web Enablement services and other new technology software tools and components. These modifications and extensions will result in a robust, versatile RTMM system that will greatly increase flexibility of the user to choose which science data sets and support applications to view and/or use. The improvements brought about by RTMM 2nd generation system will provide mission planners and airborne scientists with enhanced decision-making tools and capabilities to more efficiently plan, prepare and execute missions, as well as to playback and review past mission data. To paraphrase the old television commercial RTMM doesn t make the airborne science, it makes the airborne science easier.

Blakeslee, Richard; Goodman, Michael; Meyer, Paul; Hardin, Danny; Hall, John; He, Yubin; Regner, Kathryn; Conover, Helen; Smith, Tammy; Lu, Jessica; Garrett, Michelle

2009-01-01

9

Performance and validation of COMPUCEA 2nd generation for uranium measurements in physical inventory verifications  

International Nuclear Information System (INIS)

Full text: In order to somewhat alleviate the kind of logistical problems encountered in the in-field measurements with the current COMPUCEA equipment (COMbined Product for Uranium Content and Enrichment Assay), and with the expected benefits of saving some time and costs for the missions in mind, ITU is presently developing a 2nd generation of the COMPUCEA device. This new development also forms a task in the support programme of the Joint Research Centre of the European Commission to the IAEA. To validate the in-field performance of the newly developed 2nd generation COMPUCEA, a prototype has been tested together with the 1st generation equipment during physical inventory verification (PIV) measurements in different uranium fuel fabrication plants in Europe. In this paper we will present the prototype of COMPUCEA 2nd generation, its hardware as well as the software developed for the evaluation of the U content and 235U enrichment. We will show a comparison of the performance of the 2nd generation with the 1st generation on a larger number of uranium samples measured during the in-field PIVs. The observed excellent performance of the new COMPUCEA represents an important step in the validation of this new instrument. (author)

10

Colchicine treatment of jute seedlings in the 1st and 2nd generation after irradiation  

International Nuclear Information System (INIS)

Colchicine treatment (0.05% for 12 h) to 15 day old seedlings in the 1st generation after X-ray or gamma-ray exposure was lethal. In contrast the same colchicine treatment to 15 day old seedlings in the 2nd generation was effective in inducing polyploids. (author)

11

Intergenerational Transmission and the School-to-work Transition for 2nd Generation Immigrants  

DEFF Research Database (Denmark)

We analyse the extent of intergenerational transmission through parental capital, ethnic capital and neighbourhood effects on several aspects of the school-to-work transition of 2nd generation immigrants and young ethnic Danes. The main findings are that parental capital has strong positive effects on the probability of completing a qualifying education and on the entry into the labour market, but it has a much smaller impact on the duration of the first employment spell and on the wage level. Growing up in neighbourhoods with a high concentration of immigrants is associated with negative labour market prospects both for young natives and 2nd generation immigrants.

Nielsen, Helena Skyt; Rosholm, Michael

2001-01-01

12

A 2nd Generation Interfacial X-ray Microscope  

Science.gov (United States)

Understanding and controlling the physical and chemical processes occurring at the interface of materials is a central theme in many of today's scientific inquiries and technological advancements. Experimental investigations of interfaces has benefited from a large set of imaging techniques such as Probe microscopy, and Electron microscopy. Yet, numerous systems comprised of buried interfaces that are of immense significance, remain out of the reach of these methods because of their lack of depth penetration capabilities or inoperability in extreme conditions of pressure and/or temperature. Such systems can benefit from the development of complementary x-ray based imaging techniques that can operate in the above cited conditions. Combining the surface sensitivity of x-ray scattering and well-established methodology and instrumentation of transmission x-ray microscopy, a second generation interfacial x-ray microscope (IXM) is currently under development at Argonne's advanced photon source with the aim of achieving a lateral resolution of 50 nm and collection times on the order of seconds. The IXM has been used to image surface topography of solid/gas, solid/liquid with sub-nanometer height sensitivity. These scientific results as well as the instrumentation will be presented.

Laanait, Nouamane; Zhang, Zhan; Fenter, Paul

2013-03-01

13

The 1997 Protocol and the European Union (European Union and '2nd generation' responsibility conventions)  

International Nuclear Information System (INIS)

The issue of accession of the Eastern European Member States to the 1997 Protocol is discussed with focus on the European Union's authority and enforcement powers. Following up the article published in the preceding issue of this journal, the present contribution analyses the relations of the '2nd generation' responsibility conventions to the law of the European Union. (orig.)

14

Sustainable Production of Fuel : A Study for Customer Adoption of 2nd Generation of Biofuel  

Digital Repository Infrastructure Vision for European Research (DRIVER)

Abstract Finding a new fuel to substitute gasoline which reducing rapidly every year, is an urgent problem in the world. In this situation, biofuel is considered to be one kind of new fuel which make no pollution. Nowadays, 1st generation biofuel is familiar with people and adopted by customers, which make it have a stable market share. Since it also brings new problems, 2nd generation biofuel appear and solve all the problems.In the thesis, I compared the pros and cons between the 1st and 2n...

Jin, Ying

2010-01-01

15

Life Cycle Systems Engineering Approach to NASA's 2nd Generation Reusable Launch Vehicle  

Science.gov (United States)

The overall goal of the 2nd Generation RLV Program is to substantially reduce technical and business risks associated with developing a new class of reusable launch vehicles. NASA's specific goals are to improve the safety of a 2nd- generation system by 2 orders of magnitude - equivalent to a crew risk of 1 -in- 10,000 missions - and decrease the cost tenfold, to approximately $1,000 per pound of payload launched. Architecture definition is being conducted in parallel with the maturating of key technologies specifically identified to improve safety and reliability, while reducing operational costs. An architecture broadly includes an Earth-to-orbit reusable launch vehicle, on-orbit transfer vehicles and upper stages, mission planning, ground and flight operations, and support infrastructure, both on the ground and in orbit. The systems engineering approach ensures that the technologies developed - such as lightweight structures, long-life rocket engines, reliable crew escape, and robust thermal protection systems - will synergistically integrate into the optimum vehicle. Given a candidate architecture that possesses credible physical processes and realistic technology assumptions, the next set of analyses address the system's functionality across the spread of operational scenarios characterized by the design reference missions. The safety/reliability and cost/economics associated with operating the system will also be modeled and analyzed to answer the questions "How safe is it?" and "How much will it cost to acquire and operate?" The systems engineering review process factors in comprehensive budget estimates, detailed project schedules, and business and performance plans, against the goals of safety, reliability, and cost, in addition to overall technical feasibility. This approach forms the basis for investment decisions in the 2nd Generation RLV Program's risk-reduction activities. Through this process, NASA will continually refine its specialized needs and identify where Defense and commercial requirements overlap those of civil missions.

Thomas, Dale; Smith, Charles; Safie, Fayssal; Kittredge, Sheryl

2002-01-01

16

Effects of Thermal Cycling on Control and Irradiated EPC 2nd Generation GaN FETs  

Science.gov (United States)

The power systems for use in NASA space missions must work reliably under harsh conditions including radiation, thermal cycling, and exposure to extreme temperatures. Gallium nitride semiconductors show great promise, but information pertaining to their performance is scarce. Gallium nitride N-channel enhancement-mode field effect transistors made by EPC Corporation in a 2nd generation of manufacturing were exposed to radiation followed by long-term thermal cycling in order to address their reliability for use in space missions. Results of the experimental work are presented and discussed.

Patterson, Richard L.; Scheick, Leif; Lauenstein, Jean-Marie; Casey, Megan; Hammoud, Ahmad

2013-01-01

17

The New 2nd-Generation SRF R&D Facility at Jefferson Lab: TEDF  

Energy Technology Data Exchange (ETDEWEB)

The US Department of Energy has funded a near-complete renovation of the SRF-based accelerator research and development facilities at Jefferson Lab. The project to accomplish this, the Technical and Engineering Development Facility (TEDF) Project has completed the first of two phases. An entirely new 3,100 m{sup 2} purpose-built SRF technical work facility has been constructed and was occupied in summer of 2012. All SRF work processes with the exception of cryogenic testing have been relocated into the new building. All cavity fabrication, processing, thermal treatment, chemistry, cleaning, and assembly work is collected conveniently into a new LEED-certified building. An innovatively designed 800 m2 cleanroom/chemroom suite provides long-term flexibility for support of multiple R&D and construction projects as well as continued process evolution. The characteristics of this first 2nd-generation SRF facility are described.

Reece, Charles E.; Reilly, Anthony V.

2012-09-01

18

The New 2nd-Generation SRF RandD Facility at Jefferson Lab: TEDF  

International Nuclear Information System (INIS)

The US Department of Energy has funded a near-complete renovation of the SRF-based accelerator research and development facilities at Jefferson Lab. The project to accomplish this, the Technical and Engineering Development Facility (TEDF) Project has completed the first of two phases. An entirely new 3,100 m2 purpose-built SRF technical work facility has been constructed and was occupied in summer of 2012. All SRF work processes with the exception of cryogenic testing have been relocated into the new building. All cavity fabrication, processing, thermal treatment, chemistry, cleaning, and assembly work is collected conveniently into a new LEED-certified building. An innovatively designed 800 m2 cleanroom/chemroom suite provides long-term flexibility for support of multiple RandD and construction projects as well as continued process evolution. The characteristics of this first 2nd-generation SRF facility are described

19

Conceptual design study of Nb3Sn low-beta quadrupoles for 2nd generation LHC IRs  

International Nuclear Information System (INIS)

Conceptual designs of 90-mm aperture high-gradient quadrupoles based on the Nb3Sn superconductor, are being developed at Fermilab for possible 2nd generation IRs with the similar optics as in the current low-beta insertions. Magnet designs and results of magnetic, mechanical, thermal and quench protection analysis for these magnets are presented and discussed

20

A preliminary study investigating class characteristics in the Gurmukhi handwriting of 1st and 2nd generation Punjabis.  

Science.gov (United States)

Gurmukhi is a written script of the Punjabi language spoken by 104 million people worldwide. It has been previously shown in a study of Punjabi residents to contain several unique class characteristics. In this paper these class characteristics and others were analysed in both 1st generation and 2nd generation Punjabi decedents residing in the United Kingdom. Using the Pearson Chi-squared test, eight characteristic features were found to be statistically different in the Gurmukhi handwriting of the two populations (p > 0.01). Additionally there are several changes in previously identified class characteristics, such as script type and angularity of characters, between the 1st generation and 2nd generation Punjabi populations. These class characteristics may be of value to forensic document examiners and allow them to identify the population and the generation of the writer of a suspect document. PMID:18953800

Turner, Ian J; Sidhu, Rajvinder K; Love, Julian M

2008-09-01

 
 
 
 
21

1st or 2nd generation bioethanol-impacts of technology integration & on feed production and land use  

DEFF Research Database (Denmark)

"1st or 2nd generation bioethanol-impacts of technology integration & on feed production and land use" Liquid bio fuels are perceived as a means of mitigating CO2 emissions from transport and thus climate change, but much concern has been raised to the energy consumption from refining biomass to liquid fuels. Integrating technologies such that waste stream can be used will reduce energy consumption in the production of bioethanol from wheat. We show that the integration of bio refining and combined heat an power generation reduces process energy requirements with 30-40 % and makes bioethanol production comparable to gasoline production in terms of energy loss. Utilisation of biomass in the energy sector is inevitably linked to the utilisation of land. This is a key difference between fossil and bio based energy systems. Thus evaluations of bioethanol production based on energy balances alone are inadequate. 1st and 2nd generation bioethanol production exhibits major differences when evaluated on characteristics as feed energy and feed protein production and subsequently on land use changes. 1st generation bioethanol production based on wheat grain in Denmark may in fact reduce the pressure on agricultural land on a global scale, but increase the pressure on local/national scale. In contrast to that 2nd generation bioethanol based on wheat straw exhibits a poorer energy balance than 1st generation, but the induced imbalances on feed energy are smaller. Proteins are some of the plant components with the poorest bio synthesis efficiency and as such the area demand for their production is relatively high. Preservation of the proteins in the biomass such as in feed by-products from bioethanol production is of paramount importance in developing sustainable utilisation of biomass in the energy sector.

Bentsen, Niclas Scott; Felby, Claus

2009-01-01

22

Self-assembling software generator  

Science.gov (United States)

A technique to generate an executable task includes inspecting a task specification data structure to determine what software entities are to be generated to create the executable task, inspecting the task specification data structure to determine how the software entities will be linked after generating the software entities, inspecting the task specification data structure to determine logic to be executed by the software entities, and generating the software entities to create the executable task.

Bouchard, Ann M. (Albuquerque, NM); Osbourn, Gordon C. (Albuquerque, NM)

2011-11-25

23

Using 2(nd) generation tyrosine kinase inhibitors in frontline management of chronic phase chronic myeloid leukemia.  

Science.gov (United States)

Choices in medicine come with responsibility. With several TKI's (Tyrosine kinase inhibitors) available for front-line management of CML (Chronic Myeloid Leukemia), an astute clinician has to personalise, rationalise and take a pragmatic approach towards selection of the best drug for the 'patient in question'. Though it is hotly debated as to which TKI will triumph, the truth of this debate lies in individualising treatment rather than a general 'all size fits all' approach with imatinib. I personally believe that the second generation TKI's will suit most patient clinical profiles rather than prescribing imatinib to all and I have strived to make a strong case for them in front line treatment of CML. Though Imatinib may remain the first line choice for some patients, my efforts in this debate are mainly geared towards breaking the myth that imatinib is the sole 'block buster' on the CML landscape. PMID:24665456

Jayakar, Vishal

2014-01-01

24

Strategies for 2nd generation biofuels in EU - Co-firing to stimulate feedstock supply development and process integration to improve energy efficiency and economic competitiveness  

International Nuclear Information System (INIS)

The present biofuel policies in the European Union primarily stimulate 1st generation biofuels that are produced based on conventional food crops. They may be a distraction from lignocellulose based 2nd generation biofuels - and also from biomass use for heat and electricity - by keeping farmers' attention and significant investments focusing on first generation biofuels and the cultivation of conventional food crops as feedstocks. This article presents two strategies that can contribute to the development of 2nd generation biofuels based on lignocellulosic feedstocks. The integration of gasification-based biofuel plants in district heating systems is one option for increasing the energy efficiency and improving the economic competitiveness of such biofuels. Another option, biomass co-firing with coal, generates high-efficiency biomass electricity and reduces CO2 emissions by replacing coal. It also offers a near-term market for lignocellulosic biomass, which can stimulate development of supply systems for biomass also suitable as feedstock for 2nd generation biofuels. Regardless of the long-term priorities of biomass use for energy, the stimulation of lignocellulosic biomass production by development of near term and cost-effective markets is judged to be a no-regrets strategy for Europe. Strategies that induce a relevant development and exploit existing energy infrastructures in order to reduce risk and reach lower costs, are proposed an attractive compler costs, are proposed an attractive complement the present and prospective biofuel policies. (author)

25

Strategies for 2nd generation biofuels in EU - Co-firing to stimulate feedstock supply development and process integration to improve energy efficiency and economic competitiveness  

International Nuclear Information System (INIS)

The present biofuel policies in the European Union primarily stimulate 1st generation biofuels that are produced based on conventional food crops. They may be a distraction from lignocellulose based 2nd generation biofuels - and also from biomass use for heat and electricity - by keeping farmers' attention and significant investments focusing on first generation biofuels and the cultivation of conventional food crops as feedstocks. This article presents two strategies that can contribute to the development of 2nd generation biofuels based on lignocellulosic feedstocks. The integration of gasification-based biofuel plants in district heating systems is one option for increasing the energy efficiency and improving the economic competitiveness of such biofuels. Another option, biomass co-firing with coal, generates high-efficiency biomass electricity and reduces CO2 emissions by replacing coal. It also offers a near-term market for lignocellulosic biomass, which can stimulate development of supply systems for biomass also suitable as feedstock for 2nd generation biofuels. Regardless of the long-term priorities of biomass use for energy, the stimulation of lignocellulosic biomass production by development of near term and cost-effective markets is judged to be a no-regrets strategy for Europe. Strategies that induce a relevant development and exploit existing energy infrastructures in order to reduce risk and reach lower costs, are proposed an attractive complement the present and prospective biofuel policies.

26

Development of WWER-440 fuel. Use of fuel assemblies of 2-nd and 3-rd generations with increased enrichment  

International Nuclear Information System (INIS)

The problem of increasing the power of units at NPPs with WWER-440 is of current importance. There are all the necessary prerequisites for the above-stated problem as a result of updating the design of fuel assemblies and codes. The decrease of power peaking factor in the core is achieved by using profiled fuel assemblies, fuel-integrated burning absorber, FAs with modernized docking unit, modern codes, which allows decreasing conservatism of RP safety substantiation. A wide range of experimental studies of fuel behaviour has been performed which has reached burn-up of (50-60) MW·day/kgU in transition and emergency conditions, post-reactor studies of fuel assemblies, fuel rods and fuel pellets with a 5-year operating period have been performed, which prove high reliability of fuel, presence of a large margin in the fuel pillar, which helps reactor operation at increased power. The results of the work performed on introduction of 5-6 fuel cycles show that the ultimate fuel state on operability in WWER-440 reactors is far from being achieved. Neutron-physical and thermal-hydraulic characteristics of the cores of working power units with RP V-213 are such that actual (design and measured) power peaking factors on fuel assemblies and fuel rods, as a rule, are smaller than the maximum design values. This factor is a real reserve for power forcing. There is experience of operating Units 1, 2, 4 of the Kola NPP and Unit 2 of the Rovno NPP at increased power. Units of the Loviisa NPP are operated at 109 % power. During transfer to work at increased power it is reasonable to use fuel assemblies with increased height of the fuel pillar, which allows decreasing medium linear power distribution. Further development of the 2-nd generation fuel assembly design and consequent transition to working fuel assemblies of the 3-rd generation provides significant improvement of fuel consumption under the conditions of WWER-440 reactors operation with more continuous fuel cycles and increased power

27

CA4-05: Electronic Health Record Phenotyping to Define Rate of Extreme Weight Gain Associated with the Use of 2nd/3rd Generation Antipsychotic Medications  

Science.gov (United States)

Background/Aims Weight gain is an undesirable side effect of treatment with 2nd/3rd generation antipsychotic drugs which may have genetic determinants. A number of candidate genes have been analyzed but no genome-wide association studies (GWAS) have yet been reported. The purpose of this study was to determine the rate of extreme weight gain (EWG) among adult users of 2nd/3rd generation antipsychotic medications for future GWAS. Methods A standardized dataset was extracted on antipsychotic medications users at Group Health and Geisinger Health System from 2004–2010 that included demographics, enrollment, vitals, and pharmacy data. Electronic health record search algorithms were used to identify adult subjects with body weight increases of 15%+ within one year of the initiation of new 2nd/3rd generation antipsychotic drugs. To adjudicate the association of EWG with drug treatment, SAS was used to plot and visually inspect a single graph for each episode that encompassed all weight measurements, antipsychotic orders/fills, pregnancy- and cancer-related visits, and steroid and insulin orders/fills over time. The rate of EWG was then compared among the common types of antipsychotic medications. Results There were 7128 episodes of antipsychotic use that qualified for analysis (5251 patients). The most common medication types were quetiapine (n=2543), risperidone (n=1817), aripiprazole (n=1315), olanzapine (n=971), and ziprasidone (n=413). The least common medications were clozapine (n=37), olanzapine/fluoxetine (n=23), and paliperidone (n=9). EWG was identified for 275 of the 7128 episodes (3.86 per 100 episodes). EWG was significantly different among common medication types (p-value<0.0001). EWG was highest for olanzapine (5.46 per 100 episodes, 95% CI=[4.77, 6.24]), followed by quetiapine (3.93 per 100 episodes, 95% CI=[3.25, 4.75]), risperidone (3.58 per 100 episodes, 95% CI=[3.14, 4.07]), and aripiprazole (3.50 per 100 episodes, 95% CI=[2.65, 4.62]). For ziprasidone, EWG was 2–3 fold lower (1.69 per 100 episodes, 95% CI=[1.46, 1.96]). EWG was not significantly different between Geisinger and Group Health (4.43 per 100 episodes versus 3.65 per 100 episodes, p=0.157). Discussion Rates of EWG differed significantly by type of 2nd/3rd generation antipsychotic medication, which was consistent in two geographically diverse populations. Future GWAS may help determine if genetic determinants of EWG are drug specific.

Wood, G. Craig; Arterburn, David; Westbrook, Emily; Theis, Kay; Boscarino, Joseph; Rukstalis, Margaret; Still, Christopher; Gerhard, Glenn

2012-01-01

28

2nd Avenue Online  

Science.gov (United States)

Over a century ago, Yiddish theater was all the rage in New York and other major American cities with a sizable Jewish population. A wide range of well known performers (such as Paul Muni and Leonard Nimoy) cut their teeth on these stages. Of course, the 2nd Avenue corridor in New York City held many of these Yiddish theaters and this site from the New York University Libraries seeks "to capture the memory and to convey the feel of 2nd Avenue as a living part of the history and culture of New York and America." Visitors to the site can browse around the Multimedia area to listen to oral histories or check out some video clips. The Photos area includes a history of Yiddish theater in New York along with several family photo albums. The site is rounded out by a collection of related radio programs and stations.

29

Efficient 2(nd) and 4(th) harmonic generation of a single-frequency, continuous-wave fiber amplifier.  

Science.gov (United States)

We demonstrate efficient cavity-enhanced second and fourth harmonic generation of an air-cooled, continuous-wave (cw), single-frequency 1064 nm fiber-amplifier system. The second harmonic generator achieves up to 88% total external conversion efficiency, generating more than 20-W power at 532 nm wavelength in a diffraction-limited beam (M(2) crystal operated at 25 degrees C. The fourth harmonic generator is based on an AR-coated, Czochralski-grown beta-BaB(2)O(4) (BBO) crystal optimized for low loss and high damage threshold. Up to 12.2 W of 266-nm deep-UV (DUV) output is obtained using a 6-mm long critically phase-matched BBO operated at 40 degrees C. This power level is more than two times higher than previously reported for cw 266-nm generation. The total external conversion efficiency from the fundamental at 1064 nm to the fourth harmonic at 266 nm is >50%. PMID:18542230

Sudmeyer, Thomas; Imai, Yutaka; Masuda, Hisashi; Eguchi, Naoya; Saito, Masaki; Kubota, Shigeo

2008-02-01

30

Performance Evaluation of Electrochem's PEM Fuel Cell Power Plant for NASA's 2nd Generation Reusable Launch Vehicle  

Science.gov (United States)

NASA's Next Generation Launch Technology (NGLT) program is being developed to meet national needs for civil and commercial space access with goals of reducing the launch costs, increasing the reliability, and reducing the maintenance and operating costs. To this end, NASA is considering an all- electric capability for NGLT vehicles requiring advanced electrical power generation technology at a nominal 20 kW level with peak power capabilities six times the nominal power. The proton exchange membrane (PEM) fuel cell has been identified as a viable candidate to supply this electrical power; however, several technology aspects need to be assessed. Electrochem, Inc., under contract to NASA, has developed a breadboard power generator to address these technical issues with the goal of maximizing the system reliability while minimizing the cost and system complexity. This breadboard generator operates with dry hydrogen and oxygen gas using eductors to recirculate the gases eliminating gas humidification and blowers from the system. Except for a coolant pump, the system design incorporates passive components allowing the fuel cell to readily follow a duty cycle profile and that may operate at high 6:1 peak power levels for 30 second durations. Performance data of the fuel cell stack along with system performance is presented to highlight the benefits of the fuel cell stack design and system design for NGLT vehicles.

Kimble, Michael C.; Hoberecht, Mark

2003-01-01

31

FT-IR Investigation of Hoveyda-Grubbs'2nd Generation Catalyst in Self-Healing Epoxy Mixtures  

International Nuclear Information System (INIS)

The development of smart composites capable of self-repair on aeronautical structures is still at the planning stage owing to complex issues to overcome. A very important issue to solve concerns the components' stability of the proposed composites which are compromised at the cure temperatures necessary for good performance of the composite. In this work we analyzed the possibility to apply Hoveyda Grubbs' second generation catalyst (HG2) to develop self-healing systems. Our experimental results have shown critical issues in the use of epoxy precursors in conjunction with Hoveyda-Grubbs II metathesis catalyst. However, an appropriate curing cycle of the self-healing mixture permits to overcome the critical issues making possible high temperatures for the curing process without deactivating self-repair activity.

32

Control system for the 2nd generation Berkeley automounters (BAM2) at GM/CA-CAT macromolecular crystallography beamlines  

Energy Technology Data Exchange (ETDEWEB)

GM/CA-CAT at Sector 23 of the Advanced Photon Source (APS) is an NIH funded facility for crystallographic structure determination of biological macromolecules by X-ray diffraction. A second-generation Berkeley automounter is being integrated into the beamline control system at the 23BM experimental station. This new device replaces the previous all-pneumatic gripper motions with a combination of pneumatics and XYZ motorized linear stages. The latter adds a higher degree of flexibility to the robot including auto-alignment capability, accommodation of a larger capacity sample Dewar of arbitrary shape, and support for advanced operations such as crystal washing, while preserving the overall simplicity and efficiency of the Berkeley automounter design.

Makarov, O., E-mail: makarov@anl.gov [GM/CA-CAT, Biosciences Division, Argonne National Laboratory, Argonne, IL 60439 (United States); Hilgart, M.; Ogata, C.; Pothineni, S. [GM/CA-CAT, Biosciences Division, Argonne National Laboratory, Argonne, IL 60439 (United States); Cork, C. [Physical Biosciences Division, Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States)

2011-09-01

33

FT-IR Investigation of Hoveyda-Grubbs'2nd Generation Catalyst in Self-Healing Epoxy Mixtures  

Science.gov (United States)

The development of smart composites capable of self-repair on aeronautical structures is still at the planning stage owing to complex issues to overcome. A very important issue to solve concerns the components' stability of the proposed composites which are compromised at the cure temperatures necessary for good performance of the composite. In this work we analyzed the possibility to apply Hoveyda Grubbs' second generation catalyst (HG2) to develop self-healing systems. Our experimental results have shown critical issues in the use of epoxy precursors in conjunction with Hoveyda-Grubbs II metathesis catalyst. However, an appropriate curing cycle of the self-healing mixture permits to overcome the critical issues making possible high temperatures for the curing process without deactivating self-repair activity.

Guadagno, Liberata; Longo, Pasquale; Raimondo, Marialuigia; Naddeo, Carlo; Mariconda, Annaluisa; Vittoria, Vittoria; Iannuzzo, Generoso; Russo, Salvatore

2010-06-01

34

Space Ops 2002: Bringing Space Operations into the 21st Century. Track 3: Operations, Mission Planning and Control. 2nd Generation Reusable Launch Vehicle-Concepts for Flight Operations  

Science.gov (United States)

With the successful implementation of the International Space Station (ISS), the National Aeronautics and Space Administration (NASA) enters a new era of opportunity for scientific research. The ISS provides a working laboratory in space, with tremendous capabilities for scientific research. Utilization of these capabilities requires a launch system capable of routinely transporting crew and logistics to/from the ISS, as well as supporting ISS assembly and maintenance tasks. The Space Shuttle serves as NASA's launch system for performing these functions. The Space Shuttle also serves as NASA's launch system for supporting other science and servicing missions that require a human presence in space. The Space Shuttle provides proof that reusable launch vehicles are technically and physically implementable. However, a couple of problems faced by NASA are the prohibitive cost of operating and maintaining the Space Shuttle and its relative inability to support high launch rates. The 2nd Generation Reusable Launch Vehicle (2nd Gen RLV) is NASA's solution to this problem. The 2nd Gen RLV will provide a robust launch system with increased safety, improved reliability and performance, and less cost. The improved performance and reduced costs of the 2nd Gen RLV will free up resources currently spent on launch services. These resource savings can then be applied to scientific research, which in turn can be supported by the higher launch rate capability of the 2nd Gen RLV. The result is a win - win situation for science and NASA. While meeting NASA's needs, the 2nd Gen RLV also provides the United States aerospace industry with a commercially viable launch capability. One of the keys to achieving the goals of the 2nd Gen RLV is to develop and implement new technologies and processes in the area of flight operations. NASA's experience in operating the Space Shuttle and the ISS has brought to light several areas where automation can be used to augment or eliminate functions performed by crew and ground controllers. This experience has also identified the need for new approaches to staffing and training for both crew and ground controllers. This paper provides a brief overview of the mission capabilities provided by the 2nd Gen RLV, a description of NASA's approach to developing the 2nd Gen RLV, a discussion of operations concepts, and a list of challenges to implementing those concepts.

Hagopian, Jeff

2002-01-01

35

Sistema especialista de 2ª geração para diagnose técnica: modelo e procedimento 2nd generation expert system for technical diagnosis: a model and a procedure  

Directory of Open Access Journals (Sweden)

Full Text Available Este trabalho trata da diagnose em equipamentos industriais através do uso de Sistemas Especialistas. Com o objetivo de desenvolver procedimentos que contribuam na construção de Sistemas Especialistas para diagnose em Manutenção Industrial, consideramos os chamados Sistemas Especialistas de 2ª Geração. Propomos um modelo modificado e um procedimento de diagnose. Na estratégia de diagnose utilizamos uma busca "top-down best-first", que combina dois tipos de tratamento de incerteza: (i entropia, para decidir pelo melhor caminho nas estruturas de conhecimento, e (ii crença nos sintomas, para validar os diagnósticos obtidos. Esta proposta traz as seguintes vantagens: base de conhecimento mais completa, melhores explicação e apresentação de diagnósticos finais. Desenvolvemos um protótipo com base em informações reais sobre bombas centrífugas.This paper deals with the diagnosis of industrial equipments through the use of Expert Systems. Intending to develop procedures that result in diagnosis knowledge bases for Industrial Maintenance, we have considered 2nd Generation Expert Systems. We have proposed a modified model and a diagnosis procedure. We used for the diagnosis strategy a "top-down best-first search", that combines two types of uncertainty treatment: (i entropy, to find the best way in the search throughout knowledge structures, (ii belief in the symptoms, to validate the resultant diagnostics. This proposal has the following advantages: a more complete knowledge base, a better explanation and presentation of the resultant diagnostics. We have developed a prototype considering real informations about centrifugal pumps.

Néocles Alves Pereira

1994-04-01

36

Anaerobic digestion in combination with 2nd generation ethanol production for maximizing biofuels yield from lignocellulosic biomass – testing in an integrated pilot-scale biorefinery plant.  

DEFF Research Database (Denmark)

An integrated biorefinery concept for 2nd generation bioethanol production together with biogas production from the fermentation effluent was tested in pilot-scale. The pilot plant comprised pretreatment, enzymatic hydrolysis, hexose and pentose fermentation into ethanol and anaerobic digestion of the fermentation effluent in a UASB (upflow anaerobic sludge blanket) reactor. Operation of the 770 liter UASB reactor was tested under both mesophilic (38ºC) and thermophilic (53ºC) conditions with increasing loading rates of the liquid fraction of the effluent from ethanol fermentation. At an OLR of 3.5 kg-VS/(m3•d) a methane yield of 340 L/kg-VS was achieved for thermophilic operation while 270 L/kg-VS was obtained under mesophilic conditions. Thermophilic operation was, however, less robust towards further increase of the loading rate and for loading rates higher than 5 kg-VS/(m3•d) the yield was higher for mesophilic than for thermophilic operation. The effluent from the ethanol fermentation showed no signs of toxicity to the anaerobic microorganisms. Implementation of the biogas production from the fermentation effluent accounted for about 30% higher biofuels yield in the biorefinery compared to a system with only bioethanol production.

Uellendahl, Hinrich; Ahring, Birgitte Kiær

37

Experimental Investigation of 2nd Generation Bioethanol Derived from Empty-Fruit-Bunch (EFB of Oil-Palmon Performance and Exhaust Emission of SI Engine  

Directory of Open Access Journals (Sweden)

Full Text Available The experimental investigation of 2nd generation bioethanol derived from EFB of oil-palm blended with gasoline for 10, 20, 25% by volume and pure gasoline were conducted on performance and exhaust emission tests of SI engine. A four stroke, four cylinders, programmed fuel injection (PGMFI, 16 valves variable valve timing and electronic lift control (VTEC, single overhead camshaft (SOHC, and 1497 cm3 SI engine (Honda/L15A was used in this investigation. Engine performance test was carried out for brake torque, power, and fuel consumption. The exhaust emission was analyzed for carbon monoxide (CO and hydrocarbon (HC.The engine was operated on speed range from 1500 until 4500 rev/min with 85% throttle opening position. The results showed that the highest brake torque of bioethanol blends achieved by 10% bioethanol content at 3000 to 4500 rpm, the brake power was greater than pure gasoline at 3500 to 4500 rpm for 10% bioethanol, and bioethanol-gasoline blends of 10 and 20% resulted greater bsfc than pure gasoline at low speed from 1500 to 3500 rpm. The trend of CO and HC emissions tended to decrease when the engine speed increased.

Yanuandri Putrasari

2014-07-01

38

Power plant intake quantification of wheat straw composition for 2nd generation bioethanol optimization : A Near Infrared Spectroscopy (NIRS) feasibility study  

DEFF Research Database (Denmark)

Optimization of 2nd generation bioethanol production from wheat straw requires comprehensive knowledge of plant intake feedstock composition. Near Infrared Spectroscopy is evaluated as a potential method for instantaneous quantification of the salient fermentation wheat straw components: cellulose (glucan), hemicelluloses (xylan, arabinan), and lignin. Aiming at chemometric multivariate calibration, 44 pre-selected samples were subjected to spectroscopy and reference analysis. For glucan and xylan prediction accuracies (slope: 0.89, 0.94) and precisions (r2: 0.87) were obtained, corresponding to error of prediction levels at 8–9%. Models for arabinan and lignin were marginally less good, and especially for lignin a further expansion of the feasibility dataset was deemed necessary. The results are related to significant influences from sub-sampling/mass reduction errors in the laboratory regimen. A relative high proportion of outliers excluded from the present models (10–20%) may indicate that comminution sample preparation is most likely always needed. Different solutions to these issues are suggested.

Thomsen, Mette Hedegaard; Jensen, Erik Steen

2010-01-01

39

Polymer-assisted synthesis of giant, hollow and spherical polyoxomolybdate nanomolecules: the first example of 2nd-generation of C60-structure in nature?  

Science.gov (United States)

Giant, hollow and spherical polyoxomolybdate nanomolecules, which represented the largest inorganic molecule so far, was prepared from the slow decomposition of an unstable precursor compound MoO2(OH)(OOH) in the presence of PEO-containing triblock copolymer gels or concentrated polyethylene oxide (PEO) homopolymer solutions. SAXS and TEM measurements of the 1-mm size crystals revealed an extremely ordered primitive cubic (pc), zeolite-like structure made of 5-nm polyoxomolybdates spheres. The function of the PEO-containing polymer network was found to be very subtle and complex. PEO acted simultaneously as a very weak reducing agent and as a network to ensure sufficient time for the formation of long-range ordered structures, resulting in the growth of extremely uniform and hollow nanospheres. EXAFS and electrochemistry studies further suggested that these giant nanomolecules should have the geometrical structure very similar to that of the 2nd-generation of C60 (C240), which has never been discovered. It is estimated that there are 60 Mo(V) and 432 Mo(VI) in each molecule, with each Mo surrounded by averagely 3.5 other Mo atoms and 6.1 O atoms. The single nanomolecules can be stabilized by cationic surfactants in solvents, making them promising candidates as nanocontainers and nanoreactors, as well as a model system for electron transfer study. Furthermore, this material has very interesting electronic and magnetic properties. The novel synthesis method can also be extended to prepare other giant nanoclusters containing mixed valence metals.

Liu, Tianbo; Wan, Quan; Christian, Burger; Chu, Benjamin

2002-03-01

40

The value of CT-examinations in case of prolapse of a lumbar disc: A comparison of the CT-equipment of the 2nd and 3rd generation  

International Nuclear Information System (INIS)

On 117 patients with the tentative diagnosis of prolapse of a lumbar disc a myelography was carried out with equipment of the 2nd generation after the CT-examination. Of all these patients, 60 were found to need an operation. Using equipment of the 3rd generation, the CT-examination results of 98 patients were compared with those of their myelographies, 65 patients were found to need an operation. The rate of congruence of the results obtained using equipment of the 2nd generation with the operational findings was 78.2%; in those cases where equipment of the 3rd generation had been used, the rate was 94.2%. In view of these results, only equipment of the 3rd and 4th generations should be used, for routine examinations of the lumbar region. Compared with myelography, computerized tomography has the advantage that it can be carried out on out-patients and is free of risks. (orig.)

 
 
 
 
41

Formal Verification of Object-Oriented Software. Papers presented at the 2nd International Conference, October 5-7, 2011, Turin, Italy  

Digital Repository Infrastructure Vision for European Research (DRIVER)

This volume contains the invited papers, research papers, case studies, and position papers presented at the International Conference on Formal Verification of Object-Oriented Software (FoVeOOS 2011), that was held October 5-7, 2011 in Torino, Italy. Post-conference proceedings with revised versions of selected papers will be published within Springer’s Lecture Notes in Computer Science series after the conference. Formal software verification has outgrown the area of academic case studies,...

Beckert, Bernhard; Damiani, Ferruccio; Gurov, Dilian

2011-01-01

42

Sistema especialista de 2ª geração para diagnose técnica: modelo e procedimento / 2nd generation expert system for technical diagnosis: a model and a procedure  

Scientific Electronic Library Online (English)

Full Text Available SciELO Brazil | Language: Portuguese Abstract in portuguese Este trabalho trata da diagnose em equipamentos industriais através do uso de Sistemas Especialistas. Com o objetivo de desenvolver procedimentos que contribuam na construção de Sistemas Especialistas para diagnose em Manutenção Industrial, consideramos os chamados Sistemas Especialistas de 2ª Geraç [...] ão. Propomos um modelo modificado e um procedimento de diagnose. Na estratégia de diagnose utilizamos uma busca "top-down best-first", que combina dois tipos de tratamento de incerteza: (i) entropia, para decidir pelo melhor caminho nas estruturas de conhecimento, e (ii) crença nos sintomas, para validar os diagnósticos obtidos. Esta proposta traz as seguintes vantagens: base de conhecimento mais completa, melhores explicação e apresentação de diagnósticos finais. Desenvolvemos um protótipo com base em informações reais sobre bombas centrífugas. Abstract in english This paper deals with the diagnosis of industrial equipments through the use of Expert Systems. Intending to develop procedures that result in diagnosis knowledge bases for Industrial Maintenance, we have considered 2nd Generation Expert Systems. We have proposed a modified model and a diagnosis pro [...] cedure. We used for the diagnosis strategy a "top-down best-first search", that combines two types of uncertainty treatment: (i) entropy, to find the best way in the search throughout knowledge structures, (ii) belief in the symptoms, to validate the resultant diagnostics. This proposal has the following advantages: a more complete knowledge base, a better explanation and presentation of the resultant diagnostics. We have developed a prototype considering real informations about centrifugal pumps.

Néocles Alves, Pereira.

43

A novel magnetic material: 5-nm giant polyoxomolybdate nanomolecule with very possible 2nd-generation of C60-like structure  

Science.gov (United States)

A novel magnetic material -- giant, hollow and spherical polyoxomolybdate nanomolecules, which represented the largest inorganic molecule so far, was prepared from the slow decomposition of an unstable precursor compound MoO2(OH)(OOH) in the presence of PEO-containing triblock copolymer gels or concentrated polyethylene oxide (PEO) homopolymer solutions. SAXS and TEM measurements of the 1-mm size crystals revealed an extremely ordered primitive cubic (pc), zeolite-like structure made of 5-nm polyoxomolybdates spheres. The function of the PEO-containing polymer network was found to be very subtle and complex. PEO acted simultaneously as a very weak reducing agent and as a network to ensure sufficient time for the formation of long-range ordered structures, resulting in the growth of extremely uniform and hollow nanospheres. EXAFS and electrochemistry studies further suggested that these giant nanomolecules should have the geometrical structure very similar to that of the 2nd-generation of C60 (C240), which has never been discovered. It is estimated that there are 60 Mo(V) and 432 Mo(VI) in each molecule, with each Mo surrounded by averagely 3.5 other Mo atoms and 6.1 O atoms. The single nanomolecules can be stabilized by cationic surfactants in solvents, making them promising candidates as nanocontainers and nanoreactors, as well as a model system for electron transfer study. Furthermore, this material has very interesting electronic and magnetic properties. The novel synthesis method can also be extended to prepare other giant nanoclusters containing mixed valence metals.

Liu, Tianbo; Kim, Young-June; Chu, Benjamin; Frenkel, Anatoly; Creutz, Carol; Moodenbough, Arnold

2002-03-01

44

Development of the 2nd generation z(Redshift) and early universe spectrometer & the study of far-IR fine structure emission in high-z galaxies  

Science.gov (United States)

The 2nd generation z (Redshift) and Early Universe Spectrometer (ZEUS-2), is a long-slit echelle-grating spectrometer (R~1000) for observations at submillimeter wavelengths from 200 to 850 microm. Its design is optimized for the detection of redshifted far-infrared spectral lines from galaxies in the early universe. Combining exquisite sensitivity, broad wavelength coverage, and large (˜2.5%) instantaneous bandwidth, ZEUS-2 is uniquely suited for studying galaxies between z˜0.2 and 5---spanning the peaks in both the star formation rate and number of AGN in the universe. ZEUS-2 saw first light at the Caltech Submillimeter Observatory (CSO) in the Spring of 2012 and was commissioned on the Atacama Pathfinder Experiment (APEX) in November 2012. Here we detail the design and performance of ZEUS-2, first however we discuss important science results that are examples of the science enabled by ZEUS-2. Using the first generation z (Redshift) and Early Universe Spectrometer (ZEUS-1) we made the first high-z detections of the [NII] 122 microm and [OIII] 88 microm lines. We detect these lines from starburst galaxies between z ˜2.5 and 4 demonstrating the utility of these lines for characterizing the properties of early galaxies. Specifically we are able to determine the most massive star still on the main sequence, the number of those stars and a lower limit on the mass of ionized gas in the source. Next we present ZEUS-2's first science result. Using ZEUS-2 on APEX we have detected the [CII] 158 microm line from the z = 1.78 galaxy H-ATLAS J091043.1-000322 with a line flux of (6.44 +/- 0.42) ˜ 10-18 W m-2. Combined with its far-infrared luminosity and a new Herschel-PACS detection of the [OI] 63 microm line we are able to conclude that H-ATLAS J091043.1-000322 is a high redshift analogue of a local ultra-luminous infrared galaxy, i.e. it is likely the site of a compact starburst due to a major merger. This detection, combined with the ZEUS-1 observations of the [NII] and [OIII] lines represent examples of work we plan to continue with ZEUS-2. As such, they demonstrate the potential of ZEUS-2 for increasing our understanding of galaxies and galaxy evolution over cosmic time.

Ferkinhoff, Carl

45

Experimental and numerical validation of the effective medium theory for the B-term band broadening in 1st and 2nd generation monolithic silica columns.  

Science.gov (United States)

Effective medium theory (EMT) expressions for the B-term band broadening in monolithic silica columns are presented at the whole-column as well as at the mesoporous skeleton level. Given the bi-continuous nature of the monolithic medium, regular as well as inverse formulations of the EMT-expressions have been established. The established expressions were validated by applying them to a set of experimental effective diffusion (Deff)-data obtained via peak parking on a number of 1st and 2nd generation monolithic silica columns, as well as to a set of numerical diffusion simulations in a simplified monolithic column representation (tetrahedral skeleton model) with different external porosities and internal diffusion coefficients. The numerically simulated diffusion data can be very closely represented over a very broad range of zone retention factors (up to k?=80) using the established EMT-expressions, especially when using the inverse variant. The expressions also allow representing the experimentally measured effective diffusion data very closely. The measured Deff/Dmol-values were found to decrease significantly with increasing retention factor, in general going from about Deff/Dmol=0.55 to 0.65 at low k? (k??1.5-3.8) to Deff/Dmol=0.25 at very high k? (k??40-80). These values are significantly larger than observed in fully-porous and core-shell particles. The intra-skeleton diffusion coefficient (Dpz) was typically found to be of the order of Dpz/Dmol=0.4, compared to Dpz/Dmol=0.2-0.35 observed in most particle-based columns. These higher Dpz/Dmol values are the cause of the higher Deff/Dmol values observed. In addition, it also appears that the higher internal diffusion is linked to the higher porosity of the mesoporous skeleton that has a relatively open structure with relatively wide pores. The observed (weak) relation between Dpz/Dmol and the zone retention factor appears to be in good agreement with that predicted when applying the regular variant of the EMT-expression directly to the mesoporous skeleton level. PMID:24909439

Deridder, Sander; Vanmessen, Alison; Nakanishi, Kazuki; Desmet, Gert; Cabooter, Deirdre

2014-07-18

46

TOWARDS TEST CASES GENERATION FROM SOFTWARE SPECIFICATIONS  

Directory of Open Access Journals (Sweden)

Full Text Available Verification and Validation of software systems often consumes up to 70% of the development resources. Testing is one of the most frequently used Verification and Validation techniques for verifyingsystems. Many agencies that certify software systems for use require that the software be tested to certain specified levels of coverage. Currently, developing test cases to meet these requirements takes a major portion of the resources. Automating this task would result in significant time and cost savings. This testing research is aimed at the generation of such test cases. In the proposed approach a formal model of the required software behavior (a formal specification is used for test-case generation and as an oracle to determine if theimplementation produced the correct output during testing. This is referred to as Specification Based Testing. Specification based testing offers several advantages to traditional code based testing. The formal specification can be used as the source artifact to generate functional tests for the final product and since the test cases are produced at an earlier stage in the software development, they are available before the implementation is completed. Central to this approach is the use of model checkers as test case generation engines. Model checking is a technique for exploring the reachable state-space of a system model to verify properties of interest.There are several research challenges that must be addressed to realize this test generation approach.

R. Jeevarathinam,

2010-11-01

47

Test Sequence Generation for Distributed Software System  

Directory of Open Access Journals (Sweden)

Full Text Available This paper considers the test case generation for distributed software (a test case contains one or more test sequences. Applying the single finite state machine (FSM test approach to distributed software, we will suffer from some problems: 1 the state combinatorial explosion problem; 2 some unexecutable test cases may be generated; 3 some fault may be masked and cannot be isolated accurately. This paper proposed a new test case generation method based on the FSM net model. Instead of testing the global transitions of product machine, the generated test cases are used to verify the local transitions. We discuss the detailed methods of verifying the outputs and the tail states of the local transitions. Moreover, we prove that if all the local transitions are right, the transition structure of the distributed software is right. The tests are generated on the local transition structure of components, so we will not meet the state combinatorial explosion problem. All the outputs of the local transitions are checked, so the fault isolation results may be more accurate.

Shuai Wang

2011-02-01

48

Monte Carlo generators in ATLAS software  

International Nuclear Information System (INIS)

This document describes how Monte Carlo (MC) generators can be used in the ATLAS software framework (Athena). The framework is written in C++ using Python scripts for job configuration. Monte Carlo generators that provide the four-vectors describing the results of LHC collisions are written in general by third parties and are not part of Athena. These libraries are linked from the LCG Generator Services (GENSER) distribution. Generators are run from within Athena and the generated event output is put into a transient store, in HepMC format, using StoreGate. A common interface, implemented via inheritance of a GeneratorModule class, guarantees common functionality for the basic generation steps. The generator information can be accessed and manipulated by helper packages like TruthHelper. The ATLAS detector simulation as well access the truth information from StoreGate1. Steering is done through specific interfaces to allow for flexible configuration using ATLAS Python scripts. Interfaces to most general purpose generators, including: Pythia6, Pythia8, Herwig, Herwig++ and Sherpa are provided, as well as to more specialized packages, for example Phojet and Cascade. A second type of interface exist for the so called Matrix Element generators that only generate the particles produced in the hard scattering process and write events in the Les Houches event format. A generic interface to pass these events to Pythia6 and Herwig for parton showering and hadronisatHerwig for parton showering and hadronisation has been written.

49

Next generation lightweight mirror modeling software  

Science.gov (United States)

The advances in manufacturing techniques for lightweight mirrors, such as EXELSIS deep core low temperature fusion, Corning's continued improvements in the Frit bonding process and the ability to cast large complex designs, combined with water-jet and conventional diamond machining of glasses and ceramics has created the need for more efficient means of generating finite element models of these structures. Traditional methods of assembling 400,000 + element models can take weeks of effort, severely limiting the range of possible optimization variables. This paper will introduce model generation software developed under NASA sponsorship for the design of both terrestrial and space based mirrors. The software deals with any current mirror manufacturing technique, single substrates, multiple arrays of substrates, as well as the ability to merge submodels into a single large model. The modeler generates both mirror and suspension system elements, suspensions can be created either for each individual petal or the whole mirror. A typical model generation of 250,000 nodes and 450,000 elements only takes 3-5 minutes, much of that time being variable input time. The program can create input decks for ANSYS, ABAQUS and NASTRAN. An archive/retrieval system permits creation of complete trade studies, varying cell size, depth, and petal size, suspension geometry with the ability to recall a particular set of parameters and make small or large changes with ease. The input decks created by the modeler are text files which can be modified by any text editor, all the shell thickness parameters and suspension spring rates are accessible and comments in deck identify which groups of elements are associated with these parameters. This again makes optimization easier. With ANSYS decks, the nodes representing support attachments are grouped into components; in ABAQUS these are SETS and in NASTRAN as GRIDPOINT SETS, this make integration of these models into large telescope or satellite models easier.

Arnold, William R.; Fitzgerald, Matthew; Rosa, Rubin Jaca; Stahl, H. Philip

2013-09-01

50

Next Generation Lightweight Mirror Modeling Software  

Science.gov (United States)

The advances in manufacturing techniques for lightweight mirrors, such as EXELSIS deep core low temperature fusion, Corning's continued improvements in the Frit bonding process and the ability to cast large complex designs, combined with water-jet and conventional diamond machining of glasses and ceramics has created the need for more efficient means of generating finite element models of these structures. Traditional methods of assembling 400,000 + element models can take weeks of effort, severely limiting the range of possible optimization variables. This paper will introduce model generation software developed under NASA sponsorship for the design of both terrestrial and space based mirrors. The software deals with any current mirror manufacturing technique, single substrates, multiple arrays of substrates, as well as the ability to merge submodels into a single large model. The modeler generates both mirror and suspension system elements, suspensions can be created either for each individual petal or the whole mirror. A typical model generation of 250,000 nodes and 450,000 elements only takes 5-10 minutes, much of that time being variable input time. The program can create input decks for ANSYS, ABAQUS and NASTRAN. An archive/retrieval system permits creation of complete trade studies, varying cell size, depth, and petal size, suspension geometry with the ability to recall a particular set of parameters and make small or large changes with ease. The input decks created by the modeler are text files which can be modified by any editor, all the key shell thickness parameters are accessible and comments in deck identify which groups of elements are associated with these parameters. This again makes optimization easier. With ANSYS decks, the nodes representing support attachments are grouped into components; in ABAQUS these are SETS and in NASTRAN as GRIDPOINT SETS, this make integration of these models into large telescope or satellite models easier.

Arnold, William R., Sr.; Fitzgerald, Mathew; Rosa, Rubin Jaca; Stahl, H. Philip

2013-01-01

51

Next-Generation Lightweight Mirror Modeling Software  

Science.gov (United States)

The advances in manufacturing techniques for lightweight mirrors, such as EXELSIS deep core low temperature fusion, Corning's continued improvements in the Frit bonding process and the ability to cast large complex designs, combined with water-jet and conventional diamond machining of glasses and ceramics has created the need for more efficient means of generating finite element models of these structures. Traditional methods of assembling 400,000 + element models can take weeks of effort, severely limiting the range of possible optimization variables. This paper will introduce model generation software developed under NASA sponsorship for the design of both terrestrial and space based mirrors. The software deals with any current mirror manufacturing technique, single substrates, multiple arrays of substrates, as well as the ability to merge submodels into a single large model. The modeler generates both mirror and suspension system elements, suspensions can be created either for each individual petal or the whole mirror. A typical model generation of 250,000 nodes and 450,000 elements only takes 5-10 minutes, much of that time being variable input time. The program can create input decks for ANSYS, ABAQUS and NASTRAN. An archive/retrieval system permits creation of complete trade studies, varying cell size, depth, and petal size, suspension geometry with the ability to recall a particular set of parameters and make small or large changes with ease. The input decks created by the modeler are text files which can be modified by any editor, all the key shell thickness parameters are accessible and comments in deck identify which groups of elements are associated with these parameters. This again makes optimization easier. With ANSYS decks, the nodes representing support attachments are grouped into components; in ABAQUS these are SETS and in NASTRAN as GRIDPOINT SETS, this make integration of these models into large telescope or satellite models possible

Arnold, William R., Sr.; Fitzgerald, Mathew; Rosa, Rubin Jaca; Stahl, Phil

2013-01-01

52

Experimental Stochatics (2nd edition)  

International Nuclear Information System (INIS)

Otto Moeschlin and his co-authors have written a book about simulation of stochastic systems. The book comes with a CD-ROM that contains the experiments discussed in the book, and the text from the book is repeated on the CD-ROM. According to the authors, the aim of the book is to give a quick introduction to stochastic simulation for 'all persons interested in experimental stochastics'. To please this diverse audience, the authors offer a book that has four parts. Part 1, called 'Artificial Randomness', is the longest of the four parts. It gives an overview of the generation, testing and basic usage of pseudo random numbers in simulation. Although algorithms for generating sequences of random numbers are fundamental to simulation, it is a slightly unusual choice to give it such weight in comparison to other algorithmic topics. The remaining three parts consist of simulation case studies. Part 2, 'Stochastic Models', treats four problems - Buffon's needle, a queuing system, and two problems related to the kinetic theory of gases. Part 3 is called 'Stochastic Processes' and discusses the simulation of discrete time Markov chains, birth-death processes, Brownian motion and diffusions. The last section of Part 3 is about simulation as a tool to understand the traffic flow in a system controlled by stoplights, an area of research for the authors. Part 4 is called 'Evaluation of Statistical Procedures'. This section contains examples where simulation is used to test the pees where simulation is used to test the performance of statistical methods. It covers four examples: the Neymann-Pearson lemma, the Wald sequential test, Bayesian point estimation and Hartigan procedures. The CD-ROM contains an easy-to-install software package that runs under Microsoft Windows. The software contains the text and simulations from the book. What I found most enjoyable about this book is the number of topics covered in the case studies. The highly individual selection of applications, which may serve as a source of inspiration for teachers of computational stochastic methods, is the main contribution of this electronic monograph. However, both the book and software suffer from several severe problems. Firstly, I feel that the structure of the text is weak. Probably this is partly the result of the text from the CD-ROM being put into a book format, but the short paragraphs and poorly structured sentences destroy the reading experience. Secondly, although the software is functional, I believe that, like me, many users will be disappointed by the quality of the user interface and the visualizations. The opportunities to interact with the simulations are limited. Thirdly, the presentation is slightly old fashioned and lacking in pedagogical structure. For example, flow charts and Pascal programs are used to present algorithms. To conclude, I am surprised that this electronic monograph warranted a second edition in this form. Teachers may find the examples useful as a starting point, but students and researchers are advised to look elsewhere. (book review)

53

Synthesis and ring-opening metathesis polymerization (ROMP of new N-fluoro-phenylnorbornene dicarboximides by 2nd generation ruthenium alkylidene catalysts  

Directory of Open Access Journals (Sweden)

Full Text Available The synthesis of new N-3,5-bis(trifluoromethylphenyl-endo-norbornene-5,6-dicarboximide (TFMPhNDI, 2a, N-4-fluorophenyl-endo-norbornene-5,6-dicarboximide (FPhNDI, 2b and N-2,2,6,6-tetramethylpiperidyl-endo-norbornene-5,6-dicarboximide (TMPNDI, 2c monomers was carried out. Polynorbornene dicarboximides were obtained via ring opening metathesis polymerization (ROMP using a second generation ruthenium alkylidene catalyst (1,3-dimesityl-4,5-dihydroimidazol-2-ylidene (PCy3Cl2Ru=CHPh (I. Poly-TMPNDI which bears a piperidyl moiety showed the highest Tg and Td compared to the polymers bearing fluoro-aryl moieties. Thermal stability of Poly-TFMPhNDI (3a was enhanced after hydrogenation with Wilkinson´s catalyst.

2007-05-01

54

Automatic Testcase Generation for Flight Software  

Science.gov (United States)

The TacSat3 project is applying Integrated Systems Health Management (ISHM) technologies to an Air Force spacecraft for operational evaluation in space. The experiment will demonstrate the effectiveness and cost of ISHM and vehicle systems management (VSM) technologies through onboard operation for extended periods. We present two approaches to automatic testcase generation for ISHM: 1) A blackbox approach that views the system as a blackbox, and uses a grammar-based specification of the system's inputs to automatically generate *all* inputs that satisfy the specifications (up to prespecified limits); these inputs are then used to exercise the system. 2) A whitebox approach that performs analysis and testcase generation directly on a representation of the internal behaviour of the system under test. The enabling technologies for both these approaches are model checking and symbolic execution, as implemented in the Ames' Java PathFinder (JPF) tool suite. Model checking is an automated technique for software verification. Unlike simulation and testing which check only some of the system executions and therefore may miss errors, model checking exhaustively explores all possible executions. Symbolic execution evaluates programs with symbolic rather than concrete values and represents variable values as symbolic expressions. We are applying the blackbox approach to generating input scripts for the Spacecraft Command Language (SCL) from Interface and Control Systems. SCL is an embedded interpreter for controlling spacecraft systems. TacSat3 will be using SCL as the controller for its ISHM systems. We translated the SCL grammar into a program that outputs scripts conforming to the grammars. Running JPF on this program generates all legal input scripts up to a prespecified size. Script generation can also be targeted to specific parts of the grammar of interest to the developers. These scripts are then fed to the SCL Executive. ICS's in-house coverage tools will be run to measure code coverage. Because the scripts exercise all parts of the grammar, we expect them to provide high code coverage. This blackbox approach is suitable for systems for which we do not have access to the source code. We are applying whitebox test generation to the Spacecraft Health INference Engine (SHINE) that is part of the ISHM system. In TacSat3, SHINE will execute an on-board knowledge base for fault detection and diagnosis. SHINE converts its knowledge base into optimized C code which runs onboard TacSat3. SHINE can translate its rules into an intermediate representation (Java) suitable for analysis with JPF. JPF will analyze SHINE's Java output using symbolic execution, producing testcases that can provide either complete or directed coverage of the code. Automatically generated test suites can provide full code coverage and be quickly regenerated when code changes. Because our tools analyze executable code, they fully cover the delivered code, not just models of the code. This approach also provides a way to generate tests that exercise specific sections of code under specific preconditions. This capability gives us more focused testing of specific sections of code.

Bushnell, David Henry; Pasareanu, Corina; Mackey, Ryan M.

2008-01-01

55

Development of the 2nd Generation Redshift(z) and Early Universe Spectrometer and the Detailed Study of Far-IR Fine-Structure Lines in High-z Galaxies  

Science.gov (United States)

The 2nd generation Redshift(z) and Early Universe Spectrometer (ZEUS-2), is a long-slit echelle grating spectrometer ( 1000) for observations at submillimeter wavelengths from 200 to 850 ?m. Its design is optimized for the detection of redshifted far-infrared spectral lines from galaxies in the early universe. Combined with its exquisite sensitivity, broad wavelength coverage, and large 2.5%) instantaneous bandwidth, ZEUS-2 is uniquely suited for studying galaxies between 0.2 and 5—spanning the peaks in both the star formation rate and AGN activity in the universe. ZEUS-2 saw first light at the Caltech Submillimeter Observatory (CSO) in the spring of 2012 and was commissioned on the Atacama Pathfinder Experiment (APEX) this past November. Here we report on the instrument development and performance as well as initial scientific results from the APEX commissioning. We also discuss our ZEUS-1 (the first generation Redshift(z) and Early Universe Spectrometer) detections of the [NII] 122 ?m and [OIII] 88 ?m lines from starburst galaxies at redshifts between ~2.5 and 4. These are the first high-z detections of these lines and they are examples of work we plan to continue with ZEUS-2. As such, they demonstrate the potential of ZEUS-2 for increasing our understanding of galaxies and galaxy evolution over cosmic time.

Ferkinhoff, Carl; Brisbin, D.; Nikola, T.; Parshley, S.; Stacey, G. J.; Hailey-Dunsheath, S. J.; Irwin, K. D.; Cho, H.; Niemack, M.; Benford, D. J.; Staguhn, J.; Phillips, T. G.; Falgarone, E.

2013-01-01

56

Techno-economic evaluation of 2nd generation bioethanol production from sugar cane bagasse and leaves integrated with the sugar-based ethanol process  

Directory of Open Access Journals (Sweden)

Full Text Available Abstract Background Bioethanol produced from the lignocellulosic fractions of sugar cane (bagasse and leaves, i.e. second generation (2G bioethanol, has a promising market potential as an automotive fuel; however, the process is still under investigation on pilot/demonstration scale. From a process perspective, improvements in plant design can lower the production cost, providing better profitability and competitiveness if the conversion of the whole sugar cane is considered. Simulations have been performed with AspenPlus to investigate how process integration can affect the minimum ethanol selling price of this 2G process (MESP-2G, as well as improve the plant energy efficiency. This is achieved by integrating the well-established sucrose-to-bioethanol process with the enzymatic process for lignocellulosic materials. Bagasse and leaves were steam pretreated using H3PO4 as catalyst and separately hydrolysed and fermented. Results The addition of a steam dryer, doubling of the enzyme dosage in enzymatic hydrolysis, including leaves as raw material in the 2G process, heat integration and the use of more energy-efficient equipment led to a 37 % reduction in MESP-2G compared to the Base case. Modelling showed that the MESP for 2G ethanol was 0.97 US$/L, while in the future it could be reduced to 0.78 US$/L. In this case the overall production cost of 1G + 2G ethanol would be about 0.40 US$/L with an output of 102 L/ton dry sugar cane including 50 % leaves. Sensitivity analysis of the future scenario showed that a 50 % decrease in the cost of enzymes, electricity or leaves would lower the MESP-2G by about 20%, 10% and 4.5%, respectively. Conclusions According to the simulations, the production of 2G bioethanol from sugar cane bagasse and leaves in Brazil is already competitive (without subsidies with 1G starch-based bioethanol production in Europe. Moreover 2G bioethanol could be produced at a lower cost if subsidies were used to compensate for the opportunity cost from the sale of excess electricity and if the cost of enzymes continues to fall.

Macrelli Stefano

2012-04-01

57

The 2nd Generation z(Redshift) and Early Universe Spectrometer Part I: First-light observation of a highly lensed local-ULIRG analog at high-z  

CERN Document Server

We report first science results from our new spectrometer, the 2nd generation z(Redshift) and Early Universe Spectrometer (ZEUS-2), recently commissioned on the Atacama Pathfinder Experiment telescope (APEX). ZEUS-2 is a submillimeter grating spectrometer optimized for detecting the faint and broad lines from distant galaxies that are redshifted into the telluric windows from 200 to 850 microns. It utilizes a focal plane array of transition-edge sensed bolometers, the first use of these arrays for astrophysical spectroscopy. ZEUS-2 promises to be an important tool for studying galaxies in the years to come due to its synergy with ALMA and its capabilities in the short submillimeter windows that are unique in the post Herschel era. Here we report on our first detection of the [CII] 158 $\\mu m$ line with ZEUS-2. We detect the line at z ~ 1.8 from H-ATLAS J091043.1-000322 with a line flux of $(6.44 \\pm 0.42) \\times 10^{-18} W m^{-2}$. Combined with its far-infrared luminosity and a new Herschel-PACS detection of...

Ferkinhoff, Carl; Parshley, Stephen; Nikola, Thomas; Stacey, Gordon J; Schoenwald, Justin; Higdon, James L; Higdon, Sarah J U; Verma, Aprajita; Riechers, Dominik; Hailey-Dunsheath, Steven; Menten, Karl; Güsten, Rolf; Wieß, Axel; Irwin, Kent; Cho, Hsiao M; Niemack, Michael; Halpern, Mark; Amiri, Mandana; Hasselfield, Matthew; Wiebe, D V; Ade, Peter A R; Tucker, Carol E

2013-01-01

58

Generation of test cases from software requirements using combination trees  

Digital Repository Infrastructure Vision for European Research (DRIVER)

Requirements play an important role in conformance of software quality, which is verified and validated through software testing. Usually the software requirements are expressed natural language such as English. In this paper we present an approach to generate test case from requirements. Our approach takes requirements expressed in natural language and generates test cases using combination trees. However until now we have the tabular representations for combination pairs or simply the chart...

Ravi Prakash Verma; Bal Gopal; Md Rizwan Beg

2011-01-01

59

A 2nd generation cosmic axion experiment  

CERN Document Server

An experiment is described to detect dark matter axions trapped in the halo of our galaxy. Galactic axions are converted into microwave photons via the Primakoff effect in a static background field provided by a superconducting magnet. The photons are collected in a high Q microwave cavity and detected by a low noise receiver. The axion mass range accessible by this experiment is 1.3-13 micro-eV. The expected sensitivity will be roughly 50 times greater than achieved by previous experiments in this mass range. The assembly of the detector is well under way at LLNL and data taking will start in mid-1995.

Hagmann, C A; Van Bibber, K; Daw, E J; Kinion, D S; Rosenberg, L J; Sikivie, P; Sullivan, N; Tanner, D B; Moltz, D M; Nezrick, F A; Turner, M; Golubev, N A; Kravchuk, L V

1995-01-01

60

Software for Automated Generation of Density Maps  

Science.gov (United States)

The Laboratory of Cell Biology at the National Cancer Institute (NCI) is seeking parties interested in collaborative research to co-develop software for the automated determination of macromolecular structures using cryo-electron microscopy.

 
 
 
 
61

Generating Protocol Software from CPN Models Annotated with Pragmatics  

DEFF Research Database (Denmark)

Model-driven software engineering (MDSE) provides a foundation for automatically generating software based on models that focus on the problem domain while abstracting from the details of underlying implementation platforms. Coloured Petri Nets (CPNs) have been widely used to formally model and verify protocol software, but limited work exists on using CPN models of protocols as a basis for automated code generation. The contribution of this paper is a method for generating protocol software from a class of CPN models annotated with code generation pragmatics. Our code generation method consists of three main steps: automatically adding so-called derived pragmatics to the CPN model, computing an abstract template tree, which associates pragmatics with code templates, and applying the templates to generate code which can then be compiled. We illustrate our method using a unidirectional data framing protocol.

Simonsen, Kent Inge; Kindler, Ekkart

2013-01-01

62

Creating the next generation control system software  

Energy Technology Data Exchange (ETDEWEB)

A new 1980's style support package for future accelerator control systems is proposed. It provides a way to create accelerator applications software without traditional programming. Visual Interactive Applications (VIA) is designed to meet the needs of expanded accelerator complexes in a more cost effective way than past experience with procedural languages by using technology from the personal computer and artificial intelligence communities. 4 refs.

Schultz, D.E.

1989-01-01

63

Creating the next generation control system software  

International Nuclear Information System (INIS)

A new 1980's style support package for future accelerator control systems is proposed. It provides a way to create accelerator applications software without traditional programming. Visual Interactive Applications (VIA) is designed to meet the needs of expanded accelerator complexes in a more cost effective way than past experience with procedural languages by using technology from the personal computer and artificial intelligence communities. 4 refs

64

Abstracts: 2nd interventional MRI symposium  

International Nuclear Information System (INIS)

Main topics of the 2nd interventional MRI symposium were: MR compatibility and pulse sequences; MR thermometry, biopsy, musculoskeletal system; laser-induced interstitial thermotherapy, radiofrequency ablations; intraoperative MR; vascular applications, breast, endoscopy; focused ultrasound, cryotherapy, perspectives; poster session with 34 posters described. (AJ)

65

Abstracts: 2nd interventional MRI symposium  

Energy Technology Data Exchange (ETDEWEB)

Main topics of the 2nd interventional MRI symposium were: MR compatibility and pulse sequences; MR thermometry, biopsy, musculoskeletal system; laser-induced interstitial thermotherapy, radiofrequency ablations; intraoperative MR; vascular applications, breast, endoscopy; focused ultrasound, cryotherapy, perspectives; poster session with 34 posters described. (AJ)

Anon.

1997-09-01

66

Computer Software Configuration Item-Specific Flight Software Image Transfer Script Generator  

Science.gov (United States)

A K-shell UNIX script enables the International Space Station (ISS) Flight Control Team (FCT) operators in NASA s Mission Control Center (MCC) in Houston to transfer an entire or partial computer software configuration item (CSCI) from a flight software compact disk (CD) to the onboard Portable Computer System (PCS). The tool is designed to read the content stored on a flight software CD and generate individual CSCI transfer scripts that are capable of transferring the flight software content in a given subdirectory on the CD to the scratch directory on the PCS. The flight control team can then transfer the flight software from the PCS scratch directory to the Electronically Erasable Programmable Read Only Memory (EEPROM) of an ISS Multiplexer/ Demultiplexer (MDM) via the Indirect File Transfer capability. The individual CSCI scripts and the CSCI Specific Flight Software Image Transfer Script Generator (CFITSG), when executed a second time, will remove all components from their original execution. The tool will identify errors in the transfer process and create logs of the transferred software for the purposes of configuration management.

Bolen, Kenny; Greenlaw, Ronald

2010-01-01

67

A rule-based software test data generator  

Science.gov (United States)

Rule-based software test data generation is proposed as an alternative to either path/predicate analysis or random data generation. A prototype rule-based test data generator for Ada programs is constructed and compared to a random test data generator. Four Ada procedures are used in the comparison. Approximately 2000 rule-based test cases and 100,000 randomly generated test cases are automatically generated and executed. The success of the two methods is compared using standard coverage metrics. Simple statistical tests showing that even the primitive rule-based test data generation prototype is significantly better than random data generation are performed. This result demonstrates that rule-based test data generation is feasible and shows great promise in assisting test engineers, especially when the rule base is developed further.

Deason, William H.; Brown, David B.; Chang, Kai-Hsiung; Cross, James H., II

1991-01-01

68

Improved ant algorithms for software testing cases generation.  

Science.gov (United States)

Existing ant colony optimization (ACO) for software testing cases generation is a very popular domain in software testing engineering. However, the traditional ACO has flaws, as early search pheromone is relatively scarce, search efficiency is low, search model is too simple, positive feedback mechanism is easy to produce the phenomenon of stagnation and precocity. This paper introduces improved ACO for software testing cases generation: improved local pheromone update strategy for ant colony optimization, improved pheromone volatilization coefficient for ant colony optimization (IPVACO), and improved the global path pheromone update strategy for ant colony optimization (IGPACO). At last, we put forward a comprehensive improved ant colony optimization (ACIACO), which is based on all the above three methods. The proposed technique will be compared with random algorithm (RND) and genetic algorithm (GA) in terms of both efficiency and coverage. The results indicate that the improved method can effectively improve the search efficiency, restrain precocity, promote case coverage, and reduce the number of iterations. PMID:24883391

Yang, Shunkun; Man, Tianlong; Xu, Jiaqi

2014-01-01

69

Generation of test cases from software requirements using combination trees  

Directory of Open Access Journals (Sweden)

Full Text Available Requirements play an important role in conformance of software quality, which is verified and validated through software testing. Usually the software requirements are expressed natural language such as English. In this paper we present an approach to generate test case from requirements. Our approach takes requirements expressed in natural language and generates test cases using combination trees. However until now we have the tabular representations for combination pairs or simply the charts for them. In this paper we propose the use of combination trees which are far easier to visualize and handle in testing process. This also gives the benefits of remembering the combination of input parameters which we have tested and which are left, giving further confidence on the quality of the product which is to be released.

Ravi Prakash Verma

2011-05-01

70

Beyond the 2nd Fermi Pulsar Catalog  

Science.gov (United States)

Over thirteen times more ?-ray pulsars have now been studied with the Large Area Telescope on NASA's Fermi satellite than the ten seen with the Compton Gamma-Ray Observatory in the nineteen-nineties. The large sample is diverse, allowing better understanding both of the pulsars themselves and of their roles in various cosmic processes. Here we explore the prospects for even more ?-ray pulsars as Fermi enters the 2nd half of its nominal ten-year mission. New pulsars will naturally tend to be fainter than the first ones discovered. Some of them will have unusual characteristics compared to the current population, which may help discriminate between models. We illustrate a vision of the future with a sample of six pulsars discovered after the 2nd Fermi Pulsar Catalog was written.

Hou, X.; Smith, D. A.; Reposeur, T.; Rousseau pulsars, R.

2014-03-01

71

Beyond the 2nd Fermi Pulsar Catalog  

CERN Document Server

Over thirteen times more gamma-ray pulsars have now been studied with the Large Area Telescope on NASA's Fermi satellite than the ten seen with the Compton Gamma-Ray Observatory in the nineteen-nineties. The large sample is diverse, allowing better understanding both of the pulsars themselves and of their roles in various cosmic processes. Here we explore the prospects for even more gamma-ray pulsars as Fermi enters the 2nd half of its nominal ten-year mission. New pulsars will naturally tend to be fainter than the first ones discovered. Some of them will have unusual characteristics compared to the current population, which may help discriminate between models. We illustrate a vision of the future with a sample of six pulsars discovered after the 2nd Fermi Pulsar Catalog was written.

Hou, Xian; Reposeur, Thierry; Rousseau, Romain

2013-01-01

72

Software Test Case Automated Generation Algorithm with Extended EDPN Model  

Directory of Open Access Journals (Sweden)

Full Text Available To improve the sufficiency for software testing and the performance of testing algorithms, an improved event-driven Petri network model using combination method is proposed, abbreviated as OEDPN model. Then it is applied to OATS method to extend the implementation of OATS. On the basis of OEDPN model, the marked associate recursive method of state combination on category is presented to solve problems of combined conflict. It is also for test case explosion generated by redundant test cases and hard extension of OATS method. Meanwhile, the generation methods on interactive test cases of extended OATS are also presented by research on generation test cases.

Jinlong Tao

2013-08-01

73

Overview of the next generation of Fermilab collider software  

International Nuclear Information System (INIS)

Fermilab is entering an era of operating a more complex collider facility. In addition, new operator workstations are available that have increased capabilities. The task of providing updated software in this new environment precipitated a project called Colliding Beam Software (CBS). It was soon evident that a new approach was needed for developing console software. Hence CBS, although a common acronym, is too narrow a description. A new generation of the application program subroutine library has been created to enhance the existing programming environment with a set of value added tools. Several key Collider applications were written that exploit CBS tools. This paper will discuss the new tools and the underlying change in methodology in application program development for accelerator control at Fermilab. (author)

74

Oxygen Generation System Laptop Bus Controller Flight Software  

Science.gov (United States)

The Oxygen Generation System Laptop Bus Controller Flight Software was developed to allow the International Space Station (ISS) program to activate specific components of the Oxygen Generation System (OGS) to perform a checkout of key hardware operation in a microgravity environment, as well as to perform preventative maintenance operations of system valves during a long period of what would otherwise be hardware dormancy. The software provides direct connectivity to the OGS Firmware Controller with pre-programmed tasks operated by on-orbit astronauts to exercise OGS valves and motors. The software is used to manipulate the pump, separator, and valves to alleviate the concerns of hardware problems due to long-term inactivity and to allow for operational verification of microgravity-sensitive components early enough so that, if problems are found, they can be addressed before the hardware is required for operation on-orbit. The decision was made to use existing on-orbit IBM ThinkPad A31p laptops and MIL-STD-1553B interface cards as the hardware configuration. The software at the time of this reporting was developed and tested for use under the Windows 2000 Professional operating system to ensure compatibility with the existing on-orbit computer systems.

Rowe, Chad; Panter, Donna

2009-01-01

75

JERICO. Interim Periodic Activity Report. 2nd  

Digital Repository Infrastructure Vision for European Research (DRIVER)

The main objectives of the 2nd period were to finalise the “best practices handbooks” for glider (with GROOM), ferrybox and fixed platform. The project organises also workshops on JRA (Villefranche in October 2013) for presenting the mid-term results of WP10. JERICO also launched the second and third calls for Trans National Access. The mid-term review has been pass successfully in june 2013. The second general Assembly was held in Oslo on the 5th and 6th of May 2014. A dedicated JERIC...

Farcy, Patrick; Puillat, Ingrid; Beaume, Nolwenn

2014-01-01

76

Software Test Case Automated Generation Algorithm with Extended EDPN Model  

Digital Repository Infrastructure Vision for European Research (DRIVER)

To improve the sufficiency for software testing and the performance of testing algorithms, an improved event-driven Petri network model using combination method is proposed, abbreviated as OEDPN model. Then it is applied to OATS method to extend the implementation of OATS. On the basis of OEDPN model, the marked associate recursive method of state combination on category is presented to solve problems of combined conflict. It is also for test case explosion generated by redundant test cases a...

Jinlong Tao; Lirong Chen

2013-01-01

77

Optimized generation of high resolution breast anthropomorphic software phantoms  

International Nuclear Information System (INIS)

Purpose: The authors present an efficient method for generating anthropomorphic software breast phantoms with high spatial resolution. Employing the same region growing principles as in their previous algorithm for breast anatomy simulation, the present method has been optimized for computational complexity to allow for fast generation of the large number of phantoms required in virtual clinical trials of breast imaging. Methods: The new breast anatomy simulation method performs a direct calculation of the Cooper's ligaments (i.e., the borders between simulated adipose compartments). The calculation corresponds to quadratic decision boundaries of a maximum a posteriori classifier. The method is multiscale due to the use of octree-based recursive partitioning of the phantom volume. The method also provides user-control of the thickness of the simulated Cooper's ligaments and skin. Results: Using the proposed method, the authors have generated phantoms with voxel size in the range of (25-1000 ?m)3/voxel. The power regression of the simulation time as a function of the reciprocal voxel size yielded a log-log slope of 1.95 (compared to a slope of 4.53 of our previous region growing algorithm). Conclusions: A new algorithm for computer simulation of breast anatomy has been proposed that allows for fast generation of high resolution anthropomorphic software phantoms.

78

Next Generation of ECT Software for Data Analysis of Steam Generator Tubes  

International Nuclear Information System (INIS)

Improvements to existing EddyOne eddy current analysis software are being presented. Those improvements are geared towards improved interaction between the software and ECT analyst by having a better and more featured user interface, while keeping some industry standard signal display norms intact to keep the familiar factor and ease the transition to the next generation of EddyOne. Improvements presented in this paper thus ease the transition to the new software by reducing training requirements for the existing analysts and for new analysts coming to the industry. Further, by utilizing modern technologies next generation of software is able to further reduce maintenance and deployment costs of the whole system for future to come.(author).

79

Input language in software generation system for film data processing  

International Nuclear Information System (INIS)

Input language is described that has been constructed in the course of the development of the software generation system for the film data processing at JINR. The language allowed one to considerably simplify the work in the system framework and also to standardize the calling of any package of program included in the system. Input language is intended for the user-nonprogrammer. Detailed description of main directives of the language as well as their parameters are presented. The directives were determined to be of three types: input/output, descriptive and initiation. The application of the language both in batch and interactive modes is its particular feature. It has been realized by means of ample possibilities of CCL language provided by the CDC-6500 system software

80

Web Style Guide, 2nd Edition  

Science.gov (United States)

The Web Style Guide, 2nd Edition, which is the online version of a book with the same name, demonstrates the step-by-step process involved in designing a Web site. Visitors are assumed to be familiar with whatever Web publishing tool they are using. The guide gives few technical details but instead focuses on the usability, layout, and attractiveness of a Web site, with the goal being to make it as popular with the intended audience as possible. Considerations such as graphics, typography, and multimedia enhancements are discussed. Web site structure, fine-tuned features on individual pages, and almost everything in between is addressed by the guide, making it a handy resource for people who place great importance on the effectiveness of their online creations.

 
 
 
 
81

2nd International Conference on Pattern Recognition  

CERN Document Server

This book contains the extended and revised versions of a set of selected papers from the 2nd International Conference on Pattern Recognition (ICPRAM 2013), held in Barcelona, Spain, from 15 to 18 February, 2013. ICPRAM was organized by the Institute for Systems and Technologies of Information, Control and Communication (INSTICC) and was held in cooperation with the Association for the Advancement of Artificial Intelligence (AAAI). The hallmark of this conference was to encourage theory and practice to meet in a single venue. The focus of the book is on contributions describing applications of Pattern Recognition techniques to real-world problems, interdisciplinary research, experimental and/or theoretical studies yielding new insights that advance Pattern Recognition methods.

Marsico, Maria

2015-01-01

82

Evaluation of the efficiency and reliability of software generated by code generators  

Science.gov (United States)

There are numerous studies which show that CASE Tools greatly facilitate software development. As a result of these advantages, an increasing amount of software development is done with CASE Tools. As more software engineers become proficient with these tools, their experience and feedback lead to further development with the tools themselves. What has not been widely studied, however, is the reliability and efficiency of the actual code produced by the CASE Tools. This investigation considered these matters. Three segments of code generated by MATRIXx, one of many commercially available CASE Tools, were chosen for analysis: ETOFLIGHT, a portion of the Earth to Orbit Flight software, and ECLSS and PFMC, modules for Environmental Control and Life Support System and Pump Fan Motor Control, respectively.

Schreur, Barbara

1994-01-01

83

Software para la Evaluación y Selección de Generadores Independientes / Independent Generator Evaluation and Selection Software  

Scientific Electronic Library Online (English)

Full Text Available SciELO Cuba | Language: Spanish Abstract in spanish En muchas industrias, edificios, empresas y servicios por razones productivas, emergentes o de confiablidad, es necesario generar energía eléctrica independientemente del Sistema Eléctrico Nacional. En otros casos se necesitan plantas de generación de energía eléctrica que puedan trabajar de forma a [...] islada alimentando un circuito determinado. Para realizar la selección más económica y eficiente de la capacidad a instalar se debe tomar en consideración no sólo el comportamiento del sistema donde se va a insertar la unidad, sino también, las características operacionales del generador ante las fluctuaciones de la carga, sus límites operacionales y la estabilidad resultante. Este trabajo presenta un software que permite realizar la selección más adecuada considerando la curva de capacidad y la estabilidad del generador ante las particularidades del sistema. Con su aplicación es posible reducir los gastos y las pérdidas económicas debidas a una selección inapropiada. Como caso se presenta su aplicación a una planta de una fábrica de envases de alimentos. Abstract in english In many industries, buildings and services is necessary to employ independent power plants to supply the electric power to a particular system. In other cases, islanded operation is desired due to several specific situations. In order to realize the most efficient, economic and adequate selection of [...] the generator capacity is necessary to consider not only the systems load characteristic, also is necessary to know the limits of capabilities and the resultant stability of the power generator. This paper presents a software that allows to select the adequate generator, expose the operating limits and the resulting stability in fluctuating load condition. With its application is possible to reduce economic losses and the costs due to an impropriated generator selection with an oversized o sub utilized machine. As a case a 2500 kVA power plant is analyzed.

Marcos Alberto, de Armas Teyra; Miguel, Castro Fernández.

84

Software para la Evaluación y Selección de Generadores Independientes / Independent Generator Evaluation and Selection Software  

Scientific Electronic Library Online (English)

Full Text Available SciELO Cuba | Language: Spanish Abstract in spanish En muchas industrias, edificios, empresas y servicios por razones productivas, emergentes o de confiablidad, es necesario generar energía eléctrica independientemente del Sistema Eléctrico Nacional. En otros casos se necesitan plantas de generación de energía eléctrica que puedan trabajar de forma a [...] islada alimentando un circuito determinado. Para realizar la selección más económica y eficiente de la capacidad a instalar se debe tomar en consideración no sólo el comportamiento del sistema donde se va a insertar la unidad, sino también, las características operacionales del generador ante las fluctuaciones de la carga, sus límites operacionales y la estabilidad resultante. Este trabajo presenta un software que permite realizar la selección más adecuada considerando la curva de capacidad y la estabilidad del generador ante las particularidades del sistema. Con su aplicación es posible reducir los gastos y las pérdidas económicas debidas a una selección inapropiada. Como caso se presenta su aplicación a una planta de una fábrica de envases de alimentos. Abstract in english In many industries, buildings and services is necessary to employ independent power plants to supply the electric power to a particular system. In other cases, islanded operation is desired due to several specific situations. In order to realize the most efficient, economic and adequate selection of [...] the generator capacity is necessary to consider not only the systems load characteristic, also is necessary to know the limits of capabilities and the resultant stability of the power generator. This paper presents a software that allows to select the adequate generator, expose the operating limits and the resulting stability in fluctuating load condition. With its application is possible to reduce economic losses and the costs due to an impropriated generator selection with an oversized o sub utilized machine. As a case a 2500 kVA power plant is analyzed.

Marcos Alberto, de Armas Teyra; Miguel, Castro Fernández.

2014-04-01

85

Advanced Chemistry Collection, 2nd Edition  

Science.gov (United States)

Software requirements are given in Table 3. Some programs have additional special requirements. Please see the individual program abstracts at JCE Online or the documentation included on the CD-ROM for more specific information. Table 3. General software requirements for the Advanced Chemistry Collection. ComputerSystemOther Software(Required by one or more programs) Mac OS compatibleSystem 7.6.1 or higherAcrobat Reader (included)Mathcad; Mathematica;MacMolecule2; QuickTime 4; HyperCard Player Windows CompatibleWindows 2000, 98, 95, NT 4Acrobat Reader (included)Mathcad; Mathematica;PCMolecule2; QuickTime 4;HyperChem; Excel Literature Cited General Chemistry Collection, 5th ed.; J. Chem. Educ. Software, 2001, SP16. Advanced Chemistry Collection; J. Chem. Educ. Software, 2001, SP28.

2001-11-01

86

2nd International Arctic Ungulate Conference  

Directory of Open Access Journals (Sweden)

Full Text Available The 2nd International Arctic Ungulate Conference was held 13-17 August 1995 on the University of Alaska Fairbanks campus. The Institute of Arctic Biology and the Alaska Cooperative Fish and Wildlife Research Unit were responsible for organizing the conference with assistance from biologists with state and federal agencies and commercial organizations. David R. Klein was chair of the conference organizing committee. Over 200 people attended the conference, coming from 10 different countries. The United States, Canada, and Norway had the largest representation. The conference included invited lectures; panel discussions, and about 125 contributed papers. There were five technical sessions on Physiology and Body Condition; Habitat Relationships; Population Dynamics and Management; Behavior, Genetics and Evolution; and Reindeer and Muskox Husbandry. Three panel sessions discussed Comparative caribou management strategies; Management of introduced, reestablished, and expanding muskox populations; and Health risks in translocation of arctic ungulates. Invited lectures focused on the physiology and population dynamics of arctic ungulates; contaminants in food chains of arctic ungulates and lessons learned from the Chernobyl accident; and ecosystem level relationships of the Porcupine Caribou Herd.

A. Anonymous

1996-01-01

87

Strategic Scene Generation Model: baseline and operational software  

Science.gov (United States)

The Strategic Defense Initiative (SDI) must simulate the detection, acquisition, discrimination and tracking of anticipated targets and predict the effect of natural and man-made background phenomena on optical sensor systems designed to perform these tasks. NRL is developing such a capability using a computerized methodology to provide modeled data in the form of digital realizations of complex, dynamic scenes. The Strategic Scene Generation Model (SSGM) is designed to integrate state-of-science knowledge, data bases and computerized phenomenology models to simulate strategic engagement scenarios and to support the design, development and test of advanced surveillance systems. Multi-phenomenology scenes are produced from validated codes--thereby serving as a traceable standard against which different SDI concepts and designs can be tested. This paper describes the SSGM design architecture, the software modules and databases which are used to create scene elements, the synthesis of deterministic and/or stochastic structured scene elements into composite scenes, the software system to manage the various databases and digital image libraries, and verification and validation by comparison with empirical data. The focus will be on the functionality of the SSGM Phase II Baseline MOdel (SSGMB) whose implementation is complete Recent enhancements for Theater Missile Defense will also be presented as will the development plan for the SSGM Phase III Operational Model (SSGMO) whose development has just begun.

Heckathorn, Harry M.; Anding, David C.

1993-08-01

88

A NEO population generation and observation simulation software tool  

Science.gov (United States)

One of the main targets of ESA's Space Situational Awareness (SSA) program is to build a wide knowledge base about objects that can potentially harm Earth (Near-Earth Objects, NEOs). An important part of this effort is to create the Small Bodies Data Centre (SBDC) which is going to aggregate measurement data from a fully-integrated NEO observation sensor network. Until this network is developed, artificial NEO measurement data is needed in order to validate SBDC algorithms. Moreover, to establish a functioning NEO observation sensor network, it has to be determined where to place sensors, what technical requirements have to be met in order to be able to detect NEOs and which observation strategies work the best. Because of this, a sensor simulation software was needed. This paper presents a software tool which allows users to create and analyse NEO populations and to simulate and analyse population observations. It is a console program written in Fortran and comes with a Graphical User Interface (GUI) written in Java and C. The tool can be distinguished into the components ``Population Generator'' and ``Observation Simulator''. The Population Generator component is responsible for generating and analysing a NEO population. Users can choose between creating fictitious (random) and synthetic populations. The latter are based on one of two models describing the orbital and size distribution of observed NEOs: The existing socalled ``Bottke Model'' (Bottke et al. 2000, 2002) and the new ``Granvik Model'' (Granvik et al. 2014, in preparation) which has been developed in parallel to the tool. Generated populations can be analysed by defining 2D, 3D and scatter plots using various NEO attributes. As a result, the tool creates the appropiate files for the plotting tool ``gnuplot''. The tool's Observation Simulator component yields the Observation Simulation and Observation Analysis functions. Users can define sensor systems using ground- or space-based locations as well as optical or radar sensors and simulate observation campaigns. The tool outputs field-of-view crossings and actual detections of the selected NEO population objects. Using the Observation Analysis users are able to process and plot the results of the Observation Simulation. In order to enable end-users to handle the tool in a user-intuitive and comfortable way, a GUI has been created based on the modular Eclipse Rich Client Platform (RCP) technology. Through the GUI users can easily enter input data for the tool, execute it and view its output data in a clear way. Additionally, the GUI runs gnuplot to create plot pictures and presents them to the user. Furthermore, users can create projects to organise executions of the tool.

Müller, Sven; Gelhaus, Johannes; Hahn, Gerhard; Franco, Raffaella

89

Automatic Question Generation Using Software Agents for Technical Institutions  

Directory of Open Access Journals (Sweden)

Full Text Available In the attempt of producing quality graduates, a major factor that needs a considerable amount of attention is the institutions evaluation system. Now, in order to produce quality result their examination system must be very effective, questions must be able to assess students in every domain. Preparation of question paper with high standard that really kindles the student’s thinking ability is very challengeable task that need to be performed by the academicians. There arises a need for automatic question generation. For all the existing automatic question generating systems, the problem lies either in eliminating the user’s role or in developing factual questions based on Bloom’s taxonomy. To overcome these issues, in the proposed system, the focus is to take input in form of a text file from user which contains of the text upon which the user desires to fetch questions; the output is produced in form of a text file containing questions based on Bloom’s taxonomy. The entire process is carried out by software agents, which eliminates the major problems of existing systems.

Shivank Pandey,

2013-12-01

90

2nd International technical meeting on small reactors  

International Nuclear Information System (INIS)

The 2nd International Technical Meeting on Small Reactors was held on November 7-9, 2012 in Ottawa, Ontario. The meeting was hosted by Atomic Energy of Canada Limited (AECL) and Canadian Nuclear Society (CNS). There is growing international interest and activity in the development of small nuclear reactor technology. This meeting provided participants with an opportunity to share ideas and exchange information on new developments. This Technical Meeting covered topics of interest to designers, operators, researchers and analysts involved in the design, development and deployment of small reactors for power generation and research. A special session focussed on small modular reactors (SMR) for generating electricity and process heat, particularly in small grids and remote locations. Following the success of the first Technical Meeting in November 2010, which captured numerous accomplishments of low-power critical facilities and small reactors, the second Technical Meeting was dedicated to the achievements, capabilities, and future prospects of small reactors. This meeting also celebrated the 50th Anniversary of the Nuclear Power Demonstration (NPD) reactor which was the first small reactor (20 MWe) to generate electricity in Canada.

91

Elements of the Next Generation Science Standards' (NGSS) New Framework for K-12 Science Education aligned with STEM designed projects created by Kindergarten, 1st and 2nd grade students in a Reggio Emilio project approach setting  

Science.gov (United States)

This paper examines how elements of the Next Generation Science Standards' (NGSS) New Framework for K-12 Science Education standards (National Research Council 2011)---specifically the cross-cutting concept "cause and effect" are aligned with early childhood students' creation of projects of their choice. The study took place in a Reggio Emilio-inspired, K-12 school, in a multi-aged kindergarten, first and second grade classroom with 14 students. Students worked on their projects independently with the assistance of their peers and teachers. The students' projects and the alignment with the Next Generation Science Standards' New Framework were analyzed by using pre and post assessments, student interviews, and discourse analysis. Results indicate that elements of the New Framework for K-12 Science Education emerged through students' project presentation, particularly regarding the notion of "cause and effect". More specifically, results show that initially students perceived the relationship between "cause and effect" to be negative.

Facchini, Nicole

92

GENESIS: Agile Generation of Information Management Oriented Software  

Directory of Open Access Journals (Sweden)

Full Text Available The specification for an information system can be clear from the beginning: it must acquire, display, query and modify data, using a database. The main issue is to decide which information to manage. In the case originating this work, information was always evolving, even up to the end of the project. This implies the construction of a new system each time the information is redefined. This article presents Genesis, an agile development infrastructure, and proposes an approach for the immediate construction of required information systems. Experts describe their information needs and queries, and Genesis generates the corresponding application, with the appropriate graphical interfaces and database.La especificación de un sistema de información puede estar clara desde el principio: debe adquirir, desplegar, consultar y modificar datos, usando una base de datos. El asunto es decidir cuál información manejar. En el caso que origina este trabajo, la información evoluciona permanentemente, incluso hasta el final del proyecto. Esto implica la construcción de un nuevo sistema cada vez que se redefine la información. Este artículo presenta Genesis, una infraestructura ágil para la construcción inmediata del sistema de información que sea requerido. Los expertos describen su información y consultas. Genesis produce el software correspondiente, generando las interfaces gráficas y la base de datos apropiados.

Juan Erasmo Gómez

2010-06-01

93

Pregnancy - 2nd Trimester: Questions to Discuss with Your Doctor  

Science.gov (United States)

... Pregnancy: 2nd Trimester Questions to Discuss with Your Doctor: How do you feel? Have you had any ... planning to breast-feed or bottle-feed? Your Doctor Might Examine the Following Body Structures or Functions: ...

94

The PCMDI software and the next generation internet project  

Energy Technology Data Exchange (ETDEWEB)

One problem facing many scientists is not the absence of tools to analyze data, but rather a shortage of interrelated diagnostic software that is consistent, flexible, portable, adaptable, efficient, sharable, and easy to use. Consequently, many scientists are writing their own programs to ingest, manipulate and display data. Debugging and enhancing special purpose software diverts time that otherwise would be spent on research. The result is often not ''friendly'', reusable, or portable, nor does it promote standards within the research community. In response to the needs of the scientific community, PCMDI has developed a suite of software tools for the storage, diagnosis, and visualization of data. PCMDI's principal tools are the Climate Data Analysis Tool (CDAT), the Climate Database Management System (CDMS), and the Visualization and Computation System (VCS). The design goal of this suite of software is to reduce the redundancy encountered so often in scientific analysis and to allow researchers to concentrate on their science. One obstacle to sharing analysis software is the wide variety of data file formats that are in use. Programs must be written to convert data to a user's preferred file format and conventions. This data conversion requires additional expenditure of efforts on testing and quality assurance. Modular and interrelated software performs such tasks transparently.

Potter, G L; Williams, D N

1999-10-15

95

Generation and Optimization of Test cases for Object-Oriented Software Using State Chart Diagram  

CERN Document Server

The process of testing any software system is an enormous task which is time consuming and costly. The time and required effort to do sufficient testing grow, as the size and complexity of the software grows, which may cause overrun of the project budget, delay in the development of software system or some test cases may not be covered. During SDLC (software development life cycle), generally the software testing phase takes around 40-70% of the time and cost. State-based testing is frequently used in software testing. Test data generation is one of the key issues in software testing. A properly generated test suite may not only locate the errors in a software system, but also help in reducing the high cost associated with software testing. It is often desired that test data in the form of test sequences within a test suite can be automatically generated to achieve required test coverage. This paper proposes an optimization approach to test data generation for the state-based software testing. In this paper, ...

Swain, Ranjita Kumari; Mohapatra, Durga Prasad

2012-01-01

96

Basis Principles of Software Development for Eddy Current Inspection of PWR/WWER Steam Generator Tubes  

International Nuclear Information System (INIS)

Extensive inspection of PWR/WWER steam generators associated with development of own designs of eddy current inspection systems including manipulators, push-pullers, controllers, probes, etc. influence on INETEC decision to start with development of its own software for EC inspections. In last year incredible results were obtained. Main software packages were finished with increased possibilities compared to other software available on the world market. In this article some basic principles of EC NDT software development is described including organizational aspects of software team, description of tasks and description of main achievements. Also associated problems and future development directions are discussed. (author)

97

2nd Generation Airborne Precipitation Radar (APR-2)  

Science.gov (United States)

Dual-frequency operation with Ku-band (13.4 GHz) and Ka-band (35.6 GHz). Geometry and frequencies chosen to simulate GPM radar. Measures reflectivity at co- and cross-polarizations, and Doppler. Range resolution is approx. 60 m. Horizontal resolution at surface is approx. 1 km. Reflectivity calibration is within 1.5 dB, based on 10 deg sigmaO at Ku-band and Mie scattering calculations in light rain at Ka-band. LDR measurements are OK to near -20 dB; LDR lower than this is likely contaminated by system cross-polarization isolation. Velocity is motion-corrected total Doppler, including particle fall speed. Aliasing can be seen in some places; can usually be dealiased with an algorithm. .

Durden, S.; Tanelli, S.; Haddad, Z.; Im, E.

2012-01-01

98

2nd Workshop Energy Generation of PV Systems 2014, 29 ...  

Sep 5, 2014 ... EERA Project Development · EERA Joint Programme Details ... Matt Black (\\Senior Analyst, Foresight Group): Due Diligence requirements from a UK \\operator's point of view ... Felix Holz (Vice President Experten Team Greentech, \\Deutsche ... Tony Sample (Joint Research Centre ESTI): Qualification testing ...

99

2nd Generation Reusable Launch Vehicle (2G RLV). Revised  

Science.gov (United States)

This is a revised final report and addresses all of the work performed on this program. Specifically, it covers vehicle architecture background, definition of six baseline engine cycles, reliability baseline (space shuttle main engine QRAS), and component level reliability/performance/cost for the six baseline cycles, and selection of 3 cycles for further study. This report further addresses technology improvement selection and component level reliability/performance/cost for the three cycles selected for further study, as well as risk reduction plans, and recommendation for future studies.

Matlock, Steve; Sides, Steve; Kmiec, Tom; Arbogast, Tim; Mayers, Tom; Doehnert, Bill

2001-01-01

100

Vendors boot up a new generation of mining software  

Energy Technology Data Exchange (ETDEWEB)

From complete mine management down to the smallest detail of process or machine design, the latest programs are claimed to process and display data easier, faster, and more accurately. MINExpo 2008 held in Las Vegas in September offered a bonanza of information on complete mine management to specilaized financial modules. Gemcom Software introduced new versions of five of its key products - Surpac 6.1, GEMS 6.2, Minex 5.3, MinerSched 6.0 and Whittle 4.2. Lerca Geosystems launched its Jigsaw 360 mine management suite, encompassing GPS navigation. Products launched by Modular Mining systems, Carlson Software, DEM Solutions and Logimine are also mentioned. 2 figs.

Carter, R.A.

2008-10-15

 
 
 
 
101

The 2nd colloquium on process simulation. Computational fluid dynamics coupled with chemical kinetics, combustion and thermodynamics  

Energy Technology Data Exchange (ETDEWEB)

The articles collected in this volume were presented at the 2nd Colloquium on Process Simulation held at Helsinki University of Technology, Espoo, Finland, June 6-8, 1995. The processes for producing chemicals, energy, and materials encounter environmental concern and laws which challenge engineers to develop the processes towards more efficient, economical and safe operation. A more thorough understanding of the processes and phenomena involved is necessary. Formerly, the development of the processes was largely based on trial and error, whereas today, the development of computer performance together with the diversification of modelling software enables simulation of the processes. The increased capacity and possibilities for modelling the processes brought by the improved hardware and software, have generated a strong demand for more accurate mathematical descriptions of the processes. Especially, the coupling of computational fluid dynamics and chemical kinetics, combustion, and thermodynamics is of current interest in process oriented technology. This colloquium attempts to give examples of modelling efforts in operation in different universities, research institutes and companies

Jokilaakso, A. [ed.

1995-09-01

102

2nd ISPRA nuclear electronics symposium, Stresa, Italy May 20-23, 1975  

International Nuclear Information System (INIS)

Two round tables were annexed to the 2nd Ispra Nuclear Electronics Symposium. The first one was concerned with software support for the implementation of microprocessors, MOS and bipolar microporcessors, environmental data systems, and the use of microprocessors and minicomputers in nuclear, biomedical and environmental fields. Nuclear electronics future, and its diversification, gravitational waves and electronics, the environmental measurements of air and water quality were discussed during the second round table, and relevant feelings brought out during the discussion on the extension of nuclear electronics techniques to other fields

103

On-line software for the new generation experiments: The model based OBELIX on-line software  

International Nuclear Information System (INIS)

The increasing in complexity and scale of modern experiments places major requirements on the one-line systems needed to operate them. The MODEL software has been developed by the Online Computing Group of the CERN Data Handling Division to provide an environment for data acquisition software. LEP and the new generation experiments were the target of the project. The Obelix on-line system has been developed on the basis of the MODEL framework. We report our experience in the system development and integration

104

Application of a path sensitizing method on automated generation of test specifications for control software  

International Nuclear Information System (INIS)

An automated generation method for test specifications has been developed for sequential control software in plant control equipment. Sequential control software can be represented as sequential circuits. The control software implemented in a control equipment is designed from these circuit diagrams. In logic tests of VLSI's, path sensitizing methods are widely used to generate test specifications. But the method generates test specifications at a single time only, and can not be directly applied to sequential control software. The basic idea of the proposed method is as follows. Specifications of each logic operator in the diagrams are defined in the software design process. Therefore, test specifications of each operator in the control software can be determined from these specifications, and validity of software can be judged by inspecting all of the operators in the logic circuit diagrams. Candidates for sensitized paths, on which test data for each operator propagates, can be generated by the path sensitizing method. To confirm feasibility of the method, it was experimentally applied to control software in digital control equipment. The program could generate test specifications exactly, and feasibility of the method was confirmed. (orig.) (3 refs., 7 figs.)

105

Development of a Prototype Automation Simulation Scenario Generator for Air Traffic Management Software Simulations  

Science.gov (United States)

A technique for automated development of scenarios for use in the Multi-Center Traffic Management Advisor (McTMA) software simulations is described. The resulting software is designed and implemented to automate the generation of simulation scenarios with the intent of reducing the time it currently takes using an observational approach. The software program is effective in achieving this goal. The scenarios created for use in the McTMA simulations are based on data taken from data files from the McTMA system, and were manually edited before incorporation into the simulations to ensure accuracy. Despite the software s overall favorable performance, several key software issues are identified. Proposed solutions to these issues are discussed. Future enhancements to the scenario generator software may address the limitations identified in this paper.

Khambatta, Cyrus F.

2007-01-01

106

Thermoluminescent characteristics of ZrO2:Nd films  

International Nuclear Information System (INIS)

In this work it is exposed the obtained results after analysing the photo luminescent and thermoluminescent characteristics of activated zirconium oxide with neodymium (ZrO2 :Nd) and its possible application in the UV radiation dosimetry. The realized experiments had as objective to study the characteristics such as the optimum thermal erased treatment, the influence of light on the response, the response depending on the wavelength, the fadeout of the information, the temperature effect, the response depending on the time and the recurring of the response. The results show that the ZrO2 :Nd is a promising material to be used as Tl dosemeter for the UV radiation. (Author)

107

Using Automatic Code Generation in the Attitude Control Flight Software Engineering Process  

Science.gov (United States)

This paper presents an overview of the attitude control subsystem flight software development process, identifies how the process has changed due to automatic code generation, analyzes each software development phase in detail, and concludes with a summary of our lessons learned.

McComas, David; O'Donnell, James R., Jr.; Andrews, Stephen F.

1999-01-01

108

AMON: A Software System for Automatic Generation of Ontology Mappings  

Digital Repository Infrastructure Vision for European Research (DRIVER)

Some of the most outstanding problems in Computer Science (e.g. access to heterogeneous information sources, use of different e-commerce standards, ontology translation, etc.) are often approached through the identification of ontology mappings. A manual mapping generation slows down, or even makes unfeasible, the solution of particular cases of the aforementioned problems via ontology mappings. Some algorithms and formal models for partial tasks of automatic generation of mappings have been ...

Sa?nchez-alberca, A.; Garci?a-garci?a, R.; Sorzano, C. O. S.; Gutie?rrez-cossi?o, Celia; Chagoyen, Mo?nica; Ferna?ndez Lo?pez, Mariano

2005-01-01

109

Automatically generated acceptance test: A software reliability experiment  

Science.gov (United States)

This study presents results of a software reliability experiment investigating the feasibility of a new error detection method. The method can be used as an acceptance test and is solely based on empirical data about the behavior of internal states of a program. The experimental design uses the existing environment of a multi-version experiment previously conducted at the NASA Langley Research Center, in which the launch interceptor problem is used as a model. This allows the controlled experimental investigation of versions with well-known single and multiple faults, and the availability of an oracle permits the determination of the error detection performance of the test. Fault interaction phenomena are observed that have an amplifying effect on the number of error occurrences. Preliminary results indicate that all faults examined so far are detected by the acceptance test. This shows promise for further investigations, and for the employment of this test method on other applications.

Protzel, Peter W.

1988-01-01

110

Learning from examples - Generation and evaluation of decision trees for software resource analysis  

Science.gov (United States)

A general solution method for the automatic generation of decision (or classification) trees is investigated. The approach is to provide insights through in-depth empirical characterization and evaluation of decision trees for software resource data analysis. The trees identify classes of objects (software modules) that had high development effort. Sixteen software systems ranging from 3,000 to 112,000 source lines were selected for analysis from a NASA production environment. The collection and analysis of 74 attributes (or metrics), for over 4,700 objects, captured information about the development effort, faults, changes, design style, and implementation style. A total of 9,600 decision trees were automatically generated and evaluated. The trees correctly identified 79.3 percent of the software modules that had high development effort or faults, and the trees generated from the best parameter combinations correctly identified 88.4 percent of the modules on the average.

Selby, Richard W.; Porter, Adam A.

1988-01-01

111

A Novel Scheme to Design Software-Controllable Vector Microwave Signal Generator and its Application  

Directory of Open Access Journals (Sweden)

Full Text Available With the rapid development of wireless communications, there will be many communication standards in the future, which may cost much to buy the corresponding vector microwave signal generator. Hence, this study investigated a novel vector microwave signal generation method, which modeled the vector baseband signal by the CAD software (Agilent ADS and then control the conventional microwave signal generation hardware to output vector microwave signals. Compared with the specified vector microwave signal generator developed by Agilent, Anritsu, etc., our software-controllable microwave signal source is cheaper, more flexible and more convenient. Moreover, as an application of our method, we model and realize the TD-SCDMA baseband signal corrupted by multipath channel and Additive White Gaussian Noise (AWGN in ADS software and then control the hardware (Agilent E4432B to generate the TD-SCDMA microwave signals. The measurements of the TD-SCDMA microwave signals approve the validity of our method.

L. Meng

2010-01-01

112

Technical Issues Map for the NHI System Interface and Support Systems Area: 2nd Quarter FY07  

Energy Technology Data Exchange (ETDEWEB)

This document provides a mapping of technical issues associated with development of the Next Generation Nuclear Plant (NGNP) intermediate heat transport loop and nuclear hydrogen plant support systems to the work that has been accomplished or is currently underway in the 2nd quarter of FY07.

Steven R. Sherman

2007-03-01

113

Custom software for third generation optical computed tomography  

International Nuclear Information System (INIS)

The advanced radiotherapy techniques based on increased complexity of radiation delivery methods necessitate verification of computer calculated dose distribution by an accurate dosimetric method. The only emergent candidate with true three dimensional nature is the gel dosimeter. In spite of the three dimensional nature of the Optical Computed Tomography (OCT), most work so far has only used two-dimensional evaluations of the three dimensional data set. Recently this limitation has been overcome by applying the cone beam CT imaging principle to optical imaging. A CCD camera based OCT scanner was setup for gel dosimetry using the geometry suggested by J Wolodzko et al. and J Kevin. In an earlier work by the co-author the iradon function in Matlab was used for reconstruction of mid-slices assuming the cone angle to be negligible. In this study software was developed based on the algorithm by Feldkamp, Davis and Kress FDK to reconstruct the images for Optical Computed Tomography Scanner. Results are compared with the previous work of the co-author

114

2nd International Conference on Green Communications and Networks 2012  

CERN Document Server

The objective of the 2nd International Conference on Green Communications and Networks 2012 (GCN 2012) is to facilitate an exchange of information on best practices for the latest research advances in the area of communications, networks and intelligence applications. These mainly involve computer science and engineering, informatics, communications and control, electrical engineering, information computing, and business intelligence and management. Proceedings of the 2nd International Conference on Green Communications and Networks 2012 (GCN 2012) will focus on green information technology and applications, which will provide in-depth insights for engineers and scientists in academia, industry, and government. The book addresses the most innovative research developments including technical challenges, social and economic issues, and presents and discusses the authors’ ideas, experiences, findings, and current projects on all aspects of advanced green information technology and applications. Yuhang Yang is ...

Ma, Maode; GCN 2012

2013-01-01

115

2nd SWITCH-Asia Networking Event - Replicating sustainable SMEs in Asia  

...2nd SWITCH-Asia Networking Event - Replicating sustainable SMEs in Asia 2nd SWITCH-Asia Networking Event - Replicating sustainable SMEs in Asia SWITCH-Asia info Basic Information ...euSWITCH-Asia infoNews and EventsSWITCH-Asia Networking Events 2nd SWITCH-Asia Networking Event Creating Synergies at the 2nd Networking Event All SWITCH-Asia project promote sustainable consumption or ... The 2nd Networking Event started with an afternoon of interactive sessions to encourage synergies and collaboration between SWITCH-Asia project. The team leader ... To catch impressions of the Networking Event please follow... CLICK on the Agenda items for more information! 2nd ...

116

A computer-based physics laboratory apparatus: Signal generator software  

Science.gov (United States)

This paper describes a computer-based physics laboratory apparatus to replace expensive instruments such as high-precision signal generators. This apparatus uses a sound card in a common personal computer to give sinusoidal signals with an accurate frequency that can be programmed to give different frequency signals repeatedly. An experiment on standing waves on an oscillating string uses this apparatus. In conjunction with interactive lab manuals, which have been developed using personal computers in our university, we achieve a complete set of low-cost, accurate, and easy-to-use equipment for teaching a physics laboratory.

Thanakittiviroon, Tharest; Liangrocapart, Sompong

2005-09-01

117

FACTORS GENERATING RISKS DURING REQUIREMENT ENGINEERING PROCESS IN GLOBAL SOFTWARE DEVELOPMENT ENVIRONMENT  

Directory of Open Access Journals (Sweden)

Full Text Available Challenges of Requirements Engineering become adequate when it is performed in global software development paradigm. There can be many reasons behind this challenging nature. “Risks” can be one of them, as there is more risk exposure in global development paradigm. So it is may be one of the main reasons of making Requirement Engineering more challenging. For this first there is a need to identify the factors which actually generate these risks. This paper therefore not only identifies the factors, but also the risks which these factors may generate. A systematic literature review is done for the identification of these factors and the risks which may occur during requirement engineering process in global software development paradigm. The list leads to progressive enhancement for assisting in requirement engineering activities in global software development paradigm. This work is especially useful for the, less experience people working in global software development.

Huma Hayat Khan

2014-03-01

118

Mongolia 360°, 2nd Land Art Biennial, Creating Identities.  

Digital Repository Infrastructure Vision for European Research (DRIVER)

LAM 360º – 2nd Land Art Mongolia Biennial curated by Anna Brietzke, Orna Tsultem, Fumio Nanjo Locations: Ikh Gazriin Chuluu (Dundgobi) (45°29'33.24"N, 107°13'28.50"E) and National Mongolian Modern Art Gallery, Ulaanbataar, Mongolia. International site specific visual art event in the Mongolian Gobi Desert, a seminar on Art and Politics and an exhibition of documents and artifacts related to the works produced in the Gobi desert at Ikh Gazriin Chuluu.

Macleod, Anna

2012-01-01

119

THR Simulator – the software for generating radiographs of THR prosthesis  

Directory of Open Access Journals (Sweden)

Full Text Available Abstract Background Measuring the orientation of acetabular cup after total hip arthroplasty is important for prognosis. The verification of these measurement methods will be easier and more feasible if we can synthesize prosthesis radiographs in each simulated condition. One reported method used an expensive mechanical device with an indeterminable precision. We thus develop a program, THR Simulator, to directly synthesize digital radiographs of prostheses for further analysis. Under Windows platform and using Borland C++ Builder programming tool, we developed the THR Simulator. We first built a mathematical model of acetabulum and femoral head. The data of the real dimension of prosthesis was adopted to generate the radiograph of hip prosthesis. Then with the ray tracing algorithm, we calculated the thickness each X-ray beam passed, and then transformed to grey scale by mapping function which was derived by fitting the exponential function from the phantom image. Finally we could generate a simulated radiograph for further analysis. Results Using THR Simulator, the users can incorporate many parameters together for radiograph synthesis. These parameters include thickness, film size, tube distance, film distance, anteversion, abduction, upper wear, medial wear, and posterior wear. These parameters are adequate for any radiographic measurement research. This THR Simulator has been used in two studies, and the errors are within 2° for anteversion and 0.2 mm for wearing measurement. Conclusion We design a program, THR Simulator that can synthesize prosthesis radiographs. Such a program can be applied in future studies for further analysis and validation of measurement of various parameters of pelvis after total hip arthroplasty.

Hou Sheng-Mou

2009-01-01

120

Software module for geometric product modeling and NC tool path generation  

International Nuclear Information System (INIS)

The intelligent CAD/CAM system named VIRTUAL MANUFACTURE is created. It is consisted of four intelligent software modules: the module for virtual NC machine creation, the module for geometric product modeling and automatic NC path generation, the module for virtual NC machining and the module for virtual product evaluation. In this paper the second intelligent software module is presented. This module enables feature-based product modeling carried out via automatic saving of the designed product geometric features as knowledge data. The knowledge data are afterwards applied for automatic NC program generation for the designed product NC machining. (Author)

 
 
 
 
121

Advanced User Interface Generation in the Software Framework for Magnetic Measurements at CERN  

CERN Document Server

A model-based approach, the Model-View-Interactor Paradigm, for automatic generation of user interfaces in software frameworks for measurement systems is proposed. The Model-View-Interactor Paradigm is focused on the ``interaction{''} typical in a software framework for measurement applications: the final user interacts with the automatic measurement system executing a suitable high-level script previously written by a test engineer. According to the main design goal of frameworks, the proposed approach allows the user interfaces to be separated easily from the application logic for enhancing the flexibility and reusability of the software. As a practical case study, this approach has been applied to the flexible software framework for magnetic measurements at the European Organization for Nuclear research (CERN). In particular, experimental results about the scenario of permeability measurements are reported.

Arpaia, P; La Commara, Giuseppe; Arpaia, Pasquale

2010-01-01

122

GEOGRAPHICAL DESCRIPTION OF THE 2 ND DISTRICT OF BUCHAREST  

Directory of Open Access Journals (Sweden)

Full Text Available Located in the north-east of Bucharest, with a population of approx. 400,000 inhabitants, the current territory of the 2nd district was once part of Vl?siei forests, crossed by the river Colentina. It is a tabular plain, with low declivity on NW-SE direction the only major bumps are determined leading to the terrace Colentina, tablelands and anthropic relief. The Colentina Plain covers 36% of the Bucharest Municipality and it is characterized by altitudes that vary between 88.9 m in the Free Press Square and 55m at C?telu. Field overlap Otopeni Bucharest north (northern district Colentina, B?neasa, Pipera is characterized by altitudes of 85-90 m, by fragmentation of 0.5 km/square km relief, through a high frequency tablelands and growth of local slopes (common values of 10 degrees. The 2nd district is on the second place in terms of total area of green spaces (4,187,000 square meters with an index of area of green space per capita of 13.6 square meters per head, but uneven distributed in the sector. The vegetation of 2nd district is represented in particular by vegetation in parks (Circus’ Park, Plumbuita, Morarilor, Tei, gardens and green spaces in housing blocks. Valleys are cut into loess are generally steep sides with intense phenomena of warping and biogenic mineral presents meadows, sometimes covered by lakes or swamps. The largest lakes of the valley, made by dams are located on Colentina river. Geomorphologic defining characteristics are the result of the action of erosion, transportation and deposition on the lower course of the Dâmbovi?a river. Altimetry and the average curve in the same time the capital is 80 m.

Mariana Cârstea

2009-10-01

123

Proceedings Issue No. 1 - 2nd Arctic Ungulate Conference  

Directory of Open Access Journals (Sweden)

Full Text Available The 2nd International Arctic Ungulate Conference was held 13-17 August 1995 on the University of Alaska Fairbanks campus. The Institute of Arctic Biology and the Alaska Cooperative Fish and Wildlife Research Unit were responsible for organizing the conference with assistance from biologists with state and federal agencies and commercial organizations. David R. Klein was chair of the conference organizing committee. Over 200 people attended the conference, coming from 10 different countries. The United States, Canada, and Norway had the largest representation. The conference included invited lectures; panel discussions, and about 125 contributed papers.

Rolf Egil Haugerud (ed.

1996-01-01

124

GEOGRAPHICAL DESCRIPTION OF THE 2 ND DISTRICT OF BUCHAREST  

Digital Repository Infrastructure Vision for European Research (DRIVER)

Located in the north-east of Bucharest, with a population of approx. 400,000 inhabitants, the current territory of the 2nd district was once part of Vl?siei forests, crossed by the river Colentina. It is a tabular plain, with low declivity on NW-SE direction the only major bumps are determined leading to the terrace Colentina, tablelands and anthropic relief. The Colentina Plain covers 36% of the Bucharest Municipality and it is characterized by altitudes that vary between 88.9 m in the Free...

Mariana Cârstea

2009-01-01

125

2nd International Conference on Computer Science, Applied Mathematics and Applications  

CERN Document Server

The proceedings consists of 30 papers which have been selected and invited from the submissions to the 2nd International Conference on Computer Science, Applied Mathematics and Applications (ICCSAMA 2014) held on 8-9 May, 2014 in Budapest, Hungary. The conference is organized into 7 sessions: Advanced Optimization Methods and Their Applications, Queueing Models and Performance Evaluation, Software Development and Testing, Computational Methods for Mobile and Wireless Networks, Computational Methods for Knowledge Engineering, Logic Based Methods for Decision Making and Data Mining, and Nonlinear Systems and Applications, respectively. All chapters in the book discuss theoretical and practical issues connected with computational methods and optimization methods for knowledge engineering. The editors hope that this volume can be useful for graduate and Ph.D. students and researchers in Computer Science and Applied Mathematics. It is the hope of the editors that readers of this volume can find many inspiring idea...

Thi, Hoai; Nguyen, Ngoc

2014-01-01

126

Beyond Chat--New Generation Software for Real-Time Discussion.  

Science.gov (United States)

This paper discusses the need for and the design requirements of generic chatroom software that is readily configurable to particular domains and that is also extensible. The architecture of a client/server chatroom generation system, ChatterBox, implemented in Java, is presented. The following components of ChatterBox are described: (1) the basic…

Charlton, Colin; Little, Janet; Morris, Simon; Neilson, Irene

127

Afs password expiration starts Feb 2nd 2004  

CERN Document Server

Due to security reasons, and in agreement with CERN management, afs/lxplus passwords will fall into line with Nice/Mail passwords on February 2nd and expire annually. As of the above date afs account holders who have not changed their passwords for over a year will have a 60 day grace period to make a change. Following this date their passwords will become invalid. What does this mean to you? If you have changed your afs password in the past 10 months the only difference is that 60 days before expiration you will receive a warning message. Similar warnings will also appear nearer the time of expiration. If you have not changed your password for more than 10 months, then, as of February 2nd you will have 60 days to change it using the command ?kpasswd'. Help to choose a good password can be found at: http://security.web.cern.ch/security/passwords/ If you have been given a temporary password at any time by the Helpdesk or registration team this will automatically fall into the expiration category ...

2004-01-01

128

Ontology-based Software for Generating Scenarios for Characterizing Searches for Nuclear Materials  

Energy Technology Data Exchange (ETDEWEB)

A software environment was created in which ontologies are used to significantly expand the number and variety of scenarios for special nuclear materials (SNM) detection based on a set of simple generalized initial descriptions. A framework was built that combined advanced reasoning from ontologies with geographical and other data sources to generate a much larger list of specific detailed descriptions from a simple initial set of user-input variables. This presentation shows how basing the scenario generation on a process of inferencing from multiple ontologies, including a new SNM Detection Ontology (DO) combined with data extraction from geodatabases, provided the desired significant variability of scenarios for testing search algorithms, including unique combinations of variables not previously expected. The various components of the software environment and the resulting scenarios generated will be discussed.

Ward, Richard C [ORNL; Sorokine, Alexandre [ORNL; Schlicher, Bob G [ORNL; Wright, Michael C [ORNL; Kruse, Kara L [ORNL

2011-01-01

129

Software for Generating Troposphere Corrections for InSAR Using GPS and Weather Model Data  

Science.gov (United States)

Atmospheric errors due to the troposphere are a limiting error source for spaceborne interferometric synthetic aperture radar (InSAR) imaging. This software generates tropospheric delay maps that can be used to correct atmospheric artifacts in InSAR data. The software automatically acquires all needed GPS (Global Positioning System), weather, and Digital Elevation Map data, and generates a tropospheric correction map using a novel algorithm for combining GPS and weather information while accounting for terrain. Existing JPL software was prototypical in nature, required a MATLAB license, required additional steps to acquire and ingest needed GPS and weather data, and did not account for topography in interpolation. Previous software did not achieve a level of automation suitable for integration in a Web portal. This software overcomes these issues. GPS estimates of tropospheric delay are a source of corrections that can be used to form correction maps to be applied to InSAR data, but the spacing of GPS stations is insufficient to remove short-wavelength tropospheric artifacts. This software combines interpolated GPS delay with weather model precipitable water vapor (PWV) and a digital elevation model to account for terrain, increasing the spatial resolution of the tropospheric correction maps and thus removing short wavelength tropospheric artifacts to a greater extent. It will be integrated into a Web portal request system, allowing use in a future L-band SAR Earth radar mission data system. This will be a significant contribution to its technology readiness, building on existing investments in in situ space geodetic networks, and improving timeliness, quality, and science value of the collected data

Moore, Angelyn W.; Webb, Frank H.; Fishbein, Evan F.; Fielding, Eric J.; Owen, Susan E.; Granger, Stephanie L.; Bjoerndahl, Fredrik; Loefgren, Johan; Fang, Peng; Means, James D.; Bock, Yehuda; Tong, Xiaopeng

2013-01-01

130

Software architecture for control and data acquisition of linear plasma generator Magnum-PSI  

International Nuclear Information System (INIS)

Highlights: ? An architecture based on a modular design. ? The design offers flexibility and extendability. ? The design covers the overall software architecture. ? It also covers its (sub)systems’ internal structure. -- Abstract: The FOM Institute DIFFER – Dutch Institute for Fundamental Energy Research has completed the construction phase of Magnum-PSI, a magnetized, steady-state, large area, high-flux linear plasma beam generator to study plasma surface interactions under ITER divertor conditions. Magnum-PSI consists of several hardware subsystems, and a variety of diagnostic systems. The COntrol, Data Acquisition and Communication (CODAC) system integrates these subsystems and provides a complete interface for the Magnum-PSI users. Integrating it all, from the lowest hardware level of sensors and actuators, via the level of networked PLCs and computer systems, up to functions and classes in programming languages, demands a sound and modular software architecture, which is extendable and scalable for future changes. This paper describes this architecture, and the modular design of the software subsystems. The design is implemented in the CODAC system at the level of services and subsystems (the overall software architecture), as well as internally in the software subsystems

131

2nd International Conference on NeuroRehabilitation  

CERN Document Server

The book is the proceedings of the 2nd International Conference on NeuroRehabilitation (ICNR 2014), held 24th-26th June 2014 in Aalborg, Denmark. The conference featured the latest highlights in the emerging and interdisciplinary field of neural rehabilitation engineering and identified important healthcare challenges the scientific community will be faced with in the coming years. Edited and written by leading experts in the field, the book includes keynote papers, regular conference papers, and contributions to special and innovation sessions, covering the following main topics: neuro-rehabilitation applications and solutions for restoring impaired neurological functions; cutting-edge technologies and methods in neuro-rehabilitation; and translational challenges in neuro-rehabilitation. Thanks to its highly interdisciplinary approach, the book will not only be a  highly relevant reference guide for academic researchers, engineers, neurophysiologists, neuroscientists, physicians and physiotherapists workin...

Andersen, Ole; Akay, Metin

2014-01-01

132

Testing Product Generation in Software Product Lines Using Pairwise for Features Coverage  

Science.gov (United States)

A Software Product Lines (SPL) is "a set of software-intensive systems sharing a common, managed set of features that satisfy the specific needs of a particular market segment or mission and that are developed from a common set of core assets in a prescribed way". Variability is a central concept that permits the generation of different products of the family by reusing core assets. It is captured through features which, for a SPL, define its scope. Features are represented in a feature model, which is later used to generate the products from the line. From the testing point of view, testing all the possible combinations in feature models is not practical because: (1) the number of possible combinations (i.e., combinations of features for composing products) may be untreatable, and (2) some combinations may contain incompatible features. Thus, this paper resolves the problem by the implementation of combinatorial testing techniques adapted to the SPL context.

Pérez Lamancha, Beatriz; Polo Usaola, Macario

133

Simulation of photovoltaic systems electricity generation using homer software in specific locations in Serbia  

Digital Repository Infrastructure Vision for European Research (DRIVER)

In this paper basic information of Homer software for PV system electricity generation, NASA - Surface meteorology and solar energy database, RETScreen, PVGIS and HMIRS (Hydrometeorological Institute of Republic of Serbia) solar databases are given. The comparison of the monthly average values for daily solar radiation per square meter received by the horizontal surface taken from NASA, RETScreen, PVGIS and HMIRS solar databases for three locations in Serbia (Belgrade, Negotin and Zlati...

Pavlovi? Tomislav M.; Milosavljevi? Dragana D.; Pirsl Danica S.

2013-01-01

134

SEMANTIC WEB-BASED SOFTWARE ENGINEERING BY AUTOMATED REQUIREMENTS ONTOLOGY GENERATION IN SOA  

Directory of Open Access Journals (Sweden)

Full Text Available This paper presents an approach for automated generation of requirements ontology using UML diagrams in service-oriented architecture (SOA. The goal of this paper is to convenience progress of software engineering processes like software design, software reuse, service discovering and etc. The proposed method is based on a four conceptual layers. The first layer includes requirements achieved by stakeholders, the second one designs service-oriented diagrams from the data in first layer and extracts XMI codes of them. The third layer includes requirement ontology and protocol ontology to describe behavior of services and relationships between them semantically. Finally the forth layer makes standard the concepts exists in ontologies of previous layer. The generated ontology exceeds absolute domain ontology because it considers the behavior of services moreover the hierarchical relationship of them. Experimental results conducted on a set of UML4Soa diagrams in different scopes demonstrate the improvement of the proposed approach from different points of view such as: completeness of requirements ontology, automatic generation and considering SOA.

Vahid Rastgoo

2014-04-01

135

A Minimized Assumption Generation Method for Component-Based Software Verification  

Science.gov (United States)

An assume-guarantee verification method has been recognized as a promising approach to verify component-based software by model checking. This method is not only fitted to component-based software but also has a potential to solve the state space explosion problem in model checking. The method allows us to decompose a verification target into components so that we can model check each of them separately. In this method, assumptions are seen as the environments needed for the components to satisfy a property and for the rest of the system to be satisfied. The number of states of the assumptions should be minimized because the computational cost of model checking is influenced by that number. Thus, we propose a method for generating minimal assumptions for the assume-guarantee verification of component-based software. The key idea of this method is finding the minimal assumptions in the search spaces of the candidate assumptions. The minimal assumptions generated by the proposed method can be used to recheck the whole system at much lower computational cost. We have implemented a tool for generating the minimal assumptions. Experimental results are also presented and discussed.

Pham, Ngoc Hung; Nguyen, Viet Ha; Aoki, Toshiaki; Katayama, Takuya

136

Analysis of quality raw data of second generation sequencers with Quality Assessment Software  

Directory of Open Access Journals (Sweden)

Full Text Available Abstract Background Second generation technologies have advantages over Sanger; however, they have resulted in new challenges for the genome construction process, especially because of the small size of the reads, despite the high degree of coverage. Independent of the program chosen for the construction process, DNA sequences are superimposed, based on identity, to extend the reads, generating contigs; mismatches indicate a lack of homology and are not included. This process improves our confidence in the sequences that are generated. Findings We developed Quality Assessment Software, with which one can review graphs showing the distribution of quality values from the sequencing reads. This software allow us to adopt more stringent quality standards for sequence data, based on quality-graph analysis and estimated coverage after applying the quality filter, providing acceptable sequence coverage for genome construction from short reads. Conclusions Quality filtering is a fundamental step in the process of constructing genomes, as it reduces the frequency of incorrect alignments that are caused by measuring errors, which can occur during the construction process due to the size of the reads, provoking misassemblies. Application of quality filters to sequence data, using the software Quality Assessment, along with graphing analyses, provided greater precision in the definition of cutoff parameters, which increased the accuracy of genome construction.

Schneider Maria PC

2011-04-01

137

GENESIS: Agile Generation of Information Management Oriented Software / GENESIS: Generación ágil de software orientado a gestión de información  

Scientific Electronic Library Online (English)

Full Text Available SciELO Colombia | Language: English Abstract in spanish La especificación de un sistema de información puede estar clara desde el principio: debe adquirir, desplegar, consultar y modificar datos, usando una base de datos. El asunto es decidir cuál información manejar. En el caso que origina este trabajo, la información evoluciona permanentemente, incluso [...] hasta final del proyecto. Esto implica la construcción de un nuevo sistema cada vez que se redefine la información. Este artículo presenta Genesis, una infraestructura ágil para la construcción inmediata del sistema de información que sea requerido. Los expertos describen su información y consultas. Genesis produce el software correspondiente, generando las interfaces gráficas y la base de datos apropiados. Abstract in english The specification for an information system can be clear from the beginning: it must acquire, display, query and modify data, using a database. The main issue is to decide which information to manage. In the case originating this work, information was always evolving, even up to the end of the proje [...] ct. This implies the construction of a new system each time the information is redefined. This article presents Genesis, an agile for the immediate construction of required information systems. Experts describe their information needs and queries, and Genesis generates the corresponding application, with the appropriate graphical interfaces and database.

Claudia, Jiménez Guarín; Juan Erasmo, Gómez.

2010-05-01

138

GENESIS: Agile Generation of Information Management Oriented Software / GENESIS: Generación ágil de software orientado a gestión de información  

Scientific Electronic Library Online (English)

Full Text Available SciELO Colombia | Language: English Abstract in spanish La especificación de un sistema de información puede estar clara desde el principio: debe adquirir, desplegar, consultar y modificar datos, usando una base de datos. El asunto es decidir cuál información manejar. En el caso que origina este trabajo, la información evoluciona permanentemente, incluso [...] hasta final del proyecto. Esto implica la construcción de un nuevo sistema cada vez que se redefine la información. Este artículo presenta Genesis, una infraestructura ágil para la construcción inmediata del sistema de información que sea requerido. Los expertos describen su información y consultas. Genesis produce el software correspondiente, generando las interfaces gráficas y la base de datos apropiados. Abstract in english The specification for an information system can be clear from the beginning: it must acquire, display, query and modify data, using a database. The main issue is to decide which information to manage. In the case originating this work, information was always evolving, even up to the end of the proje [...] ct. This implies the construction of a new system each time the information is redefined. This article presents Genesis, an agile for the immediate construction of required information systems. Experts describe their information needs and queries, and Genesis generates the corresponding application, with the appropriate graphical interfaces and database.

Claudia, Jiménez Guarín; Juan Erasmo, Gómez.

139

New software developments for quality mesh generation and optimization from biomedical imaging data.  

Science.gov (United States)

In this paper we present a new software toolkit for generating and optimizing surface and volumetric meshes from three-dimensional (3D) biomedical imaging data, targeted at image-based finite element analysis of some biomedical activities in a single material domain. Our toolkit includes a series of geometric processing algorithms including surface re-meshing and quality-guaranteed tetrahedral mesh generation and optimization. All methods described have been encapsulated into a user-friendly graphical interface for easy manipulation and informative visualization of biomedical images and mesh models. Numerous examples are presented to demonstrate the effectiveness and efficiency of the described methods and toolkit. PMID:24252469

Yu, Zeyun; Wang, Jun; Gao, Zhanheng; Xu, Ming; Hoshijima, Masahiko

2014-01-01

140

Simulation of photovoltaic systems electricity generation using homer software in specific locations in Serbia  

Directory of Open Access Journals (Sweden)

Full Text Available In this paper basic information of Homer software for PV system electricity generation, NASA - Surface meteorology and solar energy database, RETScreen, PVGIS and HMIRS (Hydrometeorological Institute of Republic of Serbia solar databases are given. The comparison of the monthly average values for daily solar radiation per square meter received by the horizontal surface taken from NASA, RETScreen, PVGIS and HMIRS solar databases for three locations in Serbia (Belgrade, Negotin and Zlatibor is given. It was found that the annual average values of daily solar radiation taken from RETScreen solar database are the closest to the annual average values of daily solar radiation taken from HMIRS solar database for Belgrade, Negotin and Zlatibor. Monthly and total for year values of electricity production of fixed on-grid PV system of 1 kW with optimal inclinated and south oriented solar modules, in Belgrade, Negotin and Zlatibor using HOMER software simulation based on data for daily solar radiation taken from NASA, RETScreen, PVGIS and HMIRS databases are calculated. The relative deviation of electricity production of fixed on-grid PV system of 1 kW using HOMER software simulation based on data for daily solar radiation taken from NASA, RETScreen, and PVGIS databases compared to electricity production of fixed on-grid PV system of 1 kW using HOMER software simulation based on data for daily solar radiation taken from HMIRS databases in Belgrade, Negotin and Zlatibor are given. [Projekat Ministarstva nauke Republike Srbije, br. TR 33009

Pavlovi? Tomislav M.

2013-01-01

 
 
 
 
141

Scoping analysis of the Advanced Test Reactor using SN2ND  

International Nuclear Information System (INIS)

focus of the work: can modern deterministic tools actually treat complex facilities like the ATR with heterogeneous geometry modeling. SN2ND has been demonstrated to solve problems with upwards of one trillion degrees of freedom which translates to tens of millions of finite elements, hundreds of angles, and hundreds of energy groups, resulting in a very high-fidelity model of the system unachievable by most deterministic transport codes today. A space-angle convergence study was conducted to determine the meshing and angular cubature requirements for the ATR, and also to demonstrate the feasibility of performing this analysis with a deterministic transport code capable of modeling heterogeneous geometries. The work performed indicates that a minimum of 260,000 linear finite elements combined with a L3T11 cubature (96 angles on the sphere) is required for both eigenvalue and flux convergence of the ATR. A critical finding was that the fuel meat and water channels must each be meshed with at least 3 'radial zones' for accurate flux convergence. A small number of 3D calculations were also performed to show axial mesh and eigenvalue convergence for a full core problem. Finally, a brief analysis was performed with different cross sections sets generated from DRAGON and SCALE, and the findings show that more effort will be required to improve the multigroup cross section generation process. The total number of degrees of freedom for a converged 27 group, 2D ATR problem is ?340 million. This number increases to ?25 billion for a 3D ATR problem. This scoping study shows that both 2D and 3D calculations are well within the capabilities of the current SN2ND solver, given the availability of a large-scale computing center such as BlueGene/P. However, dynamics calculations are not realistic without the implementation of improvements in the solver.

142

Comparison of HIV-1 drug resistance profiles generated from novel software applications for routine patient care  

Science.gov (United States)

Introduction Clinical laboratories performing routine HIV-1 genotyping antiviral drug resistance (DR) testing need reliable and up-to-date information systems to provide accurate and timely test results to optimize antiretroviral treatment in HIV-1-infected patients. Materials and Methods Three software applications were used to compare DR profiles generated from the analysis of HIV-1 protease (PR) and reverse transcriptase (RT) gene sequences obtained by Sanger sequencing assay in 100 selected clinical plasma samples from March 2013 through May 2014. Interpretative results obtained from the Trugene HIV-1 Genotyping assay (TG; Guidelines v17.0) were compared with a newly FDA-registered data processing module (DPM v1.0) and the research-use-only ViroScore-HIV (VS) software, both of which use the latest versions of Stanford HIVdb (SD v7.0) and geno2pheno (G2P v3.3) interpretive algorithms (IA). Differences among the DR interpretive algorithms were compared according to drug class (NRTI, NNRTI, PI) and each drug. HIV-1 tropism and integrase inhibitor resistance were not evaluated (not available in TG). Results Overall, only 17 of the 100 TG sequences obtained yielded equivalent DR profiles among all 3 software applications for every IA and for all drug classes. DPM and VS generated equivalent results with >99.9% agreement. Excluding AZT, DDI, D4T and rilpivirine (not available in G2P), ranges of agreement in DR profiles among the three IA (using the DPM) are shown in Table 1. Conclusions Substantial discrepancies (<75% agreement) exist among the three interpretive algorithms for ETR, while G2P differed from TG and SD for resistance to TDF and TPV/r. Use of more than one DR interpretive algorithm using well-validated software applications, such as DPM v1.0 and VS, would enable clinical laboratories to provide clinically useful and accurate DR results for patient care needs.

Gonzalez, Dimitri; Digmann, Benjamin; Barralon, Matthieu; Boulme, Ronan; Sayada, Chalom; Yao, Joseph

2014-01-01

143

Book review: Psychology in a work context (2nd Ed.  

Directory of Open Access Journals (Sweden)

Full Text Available Bergh, Z. & Theron, A.L. (Eds (2003 Psychology in a work context (2nd Ed.. Cape Town: Oxford University Press.

This book is an overview and introduction to Industrial and Organisational Psychology. It is a work of ambitious scope, and it is clear that the contributors have invested a great deal of thought and effort in the planning and execution of the book. The current version is the second edition, and it looks set to become one of those standard textbooks that are revised every few years to keep up with changing times. It is a handsome volume, produced to a high standard of editorial care, pleasingly laid out and organised well enough to be useful as an occasional reference source. An English-Afrikaans glossary, tables of contents for every chapter as well as for the entire book, a comprehensive index and extensive bibliography make it easy to retrieve the information relating to a particular topic. Every chapter ends with a conclusion summarising the gist of the material covered. Quality illustrations lighten the tone and help to bring some of the concepts to life. Learning outcomes and self-assessment exercises and questions for every chapter will be useful to the lecturer using the book as a source for a tutored course, and for the student studying by distance learning. If sold at the suggested retail price, the book represents good value compared to imported textbooks that cover similar ground.

Nanette Tredoux

2003-10-01

144

CELEBRATED APRIL 2nd – INTERNATIONAL DAY OF PERSONS WITH AUTISM  

Directory of Open Access Journals (Sweden)

Full Text Available On April 2nd, the Macedonian Scientific Society for Autism, for the fourth time organized an event on the occasion of the International Day of Persons with Autism. The event with cultural and artistic character was held at the Museum of the Macedonian Struggle under the motto “They are not alone, we are with them”. The huge number of citizens only confirmed the motto. It seemed that the hall of the Museum of the Macedonian Struggle is too small for the warm hearts of the audience. More than 300 guests were present in the hall, among which there were children with autism and their families, prominent professors, doctors, special educators and rehabilitators, psychologists, students and other citizens with glad heart and will who decided to enrich the event with their presence. The event was opened by the violinist Plamenka Trajkovska, which performed one song. After her, the President of the Macedonian Scientific Society for Autism, PhD. Vladimir Trajkovski delivered his speech. The professor told the parents of autistic children, who were present in large number, not to lose hope, to fight for their children, and that the Macedonian Scientific Society for Autism will provide tremendous support and assistance in this struggle.

Manuela KRCHANOSKA

2014-09-01

145

Examples to Accompany "Descriptive Cataloging of Rare Books, 2nd Edition."  

Science.gov (United States)

This book is intended to be used with "Descriptive Cataloging of Rare Books," 2nd edition (DCRB) as an illustrative aid to catalogers and others interested in or needing to interpret rare book cataloging. As such, it is to be used in conjunction with the rules it illustrates, both in DCRB and in "Anglo-American Cataloging Rules," 2nd edition…

Association of Coll. and Research Libraries, Chicago, IL.

146

A proposed metamodel for the implementation of object oriented software through the automatic generation of source code  

Directory of Open Access Journals (Sweden)

Full Text Available During the development of software one of the most visible risks and perhaps the biggest implementation obstacle relates to the time management. All delivery deadlines software versions must be followed, but it is not always possible, sometimes due to delay in coding. This paper presents a metamodel for software implementation, which will rise to a development tool for automatic generation of source code, in order to make any development pattern transparent to the programmer, significantly reducing the time spent in coding artifacts that make up the software.

CARVALHO, J. S. C.

2008-12-01

147

2nd International Open and Distance Learning (IODL Symposium  

Directory of Open Access Journals (Sweden)

Full Text Available This closing remarks prepared and presented by Prof. Dr. Murat BARKAN Anadolu University, Eskisehir, TURKEY DEAR GUESTS, As the 2nd International Open and Distance Learning (IODL Symposium is now drawing to end, I would like to thank you all for your outstanding speeches, distinguished presentations, constructive roundtable and session discussions, and active participation during the last five days. I hope you all share my view that the whole symposium has been a very stimulating and successful experience. Also, on behalf of all the participants, I would like to take this opportunity to thank and congratulate the symposium organization committee for their excellent job in organizing and hosting our 2nd meeting. Throughout the symposium, five workshops, six keynote speeches and 66 papers, which were prepared by more than 150 academicians and practitioners from 23 different countries, reflected remarkable and various views and approaches about open and flexible learning. Besides, all these academic endeavors, 13 educational films were displayed during the symposium. The technology exhibition, hosted by seven companies, was very effective to showcase the current level of the technology and the success of applications of theory into practice. Now I would like to go over what our scholar workshop and keynote presenters shared with us: Prof. Marina McIsaac form Arizona State University dwelled on how to determine research topics worthwhile to be examined and how to choose appropriate research design and methods. She gave us clues on how to get articles published in professional journals. Prof. Colin Latchem from Australia and Prof. Dr. Ali Ekrem Ozkul from Anadolu University pointed to the importance of strategic planning for distance and flexible learning. They highlighted the advantages of strategic planning for policy-makers, planners, managers and staff. Dr. Wolfram Laaser from Fern University of Hagen, presented different multimedia clips and directed interactive exercises using flashmx in his workshop. Jack Koumi from UK, presented a workshop about what to teach on video and when to choose other media. He exemplified 27 added value techniques and teaching functions for TV and video. He later specified different capabilities and limitations of eight different media used in teaching, emphasizing the importance of optimizing media deployment. Dr. Janet Bohren from University of Cincinnati and Jennifer McVay-Dyche from United Theological Seminary, explained their experience with a course management system used to develop dialogue between K-12 teachers in Turkey and the US, on the topics of religion, culture and schools. Their workshop provided an overview of a pilot study. They showed us a good case-study of utilizing “Blackboard” as a mean for getting rid of biases and improving the understanding of the American and Turkish teachers against each other. We had very remarkable key notes as well. Dr Nikitas Kastis representing European Distance and E-Learning Network (EDEN made his speech on distance and e-Learning evolutions and trends in Europe. He informed the audience about the application and assessment criteria at European scale, concerning e-Learning in the education and training systems. Meanwhile, our key note speakers took our attention to different applications of virtual learning. Dr. Piet Kommers from University of Twente exemplified a virtual training environment for acquiring surgical skills. Dr. Timothy Shih from Tamkang University presented their project called Hard SCORM (Sharable Content Object Reference Model as an asynchronous distance learning specification. In his speech titled “Engaging and Supporting Problem Solving Online” Prof. David Jonassen from University of Missouri, reflected his vision of the future of education and explained why it should embrace problem solving. Then he showed us examples of incorporating this vision with learning environments for making online problem solving possible. Dr. Wolfram Laaser from Fern University talked on applications of ICT at Europe

Reviewed by Murat BARKAN

2006-10-01

148

A software tool for simulation of surfaces generated by ball nose end milling  

DEFF Research Database (Denmark)

The number of models available for prediction of surface topography is very limited. The main reason is that these models cannot be based on engineering principles like those for elastic deformations. Most knowledge about surface roughness and integrity is empirical and up to now very few mathematical relationships relating surface parameters to cutting conditions are available. Basic models of kinematical roughness, determined by the tool profile and the pattern of relative motions of tool and workpiece, have been so far not reliable. The actual roughness may be more than five times higher due to error motions, unstable built up edge and changing tool profile due to wear [1]. Tool chatter is also affecting surface roughness, but its effect is normally not included in prediction of surface roughness, since machining conditions which generate chatter must be avoided in any case. Finally, reproducibility of experimental results concerning surface roughness requires tight control of all influencing factors, difficult to keep in actual machining workshops. This introduces further complications in surface topography modelling. In the light of these considerations, a simple software tool, for prediction of surface topography of ball nose end milled surfaces, was developed. Such software tool is based on a simplified model of the ideal tool motion and neglects the effects due to run-out, static and dynamic deflections and error motions, but has the merit of generating in output a file in a format readable by a surface processor software (SPIP [2]), for calculation of a number of surface roughness parameters. In the next paragraph a description of the basic features of ball nose end milled surfaces is given, while in paragraph 3 the model is described.

Bissacco, Giuliano

2004-01-01

149

Meteosat Second Generation station: processing software and computing arquitecture; Estacion de Recepcion de Imagenes del Satelite Meteosat Segunda generacion: Arquitectura Informatica y Software de Proceso  

Energy Technology Data Exchange (ETDEWEB)

The Renewable Energy Division of CIEMAT houses a specific station for receiving the Meteosat Second Generation images, which is of interest in the works being carried out concerning the solar radiation derived from satellite images. The complexity, the huge amount of information being received and the particular characteristics of the MSG images encouraged the design and development of a specific computer structure and the associated software as well, for a better and more suitable use of the images. This document describes the mentioned structure and software. (Author) 8 refs.

Martin, L.; Cony, M.; Navarro, A. A.; Zarzalejo, L. F.; Polo, J.

2010-05-01

150

2nd interface between ecology and land development in California  

Science.gov (United States)

The 2nd Interface Between Ecology and Land Development Conference was held in association with Earth Day 1997, five years after the first Interface Conference. Rapid population growth in California has intensified the inevitable conflict between land development and preservation of natural ecosystems. Sustainable development requires wise use of diminishing natural resources and, where possible, restoration of damaged landscapes. These Earth Week Celebrations brought together resource managers, scientists, politicians, environmental consultants, and concerned citizens in an effort to improve the communication necessary to maintain our natural biodiversity, ecosystem processes and general quality of life. As discussed by our keynote speaker, Michael Soule, the best predictor of habitat loss is population growth and nowhere is this better illustrated than in California. As urban perimeters expand, the interface between wildlands and urban areas increases. Few problems are more vexing than how to manage the fire prone ecosystems indigenous to California at this urban interface. Today resource managers face increasing challenges of dealing with this problem and the lead-off section of the proceedings considers both the theoretical basis for making decisions related to prescribed burning and the practical application. Habitat fragmentation is an inevitable consequence of development patterns with significant impacts on animal and plant populations. Managers must be increasingly resourceful in dealing with problems of fragmentation and the often inevitable consequences, including susceptibility to invasive oganisms. One approach to dealing with fragmentation problems is through careful landplanning. California is the national leader in the integration of conservation and economics. On Earth Day 1991, Governor Pete Wilson presented an environmental agenda that promised to create between land owners and environmentalists, agreements that would guarantee the protection of -endangered species and out of this grew the pioneering initiative, known as the Natural Communities Conservation Planning (NCCP) program. California's vast expanse of seemingly endless resources has traditionally been viewed as justification for abusive land use practices. The modem day recognition that resources are finite has led to greater concern, not only for conserving what is left, but for restoring abused landscapes. Ecological restoration is a new science devoted to returning disturbed environments to a semblance of their 'pristine' state. Based on principles of 'revegetation,' restoration goes far beyond simple replanting, rather the ambition of ecological restoration is to return landscapes to functioning ecosystems and is the focus of the last section.

Keeley, Jon E.; Baer-Keeley, Melanie; Fortheringham, C.J.

2000-01-01

151

Real Time Physics Module 1: Mechanics, 2nd Edition  

Science.gov (United States)

This computer-based lab manual contains experiments in mechanics, thermodynamics, E&M, and optics using hardware and software designed to enhance readers' understanding of calculus-based physics concepts. It uses an active learning cycle, including concept overviews, hypothesis-testing, prediction-making, and investigations.

Sokoloff, David; Laws, Priscilla W.; Thornton, Ronald K.

2005-11-28

152

Easy Steps to STAIRS. 2nd Revised Edition.  

Science.gov (United States)

This manual for computer searchers describes the software package--IBM's STAIRS (Storage And Information Retrieval System)--used for searching databases in AUSINET (AUStralian Information NETwork). Whereas the first edition explained STAIRS in the context of the National Library's Online ERIC Project and the ERIC data base, this second edition…

National Library of Australia, Canberra.

153

Quetzalcoatl: Una Herramienta para Generar Contratos de Desarrollo de Software en Entornos de Outsourcing / Quetzalcoatl: A Tool for Generating Software Development Contracts in Outsourcing Environments  

Scientific Electronic Library Online (English)

Full Text Available SciELO Portugal | Language: Spanish Abstract in spanish Actualmente el outsourcing es una de las actividades principales de trabajo. Sin embargo, las relaciones que se dan entre un cliente y un proveedor de servicios no son lo suficientemente fuertes para lograr las expectativas de los acuerdos. El contrato de outsourcing para proyectos en desarrollo de [...] software es una alternativa a este tipo de relaciones. En este artículo se presenta la arquitectura de la herramienta Quetzalcoatl, así como las funciones principales que ofrece la herramienta, todo esto con el objetivo de generar y evaluar contratos para proyectos de desarrollo de software en entornos de outsourcing como apoyo a las PYMEs. Abstract in english Nowadays, outsourcing is one of the most important work activities for the software development companies. However, the relationships between a client and a service provider are not b enough to meet the expectations of the agreements. The outsourcing contract for software development projects is an [...] alternative to this type of relationship. This paper presents the architecture of the tool named Quetzalcoatl, also the main functions that this tool offers in order to generate and evaluate contracts for software development projects in outsourcing environments to support SMEs.

Jezreel, Mejía; Sergio D., Ixmatlahua; Alma I., Sánchez.

154

Quetzalcoatl: Una Herramienta para Generar Contratos de Desarrollo de Software en Entornos de Outsourcing / Quetzalcoatl: A Tool for Generating Software Development Contracts in Outsourcing Environments  

Scientific Electronic Library Online (English)

Full Text Available SciELO Portugal | Language: Spanish Abstract in spanish Actualmente el outsourcing es una de las actividades principales de trabajo. Sin embargo, las relaciones que se dan entre un cliente y un proveedor de servicios no son lo suficientemente fuertes para lograr las expectativas de los acuerdos. El contrato de outsourcing para proyectos en desarrollo de [...] software es una alternativa a este tipo de relaciones. En este artículo se presenta la arquitectura de la herramienta Quetzalcoatl, así como las funciones principales que ofrece la herramienta, todo esto con el objetivo de generar y evaluar contratos para proyectos de desarrollo de software en entornos de outsourcing como apoyo a las PYMEs. Abstract in english Nowadays, outsourcing is one of the most important work activities for the software development companies. However, the relationships between a client and a service provider are not b enough to meet the expectations of the agreements. The outsourcing contract for software development projects is an [...] alternative to this type of relationship. This paper presents the architecture of the tool named Quetzalcoatl, also the main functions that this tool offers in order to generate and evaluate contracts for software development projects in outsourcing environments to support SMEs.

Jezreel, Mejía; Sergio D., Ixmatlahua; Alma I., Sánchez.

2014-03-01

155

Next generation hyper-scale software and hardware systems for big data analytics  

CERN Document Server

Building on foundational technologies such as many-core systems, non-volatile memories and photonic interconnects, we describe some current technologies and future research to create real-time, big data analytics, IT infrastructure. We will also briefly describe some of our biologically-inspired software and hardware architecture for creating radically new hyper-scale cognitive computing systems. About the speaker Rich Friedrich is the director of Strategic Innovation and Research Services (SIRS) at HP Labs. In this strategic role, he is responsible for research investments in nano-technology, exascale computing, cyber security, information management, cloud computing, immersive interaction, sustainability, social computing and commercial digital printing. Rich's philosophy is to fuse strategy and inspiration to create compelling capabilities for next generation information devices, systems and services. Using essential insights gained from the metaphysics of innnovation, he effectively leads ...

CERN. Geneva

2013-01-01

156

Next Generation Astronomical Data Processing using Big Data Technologies from the Apache Software Foundation  

Science.gov (United States)

In this era of exascale instruments for astronomy we must naturally develop next generation capabilities for the unprecedented data volume and velocity that will arrive due to the veracity of these ground-based sensor and observatories. Integrating scientific algorithms stewarded by scientific groups unobtrusively and rapidly; intelligently selecting data movement technologies; making use of cloud computing for storage and processing; and automatically extracting text and metadata and science from any type of file are all needed capabilities in this exciting time. Our group at NASA JPL has promoted the use of open source data management technologies available from the Apache Software Foundation (ASF) in pursuit of constructing next generation data management and processing systems for astronomical instruments including the Expanded Very Large Array (EVLA) in Socorro, NM and the Atacama Large Milimetre/Sub Milimetre Array (ALMA); as well as for the KAT-7 project led by SKA South Africa as a precursor to the full MeerKAT telescope. In addition we are funded currently by the National Science Foundation in the US to work with MIT Haystack Observatory and the University of Cambridge in the UK to construct a Radio Array of Portable Interferometric Devices (RAPID) that will undoubtedly draw from the rich technology advances underway. NASA JPL is investing in a strategic initiative for Big Data that is pulling in these capabilities and technologies for astronomical instruments and also for Earth science remote sensing. In this talk I will describe the above collaborative efforts underway and point to solutions in open source from the Apache Software Foundation that can be deployed and used today and that are already bringing our teams and projects benefits. I will describe how others can take advantage of our experience and point towards future application and contribution of these tools.

Mattmann, Chris

2014-04-01

157

Software for generation and analysis of photoelastic fringes in plates with a single hole subjected to in-plane loads  

International Nuclear Information System (INIS)

A software package for generating and analyzing photoelastic images on infinite rectangular plates, subjected to in-plane loads, is being presented. It allows the user to generate photoelastic images as produced in a polariscope fed by monochromatic light. Both circular and plane polariscopes in conditions of dark or light field can be selected. Tools for obtaining light intensity distributions along horizontal and vertical lines and for extracting darkest regions of photoelastic fringes are also available. The extraction of such regions can be done by digital image processing (DIP). This process produces thin lines, from which main stresses and intensity factor used in the Fracture Mechanics can be obtained. The software was developed for running on DOS environment in Super VGA mode. The synthetic photoelastic images are generated in 64 gray levels. This software is a useful tool for teaching the fundamentals of photoelasticity and will help the researchers in the development of photoelastic experiments. (author). 6 fefs., 7 figs

158

Software for generation and analysis of photoelastic fringes in plates with a single hole subjected to in-plane loads  

Energy Technology Data Exchange (ETDEWEB)

A software package for generating and analyzing photoelastic images on infinite rectangular plates, subjected to in-plane loads, is being presented. It allows the user to generate photoelastic images as produced in a polariscope fed by monochromatic light. Both circular and plane polariscopes in conditions of dark or light field can be selected. Tools for obtaining light intensity distributions along horizontal and vertical lines and for extracting darkest regions of photoelastic fringes are also available. The extraction of such regions can be done by digital image processing (DIP). This process produces thin lines, from which main stresses and intensity factor used in the Fracture Mechanics can be obtained. The software was developed for running on DOS environment in Super VGA mode. The synthetic photoelastic images are generated in 64 gray levels. This software is a useful tool for teaching the fundamentals of photoelasticity and will help the researchers in the development of photoelastic experiments. (author). 6 fefs., 7 figs.

Soares, W.A. [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN), Belo Horizonte, MG (Brazil); Andrade, A.H.P. [Instituto de Pesquisas Energeticas e Nucleares (IPEN), Sao Paulo, SP (Brazil)

1995-12-31

159

Neutron flux measurement based on the 2 nd Campbell theorem  

International Nuclear Information System (INIS)

Generally, nuclear flux measurement in research and production reactor are carried out in two stages: first, low level fluxes are measured by counting the pulses produced by fission or boron trifluoride chambers. Second, for high flux levels the parameter measured is the mean current generated in a compensated ionization chamber. A method which shows the feasibility of measuring neutron flux in the second stage with the same counting chamber used in the first stage, without the need to move it from its placement, is presented. (author). 6 refs., 4 figs

160

Physics design of the DARHT 2nd axis accelerator cell  

Energy Technology Data Exchange (ETDEWEB)

The next generation of radiographic machines based on induction accelerators require very high brightness electron beams to realize the desired x-ray spot size and intensity. This high brightness must be maintained throughout the beam transport, from source to x-ray converter target. The accelerator for the second-axis of the Dual Axis Radiographic Hydrodynamic Test (DARHT) facility is being designed to accelerate a 4-kA, 2-{micro}s pulse of electrons to 20 MeV. After acceleration, the 2-{micro}s pulse will be chopped into a train of four 50-ns pulses with variable temporal spacing by rapidly deflecting the beam between a beam stop and the final transport section. The short beam pulses will be focused onto an x-ray converter target generating four radiographic pulses within the 2-{micro}s window. Beam instability due to interaction with the accelerator cells can very adversely effect the beam brightness and radiographic pulse quality. This paper describes the various issues considered in the design of the accelerator cell with emphasis on transverse impedance and minimizing beam instabilities.

Chen, Y J; Houck, T L; Reginato, L J; Shang, C C; Yu, S S

1999-08-19

 
 
 
 
161

Optimizing Software Testing and Test Case Generation by using the concept of Hamiltonian Paths  

Directory of Open Access Journals (Sweden)

Full Text Available Software testing is a trade-off between budget, time and quality. Broadly, software testing can be classified as Unit testing, Integration testing, Validation testing and System testing. By including the concept of Hamiltonian paths we can improve greatly on the facet of software testing of any project. This paper shows how Hamiltonian paths can be used for requirement specification. It can also be used in acceptance testing phase for checking if all the user requirements are met or not. Further it gives the necessary calculations and algorithms to show the feasibility of its implementation.

Ankita Bihani

2014-04-01

162

A new analytic solution for 2nd-order Fermi acceleration  

International Nuclear Information System (INIS)

A new analytic solution for 2nd-order Fermi acceleration is presented. In particular, we consider time-dependent rates for stochastic acceleration, diffusive and convective escape as well as adiabatic losses. The power law index q of the turbulence spectrum is unconstrained and can therefore account for Kolmogorov (q = 5/3) and Kraichnan (q = 3/2) turbulence, Bohm diffusion (q = 1) as well as the hard-sphere approximation (q = 2). This considerably improves beyond solutions known to date and will prove a useful tool for more realistic modelling of 2nd-order Fermi acceleration in a variety of astrophysical environments

163

Proceedings of the 2nd KUR symposium on hyperfine interactions  

International Nuclear Information System (INIS)

Hyperfine interactions between a nuclear spin and an electronic spin discovered from hyperfine splitting in atomic optical spectra have been utilized not only for the determination of nuclear parameters in nuclear physics but also for novel experimental techniques in many fields such as solid state physics, chemistry, biology, mineralogy and for diagnostic methods in medical science. Experimental techniques based on hyperfine interactions yield information about microscopic states of matter so that they are important in material science. Probes for material research using hyperfine interactions have been nuclei in the ground state and radioactive isotopes prepared with nuclear reactors or particle accelerators. But utilization of muons generated from accelerators is recently growing. Such wide spread application of hyperfine interaction techniques gives rise to some difficulty in collaboration among various research fields. In these circumstances, the present workshop was planned after four years since the last KUR symposium on the same subject. This report summarizes the contributions to the workshop in order to be available for the studies of hyperfine interactions. (J.P.N.)

164

GSIMF: a web service based software and database management system for the generation grids  

International Nuclear Information System (INIS)

To process the vast amount of data from high energy physics experiments, physicists rely on Computational and Data Grids; yet, the distribution, installation, and updating of a myriad of different versions of different programs over the Grid environment is complicated, time-consuming, and error-prone. Our Grid Software Installation Management Framework (GSIMF) is a set of Grid Services that has been developed for managing versioned and interdependent software applications and file-based databases over the Grid infrastructure. This set of Grid services provide a mechanism to install software packages on distributed Grid computing elements, thus automating the software and database installation management process on behalf of the users. This enables users to remotely install programs and tap into the computing power provided by Grids

165

GSIMF: a web service based software and database management system for the next generation grids  

International Nuclear Information System (INIS)

To process the vast amount of data from high energy physics experiments, physicists rely on Computational and Data Grids; yet, the distribution, installation, and updating of a myriad of different versions of different programs over the Grid environment is complicated, time-consuming, and error-prone. Our Grid Software Installation Management Framework (GSIMF) is a set of Grid Services that has been developed for managing versioned and interdependent software applications and file-based databases over the Grid infrastructure. This set of Grid services provide a mechanism to install software packages on distributed Grid computing elements, thus automating the software and database installation management process on behalf of the users. This enables users to remotely install programs and tap into the computing power provided by Grids

166

2nd international expert meeting straw power; 2. Internationale Fachtagung Strohenergie  

Energy Technology Data Exchange (ETDEWEB)

Within the 2nd Guelzow expert discussions at 29th to 30th March, 2012 in Berlin (Federal Republic of Germany), the following lectures were held: (1) Promotion of the utilisation of straw in Germany (A. Schuette); (2) The significance of straw in the heat and power generation in EU-27 member states in 2020 and in 2030 under consideration of the costs and sustainability criteria (C. Panoutsou); (3) State of he art of the energetic utilization of hay goods in Europe (D. Thraen); (4) Incineration technological characterisation of straw based on analysis data as well as measured data of large-scale installations (I. Obernberger); (5) Energetic utilization of hay goods in Germany (T. Hering); (6) Actual state of the art towards establishing the first German straw thermal power station (R. Knieper); (7) Straw thermal power plants at agricultural sow farms and poultry farms (H. Heilmann); (8) Country report power from straw in Denmark (A. Evald); (9) Country report power from straw in Poland (J. Antonowicz); (10) Country report power from straw in China (J. Zhang); (11) Energetic utilisation of straw in Czechia (D. Andert); (12) Mobile pelletization of straw (S. Auth); (13) Experiences with the straw thermal power plant from Vattenfall (N. Kirkegaard); (14) Available straw potentials in Germany (potential, straw provision costs) (C. Weiser); (15) Standardization of hay good and test fuels - Classification and development of product standards (M. Englisch); (16) Measures of reduction of emissions at hay good incinerators (V. Lenz); (17) Fermentation of straw - State of the art and perspectives (G. Reinhold); (18) Cellulosis - Ethanol from agricultural residues - Sustainable biofuels (A. Hartmair); (19) Syngas by fermentation of straw (N. Dahmen); (20) Construction using straw (D. Scharmer).

NONE

2012-06-15

167

Summary of UNCCD 2nd Scientific Conference and CST S-3, 9-12 April 2013, Bonn, Germany  

...ENB, UNCCD, Land, CST S-3, UN Convention to Combat Desertification (UNCCD) 2nd Scientific Conference, Third Special Session of the Committee on Science and ...Summary and analysis of UNCCD CST Third Special Session (S-3) and 2nd Scientific Conference, 9-12 April 2013, Bonn, Germany Summary of ...UNCCD 2nd Scientific Conference and CST S-3, 9-12 April 2013, Bonn, Germany ...

168

mbs: modifying Hudson's ms software to generate samples of DNA sequences with a biallelic site under selection  

Directory of Open Access Journals (Sweden)

Full Text Available Abstract Background The pattern of single nucleotide polymorphisms, or SNPs, contains a tremendous amount of information with respect to the mechanisms of the micro-evolutionary process of a species. The inference of the roles of these mechanisms, including natural selection, relies heavily on computer simulations. A coalescent simulation is extremely powerful in generating a large number of samples of DNA sequences from a population (species when all mutations are neutral, and Hudson's ms software is frequently used for this purpose. However, it has been difficult to incorporate natural selection into the coalescent framework. Results We herein present a software application to generate samples of DNA sequences when there is a biallelic site targeted by selection. This software application, referred to as mbs, is developed by modifying Hudson's ms. The mbs software is so flexible that it can incorporate any arbitrary histories of population size changes and any mode of selection as long as selection is operating on a biallelic site. Conclusion mbs provides opportunities to investigate the effect of any mode of selection on the pattern of SNPs under various demography.

Innan Hideki

2009-05-01

169

Methodology and Toolset for Model Verification, Hardware/Software co-simulation, Performance Optimisation and Customisable Source-code generation  

DEFF Research Database (Denmark)

The MODUS project aims to provide a pragmatic and viable solution that will allow SMEs to substantially improve their positioning in the embedded-systems development market. The MODUS tool will provide a model verification and Hardware/Software co-simulation tool (TRIAL) and a performance optimisation and customisable source-code generation tool (TUNE). The concept is depicted in automated modelling and optimisation of embedded-systems development. The tool will enable model verification by guiding the selection of existing open-source model verification engines, based on the automated analysis of system properties, and producing inputs to be fed into these engines, interfacing with standard (SystemC) simulation platforms for HW/SW co-simulation, customisable source-code generation towards respecting coding standards and conventions and software performance-tuning optimisation through automated design transformations.

Berger, Michael Stübert; Soler, José

2013-01-01

170

XML (eXtensible Mark-up Language) Industrial Standard, Determining Architecture of the Next Generation of the Internet Software  

CERN Document Server

The past 1999 became the period of standing of the new Internet technology - XML (eXtensible Mark-up Language), the language of sectoring established by a Consortium WWW (http://www.w3.org) as a new industrial standard, determining architecture of the next generation Internet software. In this message the results of a research of this technology, basic opportunities XML, rules and recommendations for its application are given.

Galaktionov, V V

2000-01-01

171

Proceedings of the 2nd symposium on valves for coal conversion and utilization  

Energy Technology Data Exchange (ETDEWEB)

The 2nd symposium on valves for coal conversion and utilization was held October 15 to 17, 1980. It was sponsored by the US Department of Energy, Morgantown Energy Technology Center, in cooperation with the Valve Manufacturers Association. Seventeen papers have been entered individually into EDB and ERA. (LTN)

Maxfield, D.A. (ed.)

1981-01-01

172

Lessons learned from the 2nd Domestic Standard Problem Based on ATLAS Test  

International Nuclear Information System (INIS)

The 2nd Domestic Standard Problem exercise, DSP- 02 with a 6-inch cold leg break LOCA test performed with ATLAS was successfully completed. Lessons learned from this DSP-02 exercise are summarized in this paper, focusing on the findings for code deficiencies and user guidelines

173

Comparison of HCG and 2nd IRP-HMG standards in LH radioimmunoassay  

International Nuclear Information System (INIS)

The results are shown of a comparison of the investigation of 62 urine samples by RIA using HCG and 2nd IRP-HMG standards. It was found that 1 I.U. of the HCG standard for LH corresponds to 2.86 I.U. of the HMG standard, which well agrees with the literature data. (L.O.)

174

8. Book Review: ‘Broken Bones: Anthropological Analysis of Blunt Force Trauma’ 2 nd edition, 2014  

Directory of Open Access Journals (Sweden)

Full Text Available 'Broken Bones: Anthropological Analysis of Blunt Force Trauma' 2nd edition, 2014. Editors: Vicki L. Wedel and Alison Galloway; Publisher: Charles C. Thomas, Illinois. pp 479 + xxiii ISBN: 978-0-398-08768-5 (Hard ISBN: 978-0-398-08769-2 (eBook

R. Gaur

2014-04-01

175

Book of abstracts. 2-nd International Conference on 'Quantum electrodynamics and statistical physics' QEDSP2006  

International Nuclear Information System (INIS)

Abstracts of the talks presented to the 2-nd International Conference 'Quantum electrodynamics and statistical physics' QEDSP2006 (Sept. 19-23, 2006) deal with the up-to-date problems of quantum field theory and elementary particle theory, high-energy electrodynamics in matter, QED processes in strong fields, nonlinear dynamics, kinetic theory, phase transitions in condensed matter, and physics of quantum liquids

176

76 FR 29750 - Filing Dates for the Nevada Special Election in the 2nd Congressional District  

Science.gov (United States)

Nevada has scheduled a Special General Election on September 13, 2011, to fill the U.S. House seat in the 2nd Congressional District formerly held by Senator Dean Heller. Committees required to file reports in connection with the Special General Election on September 13, 2011, shall file a 12-day Pre-General Report, and a 30-day Post-General...

2011-05-23

177

Evaluation of a Hand Washing Program for 2nd-Graders  

Science.gov (United States)

The purpose of this project was to determine if a multiple-week learner-centered hand washing program could improve hand hygiene behaviors of 2nd-graders in a northern Illinois public school system. Volunteers from the Rockford Hand Washing Coalition went into 19 different classrooms for 4 consecutive weeks and taught a learner-centered program.…

Tousman, Stuart; Arnold, Dani; Helland, Wealtha; Roth, Ruth; Heshelman, Nannatte; Castaneda, Oralia; Fischer, Emily; O'Neil, Kristen; Bileto, Stephanie

2007-01-01

178

Generating Variable Strength Covering Array for Combinatorial Software Testing with Greedy Strategy  

Directory of Open Access Journals (Sweden)

Full Text Available Combinatorial testingis a practical and efficient software testing techniques, which could detectthe faults that triggered by interactions among factors in software. Comparedto the classic fixed strength combinatorial testing, the variable strengthcombinatorial testing usually uses less test cases to detect more interactionfaults, because it considers the actual interaction relationship in softwaresufficiently. For a model of variable strength combinatorial testing that hasbeen propose previously, two heuristic algorithms, which are based onone-test-at-a-time greedy strategy, are proposed in this paper to generatevariable strength covering arrays as test suites in software testing.Experimental results show that, compared to some existed algorithms and tools,the two proposed algorithms have advantages onboth the execution effectiveness and the optimality of the size of generatedtest suite.

Ziyuan Wang

2013-12-01

179

Verification of satellite EGSE test-software using an application generator  

Science.gov (United States)

Electrical Ground Support Equipment (EGSE) is used to perform checkout and testing procedures during the integration and testing of satellite sub-systems. The EGSE is a computer-based system that implements the concepts and standards of satellite integration and testing. The verification of spacecraft software is a well covered subject area. Less literature is available concerning the Acceptance Test Procedure (ATP) to be performed on the EGSE data-base itself. This paper poses the problem of EGSE data-base and special software verification and suggests a solution implemented in the Offeq program.

Gutmanovitz, Avi; Yarchi, Zohar

1990-10-01

180

Automatic Generation of Machine Emulators: Efficient Synthesis of Robust Virtual Machines for Legacy Software Migration  

DEFF Research Database (Denmark)

As older mainframe architectures become obsolete, the corresponding le- gacy software is increasingly executed via platform emulators running on top of more modern commodity hardware. These emulators are virtual machines that often include a combination of interpreters and just-in-time compilers. Implementing interpreters and compilers for each combination of emulated and target platform independently of each other is a redundant and error-prone task. We describe an alternative approach that automatically synthesizes specialized virtual-machine interpreters and just-in-time compilers, which then execute on top of an existing software portability platform such as Java. The result is a considerably reduced implementation effort.

Franz, Michael; Gal, Andreas

2006-01-01

 
 
 
 
181

Software concept of in-service diagnostic systems for nuclear steam generating facilities  

International Nuclear Information System (INIS)

The concept of software systems of in-service diagnostics is presented for the primary circuits of WWER-440 and WWER-1000 reactors. The basic and supplementary systems and user software are described for the collection, processing and evaluation of diagnostic signals from the primary circuits of the Dukovany and Bohunice nuclear power plants and the design is presented of the hierarchical structure of computers in the diagnostic systems of the Mochovce and Temelin nuclear power plants. The systems are operated using computers of Czechoslovak make of the ADT production series with operating systems RTE-II or DOS IV. (J.B.)

182

Modeling of wind turbines with doubly fed generator system  

CERN Document Server

Jens Fortmann describes the deduction of models for the grid integration of variable speed wind turbines and the reactive power control design of wind plants. The modeling part is intended as background to understand the theory, capabilities and limitations of the generic doubly fed generator and full converter wind turbine models described in the IEC 61400-27-1 and as 2nd generation WECC models that are used as standard library models of wind turbines for grid simulation software. Focus of the reactive power control part is a deduction of the origin and theory behind the reactive current requ

Fortmann, Jens

2014-01-01

183

3rd harmonic electron cyclotron resonant heating absorption enhancement by 2nd harmonic heating at the same frequency in a tokamak  

International Nuclear Information System (INIS)

The fundamental mechanisms responsible for the interplay and synergy between the absorption dynamics of extraordinary-mode electron cyclotron waves at two different harmonic resonances (the 2nd and 3rd) are investigated in the TCV tokamak. An enhanced 3rd harmonic absorption in the presence of suprathermal electrons generated by 2nd harmonic heating is predicted by Fokker–Planck simulations, subject to complex alignment requirements in both physical space and momentum space. The experimental signature for the 2nd/3rd harmonic synergy is sought through the suprathermal bremsstrahlung emission in the hard x-ray range of photon energy. Using a synthetic diagnostic, the emission variation due to synergy is calculated as a function of the injected power and of the radial transport of suprathermal electrons. It is concluded that in the present experimental setup a synergy signature has not been unambiguously detected. The detectability of the synergy is then discussed with respect to variations and uncertainties in the plasma density and effective charge in view of future optimized experiments. (paper)

184

A proposed approach for developing next-generation computational electromagnetics software  

Energy Technology Data Exchange (ETDEWEB)

Computations have become a tool coequal with mathematics and measurements as a means of performing electromagnetic analysis and design. This is demonstrated by the volume of articles and meeting presentations in which computational electromagnetics (CEM) is routinely employed to address an increasing variety of problems. Yet, in spite of the substantial resources invested in CEM software over the past three decades, little real progress seems to have been made towards providing the EM engineer software tools having a functionality equivalent to that expected of hardware instrumentation. Furthermore, the bulk of CEM software now available is generally of limited applicability to large, complex problems because most modeling codes employ a single field propagator, or analytical form, of Maxwell's Equations. The acknowledged advantages of hybrid models, i.e., those which employ different propagators in differing regions of a problem, are relatively unexploited. The thrust of this discussion is to propose a new approach designed to address both problems outlined above, integrating advances being made in both software and hardware development. After briefly reviewing the evolution of modeling CEM software to date and pointing out the deficiencies thereof, we describe an approach for making CEM tools more truly user friendly'' called EMSES (Electromagnetic Modeling and Simulation Environment for Systems). This will be achieved through two main avenues. One is developing a common problem-description language implemented in a visual programming environment working together with a translator that produces the specific model description needed by various numerical treatments, in order to optimize user efficiency. The other is to employ a new modeling paradigm based on the idea of field propagators to expedite the development of the hybrid models that are needed to optimize computation efficiency.

Miller, E.K.; Kruger, R.P. (Los Alamos National Lab., NM (United States)); Moraites, S. (Simulated Life Systems, Inc., Chambersburg, PA (United States))

1993-01-01

185

A proposed approach for developing next-generation computational electromagnetics software  

Energy Technology Data Exchange (ETDEWEB)

Computations have become a tool coequal with mathematics and measurements as a means of performing electromagnetic analysis and design. This is demonstrated by the volume of articles and meeting presentations in which computational electromagnetics (CEM) is routinely employed to address an increasing variety of problems. Yet, in spite of the substantial resources invested in CEM software over the past three decades, little real progress seems to have been made towards providing the EM engineer software tools having a functionality equivalent to that expected of hardware instrumentation. Furthermore, the bulk of CEM software now available is generally of limited applicability to large, complex problems because most modeling codes employ a single field propagator, or analytical form, of Maxwell`s Equations. The acknowledged advantages of hybrid models, i.e., those which employ different propagators in differing regions of a problem, are relatively unexploited. The thrust of this discussion is to propose a new approach designed to address both problems outlined above, integrating advances being made in both software and hardware development. After briefly reviewing the evolution of modeling CEM software to date and pointing out the deficiencies thereof, we describe an approach for making CEM tools more truly ``user friendly`` called EMSES (Electromagnetic Modeling and Simulation Environment for Systems). This will be achieved through two main avenues. One is developing a common problem-description language implemented in a visual programming environment working together with a translator that produces the specific model description needed by various numerical treatments, in order to optimize user efficiency. The other is to employ a new modeling paradigm based on the idea of field propagators to expedite the development of the hybrid models that are needed to optimize computation efficiency.

Miller, E.K.; Kruger, R.P. [Los Alamos National Lab., NM (United States); Moraites, S. [Simulated Life Systems, Inc., Chambersburg, PA (United States)

1993-02-01

186

Makahiki+WattDepot : An open source software stack for next generation energy research and education  

DEFF Research Database (Denmark)

The accelerating world-wide growth in demand for energy has led to the conceptualization of a “smart grid”, where a variety of decentralized, intermittent, renewable energy sources (for example, wind, solar, and wave) would provide most or all of the power required by small-scale “micro-grids” servicing hundreds to thousands of consumers. Such a smart grid will require consumers to transition from passive to active participation in order to optimize the efficiency and effectiveness of the grid’s electrical capabilities. This paper presents a software stack comprised of two open source software systems, Makahiki and WattDepot, which together are designed to engage consumers in energy issues through a combination of education, real-time feedback, incentives, and game mechanics. We detail the novel features of Makahiki and WattDepot, along with our initial experiences using them to implement an energy challenge called the Kukui Cup.

Johnson, Philip M.; Xu, Yongwen

2013-01-01

187

PPRODUCTION OF 2ND GENERATION BIOETHANOL FROM LUCERNE – OPTIMIZATION OF HYDROTHERMAL PRETREATMENT  

Directory of Open Access Journals (Sweden)

Full Text Available Lucerne (Medicago sativa has many qualities associated with sustainable agriculture such as nitrogen fixation and high biomass yield. Therefore, there is interest in whether lucerne is a suitable biomass substrate for bioethanol production, and if hydrothermal pretreatment (HTT of lucerne improves enzymatic convertibility, providing sufficient enzymatic conversion of carbohydrate to simple sugars for ethanol production. The HTT process was optimised for lucerne hay, and the pretreated biomass was assessed by carbohydrate analysis, inhibitor characterisation of liquid phases, and by simultaneous saccharification and fermentation (SSF of the whole slurry with Cellubrix enzymes and Saccharomyces cerevisiae yeast. The optimal HTT conditions were 205°C for 5 minutes, resulting in pentose recovery of 81%, and an enzymatic convertibility of glucan to monomeric glucose of 74%, facilitating a conversion of 6.2% w/w of untreated material into bioethanol in SSF, which is equivalent to 1,100 litre ethanol per hectare per year

Sune Tjalfe Thomsen,

2012-02-01

188

2nd generation lignocellulosic bioethanol: is torrefaction a possible approach to biomass pretreatment?  

Energy Technology Data Exchange (ETDEWEB)

Biomass pretreatement is a key and energy-consuming step for lignocellulosic ethanol production; it is largely responsible for the energy efficiency and economic sustainability of the process. A new approach to biomass pretreatment for the lignocellulosic bioethanol chain could be mild torrefaction. Among other effects, biomass torrefaction improves the grindability of fibrous materials, thus reducing energy demand for grinding the feedstock before hydrolysis, and opens the biomass structure, making this more accessible to enzymes for hydrolysis. The aim of the preliminary experiments carried out was to achieve a first understanding of the possibility to combine torrefaction and hydrolysis for lignocellulosic bioethanol processes, and to evaluate it in terms of sugar and ethanol yields. In addition, the possibility of hydrolyzing the torrefied biomass has not yet been proven. Biomass from olive pruning has been torrefied at different conditions, namely 180-280 C for 60-120 min, grinded and then used as substrate in hydrolysis experiments. The bioconversion has been carried out at flask scale using a mixture of cellulosolytic, hemicellulosolitic, {beta}-glucosidase enzymes, and a commercial strain of Saccharomyces cerevisiae. The experiments demonstrated that torrefied biomass can be enzymatically hydrolyzed and fermented into ethanol, with yields comparable with grinded untreated biomass and saving electrical energy. The comparison between the bioconversion yields achieved using only raw grinded biomass or torrefied and grinded biomass highlighted that: (1) mild torrefaction conditions limit sugar degradation to 5-10%; and (2) torrefied biomass does not lead to enzymatic and fermentation inhibition. Energy consumption for ethanol production has been preliminary estimated, and three different pretreatment steps, i.e., raw biomass grinding, biomass-torrefaction grinding, and steam explosion were compared. Based on preliminary results, steam explosion still has a significant advantage compared to the other two process chains. (orig.)

Chiaramonti, David; Rizzo, Andrea Maria; Prussi, Matteo [University of Florence, CREAR - Research Centre for Renewable Energy and RE-CORD, Florence (Italy); Tedeschi, Silvana; Zimbardi, Francesco; Braccio, Giacobbe; Viola, Egidio [ENEA - Laboratory of Technology and Equipment for Bioenergy and Solar Thermal, Rotondella (Italy); Pardelli, Paolo Taddei [Spike Renewables s.r.l., Florence (Italy)

2011-03-15

189

Generation of high brightness X-ray source and its medical applications (2nd report)  

International Nuclear Information System (INIS)

A laser produced plasma X-ray is one of the most feasible sources to be used for medical applications: Angiography, Protein crystallography and X-ray microscopy, etc. In the present paper, we have carried out fundamental experiments to investigate X-ray characterization, spectroscopy of X-rays from laser produced plasmas and the effect of debris from a target. Human lymphocyte cells have been observed with X-ray contact microscopy and atomic force microscopy. (author)

190

Utilisation of 2nd generation web technologies in master level vocational teacher training  

Directory of Open Access Journals (Sweden)

Full Text Available The Masters level Opportunities and Technological Innovation in Vocational Teacher Education project (project site: http://motivate.tmpk.bmf.hu/ aims to develop the use and management of virtual learning environments in the area of vocational teacher training, drawing on a well established international partnership of institutions providing both technical and educational expertise. This paper gives an overall picture of the first results and products of the collaboration. We touch upon the goals, the assessments and the learning process of using “Multimedia and e-Learning: e-learning methods and tools” module in details. The main cooperative and collaborative devices are presented in virtual learning environment. The communication during collaborative learning, the structured debate on forum and the benefits of collaborative learning in VLE are interpreted at the end of this paper.

Péter Tóth

2009-03-01

191

Reed canary grass as a feedstock for 2nd generation bioethanol production.  

Science.gov (United States)

The enzymatic hydrolysis and fermentation of reed canary grass, harvested in the spring or autumn, and barley straw were studied. Steam pretreated materials were efficiently hydrolysed by commercial enzymes with a dosage of 10-20FPU/g d.m. Reed canary grass harvested in the spring was hydrolysed more efficiently than the autumn-harvested reed canary grass. Additional ?-glucosidase improved the release of glucose and xylose during the hydrolysis reaction. The hydrolysis rate and level of reed canary grass with a commercial Trichoderma reesei cellulase could be improved by supplementation of purified enzymes. The addition of CBH II improved the hydrolysis level by 10% in 48hours' hydrolysis. Efficient mixing was shown to be important for hydrolysis already at 10% dry matter consistency. The highest ethanol concentration (20g/l) and yield (82%) was obtained with reed canary grass at 10% d.m. consistency. PMID:22939601

Kallioinen, Anne; Uusitalo, Jaana; Pahkala, Katri; Kontturi, Markku; Viikari, Liisa; Weymarn, Niklas von; Siika-Aho, Matti

2012-11-01

192

Development of Seismic Safety Assessment Technology for Containment Structure ( 2nd intermediate Report )  

Energy Technology Data Exchange (ETDEWEB)

This intermediate report is made based on the research results of seismic analysis and seismic margin assessment field, carried out during 2nd stage ( '00. 4. 1. {approx} '01.3.31 ) under financial support of MOST ( Ministry of Science and Technology ). The objective of this research is to develop the soil - structure interaction analysis technique with high reliability, the main research subjects, performed during 2nd stage, are as follows. 1) Verification of SSI analysis program which was developed on 1st stage 2) Improvement of soil nonlinear characteristic evaluation technique by seismic recorded data at downhole array 3) Spatial variation evaluation of earthquake ground motion 4) Database construction of Hualien earthquake recorded data. (author). 43 refs., 49 figs., 10 tabs.

Jang, J.B.; Suh, Y.P.; Lee, J.R. [Korea Electric Power Research Institute, Taejon (Korea)

2001-07-01

193

Twist decomposition of nonlocal light-cone operators II: general tensors of 2nd rank  

CERN Document Server

A group theoretical procedure, introduced earlier in , to decompose bilocal light-ray operators into (harmonic) operators of definite twist is applied to the case of arbitrary 2nd rank tensors. As a generic example the bilocal gluon operator is considered which gets contributions of twist-2 up to twist-6 from four different symmetry classes characterized by corresponding Young tableaux; also the twist decomposition of the related vector and scalar operators is considered. In addition, we extend these results to various trilocal light-ray operators, like the Shuryak-Vainshtein, the three-gluon and the four-quark operators, which are required for the consideration of higher-twist distribution amplitudes. The present results rely on the knowledge of harmonic tensor polynomials of any order n which have been determined up to the case of 2nd rank symmetric tensor for arbitrary space-time dimension.

Geyer, B

2000-01-01

194

Twist decomposition of nonlocal light-cone operators II: general tensors of 2nd rank  

International Nuclear Information System (INIS)

A group theoretical procedure, introduced earlier in , to decompose bilocal light-ray operators into (harmonic) operators of definite twist is applied to the case of arbitrary 2nd rank tensors. As a generic example the bilocal gluon operator is considered which gets contributions of twist-2 up to twist-6 from four different symmetry classes characterized by corresponding Young tableaux; also the twist decomposition of the related vector and scalar operators is considered. In addition, we extend these results to various trilocal light-ray operators, like the Shuryak-Vainshtein, the three-gluon and the four-quark operators, which are required for the consideration of higher-twist distribution amplitudes. The present results rely on the knowledge of harmonic tensor polynomials of any order n which have been determined up to the case of 2nd rank symmetric tensor for arbitrary space-time dimension

195

Crystal structures and phase transformation of deuterated lithium imide, Li2ND  

International Nuclear Information System (INIS)

We have investigated the crystal structure of deuterated lithium imide, Li2ND, by means of neutron and X-ray diffraction. An order-disorder transition occurs near 360K. Below that temperature Li2ND can be described to the same level of accuracy as a disordered cubic (Fd3-bar m) structure with partially occupied Li 32e sites or as a fully occupied orthorhombic (Ima2 or Imm2) structure. The high temperature phase is best characterized as disordered cubic (Fm3-bar m) with D atoms randomized over the 192l sites. Density functional theory calculations complement and support the diffraction analyses. We compare our findings in detail with previous studies

196

Primary School 2nd Grade Students’ Perceptions Towards Standard Unit Of Length Measurement  

Directory of Open Access Journals (Sweden)

Full Text Available The purpose of this study was to determine primary school 2nd grade students’ perceptions towards standard unit of length measurement. Sample of the study consists of 6 2nd grade students in a primary school in Eski?ehir. These students were selected according to their success levels as there were one girl and one boy in every level. In order to obtain data, semi-structured interview forms were used and each subject was interviewed individually. The content analyze method was used for analyzing data. According to the results of the study, students perceive the standard unit of length measurement “meter” as both a measuring unit and a measuring instrument.

Kür?at YEN?LMEZ

2008-08-01

197

Technical Background Material for the Wave Generation Software AwaSys 5  

DEFF Research Database (Denmark)

"Les Appareils Generateurs de Houle en Laboratorie" presented by Bi¶esel and Suquet in 1951 discussed and solved the analytical problems concerning a number of di®erent wave generator types. For each wave maker type the paper presented the transfer function between wave maker displacement and wave amplitude in those cases where the analytical problem could be solved. The article therefore represented a giant step in wave generation techniques and found the basis for today's wave generation in hydraulics laboratories.

Frigaard, Peter; Andersen, Thomas Lykke

2010-01-01

198

Study of Nd2O3-GeO2-NdPO4 system  

International Nuclear Information System (INIS)

Phase ratios in the system Nd2O3-GeO2-NdPO4 at the temperatures of 1300-1400 deg C were ascertained by the method of X-ray phase analysis. Three compounds are formed in the system: Nd3GePO9, Nd8GeP2O19, Nd7Ge2PO17. Roentgenometric data on Nd7Ge2PO17 are given

199

Production of artificial ionospheric layers by frequency sweeping near the 2nd gyroharmonic  

Digital Repository Infrastructure Vision for European Research (DRIVER)

Artificial ionospheric plasmas descending from the background F-region have been observed on multiple occasions at the High Frequency Active Auroral Research Program (HAARP) facility since it reached full 3.6 MW power. Proximity of the transmitter frequency to the 2nd harmonic of the electron gyrofrequency (2fce) has been noted as a requirement for their occurrence, and their disappearance after only a few minutes has been attributed to the increasing...

Pedersen, T.; Mccarrick, M.; Reinisch, B.; Watkins, B.; Hamel, R.; Paznukhov, V.

2011-01-01

200

Report of the 2nd through 7th conferences of Special Committee on Nuclear Criticality Safety  

International Nuclear Information System (INIS)

Special committee on Nuclear Criticality Safety was established as a public committee of Atomic Energy Society of Japan in November, 1988. The main objectives of this committee are to contribute to reasonable criticality safety design/control through extensive discussions among the specialists of reactor physics, fuel treatment process and radiation surveillance technique and so on. The conferences were held totally seven times. This report concerns with the activities of this committee in the 2nd (1989) through 7th (1992) conferences. (author)

 
 
 
 
201

Software tool for analysing the family shopping basket without candidate generation  

Directory of Open Access Journals (Sweden)

Full Text Available Tools leading to useful knowledge being obtained for supporting marketing decisions being taken are currently needed in the e-commerce environment. A process is needed for this which uses a series of techniques for data-processing; data-mining is one such technique enabling automatic information discovery. This work presents the association rules as a suitable technique for dis-covering how customers buy from a company offering business to consumer (B2C e-business, aimed at supporting decision-ma-king in supplying its customers or capturing new ones. Many algorithms such as A priori, DHP, Partition, FP-Growth and Eclat are available for implementing association rules; the following criteria were defined for selecting the appropriate algorithm: database insert, computational cost, performance and execution time. The development of a software tool is also presented which involved the CRISP-DM approach; this software tool was formed by the following four sub-modules: data pre-processing, data-mining, re-sults analysis and results application. The application design used three-layer architecture: presentation logic, business logic and service logic. Data warehouse design and algorithm design were included in developing this data-mining software tool. It was tested by using a FoodMart company database; the tests included performance, functionality and results’ validity, thereby allo-wing association rules to be found. The results led to concluding that using association rules as a data mining technique facilita-tes analysing volumes of information for B2C e-business services which represents a competitive advantage for those companies using Internet as their sales’ media.

Roberto Carlos Naranjo Cuervo

2010-05-01

202

An overview of the Software Development Process for the NASA Langley Atmospheric Data Center Archive Next Generation system  

Science.gov (United States)

The Atmospheric Science Data Center (ASDC) at NASA Langley Research Center is responsible for the archive and distribution of Earth science data in the areas of radiation budget, clouds, aerosols, and tropospheric chemistry. The ASDC has developed and implemented the Archive Next Generation (ANGe) system, a state-of-the-art data ingest, archival, and distribution system to serve the atmospheric sciences data provider and user communities. The ANGe project follows a software development process that covers the full life-cycle of the system, from initial requirements to deployment to production to long-term maintenance of the software. The project uses several tools to support the different stages of the process, such as Subversion for source code control, JIRA for change management, Confluence for documentation and collaboration, and Bamboo for continuous integration. Based on our experience with developing ANGe and other projects at the ASDC, we also provide support for local science projects by setting up Subversion repositories and tools such as Trac, and providing training and support on their use. An overview of the software development process and the tools used to support it will be presented.

Piatko, P.; Perez, J.; Kinney, J. B.

2013-12-01

203

A luminescence spectroscopy study of SrI{sub 2}:Nd{sup 3+} single crystals  

Energy Technology Data Exchange (ETDEWEB)

The paper presents the results of a study on the luminescence of SrI{sub 2}:Nd{sup 3+} single crystals grown by the vertical Bridgman method. The photoluminescence (PL) spectra of SrI{sub 2}:Nd crystals show characteristic lines corresponding to transitions in trivalent Nd{sup 3+} ions, the most intense line at ca. 1070 nm is due to the radiative {sup 4}F{sub 3/2}?{sup 4}I{sub 11/2} transitions. Efficient PL excitation in a crystal transparency band occurs due to optical 4f?4f transitions from the ground {sup 4}I{sub 9/2} state, or the charge transfer transitions I{sup ?}?Nd{sup 3+}. The paper discussed two new PL emission bands at 2.8 and 3.8 eV, associated with the lattice defects formed in SrI{sub 2} crystal at introducing Nd{sup 3+} impurity ions; two intense narrow PL excitation bands at 5.52 and 5.31 eV, originating from free and defect-bound excitons, respectively; an efficient channel of exciton energy transfer between the host lattice and defects. We calculated the H(k) functions of distribution of the elementary relaxations over the reaction rate constants and explained on this basis the nonexponential PL decay kinetics in SrI{sub 2}:Nd{sup 3+} crystal. -- Author-Highlights: • Luminescence spectroscopy study with time-resolution of SrI{sub 2}:Nd{sup 3+} single crystals. • PL emission band at 1070 nm is due to {sup 4}F{sub 3/2}?{sup 4}I{sub 11/2} transitions in Nd{sup 3+} ions. • PL bands at 2.8 and 3.8 eV are due to the lattice defects in SrI{sub 2} Nd{sup 3+} crystals. • Efficient channel of exciton energy transfer between the host lattice and defects. • Nonexponential PL decay kinetics explained in terms of Kohlrausch function.

Ogorodnikov, I.N., E-mail: i.n.ogorodnikov@gmail.com [Experimental Physics Department, Ural Federal University, 19, Mira Street, 620002 Ekaterinburg (Russian Federation); Pustovarov, V.A. [Experimental Physics Department, Ural Federal University, 19, Mira Street, 620002 Ekaterinburg (Russian Federation); Goloshumova, A.A.; Isaenko, L.I.; Yelisseyev, A.P.; Pashkov, V.M. [Institute of Geology and Mineralogy of Siberian Branch of RAS, 43, Russkaya Street, 630058 Novosibirsk (Russian Federation)

2013-11-15

204

Sustainable development - a role for nuclear power? 2nd scientific forum  

International Nuclear Information System (INIS)

The 2nd Scientific Forum of the International Atomic Energy Agency (IAEA) was held during the 43rd General Conference. This paper summarizes the deliberations of the two-day Forum. The definition of 'sustainable development' of the 1987 Bruntland Commission - 'development that meets the needs of the present without compromising the ability of future generations to meet their own needs' - provided the background for the Forum's debate whether and how nuclear power could contribute to sustainable energy development. The framework for this debate comprises different perspectives on economic, energy, environmental, and political considerations. Nuclear power, along with all energy generating systems, should be judged on these considerations using a common set of criteria (e.g., emission levels, economics, public safety, wastes, and risks). First and foremost, there is a growing political concern over the possible adverse impact of increasing emissions of greenhouse gases from fossil fuel combustion. However, there is debate as to whether this would have any material impact on the predominantly economic criteria currently used to make investment decisions on energy production. According to the views expressed, the level of safety of existing nuclear power plants is no longer a major concern - a view not yet fully shared by the general public. The need to maintain the highest standards of safety in operation remains, especially under the mounting pressure of competitiveness in deregulated and liberalized energy markets. The industry must continuously reinforce a strong safety culture among reactor designers, builders, and operators. Furthermore, a convincing case for safety will have to be made for any new reactor designs. Of greater concern to the public and politicians are the issues of radioactive waste and proliferation of nuclear weapons. There is a consensus among technical experts that radioactive wastes from nuclear power can be disposed of safely and economically in deep geologic formations. However, the necessary political decisions to select sites for repositories need public support and understanding about what the industry is doing and what can be done. As to nuclear weapons proliferation, the existing safeguards system must be fully maintained and strengthened and inherently proliferation-resistant fuel cycles should be explored. Overviews of the future global energy demand and of the prospects for nuclear power in various economic regions of the world indicate that, in the case of the OECD countries, the dominant issue is economics in an increasingly free market system for electricity. For the so-called transition economies, countries of the Former Soviet Union and Central and Eastern Europe, the issue is one of managing nuclear power plant operations safely. In the case of developing countries, the dominant concern is effective management of technology, in addition to economics and finance. The prospects for nuclear power depend on the resolution of two cardinal issues. The first is economic competitiveness, and in particular, reduced capital cost. The second is public confidence in the ability of the industry to manage plant operations and its high level waste safely. There is a continuing need for dialogue and communication with all sectors of the public: economists, investors, social scientists, politicians, regulators, unions, and environmentalists. Of help in this dialogue would be nuclear power's relevance to and comparative advantages in addressing environmental issues, such as global climate change, local air quality, and regional acidification. Suggestions have been made for a globalized approach to critical nuclear power issues, such as waste management, innovative and proliferation-resistant reactors and fuel cycles, and international standards for new generation nuclear reactor designs.The conclusion seems to be that there is a role for nuclear energy in sustainable development, especially if greenhouse gas emissions are to be limited. Doubts persist in the minds of many energy experts over the pote

205

Software Reviews.  

Science.gov (United States)

Provides reviews of six computer software programs designed for use in elementary science education programs. Provides the title, publisher, grade level, and descriptions of courseware on ant farms, drugs, genetics, beachcombing, matter, and test generation. (TW)

Wulfson, Stephen, Ed.

1987-01-01

206

Development of Data Analysis Software for Diagnostic Eddy Current Probe (D-probe) for Steam Generator Tube Inspection  

International Nuclear Information System (INIS)

Occurrences of a stress corrosion cracking in the steam generator tubes of nuclear power plants are closely related to the residual stress existing on the region of a geometric change, that is, expansion transition, u-bend, ding, dent, bulge, etc. Therefore, information on the location, type and quantitative size of a geometric anomaly existing in a tube is a prerequisite to the activity of a non destructive inspection for a root cause analysis, alert detection of an earlier crack, and the prediction of a further crack evolution. KAERI developed an innovative eddy current probe, D-probe, equipped with the simultaneous dual functions of a crack detection and a 3-dimensional quantitative profile measurement. Its excellent performance has been verified through the sampling inspections in several domestic nuclear power plants where the various types of the steam generator tube cracking were observed in operation. The qualified data analysis software should be furnished in order to deploy D-probe to pre- and in-service inspection of commercial power plant. This paper introduces the PC-Windows based eddy current data analysis software which is being developed for D-probe in cooperation with Zetec Inc

207

Software tools for automatic generation of finite element mesh and application of biomechanical calculation in medicine  

Directory of Open Access Journals (Sweden)

Full Text Available Cardiovascular diseases are common and a special difficulty in their curing is diagnostics. Modern medical instruments can provide data that is much more adequate for computer modeling. Computer simulations of blood flow through the cardiovascular organs give powerful advantages to scientists today. The motivation for this work is raw data that our Center recently received from the University Clinical center in Heidelberg from a multislice CT scanner. In this work raw data from CT scanner was used for creating a 3D model of the aorta. In this process we used Gmsh, TetGen (Hang Si as well as our own software tools, and the result was the 8-node (brick mesh on which the calculation was run. The results obtained were very satisfactory so...

Milašinovi? Danko Z.

2008-01-01

208

MaRRS: A Software System for Generating Multimedia Radiology Reports using Adobe Acrobat  

Directory of Open Access Journals (Sweden)

Full Text Available Despite the proliferation of mature multimedia software technologies, radiology reports continue to lack image content and structure that would improve the ability of referring clinicians to fully interpret and analyze radiological findings. This paper introduces an intuitive and interactive radiology report authoring system that provides enhanced visual multimedia capabilities, structured content, and reduced report production time, using a well-known PDF program, Adobe Acrobat. The system, which we call the Multimedia Radiology Report System, or MaRRS, allows radiologists to quickly and simply create and deliver effective interactive multimedia medical reports. This paper will introduce MaRRS, outline some related radiology report systems, and describe the unique structure and functionality of MaRRS in order to demonstrate its advantages for both radiologists and referring clinicians.

Kristy Moniz

2010-06-01

209

Software and Algorithms for Solving Computational Geodynamic Problems using Next Generation Hardware  

Science.gov (United States)

Numerical geodynamic modeling is typically based on solving a series of partial differential equations which describe the long-term behavior of the solid visco-elasto-brittle/plastic Earth as a highly viscous incompressible fluid with strongly variable non-Newtonian viscosity. Coding for solving geodynamic equations is catching up with the advance of modern high performance computing. In the past five years, newly developed many-core computing technology, including GPU (Graphics Processing Unit) and MIC (Many Integrated Core), has also been utilized for geodynamic modeling. However, the lack of easy-to-expand or easy-to-use geo-computing toolkits limits the high performance software catching up with the endless updating of high performance hardware. In this presentation, we will firstly show two examples of the implementation of solving geodynamic problems based on Stokes and continuity equations with strongly variable viscosity using many-core hardware, with a specific focus on the GPU. The first example is a geometric multi-grid (GMG) solver, which solves a synthetic sinking cube problem using a staggered grid finite difference discretization. The second example is a preconditioned minimal residual (MINRES) solver for incompressible Stokes flow problem with many viscous inclusions which is discretized using the finite element method. Through these two implementation examples, we will analyze the cost of coding and running advantages and disadvantages of the two kinds of coding methodologies, and in a hope to discuss a potential general coding flowchart for solving geodynamic equations using many-core devices. Finally, a software stack based many-core computing framework oriented to geodynamic modeling is proposed for the future.

Zheng, Liang; Gerya, Taras

2014-05-01

210

Radiation protection for repairs of reactor's internals at the 2nd Unit of the Nuclear Power Plant Temelin  

International Nuclear Information System (INIS)

This presentation describes the process and extent of repairs of the 2nd unit of the Nuclear power plant Temelin during the shutdown of the reactor. All works were optimized in terms of radiation protection of workers.

211

Standardizing the next generation of bioinformatics software development with BioHDF (HDF5).  

Science.gov (United States)

Next Generation Sequencing technologies are limited by the lack of standard bioinformatics infrastructures that can reduce data storage, increase data processing performance, and integrate diverse information. HDF technologies address these requirements and have a long history of use in data-intensive science communities. They include general data file formats, libraries, and tools for working with the data. Compared to emerging standards, such as the SAM/BAM formats, HDF5-based systems demonstrate significantly better scalability, can support multiple indexes, store multiple data types, and are self-describing. For these reasons, HDF5 and its BioHDF extension are well suited for implementing data models to support the next generation of bioinformatics applications. PMID:20865556

Mason, Christopher E; Zumbo, Paul; Sanders, Stephan; Folk, Mike; Robinson, Dana; Aydt, Ruth; Gollery, Martin; Welsh, Mark; Olson, N Eric; Smith, Todd M

2010-01-01

212

Group field theory as the 2nd quantization of Loop Quantum Gravity  

CERN Document Server

We construct a 2nd quantized reformulation of canonical Loop Quantum Gravity at both kinematical and dynamical level, in terms of a Fock space of spin networks, and show in full generality that it leads directly to the Group Field Theory formalism. In particular, we show the correspondence between canonical LQG dynamics and GFT dynamics leading to a specific GFT model from any definition of quantum canonical dynamics of spin networks. We exemplify the correspondence of dynamics in the specific example of 3d quantum gravity. The correspondence between canonical LQG and covariant spin foam models is obtained via the GFT definition of the latter.

Oriti, Daniele

2013-01-01

213

PREFACE: 2nd International Meeting for Researchers in Materials and Plasma Technology  

Science.gov (United States)

These proceedings present the written contributions of the participants of the 2nd International Meeting for Researchers in Materials and Plasma Technology, 2nd IMRMPT, which was held from February 27 to March 2, 2013 at the Pontificia Bolivariana Bucaramanga-UPB and Santander and Industrial - UIS Universities, Bucaramanga, Colombia, organized by research groups from GINTEP-UPB, FITEK-UIS. The IMRMPT, was the second version of biennial meetings that began in 2011. The three-day scientific program of the 2nd IMRMPT consisted in 14 Magisterial Conferences, 42 Oral Presentations and 48 Poster Presentations, with the participation of undergraduate and graduate students, professors, researchers and entrepreneurs from Colombia, Russia, France, Venezuela, Brazil, Uruguay, Argentina, Peru, Mexico, United States, among others. Moreover, the objective of IMRMPT was to bring together national and international researchers in order to establish scientific cooperation in the field of materials science and plasma technology; introduce new techniques of surface treatment of materials to improve properties of metals in terms of the deterioration due to corrosion, hydrogen embrittlement, abrasion, hardness, among others; and establish cooperation agreements between universities and industry. The topics covered in the 2nd IMRMPT include New Materials, Surface Physics, Laser and Hybrid Processes, Characterization of Materials, Thin Films and Nanomaterials, Surface Hardening Processes, Wear and Corrosion / Oxidation, Modeling, Simulation and Diagnostics, Plasma Applications and Technologies, Biomedical Coatings and Surface Treatments, Non Destructive Evaluation and Online Process Control, Surface Modification (Ion Implantation, Ion Nitriding, PVD, CVD). The editors hope that those interested in the are of materials science and plasma technology, enjoy the reading that reflect a wide range of topics. It is a pleasure to thank the sponsors and all the participants and contributors for making possible this international meeting of researchers. It should be noted that the event organized by UIS and UPB universities, through their research groups FITEK and GINTEP, was a very significant contribution to the national and international scientific community, achieving the interaction of different research groups from academia and business sector. On behalf of the research groups GINTEP - UPB and FITEK - UIS, we greatly appreciate the support provided by the Sponsors, who allowed to continue with the dream of research. Ely Dannier V-Nitilde no The Editor The PDF file also contains a list of committees and sponsors.

Niño, Ely Dannier V.

2013-11-01

214

TF insert experiment log book. 2nd Experiment of CS model coil  

International Nuclear Information System (INIS)

The cool down of CS model coil and TF insert was started on August 20, 2001. It took almost one month and immediately started coil charge since September 17, 2001. The charge test of TF insert and CS model coil was completed on October 19, 2001. In this campaign, total shot numbers were 88 and the size of the data file in the DAS (Data Acquisition System) was about 4 GB. This report is a database that consists of the log list and the log sheets of every shot. This is an experiment logbook for 2nd experiment of CS model coil and TF insert for charge test. (author)

215

Future Control and Automation : Proceedings of the 2nd International Conference on Future Control and Automation  

CERN Document Server

This volume Future Control and Automation- Volume 1 includes best papers selected from 2012 2nd International Conference on Future Control and Automation (ICFCA 2012) held on July 1-2, 2012, Changsha, China. Future control and automation is the use of control systems and information technologies to reduce the need for human work in the production of goods and services. This volume can be divided into five sessions on the basis of the classification of manuscripts considered, which is listed as follows: Identification and Control, Navigation, Guidance and Sensor, Simulation Technology, Future Telecommunications and Control

2012-01-01

216

Groundwater Management in Mining Areas. Proceedings of the 2nd Image-Train Advanced Study Course  

International Nuclear Information System (INIS)

Innovative Management of Groundwater Resources in Europe - training and RTD coordination (IMAGE-TRAIN) has the ambition to improve cooperation and interaction between ongoing research projects in the field of soil and groundwater contamination and to communicate new technology achievements to young scientists by means of training courses. The 2nd IMAGE-TRAIN advanced study course focussed on mine water management. This report includes reviews papers of the key-note lectures dealing with flooded mines, mine water pollution, in-situ remediation technologies (uranium mine), and mine water regulation. Those reviews of INIS database scope are indexed separately. (nevyjel)

217

jMHC: software assistant for multilocus genotyping of gene families using next-generation amplicon sequencing.  

Science.gov (United States)

Genotyping of multilocus gene families, such as the major histocompatibility complex (MHC), may be challenging because of problems with assigning alleles to loci and copy number variation among individuals. Simultaneous amplification and genotyping of multiple loci may be necessary, and in such cases, next-generation deep amplicon sequencing offers a great promise as a genotyping method of choice. Here, we describe jMHC, a computer program developed for analysing and assisting in the visualization of deep amplicon sequencing data. Software operates on FASTA files; therefore, output from any sequencing technology may be used. jMHC was designed specifically for MHC studies but it may be useful for analysing amplicons derived from other multigene families or for genotyping other polymorphic systems. The program is written in Java with user-friendly graphical interface (GUI) and can be run on Microsoft Windows, Linux OS and Mac OS. PMID:21676201

Stuglik, Micha? T; Radwan, Jacek; Babik, Wies?aw

2011-07-01

218

Development of new generation software tools for simulation of electron beam formation in novel high power gyrotrons  

International Nuclear Information System (INIS)

Computer aided design (CAD) based on numerical experiments performed by using adequate physical models and efficient simulation codes is an indispensable tool for development, investigation, and optimization of gyrotrons used as radiation sources for electron cyclotron resonance heating (ECRH) of fusion plasmas. In this paper, we review briefly the state-of-the-art in the field of modelling and simulation of intense, relativistic, helical electron beams formed in the electron-optical systems (EOS) of powerful gyrotrons. We discuss both the limitations of the known computer codes and the requirements for increasing their capabilities for solution of various design problems that are being envisaged in the development of the next generation gyrotrons for ECRH. Moreover, we present the concept followed by us in an attempt to unite the advantages of the modern programming techniques with self-consistent, first-principles 3D physical models in the creation of a new highly efficient and versatile software package for simulation of powerful gyrotrons

219

Development of new generation software tools for simulation of electron beam formation in novel high power gyrotrons  

Energy Technology Data Exchange (ETDEWEB)

Computer aided design (CAD) based on numerical experiments performed by using adequate physical models and efficient simulation codes is an indispensable tool for development, investigation, and optimization of gyrotrons used as radiation sources for electron cyclotron resonance heating (ECRH) of fusion plasmas. In this paper, we review briefly the state-of-the-art in the field of modelling and simulation of intense, relativistic, helical electron beams formed in the electron-optical systems (EOS) of powerful gyrotrons. We discuss both the limitations of the known computer codes and the requirements for increasing their capabilities for solution of various design problems that are being envisaged in the development of the next generation gyrotrons for ECRH. Moreover, we present the concept followed by us in an attempt to unite the advantages of the modern programming techniques with self-consistent, first-principles 3D physical models in the creation of a new highly efficient and versatile software package for simulation of powerful gyrotrons.

Sabchevski, S [Institute of Electronics, Bulgarian Academy of Sciences, BG-1784 Sofia (Bulgaria); Zhelyazkov, I [Faculty of Physics, Sofia University, BG-1164 Sofia (Bulgaria); Benova, E [Faculty of Physics, Sofia University, BG-1164 Sofia (Bulgaria); Atanassov, V [Institute of Electronics, Bulgarian Academy of Sciences, BG-1784 Sofia (Bulgaria); Dankov, P [Faculty of Physics, Sofia University, BG-1164 Sofia (Bulgaria); Thumm, M [Forschungszentrum Karlsruhe, Association EURATOM-FZK, Institute for Pulsed Power and Microwave Technology, D-76021 Karlsruhe (Germany); Dammertz, G [University of Karlsruhe, Institute of High Frequency Techniques and Electronics, D-76128 Karlsruhe (Germany); Piosczyk, B [Forschungszentrum Karlsruhe, Association EURATOM-FZK, Institute for Pulsed Power and Microwave Technology, D-76021 Karlsruhe (Germany); Illy, S [Forschungszentrum Karlsruhe, Association EURATOM-FZK, Institute for Pulsed Power and Microwave Technology, D-76021 Karlsruhe (Germany); Tran, M Q [Centre de Recherches en Physique des Plasmas, Association EURATOM-CRPP, Ecole Polytechnique Federale de Lausanne, CH-1015 Lausanne (Switzerland); Alberti, S [Centre de Recherches en Physique des Plasmas, Association EURATOM-CRPP, Ecole Polytechnique Federale de Lausanne, CH-1015 Lausanne (Switzerland); Hogge, J-Ph [Centre de Recherches en Physique des Plasmas, Association EURATOM-CRPP, Ecole Polytechnique Federale de Lausanne, CH-1015 Lausanne (Switzerland)

2006-07-15

220

Development of new generation software tools for simulation of electron beam formation in novel high power gyrotrons  

Science.gov (United States)

Computer aided design (CAD) based on numerical experiments performed by using adequate physical models and efficient simulation codes is an indispensable tool for development, investigation, and optimization of gyrotrons used as radiation sources for electron cyclotron resonance heating (ECRH) of fusion plasmas. In this paper, we review briefly the state-of-the-art in the field of modelling and simulation of intense, relativistic, helical electron beams formed in the electron-optical systems (EOS) of powerful gyrotrons. We discuss both the limitations of the known computer codes and the requirements for increasing their capabilities for solution of various design problems that are being envisaged in the development of the next generation gyrotrons for ECRH. Moreover, we present the concept followed by us in an attempt to unite the advantages of the modern programming techniques with self-consistent, first-principles 3D physical models in the creation of a new highly efficient and versatile software package for simulation of powerful gyrotrons.

Sabchevski, S.; Zhelyazkov, I.; Benova, E.; Atanassov, V.; Dankov, P.; Thumm, M.; Dammertz, G.; Piosczyk, B.; Illy, S.; Tran, M. Q.; Alberti, S.; Hogge, J.-Ph

2006-07-01

 
 
 
 
221

Roles of doping ions in afterglow properties of blue CaAl2O4:Eu2+,Nd3+ phosphors  

Science.gov (United States)

Eu2+ doped and Nd3+ co-doped calcium aluminate (CaAl2O4:Eu2+,Nd3+) phosphor was prepared by a urea-nitrate solution combustion method at furnace temperatures as low as 500 °C. The produced CaAl2O4:Eu2+,Nd3+ powder was investigated in terms of phase composition, morphology and luminescence by X-Ray diffraction (XRD), Scanning Electron Microscope (SEM), Fourier Transform Infra Red spectroscopy (FTIR) and Photoluminescence (PL) techniques respectively. XRD analysis depicts a dominant monoclinic phase that indicates no change in the crystalline structure of the phosphor with varying concentration of Eu2+ and Nd3+. SEM results show agglomerates with non-uniform shapes and sizes with a number of irregular network structures having lots of voids and pores. The Energy Dispersive X-ray Spectroscopy (EDS) and (FTIR) spectra confirm the expected chemical components of the phosphor. PL measurements indicated one broadband excitation spectra from 200 to 300 nm centered around 240 nm corresponding to the crystal field splitting of the Eu2+ d-orbital and an emission spectrum in the blue region with a maximum on 440 nm. This is a strong indication that there was dominantly one luminescence center, Eu2+ which represents emission from transitions between the 4f7 ground state and the 4f6-5d1 excited state configuration. High concentrations of Eu2+ and Nd3+ generally reduce both intensity and lifetime of the phosphor powders. The optimized content of Eu2+ is 1 mol% and for Nd3+ is 1 mol% for the obtained phosphors with excellent optical properties. The phosphor also emits visible light at around 587 and 616 nm. Such emissions can be ascribed to the 5D0-7F1 and 5D0-7F2 intrinsic transition of Eu3+ respectively. The decay characteristics exhibit a significant rise in initial intensity with increasing Eu2+ doping concentration while the decay time increased with Nd3+ co-doping. The observed afterglow can be ascribed to the generation of suitable traps due to the presence of the Nd3+ ions.

Wako, A. H.; Dejene, B. F.; Swart, H. C.

2014-04-01

222

Characterization of the 1st and 2nd EF-hands of NADPH oxidase 5 by fluorescence, isothermal titration calorimetry, and circular dichroism  

Directory of Open Access Journals (Sweden)

Full Text Available Abstract Background Superoxide generated by non-phagocytic NADPH oxidases (NOXs is of growing importance for physiology and pathobiology. The calcium binding domain (CaBD of NOX5 contains four EF-hands, each binding one calcium ion. To better understand the metal binding properties of the 1st and 2nd EF-hands, we characterized the N-terminal half of CaBD (NCaBD and its calcium-binding knockout mutants. Results The isothermal titration calorimetry measurement for NCaBD reveals that the calcium binding of two EF-hands are loosely associated with each other and can be treated as independent binding events. However, the Ca2+ binding studies on NCaBD(E31Q and NCaBD(E63Q showed their binding constants to be 6.5 × 105 and 5.0 × 102 M-1 with ?Hs of -14 and -4 kJ/mol, respectively, suggesting that intrinsic calcium binding for the 1st non-canonical EF-hand is largely enhanced by the binding of Ca2+ to the 2nd canonical EF-hand. The fluorescence quenching and CD spectra support a conformational change upon Ca2+ binding, which changes Trp residues toward a more non-polar and exposed environment and also increases its ?-helix secondary structure content. All measurements exclude Mg2+-binding in NCaBD. Conclusions We demonstrated that the 1st non-canonical EF-hand of NOX5 has very weak Ca2+ binding affinity compared with the 2nd canonical EF-hand. Both EF-hands interact with each other in a cooperative manner to enhance their Ca2+ binding affinity. Our characterization reveals that the two EF-hands in the N-terminal NOX5 are Ca2+ specific. Graphical abstract

Wei Chin-Chuan

2012-04-01

223

Comparison of CO 2, Nd:YAG and high power diode lasers for the ablation of tile grout  

Science.gov (United States)

Process feasibility of a laser-based grout removal process and the effects thereof on the processing parameters have been successfully examined using three different lasers: CO 2, Nd:YAG and HPDL (high power diode laser). The grout removal rate was found to increase linearly with laser power level. In contrast, it was affected very little by laser traverse speed. The optimum removal rate was obtained with the CO 2 laser and this occurrence was believed to be due to the CO 2 laser's wavelength. Surface morphological and material characterisation on the laser-removed grout was carried out. Depending on the laser used, colour ranged from white to dark grey. Significant differences between the untreated and laser-treated samples, as well as between the samples treated by the different lasers, were observed. The CO 2 laser produced relatively larger sized particles, with the Nd:YAG and HPDL generating finer particles exhibiting a similar appearance. An energy depressive X-ray (EDX) and an X-ray diffraction analysis (XRD) of the epoxy grout chemical composition before and after the laser treatment revealed that CaCO 3 (limestone) was decomposed to give CaO and CO 2 during laser interaction. The existence of SiO 2, Ti and dolomite was also found in the original grout material. thermogravimetric and differential thermal analysis (TG-DTA) identified a sequence of three stages in the thermal history of the epoxy grout.

Minami, K.; Lawrence, J.; Li, L.; Edwards, R. E.; Gale, A. W.

2002-01-01

224

Proceedings of the 2nd JAERI symposium on HTGR technologies October 21 ? 23, 1992, Oarai, Japan  

International Nuclear Information System (INIS)

The Japan Atomic Energy Research Institute (JAERI) held the 2nd JAERI Symposium on HTGR Technologies on October 21 to 23, 1992, at Oarai Park Hotel at Oarai-machi, Ibaraki-ken, Japan, with support of International Atomic Energy Agency (IAEA), Science and Technology Agency of Japan and the Atomic Energy Society of Japan on the occasion that the construction of the High Temperature Engineering Test Reactor (HTTR), which is the first high temperature gas-cooled reactor (HTGR) in Japan, is now being proceeded smoothly. In this symposium, the worldwide present status of research and development (R and D) of the HTGRs and the future perspectives of the HTGR development were discussed with 47 papers including 3 invited lectures, focusing on the present status of HTGR projects and perspectives of HTGR Development, Safety, Operation Experience, Fuel and Heat Utilization. A panel discussion was also organized on how the HTGRs can contribute to the preservation of global environment. About 280 participants attended the symposium from Japan, Bangladesh, Germany, France, Indonesia, People's Republic of China, Poland, Russia, Switzerland, United Kingdom, United States of America, Venezuela and the IAEA. This paper was edited as the proceedings of the 2nd JAERI Symposium on HTGR Technologies, collecting the 47 papers presented in the oral and poster sessions along with 11 panel exhibitions on the results of research and development associated to the HTTR. (author)

225

Efficient FPGA implementation of 2nd order digital controllers using Matlab/Simulink  

Directory of Open Access Journals (Sweden)

Full Text Available This paper explains a method for the design and implementation of digital controller based on Field Programmable Gate Array (FPGA device. It is more compact, power efficient and provides high speed capabilities as compared to software based PID controllers. The proposed method is based on implementation of Digital controller as digital filters using DSP architectures. The PID controller is designed using MATLAB and Simulink to generate a set of coefficients associated with the desired controller characteristics. The controller coefficients are then included in VHDL that implements the PID controller on to FPGA. MATLAB program is used to design PID controller to calculate and plot the time response of the control system. The synthesis report concludes the resource utilization of selected FPGA.

Vikas gupta

2011-08-01

226

PCID and ASPIRE 2.0 - The Next Generation of AMOS Image Processing Software  

Science.gov (United States)

One of the missions of the Air Force Maui Optical and Supercomputing (AMOS) site is to generate high-resolution images of space objects using the Air Force telescopes located on Haleakala. Because atmospheric turbulence greatly reduces the resolution of space object images collected with ground-based telescopes, methods for overcoming atmospheric blurring are necessary. One such method is the use of adaptive optics systems to measure and compensate for atmospheric blurring in real time. A second method is to use image restoration algorithms on one or more short-exposure images of the space object under consideration. At AMOS, both methods are used routinely. In the case of adaptive optics, rarely can all atmospheric turbulence effects be removed from the imagery, so image restoration algorithms are useful even for adaptive-optics-corrected images. Historically, the bispectrum algorithm has been the primary image restoration algorithm used at AMOS. It has the advantages of being extremely fast (processing times of less than one second) and insensitive to atmospheric phase distortions. In addition, multi-frame blind deconvolution (MFBD) algorithms have also been used for image restoration. It has been observed empirically and with the use of computer simulation studies that MFBD algorithms produce higher-resolution image restorations than does the bispectrum algorithm. MFBD algorithms also do not need separate measurements of a star in order to work. However, in the past, MFBD algorithms have been factors of one hundred or more slower than the bispectrum algorithm, limiting their use to non-time-critical image restorations. Recently, with the financial support of AMOS and the High-Performance Computing Modernization Office, an MFBD algorithm called Physically-Constrained Iterative Deconvolution (PCID) has been efficiently parallelized and is able to produce image restorations in only a few seconds. In addition, with the financial support of AFOSR, it has been shown that PCID achieves or closely approaches the theoretical limits to image restoration quality for a variety of scenarios. For these reasons, PCID is now being transitioned to being the site-wide image restoration algorithm. Because the algorithm can be complicated to use, a GUI is being developed to be the front end to the PCID algorithm. This interface, called the Advanced SPeckle Image Reconstruction Environment (ASPIRE) version 2.0, is the next generation of the current ASPIRE GUI used as a front end to the bispectrum algorithm. ASPIRE 2.0 will be the front-end GUI to PCID, the bispectrum algorithm, and the AMOSphere database. In this presentation we describe ASPIRE 2.0 and PCID and how to use them to obtain high-resolution images.

Matson, C.; Soo Hoo, T.; Murphy, M.; Calef, B.; Beckner, C.; You, S.

227

Move Table: An Intelligent Software Tool for Optimal Path Finding and Halt Schedule Generation  

Directory of Open Access Journals (Sweden)

Full Text Available This study aims to help army officials in taking decisions before war to decide the optimal path for army troops moving between two points in a real world digital terrain, considering factors like traveled distance, terrain type, terrain slope, and road network. There can optionally be one or more enemies (obstacles located on the terrain which should be avoided. A tile-based A* search strategy with diagonal distance and tie-breaker heuristics is proposed for finding the optimal path between source and destination nodes across a real-world 3-D terrain. A performance comparison (time analysis, search space analysis, and accuracy has been made between the multiresolution A* search and the proposed tile-based A* search for large-scale digital terrain maps. Different heuristics, which are used by the algorithms to guide these to the goal node, are presented and compared to overcome some of the computational constraints associated with path finding on large digital terrains. Finally, a halt schedule is generated using the optimal path, weather condition, moving time, priority and type of a column, so that the senior military planners can strategically decide in advance the time and locations where the troops have to halt or overtake other troops depending on their priority and also the time of reaching the destination.

Anupam Agrawal

2007-09-01

228

GONe: Software for estimating effective population size in species with generational overlap  

Science.gov (United States)

GONe is a user-friendly, Windows-based program for estimating effective size (N e) in populations with overlapping generations. It uses the Jorde-Ryman modification to the temporal method to account for age structure in populations. This method requires estimates of age-specific survival and birth rate and allele frequencies measured in two or more consecutive cohorts. Allele frequencies are acquired by reading in genotypic data from files formatted for either GENEPOP or TEMPOFS. For each interval between consecutive cohorts, N e is estimated at each locus and over all loci. Furthermore, N e estimates are output for three different genetic drift estimators (F s, F c and F k). Confidence intervals are derived from a chi-square distribution with degrees of freedom equal to the number of independent alleles. GONe has been validated over a wide range of N e values, and for scenarios where survival and birth rates differ between sexes, sex ratios are unequal and reproductive variances differ. GONe is freely available for download at. ?? 2011 Blackwell Publishing Ltd.

Coombs, J.A.; Letcher, B.H.; Nislow, K.H.

2012-01-01

229

2nd INTERNATIONAL CONFERENCE ON ELECTRICAL SYSTEMS (ICES 2006 8-10 May 2006,  

Directory of Open Access Journals (Sweden)

Full Text Available The 2nd International Conference on Electrical Systems (ICES 2006 was held 810 May 2006 at Larbi Ben M’Hidi University, Oum El-Bouaghi, Algeria. This conference provides opportunities for professional engineers, particularly young engineers, from both industry and academia to share ideas, explore recent developments, current practices and future trends in all aspects of electrical systems and related fields. ICES 2006 was of similar standing to the previous conference (PCSE’05 by the high quality of the presentations, the technical content of the papers, and the number of delegates attending. As in PCSE’05, it had a broad theme, covering all aspects of electrical power engineering, and was attended by academics, researchers, consultants and members of the manufacturing and electrical Supply industries. During the sessions, 86 papers selected from 300 uploads from 13 countries were debated.

Tarek Bouktir

2006-12-01

230

Nonlinear Dynamics of Memristor Based 2nd and 3rd Order Oscillators  

Exceptional behaviours of Memristor are illustrated in Memristor based second order (Wien oscillator) and third order (phase shift oscillator) oscillator systems in this Thesis. Conventional concepts about sustained oscillation have been argued by demonstrating the possibility of sustained oscillation with oscillating resistance and dynamic poles. Mathematical models are also proposed for analysis and simulations have been presented to support the surprising characteristics of the Memristor based oscillator systems. This thesis also describes a comparative study among the Wien family oscillators with one Memristor. In case of phase shift oscillator, one Memristor and three Memristors systems are illustrated and compared to generalize the nonlinear dynamics observed for both 2nd order and 3rd order system. Detail explanations are provided with analytical models to simplify the unconventional properties of Memristor based oscillatory systems.

Talukdar, Abdul Hafiz

2011-05-01

231

Language and pragmatic profile in children with ADHD measured by Children's Communication Checklist 2nd edition.  

Science.gov (United States)

Abstract Objective. The aim of this study was to explore whether children with attention deficit hyperactivity disorder (ADHD) have language and/or pragmatic difficulties compared to typically developing children. Methods. Nineteen children with ADHD (age 5-12 years) and nineteen typically developing children (age 5-8 years) were evaluated using the Finnish version of Children's Communication Checklist 2nd edition (CCC-2). The CCC-2 questionnaire was filled in by their parents. Results. According to the CCC-2 questionnaire, differences between the groups were found in linguistic abilities, pragmatics skills, and social interaction. Conclusion. According to the CCC-2 profiles, many children with ADHD may have various kinds of communication difficulties, even if they do not have a diagnosed language disorder. PMID:24580020

Väisänen, Raija; Loukusa, Soile; Moilanen, Irma; Yliherva, Anneli

2014-12-01

232

2nd Canada-China joint workshop on supercritical-water-cooled reactors (CCSC-2010)  

International Nuclear Information System (INIS)

The 2nd Canada-China Joint Workshop on Supercritical-Water-Cooled Reactors (CCSC-2010) was held in Toronto, Ontario, Canada on April 25-25, 2010. This joint workshop aimed at providing a forum for discussion of advancements and issues, sharing information and technology transfer, and establishing future collaborations on research and developments for supercritical water-cooled reactors (SCWR) between Canadian and Chinese research organizations. Participants were those involved in research and development of SCWR core design, materials, chemistry, corrosion, thermalhydraulics, and safety analysis at organizations in Canada and China. Papers related to the following topics were of interest to the workshop: reactor core and fuel designs; materials, chemistry and corrosion; thermalhydraulics and safety analysis; balance of plant; and other applications.

233

Book Review: The Communicating Leader: The key to strategic alignment (2nd Ed  

Directory of Open Access Journals (Sweden)

Full Text Available Title: The Communicating Leader: The key to strategic alignment (2nd Ed Author: Gustav Puth Publisher: Van Schaik Publishers Reviewer: XC Birkenbach

The aim of the book according to the author, is "meant to be a usable tool, an instrument in the toolbox of the real leader and leadership student". The book is written in conversational style (as intended by the author and the 219 pages of the 10 chapters are logically packaged into three parts. While the main emphasis is naturally on leadership and communication, the coverage includes topics typically encountered in Organisational Behaviour or Management texts, e.g., organizational culture, managing change, motivation, conflict management and strategic management.

X. C. Birkenbach

2003-10-01

234

Simultaneous vibrant soundbridge implantation and 2nd stage auricular reconstruction for microtia with aural atresia  

Directory of Open Access Journals (Sweden)

Full Text Available Aural atresia and severe microtia are associated malformations that result in problems with hearing and cosmesis, associated speech and language difficulties and diminished self-esteem. In cases where middle ear ossiculoplasty and aural atresia canalplasty are expected to give poor hearing outcomes that would eventually require the use of hearing aids, bone anchored hearing aids or active middle ear implants may be better options. This case report describes a simultaneous Vibrant Soundbridge implantation and 2nd stage auricular reconstruction with rib graft cartilage for an 11-year-old boy with grade III microtia and aural atresia 8 months after the 1st stage reconstruction. Audiometric results of the Vibrant Soundbridge aided ear were comparable to that of the contralateral hearing aid aided ear.

Jocelynne del Prado

2011-07-01

235

2nd FP7 Conference and International Summer School Nanotechnology : From Fundamental Research to Innovations  

CERN Document Server

This book presents some of the latest achievements in nanotechnology and nanomaterials from leading researchers in Ukraine, Europe, and beyond. It features contributions from participants in the 2nd International Summer School “Nanotechnology: From Fundamental Research to Innovations” and International Research and Practice Conference “Nanotechnology and Nanomaterials”, NANO-2013, which were held in Bukovel, Ukraine on August 25-September 1, 2013. These events took place within the framework of the European Commission FP7 project Nanotwinning, and were organized jointly by the Institute of Physics of the National Academy of Sciences of Ukraine, University of Tartu (Estonia), University of Turin (Italy), and Pierre and Marie Curie University (France). Internationally recognized experts from a wide range of universities and research institutions share their knowledge and key results on topics ranging from nanooptics, nanoplasmonics, and interface studies to energy storage and biomedical applications. Pr...

Yatsenko, Leonid

2015-01-01

236

Black Hole Evaporation and Generalized 2nd Law with Nonequilibrium Thermodynamics  

CERN Document Server

In general, when a black hole evaporates, there arises a net energy flow from black hole into its outside environment due to Hawking radiation and energy accretion onto black hole. The existence of energy flow means that the thermodynamic state of the whole system, which consists of a black hole and its environment, is in a nonequilibrium state. To know the detail of evaporation process, the nonequilibrium effects of energy flow should be taken into account. The nonequilibrium nature of black hole evaporation is a challenging topic including issues of not only black hole physics but also nonequilibrium physics. Using the nonequilibrium thermodynamics which has been formulated recently, this report shows: (1) the self-gravitational effect of black hole which appears as its negative heat capacity guarantees the validity of generalized 2nd law without entropy production inside the outside environment, (2) the nonequilibrium effect of energy flow tends to shorten the evaporation time (life time) of black hole, an...

Saida, Hiromi

2007-01-01

237

Textile Tectonics : 2nd Ventulett Symposium, Georgia Tech University, School of Architecture, Atlanta, November 2008  

DEFF Research Database (Denmark)

The meeting of architecture and textiles is a continuous but too often forgotten story of intimate exchange. However, the 2nd Ventulett Symposium hosted by the College of Architecture, within Georgia Institute of Technology, Atlanta, GA, was one of these precious moments celebrating such a marriage. Organized by Lars Spuybroeck, principal of Nox, Rotterdam, and current Thomas W. Ventulett III distinguished chair of Architectural Design, the event was embracing the textile tectonics as a core topic, praising textiles as the key component of architecture, relying on Gottfried Semper’s understanding of the discipline. Inspiring time gathering some of the most exciting architects of the moment, Lars Spuybroeck, Mark Burry, Evan Douglis, Michael Hensel and Cecil Balmond were invited to discuss their understanding of tectonics. Full text available at http://textilefutures.co.uk/exchange/bin/view/TextileFutures/TextileTectonics

Mossé, Aurélie

238

Summary of the 2nd workshop on ion beam-applied biology  

International Nuclear Information System (INIS)

Induction of novel plant resources by ion beam-irradiation has been investigated in JAERI. To share the knowledge of the present status of the field, and to find out future plants, 1st Workshop on ion beam-applied biology was held last year titled as ''Development of breeding technique for ion beams''. To further improve the research cooperation and to exchange useful information in the field, researchers inside JAERI and also with researchers outside, such as those from agricultural experiment stations, companies, and Universities met each other at the 2nd workshop on ion beam-applied biology titled as ''Future development of breeding technique for ion beams''. People from RIKEN, Institute of Radiation Breeding, Wakasa wan Energy Research Center, National Institute of Radiological Science also participated in this workshop. The 12 of the presented papers are indexed individually. (J.P.N.)

239

Micro-texture Analysis of 2nd Pilgered Zirconium Alloys by Electron Backscatter Diffraction  

International Nuclear Information System (INIS)

Texture controlling with crystallographic orientation is one of important techniques to produce seamless zirconium tubes for nuclear fuel cladding materials. The texture of the zirconium alloys can be analyzed by several methods such as X-ray, neutron and electron diffraction. Each method has its advantages and disadvantages for texture analysis, respectively. Since the thin seamless zirconium tubing was usually prepared with thick TREX by pilgering process, the grains near outer and inner surfaces have different deformation force during pilgering, which resultantly gives critical crystallographic orientation with position of grains in a tube. In this study, micro-texture analysis of pilgered zirconium alloys was carried out by electron backscatter diffraction (EBSD) to give a information for optimum fabrication condition. Emphasis is on the analysis of crystallographic orientation with position of 2nd pilgered zirconium tubings

240

2nd Canada-China joint workshop on supercritical-water-cooled reactors (CCSC-2010)  

Energy Technology Data Exchange (ETDEWEB)

The 2nd Canada-China Joint Workshop on Supercritical-Water-Cooled Reactors (CCSC-2010) was held in Toronto, Ontario, Canada on April 25-25, 2010. This joint workshop aimed at providing a forum for discussion of advancements and issues, sharing information and technology transfer, and establishing future collaborations on research and developments for supercritical water-cooled reactors (SCWR) between Canadian and Chinese research organizations. Participants were those involved in research and development of SCWR core design, materials, chemistry, corrosion, thermalhydraulics, and safety analysis at organizations in Canada and China. Papers related to the following topics were of interest to the workshop: reactor core and fuel designs; materials, chemistry and corrosion; thermalhydraulics and safety analysis; balance of plant; and other applications.

NONE

2010-07-01

 
 
 
 
241

2. slovenski MoodleMoot = 2nd Slovenian MoodleMoot  

Directory of Open Access Journals (Sweden)

Full Text Available Moodle, an open source learning management system, is becoming widely used and recognised all over the world. Slovenian Moodle users have been participating and sharing their experience in the Moodle.si community since 2006. The initiator of the Moodle.si community – the Faculty of Management Koper organised the first Slovenian MoodleMoot Conference last year. The event was organised again in May 2008. The conference was organised by the Centre for E-Learning of the Faculty of Management Koper in co-operation with the Open Source Centre – Slovenia, Artesia and the National School for Leadership in Education. This paper presents the 2nd International Moodle.si Conference.

Viktorija Sulcic

2008-09-01

242

Knowledge grows when shared : The Launch of OpenAIRE, 2nd December in Ghent  

DEFF Research Database (Denmark)

Knowledge is one of the few commodities that don’t devalue when used. Actually knowledge grows when shared and the free online access to peer-reviewed scientific publications is a potent ingredient the process of sharing. The sharing of knowledge is facilitated by the Open Access Movement. However Open Access is much more than downloading the PDF. Vice President of the European Commission and European Digital Agenda Commissioner Neelie Kroes boldly presented this message in the Opening Session of the OpenAIRE launch. On the 2nd December 2010 the official launch of OpenAIRE the European infrastructure for Open Access was launched in Ghent, Belgium. This project and initiative is facilitating the success of the Open Access Pilot in FP7 as presented earlier in this journal. In this brief article I will present some of the most interesting issues that were discussed during the first session of the day.

Elbæk, Mikael Karstensen

2010-01-01

243

Using integrating spheres with wavelength modulation spectroscopy: effect of pathlength distribution on 2nd harmonic signals  

Science.gov (United States)

We have studied the effect on 2nd harmonic wavelength modulation spectroscopy of the use of integrating spheres as multipass gas cells. The gas lineshape becomes distorted at high concentrations, as a consequence of the exponential pathlength distribution of the sphere, introducing nonlinearity beyond that expected from the Beer-Lambert law. We have modelled this numerically for methane absorption at 1.651 ?m, with gas concentrations in the range of 0-2.5 %vol in air. The results of this model compare well with experimental measurements. The nonlinearity for the 2 fWMS measurements is larger than that for direct scan measurements; if this additional effect were not accounted for, the resulting error would be approximately 20 % of the reading at a concentration of 2.5 %vol methane.

Hodgkinson, J.; Masiyano, D.; Tatam, R. P.

2013-02-01

244

Belief Functions: Theory and Applications - Proceedings of the 2nd International Conference on Belief Functions  

CERN Document Server

The theory of belief functions, also known as evidence theory or Dempster-Shafer theory, was first introduced by Arthur P. Dempster in the context of statistical inference, and was later developed by Glenn Shafer as a general framework for modeling epistemic uncertainty. These early contributions have been the starting points of many important developments, including the Transferable Belief Model and the Theory of Hints. The theory of belief functions is now well established as a general framework for reasoning with uncertainty, and has well understood connections to other frameworks such as probability, possibility and imprecise probability theories.   This volume contains the proceedings of the 2nd International Conference on Belief Functions that was held in Compiègne, France on 9-11 May 2012. It gathers 51 contributions describing recent developments both on theoretical issues (including approximation methods, combination rules, continuous belief functions, graphical models and independence concepts) an...

Masson, Marie-Hélène

2012-01-01

245

Estimation of 2nd-order derivative thermodynamic properties using the crossover lattice equation of state  

International Nuclear Information System (INIS)

We apply the crossover lattice equation of state (xLF EOS) [M.S. Shin, Y. Lee, H. Kim, J. Chem. Thermodyn. 40 (2007) 174-179] to the calculations of thermodynamic 2nd-order derivative properties (isochoric heat capacity, isobaric heat capacity, isothermal compressibility, thermal expansion coefficient, Joule-Thompson coefficient, and sound speed). This equation of state is used to calculate the same properties of pure systems (carbon dioxide, normal alkanes from methane to propane). We show that, over a wide range of states, the equation of state yields properties with better accuracy than the lattice equation of state (LF EOS), and near the critical region, represents singular behavior well

246

Performance evaluation of Enhanced 2nd Order Gray Edge Color Constancy Algorithm Using Bilateral Filter  

Directory of Open Access Journals (Sweden)

Full Text Available The color constancy techniques becomes an important pre-processing technique which reduces the effect of the light source on the given image or scene. It is found that light effects lot on a given scene. So effect of the light source may degrades the performance of certain applications a lot like face recognition, object detection, lane detection etc. Color constancy has ability to detection of color independent of light source. It is a characteristic of the distinct color awareness organization which guarantees that the apparent color of objects remains relatively constant under altering illumination conditions. The overall goal of this paper is to propose a new algorithm 2nd order gray edge based color constancy algorithm using bilateral algorithm. The overall attention is to enhance the color constancy algorithm further. The histogram stretching is also used to improve the results. The comparison has shown the significant improvement over the available techniques.

Richa Dogra*1

2014-04-01

247

analysis and implementation of reactor protection system circuits - case study Egypt's 2 nd research reactor-  

International Nuclear Information System (INIS)

this work presents a way to design and implement the trip unit of a reactor protection system (RPS) using a field programmable gate arrays (FPGA). instead of the traditional embedded microprocessor based interface design method, a proposed tailor made FPGA based circuit is built to substitute the trip unit (TU), which is used in Egypt's 2 nd research reactor ETRR-2. the existing embedded system is built around the STD32 field computer bus which is used in industrial and process control applications. it is modular, rugged, reliable, and easy-to-use and is able to support a large mix of I/O cards and to easily change its configuration in the future. therefore, the same bus is still used in the proposed design. the state machine of this bus is designed based around its timing diagrams and implemented in VHDL to interface the designed TU circuit

248

THINKLET: ELEMENTO CLAVE EN LA GENERACIÓN DE MÉTODOS COLABORATIVOS PARA EVALUAR USABILIDAD DE SOFTWARE / THINKLET: KEY ELEMENT IN THE COLLABORATIVE METHODS GENERATION FOR EVALUATE SOFTWARE USABILITY  

Scientific Electronic Library Online (English)

Full Text Available SciELO Colombia | Language: Spanish Abstract in spanish En la actualidad, la usabilidad es un atributo fundamental para el éxito de un producto software. La competitividad entre organizaciones obliga a mejorar el nivel de usabilidad de los productos, debido al riesgo que existe de perder clientes, si el producto no es fácil de usar y/o fácil de aprender. [...] Aunque se han establecido métodos para evaluar la usabilidad de productos software, la mayoría de estos métodos no consideran la posibilidad de involucrar a varias personas trabajando de forma colaborativa en el proceso de evaluación. Por esta razón, convendría utilizar la Metodología para el Diseño de Métodos de Evaluación de Usabilidad Colaborativos, de tal forma que se diseñen métodos que permitan a varias personas de diversas áreas de conocimiento, trabajar de forma colaborativa en el proceso de evaluación. Este artículo presenta de forma general, la metodología mencionada y hace especial énfasis en los thinklets, como elementos clave para el diseño de procesos colaborativos. Abstract in english Currently, usability is a critical attribute to success of software. The competition among organizations forces to improve the level of product usability due to the risk of losing customers if product would not be easy to use and/or easy to learn. Methods have been established to evaluate the usabil [...] ity of software products; however, most of these methods don't take into account the possibility to involve several people working collaboratively in the evaluation process. Therefore, Methodology for Design of Collaborative Usability Evaluation Methods should be used to design methods that allow several people from a range of knowledge areas to work collaboratively in the evaluation process. This paper presents the methodology mentioned and gives special emphasis on Thinklets, as key elements for design of collaborative processes.

Andrés, Solano Alegría; Yenny, Méndez Alegría; César, Collazos Ordóñez.

249

A software tool for 2D/3D visualization and analysis of phase-space data generated by Monte Carlo modelling of medical linear accelerators  

Energy Technology Data Exchange (ETDEWEB)

A computer program has been developed for novel 2D/3D visualization and analysis of the phase-space parameters of Monte Carlo simulations of medical accelerator radiation beams. The software is written in the IDL language and reads the phase-space data generated in the BEAMnrc/BEAM Monte Carlo code format. Contour and colour-wash plots of the fluence, mean energy, energy fluence, mean angle, spectra distribution, energy fluence distribution, angular distribution, and slices and projections of the 3D ZLAST distribution can be calculated and displayed. Based on our experience of using it at Massachusetts General Hospital, the software has proven to be a useful tool for analysis and verification of the Monte Carlo generated phase-space files. The software is in the public domain. (note)

Neicu, Toni; Aljarrah, Khaled M; Jiang, Steve B [Department of Radiation Oncology, Massachusetts General Hospital and Harvard Medical School, Boston, MA 02114 (United States)

2005-10-21

250

NOTE: A software tool for 2D/3D visualization and analysis of phase-space data generated by Monte Carlo modelling of medical linear accelerators  

Science.gov (United States)

A computer program has been developed for novel 2D/3D visualization and analysis of the phase-space parameters of Monte Carlo simulations of medical accelerator radiation beams. The software is written in the IDL language and reads the phase-space data generated in the BEAMnrc/BEAM Monte Carlo code format. Contour and colour-wash plots of the fluence, mean energy, energy fluence, mean angle, spectra distribution, energy fluence distribution, angular distribution, and slices and projections of the 3D ZLAST distribution can be calculated and displayed. Based on our experience of using it at Massachusetts General Hospital, the software has proven to be a useful tool for analysis and verification of the Monte Carlo generated phase-space files. The software is in the public domain.

Neicu, Toni; Aljarrah, Khaled M.; Jiang, Steve B.

2005-10-01

251

Herramienta software para el análisis de canasta de mercado sin selección de candidatos / Software tool for analysing the family shopping basket without candidate generation  

Scientific Electronic Library Online (English)

Full Text Available SciELO Colombia | Language: Spanish Abstract in spanish Actualmente en el entorno del comercio electrónico es necesario contar con herramientas que permitan obtener conocimiento útil que brinde soporte a la toma de decisiones de marketing; para ello se necesita de un proceso que utiliza una serie de técnicas para el procesamiento de los datos, entre ella [...] s se encuentra la minería de datos, que permite llevar a cabo un proceso de descubrimiento de información automático. Este trabajo tiene como objetivo presentar la técnica de reglas de asociación como la adecuada para descubrir cómo compran los clientes en una empresa que ofrece un servicio de comercio electrónico tipo B2C, con el fin de apoyar la toma de decisiones para desarrollar ofertas hacia sus clientes o cautivar nuevos. Para la implementación de las reglas de asociación existe una variedad de algoritmos como: A priori, DHP, Partition, FP-Growth y Eclat y para seleccionar el más adecuado se define una serie de criterios (Danger y Berlanga, 2001), entre los que se encuentran: inserciones a la base de datos, costo computacional, tiempo de ejecución y rendimiento, los cuales se analizaron en cada algoritmo para realizar la selección. Además, se presenta el desarrollo de una herramienta software que contempla la metodología CRISP-DM constituida por cuatro submódulos, así: Preprocesamiento de datos, Minería de datos, Análisis de resultados y Aplicación de resultados. El diseño de la aplicación utiliza una arquitectura de tres capas: Lógica de presentación, Lógica del Negocio y Lógica de servicios; dentro del proceso de construcción de la herramienta se incluye el diseño de la bodega de datos y el diseño de algoritmo como parte de la herramienta de minería de datos. Las pruebas hechas a la herramienta de minería de datos desarrollada se realizaron con una base de datos de la compañía FoodMart3. Estas pruebas fueron de: rendimiento, funcionalidad y confiabilidad en resultados, las cuales permiten encontrar reglas de asociación igualmente. Los resultados obtenidos facilitaron concluir, entre otros aspectos, que las reglas de asociación como técnica de minería de datos permiten analizar volúmenes de datos para servicios de comercio electrónico tipo B2C, lo cual es una ventaja competitiva para las empresas. Abstract in english Tools leading to useful knowledge being obtained for supporting marketing decisions being taken are currently needed in the ecommerce environment. A process is needed for this which uses a series of techniques for data-processing; data-mining is one such technique enabling automatic information disc [...] overy. This work presents the association rules as a suitable technique for discovering how customers buy from a company offering business to consumer (B2C) e-business, aimed at supporting decision-making in supplying its customers or capturing new ones. Many algorithms such as A priori, DHP, Partition, FP-Growth and Eclat are available for implementing association rules; the following criteria were defined for selecting the appropriate algorithm: database insert, computational cost, performance and execution time. The development of a software tool is also presented which involved the CRISP-DM approach; this software tool was formed by the following four sub-modules: data pre-processing, data-mining, results analysis and results application. The application design used three-layer architecture: presentation logic, business logic and service logic. Data warehouse design and algorithm design were included in developing this data-mining software tool. It was tested by using a FoodMart company database; the tests included performance, functionality and results’ validity, thereby allowing association rules to be found. The results led to concluding that using association rules as a data mining technique facilitates analysing volumes of information for B2C e-business services which represents a competitive advantage for those companies using Internet as their sales’ media.

Roberto Carlos, Naranjo Cuervo; Luz Marina, Sierra Martínez.

252

Herramienta software para el análisis de canasta de mercado sin selección de candidatos / Software tool for analysing the family shopping basket without candidate generation  

Scientific Electronic Library Online (English)

Full Text Available SciELO Colombia | Language: Spanish Abstract in spanish Actualmente en el entorno del comercio electrónico es necesario contar con herramientas que permitan obtener conocimiento útil que brinde soporte a la toma de decisiones de marketing; para ello se necesita de un proceso que utiliza una serie de técnicas para el procesamiento de los datos, entre ella [...] s se encuentra la minería de datos, que permite llevar a cabo un proceso de descubrimiento de información automático. Este trabajo tiene como objetivo presentar la técnica de reglas de asociación como la adecuada para descubrir cómo compran los clientes en una empresa que ofrece un servicio de comercio electrónico tipo B2C, con el fin de apoyar la toma de decisiones para desarrollar ofertas hacia sus clientes o cautivar nuevos. Para la implementación de las reglas de asociación existe una variedad de algoritmos como: A priori, DHP, Partition, FP-Growth y Eclat y para seleccionar el más adecuado se define una serie de criterios (Danger y Berlanga, 2001), entre los que se encuentran: inserciones a la base de datos, costo computacional, tiempo de ejecución y rendimiento, los cuales se analizaron en cada algoritmo para realizar la selección. Además, se presenta el desarrollo de una herramienta software que contempla la metodología CRISP-DM constituida por cuatro submódulos, así: Preprocesamiento de datos, Minería de datos, Análisis de resultados y Aplicación de resultados. El diseño de la aplicación utiliza una arquitectura de tres capas: Lógica de presentación, Lógica del Negocio y Lógica de servicios; dentro del proceso de construcción de la herramienta se incluye el diseño de la bodega de datos y el diseño de algoritmo como parte de la herramienta de minería de datos. Las pruebas hechas a la herramienta de minería de datos desarrollada se realizaron con una base de datos de la compañía FoodMart3. Estas pruebas fueron de: rendimiento, funcionalidad y confiabilidad en resultados, las cuales permiten encontrar reglas de asociación igualmente. Los resultados obtenidos facilitaron concluir, entre otros aspectos, que las reglas de asociación como técnica de minería de datos permiten analizar volúmenes de datos para servicios de comercio electrónico tipo B2C, lo cual es una ventaja competitiva para las empresas. Abstract in english Tools leading to useful knowledge being obtained for supporting marketing decisions being taken are currently needed in the ecommerce environment. A process is needed for this which uses a series of techniques for data-processing; data-mining is one such technique enabling automatic information disc [...] overy. This work presents the association rules as a suitable technique for discovering how customers buy from a company offering business to consumer (B2C) e-business, aimed at supporting decision-making in supplying its customers or capturing new ones. Many algorithms such as A priori, DHP, Partition, FP-Growth and Eclat are available for implementing association rules; the following criteria were defined for selecting the appropriate algorithm: database insert, computational cost, performance and execution time. The development of a software tool is also presented which involved the CRISP-DM approach; this software tool was formed by the following four sub-modules: data pre-processing, data-mining, results analysis and results application. The application design used three-layer architecture: presentation logic, business logic and service logic. Data warehouse design and algorithm design were included in developing this data-mining software tool. It was tested by using a FoodMart company database; the tests included performance, functionality and results’ validity, thereby allowing association rules to be found. The results led to concluding that using association rules as a data mining technique facilitates analysing volumes of information for B2C e-business services which represents a competitive advantage for those companies using Internet as their sales’ media.

Roberto Carlos, Naranjo Cuervo; Luz Marina, Sierra Martínez.

2009-04-01

253

Perspectives on Art Therapy: The Proceedings of the Pittsburgh Conference on Art Therapy (2nd, Pittsburgh, Pennsylvania, May 20, 1977).  

Science.gov (United States)

The proceedings of the 2nd annual Pittsburgh Conference on Art Therapy (with handicapped persons) consists of 44 items including full length papers, summaries of previously published papers, descriptions of workshops, and a limited number of abstracts (submitted by those who chose not to present a paper or workshop description). The papers are…

Roth, Ellen A., Ed.; Rubin, Judith A., Ed.

254

Mesocosm soil ecological risk assessment tool for GMO 2nd tier studies  

DEFF Research Database (Denmark)

Ecological Risk Assessment (ERA) of GMO is basically identical to ERA of chemical substances, when it comes to assessing specific effects of the GMO plant material on the soil ecosystem. The tiered approach always includes the option of studying more complex but still realistic ecosystem level effects in 2nd tier caged experimental systems, cf. the new GMO ERA guidance: EFSA Journal 2010; 8(11):1879. We propose to perform a trophic structure analysis, TSA, and include the trophic structure as an ecological endpoint to gain more direct insight into the change in interactions between species, i.e. the food-web structure, instead of relying only on the indirect evidence from population abundances. The approach was applied for effect assessment in the agro-ecosystem where we combined factors of elevated CO2, viz. global climate change, and GMO plant effects. A multi-species (Collembola, Acari and Enchytraeidae) mesocosm factorial experiment was set up in a greenhouse at ambient CO2 and 450 ppm CO2 with a GM barley variety and conventional varieties. The GM barley differed concerning the composition of amino acids in the grain (antisense C-hordein line). The fungicide carbendazim acted as a positive control. After 5 and 11 weeks, data on populations, plants and soil organic matter decomposition were evaluated. Natural abundances of stable isotopes, 13C and 15N, of animals, soil, plants and added organic matter (crushed maize leaves) were used to describe the soil food web structure.

D'Annibale, Alessandra; Maraldo, Kristine

255

DRS // CUMULUS Oslo 2013. The 2nd International Conference for Design Education Researchers  

Directory of Open Access Journals (Sweden)

Full Text Available 14-17 May 2013, Oslo, NorwayWe have received more than 200 full papers for the 2nd International Conference for Design Education Researchers in Oslo.This international conference is a springboard for sharing ideas and concepts about contemporary design education research. Contributors are invited to submit research that deals with different facets of contemporary approaches to design education research. All papers will be double-blind peer-reviewed. This conference is open to research in any aspect and discipline of design educationConference themeDesign Learning for Tomorrow - Design Education from Kindergarten to PhDDesigned artefacts and solutions influence our lives and values, both from a personal and societal perspective. Designers, decision makers, investors and consumers hold different positions in the design process, but they all make choices that will influence our future visual and material culture. To promote sustainability and meet global challenges for the future, professional designers are dependent on critical consumers and a design literate general public.  For this purpose design education is important for all. We propose that design education in general education represents both a foundation for professional design education and a vital requirement for developing the general public’s competence for informed decision making.REGISTRATION AT http://www.hioa.no/DRScumulus

Liv Merete Nielsen

2013-01-01

256

Multiferroicity in La1/2Nd1/2FeO3 nanoparticles  

Science.gov (United States)

Nano-sized La1/2Nd1/2FeO3 (LNF) powder is synthesized by the sol-gel citrate method. The Rietveld refinement of the X-ray diffraction profile of the sample at room temperature (303 K) shows the orthorhombic phase with Pbnm symmetry. The particle size is obtained by transmission electron microscope. The antiferromagnetic nature of the sample is explained using zero field cooled and field cooled magnetisation and the corresponding hysteresis loop. A signature of weak ferromagnetic phase is observed in LNF at low temperature which is explained on the basis of spin glass like behaviour of surface spins. The dielectric relaxation of the sample has been investigated using impedance spectroscopy in the frequency range from 42 Hz to 1 MHz and in the temperature range from 303 K to 513 K. The Cole-Cole model is used to analyse the dielectric relaxation of LNF. The frequency dependent conductivity spectra follow the power law. The magneto capacitance measurement of the sample confirms its multiferroic behaviour.

Chanda, Sadhan; Saha, Sujoy; Dutta, Alo; Mahapatra, A. S.; Chakrabarti, P. K.; Kumar, Uday; Sinha, T. P.

2014-11-01

257

Consensus report of the 2nd International Forum for Liver MRI.  

Science.gov (United States)

Discussion at the 2nd Forum for Liver MRI: The International Primovist User Meeting on the use of the hepatocyte-specific contrast agent gadolinium-ethoxybenzyl-diethylene triamine penta-acetic acid (Gd-EOB-DTPA) is reported. Changes to the currently recommended Gd-EOB-DTPA imaging protocol were identified that can reduce the overall examination time. The potential benefits of 3-T MR imaging using Gd-EOB-DTPA have yet to be fully explored. Data show that Gd-EOB-DTPA-enhanced MRI allows identification of liver lesions and provides a differential diagnosis of hepatocellular nodules in the noncirrhotic and cirrhotic liver, based on vascularity, during the dynamic arterial, portal-venous and late phases, and during the hepatocytespecific phase. Current European, American and Japanese guidelines for the diagnosis of hepatocellular carcinoma need to take into account the recent rapid advances in liver imaging. Based on published clinical trials and the experience of the attendees in the use of Gd-EOB-DTPA in liver imaging, a new simplified, non-invasive diagnostic algorithm was proposed that would be applicable to both Eastern and Western clinical practice in the evaluation of hepatocarcinogenesis and hepatocellular carcinoma. Preliminary clinical experience suggests that Gd-EOB-DTPA may also provide an innovative and cost-effective one-stop approach for staging rectal cancer using wholebody imaging. PMID:19851766

Tanimoto, Akihiro; Lee, Jeong Min; Murakami, Takamichi; Huppertz, Alexander; Kudo, Masatoshi; Grazioli, Luigi

2009-10-01

258

??????????????????? Reservoir Characteristics of 2nd Member of Jialingjiang Formation in Fuchengzhai Structure of East Sichuan  

Directory of Open Access Journals (Sweden)

Full Text Available ????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????–??????–????????????????2?????1???????3?????????????????????????????????????????????????????????????????????????????????????????????????????????????????????Based on the core description, thin sections identification, and physical property data analysis, reservoir char- acteristics of 2nd member of Jialingjiang Formation in Fuchengzhai area, eastern Sichuan, are studied in detail. The results show that the rock types of T1j2 are limestone, dolostone and cream rock, and reservoir rocks are mainly crystal- line dolostone and grain dolostone. The combination analysis of casting thin sections and scanning electron microscopy shows that reservoir space mainly exists in secondary pores (inter-grain pore, intercrystal pore, inter-crystal dissolved pore and fractures. The mass-properties analysis indicates that the reservoir property of T1j2 is poor, which belongs to fracture-pore and pore-fracture reservoir. Favourable reservoirs are mainly developed in T1j22, less in T1j21, and least in T1j23. Moreover, the reservoirs of T1j2 are mainly controlled by rock types, distributions of sedimentary facies, diagene- sis forms, tectonic actions, and so on, among which bank and tidal flat of dolomite are favourable reservoir facies, dis- solution and dolomization contribute to diagenesis, and microfractures shaped in tectonic activity are conducive to form the high quality reservoir and develop its permeability.

???

2013-08-01

259

Proteomics Data Collection - 2nd ProDaC Workshop 5 October 2007, Seoul, Korea.  

Science.gov (United States)

Proteomics Data Collection (ProDaC) is an EU-funded "Coordination Action" within the 6th framework programme. It aims to simplify the publication, dissemination and utilization of proteomics data by establishing standards that will support broad data collection from the research community. As a part of ProDaC, regular workshops are organized on a half-yearly basis to enable communication and discussion of the involved partners and to report on project progress. After the kick-off meeting (October 2006) in Long Beach, CA, USA and the 1st workshop in Lyon, France (April 2007), the 2nd ProDaC workshop took place at the COEX InterContinental Hotel in Seoul, Korea, on 5th October 2007, shortly before the HUPO World Congress. The progress achieved within the first year was presented by the leaders of the work packages. Additionally, a Journal's representative talked about his experiences and future plans concerning Proteomics standards; and two further external speakers presented their research related to data handling and Proteomics repositories. PMID:18318011

Eisenacher, Martin; Hardt, Tanja; Hamacher, Michael; Martens, Lennart; Häkkinen, Jari; Levander, Fredrik; Apweiler, Rolf; Meyer, Helmut E; Stephan, Christian

2008-04-01

260

Academic Training - 2nd Term: 08.01.2007 - 31.03.2007  

CERN Multimedia

2006 - 2007 ACADEMIC TRAINING PROGRAMME 2nd Term : 08.01.2007 - 31.03.2007 LECTURE SERIES Applied Superconductivity by V. Palmieri, INFN, Padova, It. 17, 18, 19 January 11:00 -1200 - Auditorium, Bldg 500 String Theory for Pedestrians by B. Zwiebach, M.I.T. Cambridge, USA 29, 30, 31 January 11:00-12:00 - Auditorium, Bldg 500 on 29, 30 January TH Auditorium on 31 January Introduction to Supersymmetry by D. Kaplan, John Hopkins University, Baltimore, USA 12, 13, 14, 15 February 11:00-12:00 - Auditorium, Bldg 500 The Hunt for the Higgs Particle by F. Zwirner, University of Padova, It 27, 28 February, 1st March 11:00-12:00 - Auditorium, Bldg 500 From Evolution Theory to Parallel and Distributed Genetic by F. Fernandez de Vega 15, 16, March 11:00-12:00 - Auditorium, Bldg 500 The lectures are open to all those interested, without application. The abstract of the lectures, as well as any change to the above information (title, dates, time, place etc.) will be published in the CERN bulletin, the WWW, an...

2006-01-01

 
 
 
 
261

Academic Training - 2nd Term: 08.01.2007 - 31.03.2007  

CERN Multimedia

2006 - 2007 ACADEMIC TRAINING PROGRAMME 2nd Term : 08.01.2007 - 31.03.2007 LECTURE SERIES Applied Superconductivity by V. Palmieri, INFN, Padova, It. 17, 18, 19 January 11:00 -1200 - Auditorium, bldg. 500 String Theory for Pedestrians by B. Zwiebach, M.I.T. Cambridge, USA 29, 30, 31 January 11:00-12:00 - Auditorium, bldg. 500 on 29, 30 January TH Auditorium on 31 January Introduction to Supersymmetry by D. Kaplan, John Hopkins University, Baltimore, USA 12, 13, 14, 15 February 11:00-12:00 - Auditorium, bldg. 500 The Hunt for the Higgs Particle by F. Zwirner, University of Padova, It 27, 28 February, 1st March 11:00-12:00 - Auditorium, bldg. 500 From Evolution Theory to Parallel and Distributed Genetic Programming by F. Fernandez de Vega 15, 16, March 11:00-12:00 - Auditorium, bldg. 500 The lectures are open to all those interested, without application. The abstract of the lectures, as well as any change to the above information (title, dates, time, place etc.) will be published in the WWW, and ...

2006-01-01

262

The 2nd and 3rd lower molars development of in utero gamma irradiated mouse fetus and neonates  

International Nuclear Information System (INIS)

Pregnant mothers were irradiated by a single dose of gamma rays (0, 2, 4, 6 Gy cobalt 60) in the days 10, 12, 14, 16, 18 of pregnancy. The heads of the embryos, and those of the neonates were taken at consecutive intervals of irradiation, starting from 16 days of pregnancy till 3rd day after delivery. The effect of irradiation was investigated in the development of the 2nd and 3rd lower molars on serial tissue sections, within consecutive periods of their organogenesis. Irradiation led to growth-deficiency in the 2nd and 3rd molars, and causes delay in their development. This was observed in various degree depending on the dose, time of irradiation, and time after irradiation. This belated development was manifested in morphogenesis, histogenesis, and odontoblasts and ameloblastis cyto and functional differentiations. The study showed that the delay in the development-stages of the 2nd lower molar, under control, if compared with the same process, to which is exposed, the 1st lower molar - within two days difference - dose not diminish the later irradiation effect on the 2nd molar, when compared with the immediate irradiation effect in the 1st molar (demonstrated in a previous study by Osman and Al-Achkar, 2001). On the contrary, the present study showed that the 2nd lower molar is more radiosensitive to various doses than the 1st lower molar. Also it showed the irradiation with two doses 4 and 6 Gy leads to a delay in the formation of the 3rd lower molar's bud, and it does not go deeper beyond the lower molar. (Author)

263

GENERACIÓN AUTOMÁTICA DE APLICACIONES SOFTWARE A PARTIR DEL ESTANDAR MDA BASÁNDOSE EN LA METODOLOGÍA DE SISTEMAS EXPERTOS E INTELIGENCIA ARTIFICIAL / AUTOMATIC GENERATION OF SOFTWARE APPLICATIONS FROM STANDARD MDA STANDARD BASED ON THE METHOD OF ARTIFICIAL INTELLIGENCE AND EXPERT SYSTEMS  

Directory of Open Access Journals (Sweden)

Full Text Available RESUMEN ANALÍTICO Son muchos los estudios que se han presentado a cerca de la generación automática de líneas de código, este artículo pretende presentar una solución a las limitaciones de una herramienta muy conocida llamada MDA, haciendo uso los avances tecnológicos de la inteligencia artificial y los sistemas expertos. Abarca los principios del marco de trabajo de MDA, transformando los modelos usados y añadiendo características a estos que permitirán hacer más eficiente esta metodología de trabajo. El modelo propuesto abarca las fases del ciclo de vida software siguiendo las reglas del negocio que hacen parte esencial un proyecto real de software. Es con las reglas del negocio que se empieza a dar la transformación del estándar MDA y se pretende dar un aporte que contribuya a automatizar las reglas del negocio de forma tal que sirva para la definición de las aplicaciones en todo el ciclo de vida que la genera. ANALYTICAL SUMMARY Many studies are presented about automatic generation of code lines, this article want to present a solution for limitations of a tool called MDA, using from Artifcial intelligence technological advances and expert sistems. covering the principle of MDA work frame, transforming used models and adding characteristics to this that allow to make more effcient this work metodology. the proposed model covers the phases cycle life software, following the business rules that make essential part in a real software proyect. With the Business rules can start to transform the standard MDA aiming to give a contribution to automate the business rules that works to defne aplications in all the life's cicle that generate it.

IVÁN MAURICIO RUEDA CÁCERES

2011-04-01

264

The influence of the 1st AlN and the 2nd GaN layers on properties of AlGaN/2nd AlN/2nd GaN/1st AlN/1st GaN structure  

Energy Technology Data Exchange (ETDEWEB)

This is a theoretical study of the 1st AlN interlayer and the 2nd GaN layer on properties of the Al{sub 0.3}Ga{sub 0.7}N/2nd AlN/2nd GaN/1st AlN/1st GaN HEMT structure by self-consistently solving coupled Schroedinger and Poisson equations. Our calculation shows that by increasing the 1st AlN thickness from 1.0 nm to 3.0 nm, the 2DEG, which is originally confined totally in the 2nd channel, gradually decreases, begins to turn up and eventually concentrates in the 1st one. The total 2DEG (2DEG in both channels) sheet density increases nearly linearly with the increasing 1st AlN thickness. And the slope of the potential profile of the AlGaN changes with the 1st AlN thickness, causing the unusual dependence of the total 2DEG sheet density on the thickness of the AlGaN barrier. The variations of 2DEG distribution, the total 2DEG sheet density and the conduction band profiles as a function of the 2nd GaN thickness also have been discussed. Their physical mechanisms have been investigated on the basis of the surface state theory. And the confinement of 2DEG can be further enhanced by the double-AlN interlayer, compared with the InGaN back-barrier. (orig.)

Bi, Yang; Peng, EnChao; Lin, DeFeng [Chinese Academy of Sciences, Materials Science Center, Institute of Semiconductors, P.O. Box 912, Beijing (China); Wang, XiaoLiang; Yang, CuiBai; Xiao, HongLing; Wang, CuiMei; Feng, Chun; Jiang, LiJuan [Chinese Academy of Sciences, Materials Science Center, Institute of Semiconductors, P.O. Box 912, Beijing (China); Chinese Academy of Sciences, Key Laboratory of Semiconductor Materials Science, Institute of Semiconductors, P.O. Box 912, Beijing (China)

2011-09-15

265

Development of the Monte Carlo event generator tuning software package Lagrange and its application to tune the PYTHIA model to the LHCb data  

CERN Document Server

One of the general problems of modern high energy physics is a problem of comparing experimental data, measurements of observables in high energy collisions, to theory, which is represented by Monte Carlo simulations. This work is dedicated to further development of the tuning methodology and implementation of software tools for tuning of the PYTHIA Monte Carlo event generator for the LHCb experiment. The aim of this thesis is to create a fast analytical model of the Monte Carlo event generator and then fitting the model to the experimental data, recorded by the LHCb detector, considering statistical and computational uncertainties and estimating the best values for the tuned parameters, by simultaneous tuning of a group of phenomenological parameters in many-dimensional parameter-space. The fitting algorithm is interfaced to the LHCb software framework, which models the response of the LHCb detector. Typically, the tunings are done to the measurements which are corrected for detector effects. These correctio...

Popov, Dmitry; Hofmann, Werner

266

An open-source software tool for the generation of relaxation time maps in magnetic resonance imaging  

Directory of Open Access Journals (Sweden)

Full Text Available Abstract Background In magnetic resonance (MR imaging, T1, T2 and T2* relaxation times represent characteristic tissue properties that can be quantified with the help of specific imaging strategies. While there are basic software tools for specific pulse sequences, until now there is no universal software program available to automate pixel-wise mapping of relaxation times from various types of images or MR systems. Such a software program would allow researchers to test and compare new imaging strategies and thus would significantly facilitate research in the area of quantitative tissue characterization. Results After defining requirements for a universal MR mapping tool, a software program named MRmap was created using a high-level graphics language. Additional features include a manual registration tool for source images with motion artifacts and a tabular DICOM viewer to examine pulse sequence parameters. MRmap was successfully tested on three different computer platforms with image data from three different MR system manufacturers and five different sorts of pulse sequences: multi-image inversion recovery T1; Look-Locker/TOMROP T1; modified Look-Locker (MOLLI T1; single-echo T2/T2*; and multi-echo T2/T2*. Computing times varied between 2 and 113 seconds. Estimates of relaxation times compared favorably to those obtained from non-automated curve fitting. Completed maps were exported in DICOM format and could be read in standard software packages used for analysis of clinical and research MR data. Conclusions MRmap is a flexible cross-platform research tool that enables accurate mapping of relaxation times from various pulse sequences. The software allows researchers to optimize quantitative MR strategies in a manufacturer-independent fashion. The program and its source code were made available as open-source software on the internet.

Kühne Titus

2010-07-01

267

La sorpresiva congruencia democrática del 2 de diciembre / The Surprising Democratic Congruence of December 2nd  

Scientific Electronic Library Online (English)

Full Text Available SciELO Venezuela | Language: Spanish Abstract in spanish El artículo comienza por subrayar, y valorar positivamente, que en la Venezuela polarizada de hoy se hayan podido procesar democráticamente y sin violencia los resultados del referendo sobre la reforma constitucional. Pasa luego a evaluar las condiciones políticas imperantes en Venezuela luego de la [...] abrumadora victoria electoral del Presidente en diciembre de 2006. Entre esas condiciones destacan los llamados “cinco motores de la revolución”, siendo uno de ellos el de “la reforma constitucional”. A continuación se señalan los que el autor considera los contenidos más resaltantes de la propuesta de reforma original del Presidente. Evalúa las razones de los resultados electorales desfavorables a la propuesta de reforma, considerando contenidos mismos de la propuesta, debilidades del sector oficialista para ese debate y fortalezas del sector opositor. Concluye el artículo presentando las principales consecuencias de los resultados electorales del 2 de diciembre para la realidad sociopolítica venezolana. Abstract in english The article begins by underlining positively the fact that in the results of the referendum for the reform of the Constitution were possible to achieve in democracy and without violence in the currently polarized Venezuela. Later, it evaluates contemporary political conditions in Venezuela after the [...] overwhelming electoral victory of the President in December 2006. Among these conditions outstands “la reforma constitucional” (The Reform of the Constitution) as one of the "Los cinco motores de la revolución” (The Five Engines of the Revolution). Then the author pointed the contents considered most relevant from the original proposal for the reform presented the President. The reasons of the electoral unfavourable results to the reform proposal, taking into consideration the contents of the offer itself, the weaknesses of the official sector to set this debate and the strengths of opposing sectors, are evaluated. The article concludes presenting the principal consequences of the electoral results of December 2nd for Venezuelan social-political reality.

Pedro, Nikken.

268

Conference Report on the 2nd International Symposium on Lithium Applications for Fusion Devices  

Science.gov (United States)

The 2nd International Symposium on Lithium Applications for Fusion Devices (ISLA-2011) was held on 27-29 April 2011 at the Princeton Plasma Physics Laboratory (PPPL) with broad participation from the community working on aspects of lithium research for fusion energy development. This community is expanding rapidly in many areas including experiments in magnetic confinement devices and a variety of lithium test stands, theory and modeling and developing innovative approaches. Overall, 53 presentations were given representing 26 institutions from 10 countries. The latest experimental results from nine magnetic fusion devices were given in 24 presentations, from NSTX (PPPL, USA), LTX (PPPL, USA), FT-U (ENEA, Italy), T-11M (TRINITY, RF), T-10 (Kurchatov Institute, RF), TJ-II (CIEMAT, Spain), EAST (ASIPP, China), HT-7 (ASIPP, China), and RFX (Padova, Italy). Sessions were devoted to: I. Lithium in magnetic confinement experiments (facility overviews), II. Lithium in magnetic confinement experiments (topical issues), III. Special session on liquid lithium technology, IV. Lithium laboratory test stands, V. Lithium theory/modeling/comments, VI. Innovative lithium applications and VII. Panel discussion on lithium PFC viability in magnetic fusion reactors. There was notable participation from the fusion technology communities, including the IFE, IFMIF and TBM communities providing productive exchanges with the physics oriented magnetic confinement lithium research groups. It was agreed to continue future exchanges of ideas and data to help develop attractive liquid lithium solutions for very challenging magnetic fusion issues, such as development of a high heat flux steady-state divertor concept and acceptable plasma disruption mitigation techniques while improving plasma performance with lithium. The next workshop will be held at ENEA, Frascati, Italy in 2013.

Ono, M.; Bell, M. G.; Hirooka, Y.; Kaita, R.; Kugel, H. W.; Mazzitelli, G.; Menard, J. E.; Mirnov, S. V.; Shimada, M.; Skinner, C. H.; Tabares, F. L.

2012-03-01

269

Re-fighting the 2nd Anglo-Boer War: historians in the trenches  

Directory of Open Access Journals (Sweden)

Full Text Available Some one hundred years ago, South Africa was tom apart by the 2nd Anglo- Boer War (1899-1902. The war was a colossal psychological experience fought at great expense: It cost Britain twenty-two thousand men and £223 million. The social, economic and political cost to South Africa was greater than the statistics immediately indicate: at least ten thousand fighting men in addition to the camp deaths, where a combination of indifference and incompetence resulted in the deaths of 27 927 Boers and at least 14 154 Black South Africans. Yet these numbers belie the consequences. It was easy for the British to 'forget' the pain of the war, which seemed so insignificant after the losses sustained in 1914-18. With a long history of far-off battles and foreign wars, the British casualties of the Anglo-Boer War became increasingly insignificant as opposed to the lesser numbers held in the collective Afrikaner mind. This impact may be stated somewhat more candidly in terms of the war participation ratio for the belligerent populations. After all, not all South Africans fought in uniform. For the Australian colonies these varied between 4½per thousand (New South Wales to 42.3 per thousand (Tasmania. New Zealand 8 per thousand, Britain 8½ per thousand: and Canada 12.3 per thousand; while in parts of South Africa this was perhaps as high as 900 per thousand. The deaths and high South African participation ratio, together with the unjustness of the war in the eyes of most Afrikaners, introduced bitterness, if not a hatred, which has cast long shadows upon twentieth-century South Africa.

Ian Van der Waag

2012-02-01

270

Comparison of strong gravitational lens model software II. HydraLens: Computer-assisted strong gravitational lens model generation and translation  

Science.gov (United States)

The behavior of strong gravitational lens model software in the analysis of lens models is not necessarily consistent among the various software available, suggesting that the use of several models may enhance the understanding of the system being studied. Among the publicly available codes, the model input files are heterogeneous, making the creation of multiple models tedious. An enhanced method of creating model files and a method to easily create multiple models, may increase the number of comparison studies. HydraLens simplifies the creation of model files for four strong gravitational lens model software packages, including Lenstool, Gravlens/Lensmodel, glafic and PixeLens, using a custom-designed GUI for each of the four codes that simplifies the entry of the model for each of these codes, obviating the need for user manuals to set the values of the many flags and in each data field. HydraLens is designed in a modular fashion, which simplifies the addition of other strong gravitational lens codes in the future. HydraLens can also translate a model generated for any of these four software packages into any of the other three. Models created using HydraLens may require slight modifications, since some information may be lost in the translation process. However the computer-generated model greatly simplifies the process of developing multiple lens models. HydraLens may enhance the number of direct software comparison studies and also assist in the education of young investigators in gravitational lens modeling. Future development of HydraLens will further enhance its capabilities.

Lefor, A. T.

2014-07-01

271

FACSGen 2.0 animation software: generating three-dimensional FACS-valid facial expressions for emotion research.  

Science.gov (United States)

In this article, we present FACSGen 2.0, new animation software for creating static and dynamic three-dimensional facial expressions on the basis of the Facial Action Coding System (FACS). FACSGen permits total control over the action units (AUs), which can be animated at all levels of intensity and applied alone or in combination to an infinite number of faces. In two studies, we tested the validity of the software for the AU appearance defined in the FACS manual and the conveyed emotionality of FACSGen expressions. In Experiment 1, four FACS-certified coders evaluated the complete set of 35 single AUs and 54 AU combinations for AU presence or absence, appearance quality, intensity, and asymmetry. In Experiment 2, lay participants performed a recognition task on emotional expressions created with FACSGen software and rated the similarity of expressions displayed by human and FACSGen faces. Results showed good to excellent classification levels for all AUs by the four FACS coders, suggesting that the AUs are valid exemplars of FACS specifications. Lay participants' recognition rates for nine emotions were high, and comparisons of human and FACSGen expressions were very similar. The findings demonstrate the effectiveness of the software in producing reliable and emotionally valid expressions, and suggest its application in numerous scientific areas, including perception, emotion, and clinical and neuroscience research. PMID:22251045

Krumhuber, Eva G; Tamarit, Lucas; Roesch, Etienne B; Scherer, Klaus R

2012-04-01

272

Interaction Between Short-Term Heat Pretreatment and Avermectin On 2nd Instar Larvae of Diamondback Moth, Plutella Xylostella (Linn)  

Digital Repository Infrastructure Vision for European Research (DRIVER)

Based on the cooperative virulence index (c.f.), the interaction effect between short-term heat pretreatment and avermectin on 2nd instar larvae of diamondback moth (DBM), Plutella xylostella (Linnaeus), was assessed. The results suggested that the interaction results between short-term heat pretreatment and avermectin on the tested insects varied with temperature level as well as its duration and avermectin concentration. Interaction between heat pretreatment at 30°C and avermectin mainly r...

Gu, Xiaojun; Tian, Sufen; Wang, Dehui; Gao, Fei

2009-01-01

273

Polythermal of solubility of Mg(NO3)2-Nd(NO3)3-H2O system  

International Nuclear Information System (INIS)

Phase equilibria in the system magnesium nitrateneodymium nitrate-water have been studied by isothermal method at 25, 50 and 65 deg C. Formation of ternary compound 3Mg(NO3)2x2Nd(NO3)3x24H2O has been detected. The individual character of magnesium-neodymium nitrate has been confirmed by chemical and X-ray phase method of analysis. 11 refs.; 2 tabs

274

On possibility of low-threshold two-plasmon decay instability in 2nd harmonic ECRH experiments at toroidal devices  

Digital Repository Infrastructure Vision for European Research (DRIVER)

The effects of the parametric decay of the 2nd harmonic X-mode into two short wave-length UH plasmons propagating in opposite directions is considered. The possibility of the absolute instability excitation is demonstrated in the case of the density profile possessing local maximum slightly exceeding the UH resonance value. The threshold of the absolute instability is shown to be substantially smaller than that provided by the standard theory for monotonous density profile.

Gusakov E.Z.; Yu, Popov A.

2012-01-01

275

Physical properties of double perovskite-type barium neodymium osmate Ba2NdOsO6  

International Nuclear Information System (INIS)

The crystal, magnetic structures and physical properties of the double perovskite-type barium neodymium osmate Ba2NdOsO6 are investigated through powder X-ray and neutron diffraction, electrical conductivity, magnetic susceptibility, and specific heat measurements. The Rietveld analysis reveals that the Nd and Os ions are arranged with regularity over the six-coordinate B sites in a distorted perovskite ABO3 framework. The monoclinic crystal structure described by space group P21/n (tilt system a?a?c+) becomes more distorted with decreasing temperature from 300 K down to 2.5 K. This compound shows a long-range antiferromagnetic ordering of Os5+ below 65 K. An antiferromagnetic ordering of Nd3+ also occurs at lower temperatures (?20 K). The magnetic structure is of Type I and the magnetic moments of Nd3+ and Os5+ ions are in the same direction in the ab-plane. - Graphical Abstract: The Magnetic structure of Ba2NdOsO6 is of Type I, and the magnetic moments of the Nd3+ and Os5+ ions are in the same direction in the ab-plane. Highlights: ? Crystal structures of Ba2NdOsO6 are determined to be monoclinic below 300 K. ? Its electrical resistivity shows a Mott variable-range hopping behavior with localized carriers. ? An antiferromagnetic ordering of the Os5+moment occuthe Os5+moment occurs at 65 K. ? The magnetic structure of Ba2NdOsO6 is determined to be of Type I.

276

Physical properties of double perovskite-type barium neodymium osmate Ba{sub 2}NdOsO{sub 6}  

Energy Technology Data Exchange (ETDEWEB)

The crystal, magnetic structures and physical properties of the double perovskite-type barium neodymium osmate Ba{sub 2}NdOsO{sub 6} are investigated through powder X-ray and neutron diffraction, electrical conductivity, magnetic susceptibility, and specific heat measurements. The Rietveld analysis reveals that the Nd and Os ions are arranged with regularity over the six-coordinate B sites in a distorted perovskite ABO{sub 3} framework. The monoclinic crystal structure described by space group P2{sub 1}/n (tilt system a{sup -}a{sup -}c{sup +}) becomes more distorted with decreasing temperature from 300 K down to 2.5 K. This compound shows a long-range antiferromagnetic ordering of Os{sup 5+} below 65 K. An antiferromagnetic ordering of Nd{sup 3+} also occurs at lower temperatures ({approx}20 K). The magnetic structure is of Type I and the magnetic moments of Nd{sup 3+} and Os{sup 5+} ions are in the same direction in the ab-plane. - Graphical Abstract: The Magnetic structure of Ba{sub 2}NdOsO{sub 6} is of Type I, and the magnetic moments of the Nd{sup 3+} and Os{sup 5+} ions are in the same direction in the ab-plane. Highlights: Black-Right-Pointing-Pointer Crystal structures of Ba{sub 2}NdOsO{sub 6} are determined to be monoclinic below 300 K. Black-Right-Pointing-Pointer Its electrical resistivity shows a Mott variable-range hopping behavior with localized carriers. Black-Right-Pointing-Pointer An antiferromagnetic ordering of the Os{sup 5+}moment occurs at 65 K. Black-Right-Pointing-Pointer The magnetic structure of Ba{sub 2}NdOsO{sub 6} is determined to be of Type I.

Wakeshima, Makoto, E-mail: wake@sci.hokudai.ac.jp [Hokkaido University, Division of Chemistry, Graduate School of Science, Sapporo-shi, Hokkaido 060-0810 (Japan); Hinatsu, Yukio [Hokkaido University, Division of Chemistry, Graduate School of Science, Sapporo-shi, Hokkaido 060-0810 (Japan); Ohoyama, Kenji [Institute of Materials Research, Tohoku University, Sendai 980-8577 (Japan)

2013-01-15

277

On possibility of low-threshold two-plasmon decay instability in 2nd harmonic ECRH experiments at toroidal devices  

Directory of Open Access Journals (Sweden)

Full Text Available The effects of the parametric decay of the 2nd harmonic X-mode into two short wave-length UH plasmons propagating in opposite directions is considered. The possibility of the absolute instability excitation is demonstrated in the case of the density profile possessing local maximum slightly exceeding the UH resonance value. The threshold of the absolute instability is shown to be substantially smaller than that provided by the standard theory for monotonous density profile.

Gusakov E. Z.

2012-09-01

278

Investigations of near IR photoluminescence properties in TiO{sub 2}:Nd,Yb materials using hyperspectral imaging methods  

Energy Technology Data Exchange (ETDEWEB)

TiO{sub 2} and TiO{sub 2}:Nd,Yb films were deposited by a doctor blade deposition technique from pastes prepared by a sol–gel process, and characterized by electron microscopy and spectroscopic techniques. Near infrared (NIR) photoluminescence (PL) properties upon 808 nm excitation were also examined. The rutile TiO{sub 2}:Nd,Yb samples exhibited the strongest NIR PL signal. The relationship between the morphological properties, annealing temperature and the optical behavior of TiO{sub 2}:Nd,Yb films is discussed. Furthermore, the study showed that hyperspectral imaging spectroscopy can be used as a rapid and nondestructive macroscopic characterization technique for the identification of spectral features and evaluation of luminescent surfaces of oxides. -- Highlights: ? Films and powders of Nd and Yb-doped titania have been synthesized. ? Three modifications (anatase, rutile and Degussa P25) have been studied. ? The NIR photoluminescence properties were studied by hyperspectral imaging. ? Emission at 978, 1008, 1029, 1064 and 1339 nm was obtained. ? The structural properties and their influence on the optical behavior are discussed.

Garskaite, Edita; Flø, Andreas S. [Department of Mathematical Sciences and Technology, Norwegian University of Life Sciences, N-1432 Aas (Norway); Helvoort, Antonius T.J. van [Department of Physics, Norwegian University of Science and Technology, 7491 Trondheim (Norway); Kareiva, Aivaras [Department of General and Inorganic Chemistry, Vilnius University, Naugarduko 24, LT-03225 Vilnius (Lithuania); Olsen, Espen, E-mail: espen.olsen@umb.no [Department of Mathematical Sciences and Technology, Norwegian University of Life Sciences, N-1432 Aas (Norway)

2013-08-15

279

Investigations of near IR photoluminescence properties in TiO2:Nd,Yb materials using hyperspectral imaging methods  

International Nuclear Information System (INIS)

TiO2 and TiO2:Nd,Yb films were deposited by a doctor blade deposition technique from pastes prepared by a sol–gel process, and characterized by electron microscopy and spectroscopic techniques. Near infrared (NIR) photoluminescence (PL) properties upon 808 nm excitation were also examined. The rutile TiO2:Nd,Yb samples exhibited the strongest NIR PL signal. The relationship between the morphological properties, annealing temperature and the optical behavior of TiO2:Nd,Yb films is discussed. Furthermore, the study showed that hyperspectral imaging spectroscopy can be used as a rapid and nondestructive macroscopic characterization technique for the identification of spectral features and evaluation of luminescent surfaces of oxides. -- Highlights: ? Films and powders of Nd and Yb-doped titania have been synthesized. ? Three modifications (anatase, rutile and Degussa P25) have been studied. ? The NIR photoluminescence properties were studied by hyperspectral imaging. ? Emission at 978, 1008, 1029, 1064 and 1339 nm was obtained. ? The structural properties and their influence on the optical behavior are discussed

280

Development of China Hydrogeology Exploring Techniques in 30 Years --Comparison of Handbook of Hydrogeology of 1st and 2nd Edition  

Science.gov (United States)

Handbook of Hydrogeology (2nd edition) is supported by one program from China Geological Survey (CGS): Research of Technical Methods of Hydrogeological Survey and Revision of Handbook of Hydrogeology. It is a reference book for those who are engaged in hydrogeological survey and research in China and covers fundamental principles, theories, survey and exploring techniques, and traditional experiences and achievements in hydrogeology. By comparing the 1st (1978) and 2nd (2012) edition of Handbook of Hydrogeology (in Chinese), this paper analyses the development of China hydrogeological survey and exploring techniques in last 30 years, especially the great change and progress in survey techniques of hydro-remote sensing and hydro-geophysical prospecting. In the first edition of Handbook of Hydrogeology, hydro-remote sensing was only mentioned as an interpretation of aerial pictures in a hydrogeological way, but had not yet formed an independent system and discipline. In the second edition, hydro-remote sensing is an important and independent chapter as one of the hydrogeological techniques. In it, various survey techniques of hydro-remote sensing and types and features of remote sensing data are classified. General systems of interpretation marks of remote sensing images are established, including marks of landform and Quaternary sediment, bedrock, structure types, water yield property, environmental elements of hydrogeology, aquifer group and so on. Systematic workflow is constructed, esp. in remote sensing images mapping and interpreting techniques. GPS and GIS are integrated into remote sensing. Remote sensing exploring instruments and interpreting softwares are also introduced and classified. Although hydro-geophysical prospecting, in the first edition of Handbook of Hydrogeology, was one independent chapter, there were only 10 exploring techniques. Equipments and instruments were simple and lagged in comparison to those in the second edition. The precision and depth were limited. In the last 30 years, geophysical exploring techniques have been widely used in oil and mineral exploration, and have laid a solid foundation for hydro-geophysics. In the second edition, systems of hydro-geophysical techniques are more complete and there are 26 techniques of 2 types. Combination of various geophysical techniques plays a much more effective role in solving hydrogeological problems and makes groundwater exploration more extensively utilized in range, depth and types. After the publication of Handbook of Hydrogeology, it is popular in the field of hydrogeology in China. It is a necessary reference book for hydrogeologists and those in related fields.

Tong, Y.

2013-12-01

 
 
 
 
281

Proceedings of the 2nd International Workshop on e-learning and Virtual and Remote Laboratories  

Digital Repository Infrastructure Vision for European Research (DRIVER)

Content Session 1: Architecture of Virtual & Remote Laboratory Infrastructures (I) An Internet-Based Laboratory Course in Chemical Reaction Engineering and Unit Operations Internet Based Laboratory for Experimentation with Multilevel Medium-Power Converters Session 2: Architecture of Virtual & Remote Laboratory Infrastructures (II) Content management and architectural issues of a remote learning laboratory Distributed Software Architecture and Applications for Remote Lab...

2008-01-01

282

Pod generated by Monte Carlo simulation using a meta-model based on the simSUNDT software  

Science.gov (United States)

A recent developed numerical procedure for simulation of POD is used to identify the most influential parameters and test the effect of their interaction and variability with different statistical distributions. With a multi-parameter prediction model, based on the NDT simulation software simSUNDT, a qualified ultrasonic procedure of personnel within Swedish nuclear power plants is investigated. The stochastical computations are compared to experimentally based POD and conclusions are drawn for both fatigue and stress corrosion cracks.

Persson, G.; Hammersberg, P.; Wirdelius, H.

2012-05-01

283

Hanbury Brown and Twiss Interferometry with respect to 2nd and 3rd-order event planes in Au+Au collisions at = 200 GeV  

Science.gov (United States)

Azimuthal angle dependence of HBT interferometry have been measured with respect to 2nd and 3rd-order event planes in Au+Au collisions at = 200 GeV at the PHENIX experiment. The 3rd-oder dependence of a Gaussian source radii was clearly observed as well as 2nd-order dependence. The result for 2nd-order indicates that the initial source eccentricity is diluted but still retain the initial shape at freeze-out, while the result for 3rd-order implies that the initial triangularity vanishes during the medium evolution, which is supported by a Gaussian source model and Monte-Carlo simulation.

Niida, Takafumi; Phenix Collaboration

2014-09-01

284

Free Open Source Software: FOSS Based GIS for Spatial Retrievals of Appropriate Locations for Ocean Energy Utilizing Electric Power Generation Plants  

Directory of Open Access Journals (Sweden)

Full Text Available Free Open Source Software: FOSS based Geographic Information System: GIS for spatial retrievals of appropriate locations for ocean wind and tidal motion utilizing electric power generation plants is proposed. Using scatterometer onboard earth observation satellites, strong wind coastal areas are retrieved with FOSS/GIS of PostgreSQL/GIS. PostGIS has to be modified together with altimeter and scatterometer database. These modification and database creation would be a good reference to the users who would like to create GIS system together with database with FOSS.

Kohei Arai

2012-09-01

285

2nd Radio and Antenna Days of the Indian Ocean (RADIO 2014)  

Science.gov (United States)

It was an honor and a great pleasure for all those involved in its organization to welcome the participants to the ''Radio and Antenna Days of the Indian Ocean'' (RADIO 2014) international conference that was held from 7th to 10th April 2014 at the Sugar Beach Resort, Wolmar, Flic–en–Flac, Mauritius. RADIO 2014 is the second of a series of conferences organized in the Indian Ocean region. The aim of the conference is to discuss recent developments, theories and practical applications covering the whole scope of radio–frequency engineering, including radio waves, antennas, propagation, and electromagnetic compatibility. The RADIO international conference emerged following discussions with engineers and scientists from the countries of the Indian Ocean as well as from other parts of the world and a need was felt for the organization of such an event in this region. Following numerous requests, the Island of Mauritius, worldwide known for its white sandy beaches and pleasant tropical atmosphere, was again chosen for the organization of the 2nd RADIO international conference. The conference was organized by the Radio Society, Mauritius and the Local Organizing Committee consisted of scientists from SUPELEC, France, the University of Mauritius, and the University of Technology, Mauritius. We would like to take the opportunity to thank all people, institutions and companies that made the event such a success. We are grateful to our gold sponsors CST and FEKO as well as URSI for their generous support which enabled us to partially support one PhD student and two scientists to attend the conference. We would also like to thank IEEE–APS and URSI for providing technical co–sponsorship. More than hundred and thirty abstracts were submitted to the conference. They were peer–reviewed by an international scientific committee and, based on the reviews, either accepted, eventually after revision, or rejected. RADIO 2014 brought together participants from twenty countries spanning five continents: Australia, Botswana, Brazil, Canada, China, Denmark, France, India, Italy, Mauritius, Poland, Reunion Island, Russia, South Africa, South Korea, Spain, Switzerland, The Netherlands, United Kingdom, and USA. The conference featured eleven oral sessions and one poster session on state–of–the–art research themes. Three internationally recognized scientists delivered keynote speeches during the conference. Prizes for the first and second Best Student Papers were awarded during the closing ceremony. Following the call for the extended contributions for publication as a volume in the IOP Conference Series: Materials Science and Engineering (MSE), both on–line and in print, we received thirty–two full papers. All submitted contributions were then peer–reviewed, revised whenever necessary, and accepted or rejected based on the recommendations of the reviewers of the editorial board. At the end of the procedure, twenty–five of them have been accepted for publication in this volume.

2014-10-01

286

Brain order disorder 2nd group report of f-EEG  

Science.gov (United States)

Since the Brain Order Disorder (BOD) group reported on a high density Electroencephalogram (EEG) to capture the neuronal information using EEG to wirelessly interface with a Smartphone [1,2], a larger BOD group has been assembled, including the Obama BRAIN program, CUA Brain Computer Interface Lab and the UCSD Swartz Computational Neuroscience Center. We can implement the pair-electrodes correlation functions in order to operate in a real time daily environment, which is of the computation complexity of O(N3) for N=102~3 known as functional f-EEG. The daily monitoring requires two areas of focus. Area #(1) to quantify the neuronal information flow under arbitrary daily stimuli-response sources. Approach to #1: (i) We have asserted that the sources contained in the EEG signals may be discovered by an unsupervised learning neural network called blind sources separation (BSS) of independent entropy components, based on the irreversible Boltzmann cellular thermodynamics(?S human's effortless brain at constant temperature, we can solve the minimum of Helmholtz free energy (H = E - TS) by computing BSS, and then their pairwise-entropy source correlation function. (i) Although the entropy itself is not the information per se, but the concurrence of the entropy sources is the information flow as a functional-EEG, sketched in this 2nd BOD report. Area #(2) applying EEG bio-feedback will improve collective decision making (TBD). Approach to #2: We introduce a novel performance quality metrics, in terms of the throughput rate of faster (?t) & more accurate (?A) decision making, which applies to individual, as well as team brain dynamics. Following Nobel Laureate Daniel Kahnmen's novel "Thinking fast and slow", through the brainwave biofeedback we can first identify an individual's "anchored cognitive bias sources". This is done in order to remove the biases by means of individually tailored pre-processing. Then the training effectiveness can be maximized by the collective product ?t * ?A. For Area #1, we compute a spatiotemporally windowed EEG in vitro average using adaptive time-window sampling. The sampling rate depends on the type of neuronal responses, which is what we seek. The averaged traditional EEG measurements and are further improved by BSS decomposition into finer stimulus-response source mixing matrix [A] having finer & faster spatial grids with rapid temporal updates. Then, the functional EEG is the second order co-variance matrix defined as the electrode-pair fluctuation correlation function C(s~, s~') of independent thermodynamic source components. (1) We define a 1-D Space filling curve as a spiral curve without origin. This pattern is historically known as the Peano-Hilbert arc length a. By taking the most significant bits of the Cartesian product a? O(x * y * z), it represents the arc length in the numerical size with values that map the 3-D neighborhood proximity into a 1-D neighborhood arc length representation. (2) 1-D Fourier coefficients spectrum have no spurious high frequency contents, which typically arise in lexicographical (zig-zag scanning) discontinuity [Hsu & Szu, "Peano-Hilbert curve," SPIE 2014]. A simple Fourier spectrum histogram fits nicely with the Compressive Sensing CRDT Mathematics. (3) Stationary power spectral density is a reasonable approximation of EEG responses in striate layers in resonance feedback loops capable of producing a 100, 000 neuronal collective Impulse Response Function (IRF). The striate brain layer architecture represents an ensemble

Lalonde, Francois; Gogtay, Nitin; Giedd, Jay; Vydelingum, Nadarajen; Brown, David; Tran, Binh Q.; Hsu, Charles; Hsu, Ming-Kai; Cha, Jae; Jenkins, Jeffrey; Ma, Lien; Willey, Jefferson; Wu, Jerry; Oh, Kenneth; Landa, Joseph; Lin, C. T.; Jung, T. P.; Makeig, Scott; Morabito, Carlo Francesco; Moon, Qyu; Yamakawa, Takeshi; Lee, Soo-Young; Lee, Jong-Hwan; Szu, Harold H.; Kaur, Balvinder; Byrd, Kenneth; Dang, Karen; Krzywicki, Alan; Familoni, Babajide O.; Larson, Louis; Harkrider, Susan; Krapels, Keith A.; Dai, Liyi

2014-05-01

287

Development of Hydrologic Characterization Technology of Fault Zones: Phase I, 2nd Report  

International Nuclear Information System (INIS)

This is the year-end report of the 2nd year of the NUMO-LBNL collaborative project: Development of Hydrologic Characterization Technology of Fault Zones under NUMO-DOE/LBNL collaboration agreement, the task description of which can be found in the Appendix 3. Literature survey of published information on the relationship between geologic and hydrologic characteristics of faults was conducted. The survey concluded that it may be possible to classify faults by indicators based on various geometric and geologic attributes that may indirectly relate to the hydrologic property of faults. Analysis of existing information on the Wildcat Fault and its surrounding geology was performed. The Wildcat Fault is thought to be a strike-slip fault with a thrust component that runs along the eastern boundary of the Lawrence Berkeley National Laboratory. It is believed to be part of the Hayward Fault system but is considered inactive. Three trenches were excavated at carefully selected locations mainly based on the information from the past investigative work inside the LBNL property. At least one fault was encountered in all three trenches. Detailed trench mapping was conducted by CRIEPI (Central Research Institute for Electric Power Industries) and LBNL scientists. Some intriguing and puzzling discoveries were made that may contradict with the published work in the past. Predictions are made regarding the hydrologic property of the Wildcat Fault based on the analysis of fault structure. Preliminary conceptual models of the Wildcat Fault were proposed. The Wildcat Fault appears to have multiple splays and some low angled faults may be part of the flower structure. In parallel, surface geophysical investigations were conducted using electrical resistivity survey and seismic reflection profiling along three lines on the north and south of the LBNL site. Because of the steep terrain, it was difficult to find optimum locations for survey lines as it is desirable for them to be as straight as possible. One interpretation suggests that the Wildcat Fault is westerly dipping. This could imply that the Wildcat Fault may merge with the Hayward Fault at depth. However, due to the complex geology of the Berkeley Hills, multiple interpretations of the geophysical surveys are possible. iv An effort to construct a 3D GIS model is under way. The model will be used not so much for visualization of the existing data because only surface data are available thus far, but to conduct investigation of possible abutment relations of the buried formations offset by the fault. A 3D model would be useful to conduct 'what if' scenario testing to aid the selection of borehole drilling locations and configurations. Based on the information available thus far, a preliminary plan for borehole drilling is outlined. The basic strategy is to first drill boreholes on both sides of the fault without penetrating it. Borehole tests will be conducted in these boreholes to estimate the property of the fault. Possibly a slanted borehole will be drilled later to intersect the fault to confirm the findings from the boreholes that do not intersect the fault. Finally, the lessons learned from conducting the trenching and geophysical surveys are listed. It is believed that these lessons will be invaluable information for NUMO when it conducts preliminary investigations at yet-to-be selected candidate sites in Japan.

288

Development of Hydrologic Characterization Technology of Fault Zones -- Phase I, 2nd Report  

Energy Technology Data Exchange (ETDEWEB)

This is the year-end report of the 2nd year of the NUMO-LBNL collaborative project: Development of Hydrologic Characterization Technology of Fault Zones under NUMO-DOE/LBNL collaboration agreement, the task description of which can be found in the Appendix 3. Literature survey of published information on the relationship between geologic and hydrologic characteristics of faults was conducted. The survey concluded that it may be possible to classify faults by indicators based on various geometric and geologic attributes that may indirectly relate to the hydrologic property of faults. Analysis of existing information on the Wildcat Fault and its surrounding geology was performed. The Wildcat Fault is thought to be a strike-slip fault with a thrust component that runs along the eastern boundary of the Lawrence Berkeley National Laboratory. It is believed to be part of the Hayward Fault system but is considered inactive. Three trenches were excavated at carefully selected locations mainly based on the information from the past investigative work inside the LBNL property. At least one fault was encountered in all three trenches. Detailed trench mapping was conducted by CRIEPI (Central Research Institute for Electric Power Industries) and LBNL scientists. Some intriguing and puzzling discoveries were made that may contradict with the published work in the past. Predictions are made regarding the hydrologic property of the Wildcat Fault based on the analysis of fault structure. Preliminary conceptual models of the Wildcat Fault were proposed. The Wildcat Fault appears to have multiple splays and some low angled faults may be part of the flower structure. In parallel, surface geophysical investigations were conducted using electrical resistivity survey and seismic reflection profiling along three lines on the north and south of the LBNL site. Because of the steep terrain, it was difficult to find optimum locations for survey lines as it is desirable for them to be as straight as possible. One interpretation suggests that the Wildcat Fault is westerly dipping. This could imply that the Wildcat Fault may merge with the Hayward Fault at depth. However, due to the complex geology of the Berkeley Hills, multiple interpretations of the geophysical surveys are possible. iv An effort to construct a 3D GIS model is under way. The model will be used not so much for visualization of the existing data because only surface data are available thus far, but to conduct investigation of possible abutment relations of the buried formations offset by the fault. A 3D model would be useful to conduct 'what if' scenario testing to aid the selection of borehole drilling locations and configurations. Based on the information available thus far, a preliminary plan for borehole drilling is outlined. The basic strategy is to first drill boreholes on both sides of the fault without penetrating it. Borehole tests will be conducted in these boreholes to estimate the property of the fault. Possibly a slanted borehole will be drilled later to intersect the fault to confirm the findings from the boreholes that do not intersect the fault. Finally, the lessons learned from conducting the trenching and geophysical surveys are listed. It is believed that these lessons will be invaluable information for NUMO when it conducts preliminary investigations at yet-to-be selected candidate sites in Japan.

Karasaki, Kenzi; Onishi, Tiemi; Black, Bill; Biraud, Sebastien

2009-03-31

289

FOREWORD: 2nd International Workshop on New Computational Methods for Inverse Problems (NCMIP 2012)  

Science.gov (United States)

Conference logo This volume of Journal of Physics: Conference Series is dedicated to the scientific contributions presented during the 2nd International Workshop on New Computational Methods for Inverse Problems, (NCMIP 2012). This workshop took place at Ecole Normale Supérieure de Cachan, in Cachan, France, on 15 May 2012, at the initiative of Institut Farman. The first edition of NCMIP also took place in Cachan, France, within the scope of the ValueTools Conference, in May 2011 (http://www.ncmip.org/2011/). The NCMIP Workshop focused on recent advances in the resolution of inverse problems. Indeed inverse problems appear in numerous scientific areas such as geophysics, biological and medical imaging, material and structure characterization, electrical, mechanical and civil engineering, and finance. The resolution of inverse problems consists of estimating the parameters of the observed system or structure from data collected by an instrumental sensing or imaging device. Its success firstly requires the collection of relevant observation data. It also requires accurate models describing the physical interactions between the instrumental device and the observed system, as well as the intrinsic properties of the solution itself. Finally, it requires the design of robust, accurate and efficient inversion algorithms. Advanced sensor arrays and imaging devices provide high rate and high volume data; in this context, the efficient resolution of the inverse problem requires the joint development of new models and inversion methods, taking computational and implementation aspects into account. During this one-day workshop, researchers had the opportunity to bring to light and share new techniques and results in the field of inverse problems. The topics of the workshop were: algorithms and computational aspects of inversion, Bayesian estimation, kernel methods, learning methods, convex optimization, free discontinuity problems, metamodels, proper orthogonal decomposition, reduced models for the inversion, non-linear inverse scattering, image reconstruction and restoration, applications (bio-medical imaging, non-destructive evaluation etc). NCMIP 2012 was a one-day workshop. Each of the submitted papers was reviewed by 2 to 4 reviewers. Among the accepted papers, there are 8 oral presentations and 5 posters. Three international speakers were invited for a long talk. This second edition attracted 60 registered attendees in May 2012. NCMIP 2012 was supported by Institut Farman (ENS Cachan) and endorsed by the following French research networks (GDR ISIS, GDR Ondes, GDR MOA, GDR MSPC). The program committee acknowledges the following laboratories CMLA, LMT, LSV, LURPA, SATIE, as well as DIGITEO Network. Laure Blanc-Féraud and Pierre-Yves Joubert Workshop Co-chairs Laure Blanc-Féraud, I3S laboratory, CNRS, France Pierre-Yves Joubert, IEF laboratory, Paris-Sud University, CNRS, France Technical Program Committee Alexandre Baussard, ENSTA Bretagne, Lab-STICC, France Marc Bonnet, ENSTA, ParisTech, France Jerôme Darbon, CMLA, ENS Cachan, CNRS, France Oliver Dorn, School of Mathematics, University of Manchester, UK Mário Figueiredo, Instituto Superior Técnico, Lisbon, Portugal Laurent Fribourg, LSV, ENS Cachan, CNRS, France Marc Lambert, L2S Laboratory, CNRS, SupElec, Paris-Sud University, France Anthony Quinn, Trinity College, Dublin, Ireland Christian Rey, LMT, ENS Cachan, CNRS, France Joachim Weickert, Saarland University, Germany Local Chair Alejandro Mottini, Morpheme group I3S-INRIA Sophie Abriet, SATIE, ENS Cachan, CNRS, France Béatrice Bacquet, SATIE, ENS Cachan, CNRS, France Reviewers Gilles Aubert, J-A Dieudonné Laboratory, CNRS and University of Nice-Sophia Antipolis, France Alexandre Baussard, ENSTA Bretagne, Lab-STICC, France Laure Blanc-Féraud, I3S laboratory, CNRS, France Marc Bonnet, ENSTA, ParisTech, France Jerôme Darbon, CMLA, ENS Cachan, CNRS, France Oliver Dorn, School of Mathematics, University of Manchester, UK Gérard Favier, I3S laboratory, CNRS, France Mário Figueiredo, Instituto Superior Técnico, Lisb

Blanc-Féraud, Laure; Joubert, Pierre-Yves

2012-09-01

290

2nd Generation RLV Risk Reduction Definition Program: Pratt & Whitney Propulsion Risk Reduction Requirements Program (TA-3 & TA-4)  

Science.gov (United States)

This is the final report and addresses all of the work performed on this program. Specifically, it covers vehicle architecture background, definition of six baseline engine cycles, reliability baseline (space shuttle main engine QRAS), and component level reliability/performance/cost for the six baseline cycles, and selection of 3 cycles for further study. This report further addresses technology improvement selection and component level reliability/performance/cost for the three cycles selected for further study, as well as risk reduction plans, and recommendation for future studies.

Matlock, Steve

2001-01-01

291

Noise Characteristics of 64-channel 2nd-order DROS Gradiometer System inside a Poorly Magnetically-shielded Room  

International Nuclear Information System (INIS)

We have developed a second-order double relaxation oscillation SQUID(DROS) gradiometer with a baseline of 35 mm, and constructed a poorly magnetically-shielded room(MSR) with an aluminum layer and permalloy layers for magnetocardiography(MCG). The 2nd-order DROS gradiometer has a noise level of 20 fT/Hz at 1 Hz and 8 fT/Hz at 200 Hz inside the heavily-shielded MSR with a shielding factor of103at 1 Hz and 104 - 105 at 100 Hz. The poorly-shielded MSR, built of a 12-mm-thick aluminum layer and 4-6 permalloy layers of 0.35 mm thickness, is 2.4 m x 2.4 m x 2.4 m in size, and has a shielding factor of 40 at 1 Hz, 104 at 100 Hz. Our 64-channel second-order gradiometer MCG system consists of 64 2nd-order DROS gradiometers, flux-locked loop electronics, and analog signal processors. With the 2nd-order DROS gradiometers and flux-locked loop electronics installed inside the poorly-shielded MSR, and with the analog signal processor installed outside it, the noise level was measured to be 20 fT/Hz at 1 Hz and 8 fT/Hz at 200 Hz on the average even though the MSR door is open. This result leads to a low noise level, low enough to obtain a human MCG at the same level as that measured in the heavily-shielded MSR. However, filters or active shielding is needed fur clear MCG when there is large low-frequency noise from heavy air conditioning or large ac power consumption near the poorly-shielded MSR.

292

Noise Characteristics of 64-channel 2nd-order DROS Gradiometer System inside a Poorly Magnetically-shielded Room  

Energy Technology Data Exchange (ETDEWEB)

We have developed a second-order double relaxation oscillation SQUID(DROS) gradiometer with a baseline of 35 mm, and constructed a poorly magnetically-shielded room(MSR) with an aluminum layer and permalloy layers for magnetocardiography(MCG). The 2nd-order DROS gradiometer has a noise level of 20 fT/Hz at 1 Hz and 8 fT/Hz at 200 Hz inside the heavily-shielded MSR with a shielding factor of10{sup 3}at 1 Hz and 10{sup 4} - 10{sup 5} at 100 Hz. The poorly-shielded MSR, built of a 12-mm-thick aluminum layer and 4-6 permalloy layers of 0.35 mm thickness, is 2.4 m x 2.4 m x 2.4 m in size, and has a shielding factor of 40 at 1 Hz, 10{sup 4} at 100 Hz. Our 64-channel second-order gradiometer MCG system consists of 64 2nd-order DROS gradiometers, flux-locked loop electronics, and analog signal processors. With the 2nd-order DROS gradiometers and flux-locked loop electronics installed inside the poorly-shielded MSR, and with the analog signal processor installed outside it, the noise level was measured to be 20 fT/Hz at 1 Hz and 8 fT/Hz at 200 Hz on the average even though the MSR door is open. This result leads to a low noise level, low enough to obtain a human MCG at the same level as that measured in the heavily-shielded MSR. However, filters or active shielding is needed fur clear MCG when there is large low-frequency noise from heavy air conditioning or large ac power consumption near the poorly-shielded MSR.

Kim, J. M.; Lee, Y. H.; Yu, K. K.; Kim, K.; Kwon, H.; Park, Y. K. [Biosignal Research Center, Korea Research Institute of Standards and Science, Daejeon (Korea, Republic of); Sasada, Ichiro [Dept. of Applied Science for Electronics and Materials, Ktushu University, Fukuoka (Korea, Republic of)

2006-10-15

293

Quantitative nondestructive detection of residual stresses of the 2nd and 3rd order by using micro-magnetic methods  

Digital Repository Infrastructure Vision for European Research (DRIVER)

Micro residual stresses (MRS) of the 2nd and 3rd order play an important role in the lifetime analysis of thermally-cycled materials. The coherent residual stresses (MRS of 3rd order) appear when the lattice parameter of the second phase particles (coherently embedded in the matrix) and the lattice parameter of the matrix are different from each other. Such differences in case of the Fe-Cu alloys between the lattice parameters of coherent Cu precipitates and the lattice parameters of the ?- F...

Pirlog, M.; Schnubel, D.; Altpeter, I.; Dobmann, G.; Kro?ning, M.

2006-01-01

294

Meeting Report: The 2nd Annual Argonne Soils Workshop, Argonne National Laboratory, Chicago Illinois, USA, October 6-8, 2010  

Science.gov (United States)

This report summarizes the proceedings of the 2nd Annual Argonne Soils Workshop held at Argonne National Laboratory October 6–8, 2010. The workshop assembled a diverse group of soil ecologists, microbiologists, molecular biologists, and computational scientists to discuss the challenges and opportunities related to implementation of metagenomics approaches in soil microbial ecology. The overarching theme of the workshop was “designing ecologically meaningful soil metagenomics research”, which encouraged presentations on both ecological and computational topics. The workshop fostered valuable cross-discipline communication and delivered the message that soil metagenomics research must be based on an iterative process between biological inquiry and bioinformatics tools. PMID:22180822

O'Brien, Sarah L.; Glass, Elizabeth M.; Brulc, Jennifer M.; Gilbert, Jack A.; Antonopoulos, Dionysios A.; Meyer, Folker

2011-01-01

295

Virtual Visit to the ATLAS Control Room by 2nd High School of Eleftherio–Kordelio in Thessaloniki  

CERN Multimedia

Our school is the 2nd High School of Eleftherio – Kordelio. It is located at the west suburbs of Thessaloniki in Greece and our students are between 15-17 years old. Thessaloniki is the second largest city in Greece with a port of a major role in trading at the area of South Balkans. During this period of time our students have heard so much about CERN and the great discoveries which have taken place there and they are really keen on visiting and learning many things about it.

2013-01-01

296

THE 2nd SCHIZOPHRENIA INTERNATIONAL RESEARCH SOCIETY CONFERENCE, 10-14 APRIL 2010, FLORENCE, ITALY: SUMMARIES OF ORAL SESSIONS  

Science.gov (United States)

The 2nd Schizophrenia International Research Society Conference, was held in Florence, Italy, April 10–15, 2010. Student travel awardees served as rapporteurs of each oral session and focused their summaries on the most significant findings that emerged from each session and the discussions that followed. The following report is a composite of these reviews. It is hoped that it will provide an overview for those who were present, but could not participate in all sessions, and those who did not have the opportunity to attend, but who would be interested in an update on current investigations ongoing in the field of schizophrenia research. PMID:20934307

Baharnoori, Moogeh; Bartholomeusz, Cali; Boucher, Aurelie A.; Buchy, Lisa; Chaddock, Christopher; Chiliza, Bonga; Focking, Melanie; Fornito, Alex; Gallego, Juan A.; Hori, Hiroaki; Huf, Gisele; Jabbar, Gul A.; Kang, Shi Hyun; El Kissi, Yousri; Merchan-Naranjo, Jessica; Modinos, Gemma; Abdel-Fadeel, Nashaat A.M.; Neubeck, Anna-Karin; Ng, Hsiao Piau; Novak, Gabriela; Owolabi, Olasunmbo.O.; Prata, Diana P.; Rao, Naren P.; Riecansky, Igor; Smith, Darryl C.; Souza, Renan P.; Thienel, Renate; Trotman, Hanan D.; Uchida, Hiroyuki; Woodberry, Kristen A.; O'Shea, Anne; DeLisi, Lynn E.

2014-01-01

297

Software Engineering  

Science.gov (United States)

CSC 450. Software Engineering (3) Prerequisite: CSC 332 and senior standing. Study of the design and production of large and small software systems. Topics include systems engineering, software life-cycle and characterization; use of software tools. Substantial software project required.

Tagliarini, Gene

2003-04-21

298

Conceptual design and optimization of a 1-1/2 generation PFBC plant task 14. Topical report  

Energy Technology Data Exchange (ETDEWEB)

The economics and performance of advanced pressurized fluidized bed (PFBC) cycles developed for utility applications during the last 10 years (especially the 2nd-Generation PFBC cycle) are projected to be favorable compared to conventional pulverized coal power plants. However, the improved economics of 2nd-Generation PFBC cycles are accompanied by the perception of increased technological risk related to the pressurized carbonizer and its associated gas cleanup systems. A PFBC cycle that removed the uncertainties of the carbonizer while retaining the high efficiency and low cost of a 2nd-Generation PFBC cycle could improve the prospects for early commercialization and pave the way for the introduction of the complete 2nd-Generation PFBC cycle at some later date. One such arrangement is a PFBC cycle with natural gas topping combustion, referred to as the 1.5-Generation PFBC cycle. This cycle combines the advantages of the 2nd-Generation PFBC plant with the reduced risk associated with a gas turbine burning natural gas, and can potentially be part of a phased approach leading to the commercialization of utility 2nd-Generation PFBC cycles. The 1.5-Generation PFBC may also introduce other advantages over the more complicated 2nd-Generation PFBC system. This report describes the technical and economic evaluation of 1.5-Generation PFBC cycles for utility or industrial power generation.

Rubow, L.N.; Horazak, D.A.; White, J.S. [and others

1994-12-01

299

Studies on lattice thermal expansion and XPS of ThO2 -NdO1.5 solid solutions  

International Nuclear Information System (INIS)

The lattice parameter changes with respect to temperature (T) have been measured by high temperature X-ray diffraction (HTXRD) technique for ThO2 -NdO1.5 solid solutions containing 23.8 and 42.5 mol% NdO1.5 in the temperature range from 298 to 2000 K. The temperature versus lattice parameter data have been made use of in calculating the lattice thermal expansivity. The values of thermal expansion of the solid solutions were found to be increased with increase in neodymium oxide content and temperature. The mean linear thermal expansion coefficients in this temperature range for ThO2 -NdO1.5 solid solutions are 12.28 x 10-6 and 12.90 x 10-6 K-1, respectively. The binding energies of Th 4f7/2 and Nd 3d5/2 energy levels of the solid solutions containing 13.1, 23.8, 31.9, 37.2 and 42.5 mol% NdO1.5 and two-phase mixtures containing 47.6 and 51.8 mol% NdO1.5 were experimentally determined by X-ray photoelectron spectroscopy (XPS)

300

Determination of Conceptions of Secondary 10th Grade Students About Torque, Angular Momentum and Kepler’s 2nd Law  

Directory of Open Access Journals (Sweden)

Full Text Available Prior knowledge that students have can be compatible with scientific facts or it can be a learning disability different from the scientific facts. So it is important to reveal students' prior knowledge and to determine their ideas that are not compatible with scientific information. In this study, it has been aimed to reveal the prior knowledge of 133 tenth grade students about torque, conservation of angular momentum and Kepler's 2nd law. Students taking place in the sample have not taken formal education about these concepts yet and their ideas may be naive. The students have been asked three open-ended questions whose study of validity and reliability has been carried out. While it is seen that students gave answers consistent with the scientific facts about the torque concept, none of the answers given by students about the conservation of angular momentum and Kepler's 2nd law is included in the category of scientific answer. Students have been encountered the most common type of alternative answers for both of these concepts. In the teaching of this concept and law, students' prior knowledge conflicting with the scientific ideas should be considered, that is to say, teaching should be organized to include conceptual change.

Ayberk Bostan Sar?o?lan

2013-06-01

 
 
 
 
301

Numerical stability of 2nd order Runge-Kutta integration algorithms for use in particle-in-cell codes  

International Nuclear Information System (INIS)

An essential ingredient of particle-in-cell (PIC) codes is a numerically accurate and stable integration scheme for the particle equations of motion. Such a scheme is the well known time-centered leapfrog (LF) method accurate to 2nd order with respect to the timestep ?t. However, this scheme can only be used for forces independent of velocity unless a simple enough implicit implementation is possible. The LF scheme is therefore inapplicable in Monte-Carlo treatments of particle collisions and/or interactions with radio-frequency fields. We examine here the suitability of the 2nd order Runge-Kutta (RK) method. We find that the basic RK scheme is numerically unstable, but that conditional stability can be attained by an implementation which preserves phase space area. Examples are presented to illustrate the performance of the RK schemes. We compare analytic and computed electron orbits in a traveling nonlinear wave and also show self-consistent PIC simulations describing plasma flow in the vicinity of a lower hybrid antenna. (author)

302

Digital Generation of Noise-Signals with Arbitrary Constant or Time-Varying Spectra (A noise generation software package and its application)  

CERN Document Server

Artificial creation of arbitrary noise signals is used in accelerator physics to reproduce a measured perturbation spectrum for simulations but also to generate real-time shaped noise spectra for controlled emittance blow-up giving tailored properties to the final bunch shape. It is demonstrated here how one can produce numerically what is, for all practical purposes, an unlimited quantity of non-periodic noise data having any predefined spectral density. This spectral density may be constant or varying with time. The noise output never repeats and has excellent statistical properties, important for very long-term applications. It is difficult to obtain such flexibility and spectral cleanliness using analogue techniques. This algorithm was applied both in computer simulations of bunch behaviour in the presence of RF noise in the PS, SPS and LHC and also to generate real-time noise, tracking the synchrotron frequency change during the energy ramp of the SPS and producing controlled longitudinal emittance blow-...

Tückmantel, Joachim

2008-01-01

303

Temperature dependence of standard Gibbs energy of formation of Al{sub 2}Nd from near absolute 0 K to room temperature  

Energy Technology Data Exchange (ETDEWEB)

The standard Gibbs energy of formation as a function of temperature of Al{sub 2}Nd, {delta}{sub f}G{sup o}{sub T}(Al{sub 2}Nd), in the temperature range from near absolute 0 K to room temperature was determined by combining the measurement of heat capacities, C{sub p}, of pure aluminium and neodymium using the relaxation method with the thermodynamic data of Al{sub 2}Nd in our previous work. The temperature dependence of {delta}{sub f}G{sup o}{sub T}(Al{sub 2}Nd) was small below 10 K, and it increased monotonically above 10 K. {delta}{sub f}G{sup o}{sub T}(Al{sub 2}Nd) could be evaluated as follows: {delta}{sub f}G{sup o}{sub T}(Al{sub 2}Nd)=-104.52-1.8436x10{sup -2}T+1.6273x10{sup -2}TlogT-5.8437x10{sup -5}T{sup 2}-1.5260x10{sup -2}T{sup -1}{+-}27.3,(3-75K) {delta}{sub f}G{sup o}{sub T}(Al{sub 2}Nd)=-104.95+4.4274x10{sup -3}T+3.6408x10{sup -3}TlogT-3.5175x10{sup -6}T{sup 2}+13.983T{sup -1}{+-}27.3,(75-300K)

Yamamoto, Hiroaki [Department of Materials Science and Chemistry, Graduate School of Engineering, University of Hyogo, 2167 shosha, Himeji, Hyogo 671-2201 (Japan)], E-mail: hyama@eng.u-hyogo.ac.jp; Morishita, Masao [Department of Materials Science and Chemistry, Graduate School of Engineering, University of Hyogo, 2167 shosha, Himeji, Hyogo 671-2201 (Japan)

2008-05-29

304

A Software Method for Generating Concurrent Pwm Signal from Pic18f4520 for Biomimetic Robotic Fish Control  

Digital Repository Infrastructure Vision for European Research (DRIVER)

A method of generating multiple pulse width modulated signal with phase difference is presented in this work. The microcontroller used is PIC18F4520 and its output is used to drive Futaba RC servo motors directly. The concurrency of pulse width modulated signal in this work is relative, due to the fact that there is a finite time (microcontroller period) between each instruction toggling the output pins. This finite time is equal to the minimum period of the microcontroller, which is 125 ns i...

Afolayan, M. O.; Yawas, D. S.; Folayan, C. O.; Aku, S. Y.

2013-01-01

305

User Manual for Beta Version of TURBO-GRD: A Software System for Interactive Two-Dimensional Boundary/ Field Grid Generation, Modification, and Refinement  

Science.gov (United States)

TURBO-GRD is a software system for interactive two-dimensional boundary/field grid generation. modification, and refinement. Its features allow users to explicitly control grid quality locally and globally. The grid control can be achieved interactively by using control points that the user picks and moves on the workstation monitor or by direct stretching and refining. The techniques used in the code are the control point form of algebraic grid generation, a damped cubic spline for edge meshing and parametric mapping between physical and computational domains. It also performs elliptic grid smoothing and free-form boundary control for boundary geometry manipulation. Internal block boundaries are constructed and shaped by using Bezier curve. Because TURBO-GRD is a highly interactive code, users can read in an initial solution, display its solution contour in the background of the grid and control net, and exercise grid modification using the solution contour as a guide. This process can be called an interactive solution-adaptive grid generation.

Choo, Yung K.; Slater, John W.; Henderson, Todd L.; Bidwell, Colin S.; Braun, Donald C.; Chung, Joongkee

1998-01-01

306

GENII (Generation II): The Hanford Environmental Radiation Dosimetry Software System: Volume 3, Code maintenance manual: Hanford Environmental Dosimetry Upgrade Project  

Energy Technology Data Exchange (ETDEWEB)

The Hanford Environmental Dosimetry Upgrade Project was undertaken to incorporate the internal dosimetry models recommended by the International Commission on Radiological Protection (ICRP) in updated versions of the environmental pathway analysis models used at Hanford. The resulting second generation of Hanford environmental dosimetry computer codes is compiled in the Hanford Environmental Dosimetry System (Generation II, or GENII). This coupled system of computer codes is intended for analysis of environmental contamination resulting from acute or chronic releases to, or initial contamination of, air, water, or soil, on through the calculation of radiation doses to individuals or populations. GENII is described in three volumes of documentation. This volume is a Code Maintenance Manual for the serious user, including code logic diagrams, global dictionary, worksheets to assist with hand calculations, and listings of the code and its associated data libraries. The first volume describes the theoretical considerations of the system. The second volume is a Users' Manual, providing code structure, users' instructions, required system configurations, and QA-related topics. 7 figs., 5 tabs.

Napier, B.A.; Peloquin, R.A.; Strenge, D.L.; Ramsdell, J.V.

1988-09-01

307

Software Engineering Program: Software Process Improvement Guidebook  

Science.gov (United States)

The purpose of this document is to provide experience-based guidance in implementing a software process improvement program in any NASA software development or maintenance community. This guidebook details how to define, operate, and implement a working software process improvement program. It describes the concept of the software process improvement program and its basic organizational components. It then describes the structure, organization, and operation of the software process improvement program, illustrating all these concepts with specific NASA examples. The information presented in the document is derived from the experiences of several NASA software organizations, including the SEL, the SEAL, and the SORCE. Their experiences reflect many of the elements of software process improvement within NASA. This guidebook presents lessons learned in a form usable by anyone considering establishing a software process improvement program within his or her own environment. This guidebook attempts to balance general and detailed information. It provides material general enough to be usable by NASA organizations whose characteristics do not directly match those of the sources of the information and models presented herein. It also keeps the ideas sufficiently close to the sources of the practical experiences that have generated the models and information.

1996-01-01

308

Early prediction for necessity of 2nd I-131 ablation therapy with serum thyroglobulin levels in patients with differentiated thyroid cancer  

International Nuclear Information System (INIS)

The aim of our study was to evaluate the predictive value of serum thyroglobulin levels, measured at preoperative status and just before 1st I-131 ablation therapy with high serum TSH, for necessity of 2nd I-131 ablation therapy in differentiated thyroid cancer patients. 111 patients with DTC who underwent total or near total thyroidectomy followed by immediate I-131 ablation therapy, were enrolled in this study. TSH, Tg and anti-Tg autoantibody were measured before thyroidectomy (TSHpreop, Tgpreop and Anti-Tgpreop) and just before 1st I-131 ablation therapy (TSHabl, Tgabl and Anti-Tgabl). All TSHabl levels were above 30mU/liter, ATg [(Tgpreop-Tgabl)X100/(Tgpreop)] was calculated. 29 patients(26.1%, 29/111) had to receive 2nd I-131 ablation therapy. Of 70 patients whose Tgabl were under 10 ng/ml, only 11 patients had received 2nd I-131 ablation therapy (15.7%). Patients with Tgabl greater than or equal to 10 ng/ml had received 2nd I-131 ablation therapy (18/41, 43.9%) than patients with lower Tgabl level. There was a disparity of necessity of 2nd I-131 ablation therapy between two groups(Tgabl <10 ng/ml and Tgabl =10 ng/ml, two by two /2 test p=0.0016). Of 41 patients with Tgabl greater than or equal to 10 ng/ml, 19 patients showed increased Tg levels (ATg<0). Patients with negative ATg and Tgabl greater than or equal to 10 ng/ml showed a strikingly high necessity of 2nd I-131 ablation therapy (11/19, 57.9%). There was also a significant disparity of necessity of 2nd I-131 ablation therapy between two groups(ATg<0 + Tgabl =10 ng/ml and the others, two by two /2 test, p=0.0012). These results suggest that high Tgabl level just before 1st I-131 ablation therapy can forecast the necessity of 2nd I-131 ablation therapy. Moreover, Difference of Tg level between preoperative status and just before 1st I-131 ablation therapy could also suggest necessity of 2nd I-131 ablation therapy at early period of DTC patients surveillance

309

A summary of the 2nd workshop on Human Resources Development (HRD) in the nuclear field in Asia. FY2000  

Energy Technology Data Exchange (ETDEWEB)

The Human Resources Development (HRD) Project was added in 1999 as a Cooperation Activity of 'the Forum for Nuclear Cooperation in Asia (FNCA)' which is organized by Nuclear Committee. The HRD Project supports to solidify the foundation of nuclear development utilization in Asia by promoting human resources development in Asian countries. The principal activity of the HRD Project is to hold the Workshop on Human Resources Development in the Nuclear Field in Asia once a year. The objective of the Workshop is to clarify problems and needs of the human resources development of each country and to support it mutually by exchanging information etc. The report consists of a summary of the 2nd Workshop on Human Resources Development in the Nuclear Field in Asia held on November 27 and 28, 2000 at Tokai Research Establishment of JAERI. (author)

NONE

2001-06-01

310

Anatomy of a 2nd-order unconformity: stratigraphy and facies of the Bakken formation during basin realignment  

Energy Technology Data Exchange (ETDEWEB)

Because classic Laramide compressional structures are relatively rare, the Williston Basin is often considered as structurally simple, but because of the presence of numerous sub-basins, simplistic lithofacies generalization is impossible, and detailed facies mapping is necessary to unravel Middle Bakken paleogeography. The unconformity above the Devonian Three Forks is explained by the infilling and destruction of the Devonian Elk Point basin, prepares the Bakken system, and introduces a Mississippian Williston Basin with a very different configuration. Black shales are too often considered as deposits that can only be found in deep water, but to a very different conclusion must be drawn after a review of stratigraphic geometry and facies successions. The whole Bakken is a 2nd-order lowstand to transgressive systems tract lying below the basal Lodgepole, which represents an interval of maximal flooding. This lowstand to transgressive stratigraphic context explains why the sedimentary process and provenance shows high aerial variability.

Skinner, Orion; Canter, Lyn; Sonnenfeld, Mark; Williams, Mark [Whiting Oil and Gas Corp., Denver, CO (United States)

2011-07-01

311

Use of 2nd and 3rd Level Correlation Analysis for Studying Degradation in Polycrystalline Thin-Film Solar Cells  

Energy Technology Data Exchange (ETDEWEB)

The correlation of stress-induced changes in the performance of laboratory-made CdTe solar cells with various 2nd and 3rd level metrics is discussed. The overall behavior of aggregated data showing how cell efficiency changes as a function of open-circuit voltage (Voc), short-circuit current density (Jsc), and fill factor (FF) is explained using a two-diode, PSpice model in which degradation is simulated by systematically changing model parameters. FF shows the highest correlation with performance during stress, and is subsequently shown to be most affected by shunt resistance, recombination and in some cases voltage-dependent collection. Large decreases in Jsc as well as increasing rates of Voc degradation are related to voltage-dependent collection effects and catastrophic shunting respectively. Large decreases in Voc in the absence of catastrophic shunting are attributed to increased recombination. The relevance of capacitance-derived data correlated with both Voc and FF is discussed.

Albin, D. S.; del Cueto, J. A.; Demtsu, S. H.; Bansal, S.

2011-03-01

312

Results of the independent verification of radiological remedial action at 396 South 2nd East Street, Monticello, Utah (MS00085)  

International Nuclear Information System (INIS)

In 1980 the site of a vanadium and uranium mill at Monticello, Utah, was accepted into the US Department of Energy's (DOE's) Surplus Facilities Management Program, with the objectives of restoring the government-owned mill site to safe levels of radioactivity disposing of or containing the tailings in an environmentally safe manner, and performing remedial actions on off-site (vicinity) properties that had been contaminated by radioactive material resulting from mill operations. During 1985, UNC Geotech, the remedial action contractor designated by DOE, performed remedial action on the vicinity property at 396 South 2nd East Street, Monticello, Utah. The Pollutant Assessments Group (PAG) of Oak Ridge National Laboratory was assigned the responsibility of verifying the data supporting the adequacy of remedial action and confirming the site's compliance with DOE guidelines. The PAG found that the site successfully meets the DOE remedial action objectives. Procedures used by PAG are described. 34 refs., 2 tabs

313

COTS software selection process.  

Energy Technology Data Exchange (ETDEWEB)

Today's need for rapid software development has generated a great interest in employing Commercial-Off-The-Shelf (COTS) software products as a way of managing cost, developing time, and effort. With an abundance of COTS software packages to choose from, the problem now is how to systematically evaluate, rank, and select a COTS product that best meets the software project requirements and at the same time can leverage off the current corporate information technology architectural environment. This paper describes a systematic process for decision support in evaluating and ranking COTS software. Performed right after the requirements analysis, this process provides the evaluators with more concise, structural, and step-by-step activities for determining the best COTS software product with manageable risk. In addition, the process is presented in phases that are flexible to allow for customization or tailoring to meet various projects' requirements.

Watkins, William M. (Strike Wire Technologies, Louisville, CO); Lin, Han Wei; McClelland, Kelly (U.S. Security Associates, Livermore, CA); Ullrich, Rebecca Ann; Khanjenoori, Soheil; Dalton, Karen; Lai, Anh Tri; Kuca, Michal; Pacheco, Sandra; Shaffer-Gant, Jessica

2006-05-01

314

Laparoscopic hepatectomy is theoretically better than open hepatectomy: preparing for the 2nd International Consensus Conference on Laparoscopic Liver Resection.  

Science.gov (United States)

Six years have passed since the first International Consensus Conference on Laparoscopic Liver Resection was held. This comparatively new surgical technique has evolved since then and is rapidly being adopted worldwide. We compared the theoretical differences between open and laparoscopic liver resection, using right hepatectomy as an example. We also searched the Cochrane Library using the keyword "laparoscopic liver resection." The papers retrieved through the search were reviewed, categorized, and applied to the clinical questions that will be discussed at the 2nd Consensus Conference. The laparoscopic hepatectomy procedure is more difficult to master than the open hepatectomy procedure because of the movement restrictions imposed upon us when we operate from outside the body cavity. However, good visibility of the operative field around the liver, which is located beneath the costal arch, and the magnifying provide for neat transection of the hepatic parenchyma. Another theoretical advantage is that pneumoperitoneum pressure reduces hemorrhage from the hepatic vein. The literature search turned up 67 papers, 23 of which we excluded, leaving only 44. Two randomized controlled trials (RCTs) are underway, but their results are yet to be published. Most of the studies (n = 15) concerned short-term results, with some addressing long-term results (n = 7), cost (n = 6), energy devices (n = 4), and so on. Laparoscopic hepatectomy is theoretically superior to open hepatectomy in terms of good visibility of the operative field due to the magnifying effect and reduced hemorrhage from the hepatic vein due to pneumoperitoneum pressure. However, there is as yet no evidence from previous studies to back this up in terms of short-term and long-term results. The 2nd International Consensus Conference on Laparoscopic Liver Resection will arrive at a consensus on the basis of the best available evidence, with video presentations focusing on surgical techniques and the publication of guidelines for the standardization of procedures based on the experience of experts. PMID:25130985

Wakabayashi, Go; Cherqui, Daniel; Geller, David A; Han, Ho-Seong; Kaneko, Hironori; Buell, Joseph F

2014-10-01

315

The 2nd DBCLS BioHackathon: interoperable bioinformatics Web services for integrated applications  

Directory of Open Access Journals (Sweden)

Full Text Available Abstract Background The interaction between biological researchers and the bioinformatics tools they use is still hampered by incomplete interoperability between such tools. To ensure interoperability initiatives are effectively deployed, end-user applications need to be aware of, and support, best practices and standards. Here, we report on an initiative in which software developers and genome biologists came together to explore and raise awareness of these issues: BioHackathon 2009. Results Developers in attendance came from diverse backgrounds, with experts in Web services, workflow tools, text mining and visualization. Genome biologists provided expertise and exemplar data from the domains of sequence and pathway analysis and glyco-informatics. One goal of the meeting was to evaluate the ability to address real world use cases in these domains using the tools that the developers represented. This resulted in i a workflow to annotate 100,000 sequences from an invertebrate species; ii an integrated system for analysis of the transcription factor binding sites (TFBSs enriched based on differential gene expression data obtained from a microarray experiment; iii a workflow to enumerate putative physical protein interactions among enzymes in a metabolic pathway using protein structure data; iv a workflow to analyze glyco-gene-related diseases by searching for human homologs of glyco-genes in other species, such as fruit flies, and retrieving their phenotype-annotated SNPs. Conclusions Beyond deriving prototype solutions for each use-case, a second major purpose of the BioHackathon was to highlight areas of insufficiency. We discuss the issues raised by our exploration of the problem/solution space, concluding that there are still problems with the way Web services are modeled and annotated, including: i the absence of several useful data or analysis functions in the Web service "space"; ii the lack of documentation of methods; iii lack of compliance with the SOAP/WSDL specification among and between various programming-language libraries; and iv incompatibility between various bioinformatics data formats. Although it was still difficult to solve real world problems posed to the developers by the biological researchers in attendance because of these problems, we note the promise of addressing these issues within a semantic framework.

Katayama Toshiaki

2011-08-01

316

Operation of Finnish nuclear power plants. Quarterly report, 2nd quarter 1998  

International Nuclear Information System (INIS)

Quarterly reports on the operation of Finnish NPPs describe events and observations relating to nuclear and radiation safety that the Radiation and Nuclear Safety Authority (STUK) considers safety significant. Safety improvements at the plants are also described. The report includes a summary of the radiation safety of plant personnel and the environment and tabulated data on the plants' production and load factors. The Loviisa plant units were in power operation for the whole second quarter of 1998. The Olkiluoto units discontinued electricity generation for annual maintenance and also briefly for tests pertaining to the power upratings of the units. In addition, there were breaks in power generation at Olkiluoto 2 due to a low electricity demand in Midsummer and turbine balancing. The Olkiluoto units were in power operation in this quarter with the exception of the aforementioned breaks. The load factor average of the four plant units was 87.7%. The events in this quarter had no bearing on the nuclear or radiation safety. Occupational doses and radioactive releases off-site were below authorised limits. Radioactive substances were measurable in samples collected around the plants in such quantities only as have no bearing on the radiation exposure of the population. (orig.)

317

Operation of Finnish nuclear power plants. Quarterly report, 2nd quarter 1998  

Energy Technology Data Exchange (ETDEWEB)

Quarterly reports on the operation of Finnish NPPs describe events and observations relating to nuclear and radiation safety that the Radiation and Nuclear Safety Authority (STUK) considers safety significant. Safety improvements at the plants are also described. The report includes a summary of the radiation safety of plant personnel and the environment and tabulated data on the plants` production and load factors. The Loviisa plant units were in power operation for the whole second quarter of 1998. The Olkiluoto units discontinued electricity generation for annual maintenance and also briefly for tests pertaining to the power upratings of the units. In addition, there were breaks in power generation at Olkiluoto 2 due to a low electricity demand in Midsummer and turbine balancing. The Olkiluoto units were in power operation in this quarter with the exception of the aforementioned breaks. The load factor average of the four plant units was 87.7%. The events in this quarter had no bearing on the nuclear or radiation safety. Occupational doses and radioactive releases off-site were below authorised limits. Radioactive substances were measurable in samples collected around the plants in such quantities only as have no bearing on the radiation exposure of the population. (orig.)

Tossavainen, K. [ed.

1999-01-01

318

Software Testing  

Directory of Open Access Journals (Sweden)

Full Text Available Software goes through a cycle of software development stages. A software is envisioned, created, evaluated, fixed and then put to use. To run any software consistently without any failure/bug/error, the most important step is to test the software. This paper points various types of software testing(manual and automation, various software testing techniques like black box, white box, gray box, sanity, functional testing etc. and software test life cycle models (V-model and W-model. This paper tries to solve the misconceptions of those who think that testing is to be done only after the coding phase, but in reality it is to be associated with each and every phase of software life cycle models.

Sarbjeet Singh

2010-10-01

319

Operation of Finnish nuclear power plants. Quarterly report, 2nd quarter 1996  

International Nuclear Information System (INIS)

Quarterly Reports on the operation of Finnish nuclear power plants describe events and observations relating to nuclear and radiation safety which the Finnish Centre for Radiation and Nuclear Safety (STUK) considers safety significant. Safety improvements at the plants are also described. The report also includes a summary of the radiation safety of plant personnel and of the environment and tabulated data on the plants' production and load factors. In the second quarter of 1996, the Finnish nuclear power plant units were in power operation except for the annual maintenance outages of TVO plant units and the Midsummer shutdown at TVO II which was due to low electricity demand, a turbine generator inspection and repairs. The load factor average of all plant units was 88.9 %. Events in the second quarter of 1996 were classified level 0 on the International Nuclear Event Scale (INES)

320

LIDAR using fundamental and 2nd harmonic Nd-laser light  

International Nuclear Information System (INIS)

The features of a LIDAR system based on a frequency-doubled Neodymium YAG Laser are presented. With a diagnostic set-up using simultaneously the fundamental wavelength and the frequency doubled output of the laser, a broad range of temperatures is accessible, making the diagnostic system feasible not only for the bulk plasma but for the outer regions as well. Advantage can be taken of the different polarisations of the two wavelengths generated by the Neodymium laser, the detection system can then be divided into two different detection paths, one for high and the other for low temperatures. Based on the expected plasma parameters of a next step experiment the proposed performance of such a system is simulated. The advantages of a two wavelength system are discussed and a lay-out for a feasible diagnostic system for a next step machine is presented. (author). 6 refs, 10 figs, 2 tabs

 
 
 
 
321

A study on trait anger – anger expression and friendship commitment levels of primary school 2nd stage students who play – do not play sports  

Digital Repository Infrastructure Vision for European Research (DRIVER)

The aim of this research was to investigate the state of trait anger-anger expression and friendship commitment levels depending on the fact whether 2nd-stage-students in primary schools played sports or not.Personal Information Form, Trait Anger-Anger Expression Scales and Inventory of Peer Attachment were used in order to collect the data.The population of the research was consisted of the students who studied in 2nd stage of 40 primary state schools that belonged to National Education Dire...

Hüseyin K?r?mo?lu; Yunus Y?ld?r?m; Ali Temiz

2010-01-01

322

The stages of development a healthy way of life of senior pupils in native pedagogy (2nd part of XX century  

Directory of Open Access Journals (Sweden)

Full Text Available The stages of formation and development the problem of healthy way of life of senior pupils were defined in native pedagogy of 2nd part of XX century. The peculiarities of forming healthy way of life of senior pupils at every stage were analysed and disclosed. The contribution of native scientific and pedagogues of the 2nd part of XX century in solution of the problem of forming healthy way of life of senior pupils were determined. On the investigated stages were defined the imperfections that prevent progressive development of healthy way of life of senior pupils.

Iermakova T.S.

2010-04-01

323

Generation of a vector of nodal forces produced by loads pre-set by the arbitrary sculpted surface designated for universal stress analysis software ?????????? ???????????? ??????? ??????? ??? ?? ????????, ???????? ???????????? ???????????????? ????????????, ??? ????????????? ?????? ???????????? ???????  

Directory of Open Access Journals (Sweden)

Full Text Available The subject matter of the article represents the concept of a vector of nodal forces produced by loads pre-set by the arbitrary sculpted surface. The concept in question may be integrated into engineering CAD systems in the capacity of a preprocessor. Pursuant to the proposed methodology, the initial surface load represents a geometric object pre-set as a selection of standard graphic primitives. This technology is easy to use if the pre-processing constituent of the strength analysis system operates within CAD media. Multi-factor strength-related problems were resolved by Department of Computer-Aided Design of Moscow State University of Roads. Researchers have developed and tested KATRAN open architecture strength analysis software programme that may be integrated into AutoCAD processor. A user may select the surface accommodating any simulated arbitrary load; further, a point of the pre-set load intensity specified in the Distributed Load Q field of interface window Distributed Loads, and the point of zero intensity load are to be specified. The above source data are used to calculate the scale coefficient of transition from linear distances to the real value of the load intensity generated within the coordinate surface. The point of zero load intensity represents a virtual plane of zero distributed load values. The proposed software designated for the conversion of arbitrary distributed loads into the nodal load is compact; therefore, it may be integrated into modules capable of exporting the nodal load into other systems of strength analysis, though functioning as a problem-oriented geometrical utility of AutoCAD.???????? ????????? ???????????? ??????? ??????? ??? ?? ????????, ???????? ???????????? ???????????????? ????????????, ??? ???????? ???????????? ???????, ???????????? ? ???????? ????????????? ?????????? CAD-???????.

Shaposhnikov Nikolay Nikolaevich

2012-03-01

324

Multi-atom resonant photoemission and the development of next-generation software and high-speed detectors for electron spectroscopy  

Energy Technology Data Exchange (ETDEWEB)

This dissertation has involved the exploration of a new effect in photoelectron emission, multi-atom resonant photoemission (MARPE), as well as the development of new software, data analysis techniques, and detectors of general use in such research. We present experimental and theoretical results related to MARPE, in which the photoelectron intensity from a core level on one atom is influenced by a core-level absorption resonance on another. We point out that some of our and others prior experimental data has been strongly influenced by detector non-linearity and that the effects seen in new corrected data are smaller and of different form. Corrected data for the MnO(001) system with resonance between the O 1s and Mn 2p energy levels are found to be well described by an extension of well-known intraatomic resonant photoemission theory to the interatomic case, provided that interactions beyond the usual second-order Kramers-Heisenberg treatment are included. This theory is also found to simplify under certain conditions so as to yield results equivalent to a classical x-ray optical approach, with the latter providing an accurate and alternative, although less detailed and general, physical picture of these effects. Possible future applications of MARPE as a new probe of near-neighbor identities and bonding and its relationship to other known effects are also discussed. We also consider in detail specially written data acquisition software that has been used for most of the measurements reported here. This software has been used with an existing experimental system to develop the method of detector characterization and then data correction required for the work described above. The development of a next generation one-dimensional, high-speed, electron detector is also discussed. Our goal has been to design, build and test a prototype high-performance, one-dimensional pulse-counting detector that represents a significant advancement in detector technology and is well matched to modern high-brightness synchrotrons radiation sources and high-transmission electron-energy analyzers as typically used in photoelectron spectroscopy experiments. The general design of the detector and the results of initial tests are discussed and the acquisition of photoelectron spectra with the first test detector is described.

Kay, Alexander William

2000-09-01

325

Multi-atom resonant photoemission and the development of next-generation software and high-speed detectors for electron spectroscopy  

International Nuclear Information System (INIS)

This dissertation has involved the exploration of a new effect in photoelectron emission, multi-atom resonant photoemission (MARPE), as well as the development of new software, data analysis techniques, and detectors of general use in such research. We present experimental and theoretical results related to MARPE, in which the photoelectron intensity from a core level on one atom is influenced by a core-level absorption resonance on another. We point out that some of our and others prior experimental data has been strongly influenced by detector non-linearity and that the effects seen in new corrected data are smaller and of different form. Corrected data for the MnO(001) system with resonance between the O 1s and Mn 2p energy levels are found to be well described by an extension of well-known intraatomic resonant photoemission theory to the interatomic case, provided that interactions beyond the usual second-order Kramers-Heisenberg treatment are included. This theory is also found to simplify under certain conditions so as to yield results equivalent to a classical x-ray optical approach, with the latter providing an accurate and alternative, although less detailed and general, physical picture of these effects. Possible future applications of MARPE as a new probe of near-neighbor identities and bonding and its relationship to other known effects are also discussed. We also consider in detail specially written data acquisition software that has been used for most of the measurements reported here. This software has been used with an existing experimental system to develop the method of detector characterization and then data correction required for the work described above. The development of a next generation one-dimensional, high-speed, electron detector is also discussed. Our goal has been to design, build and test a prototype high-performance, one-dimensional pulse-counting detector that represents a significant advancement in detector technology and is well matched to modern high-brightness synchrotrons radiation sources and high-transmission electron-energy analyzers as typically used in photoelectron spectroscopy experiments. The general design of the detector and the results of initial tests are discussed and the acquisition of photoelectron spectra with the first test detector is described

326

Multi-atom resonant photoemission and the development of next-generation software and high-speed detectors for electron spectroscopy  

Science.gov (United States)

This dissertation has involved the exploration of a new effect in photoelectron emission, multi-atom resonant photoemission (MARPE), as well as the development of new software, data analysis techniques, and detectors of general use in such research. We present experimental and theoretical results related to MARPE, in which the photoelectron intensity from a core level on one atom is influenced by a core-level absorption resonance on another. We point out that some of our and others prior experimental data has been strongly influenced by detector non-linearity and that the effects seen in new corrected data are smaller and of different form. Corrected data for the MnO(001) system with resonance between the O 1s and Mn 2p energy levels are found to be well described by an extension of well-known intraatomic resonant photoemission theory to the interatomic case, provided that interactions beyond the usual second-order Kramers-Heisenberg treatment are included. This theory is also found to simplify under certain conditions so as to yield results equivalent to a classical x-ray optical approach, with the latter providing an accurate and alternative, although less detailed and general, physical picture of these effects. Possible future applications of MARPE as a new probe of near-neighbor identities and bonding and its relationship to other known effects are also discussed. We also consider in detail specially written data acquisition software that has been used for most of the measurements reported here. This software has been used with an existing experimental system to develop the method of detector characterization and then data correction required for the work described above. The development of a next generation one-dimensional, high-speed, electron detector is also discussed. Our goal has been to design, build and test a prototype high-performance, one-dimensional pulse-counting detector that represents a significant advancement in detector technology and is well matched to modern high-brightness synchrotron radiation sources and high-transmission electron-energy analyzers as typically used in photoelectron spectroscopy experiments. The general design of the detector and the results of initial tests are discussed and the acquisition of photoelectron spectra with the first test detector is described.

Kay, Alexander William

2000-10-01

327

Report of 2nd workshop on particle process. A report of the Yayoi study meeting  

International Nuclear Information System (INIS)

In Nuclear Engineering Research Laboratory, Faculty of Engineering, University of Tokyo, a short term research named Yayoi Research Group, as a joint application research work of nuclear reactor (Yayoi) and electron Linac in Japan, has been held more than 10 times a year. This report is arranged the summaries of 'Research on Particle Method', one of them, held on August 7, 1996. As named 'Particle Method' here, the method explaining and calculating the fluids and powders as a group of particles is more suitable for treating a problem with boundary face and a large deformation of the fluids on comparison with the conventional method using lattice, which is more expectable in future development. In this report, the following studies are contained; 1) Stress analysis without necessary of element breakdown, 2) Local interpolation differential operator method and nonstructural lattice, 3) Selforganized simulation of the dynamical construction, 4) A lattice BGK solution of laminar flow over a background facing step, 5) Numerical analysis of solid-gas two phase flow using discrete element method, 6) Application of flow analysis technique to power generation plant equipments, 7) Corrision wave captured flow calculation using the particle method, and 8) Analysis of complex problem on thermal flow using the particle (MPS) method. (G.K.)

328

2nd Workshop on Jet Modification in the RHIC and LHC Era  

CERN Document Server

A workshop organized jointly by the Wayne State Heavy Ion Group and the JET Collaboration. The goal of this 2 1/2 day meeting is to review the most important new experimental measurements and theoretical breakthroughs that have occurred in the past year andto throughly explore the limits of perturbative QCD based approaches to the description of hard processes in heavy-ion collisions. Over the period of three days, topics covered will include new experimental observables that may discern between different perturbative approaches, the inevitable transformation of analytic schemes to Monte-Carlo event generators, and the progress made towards Next to Leading Order calculations of energy loss. The workshop is intended to be slow paced:We envision a mixture of longer invited talks and shorter contributed talks,allowing sufficient time for discussion, as well as time to follow up on more technical aspects of the data analysis and theoretical calculations. One of the outcomes of this workshop will be a ...

2013-01-01

329

Incretinas, incretinomiméticos, inhibidores de DPP IV: (2ª parte) / Incretins, Incretinmimetics, Inhibitors (2nd part)  

Scientific Electronic Library Online (English)

Full Text Available SciELO Argentina | Language: Spanish Abstract in spanish En los últimos años se reconoce un nuevo mecanismo involucrado en la fisiopatología de la Diabetes Mellitus tipo 2: el déficit de producción y/o acción de las incretinas. Las incretinas son enterohormonas que estimulan la secreción de insulina en respuesta a la ingesta de nutrientes. Glucagon like p [...] eptide-1 (GLP1) y Polipéptido insulinotrópico glucosa dependiente (GIP) son las principales incretinas descubiertas hasta hoy. Ambas presentan también efecto trófico sobre las células beta de los islotes pancreáticos. GLP-1 presenta otras acciones como son la inhibición de la secreción de glucagón, enlentecimiento del vaciamiento gástrico e inhibición del apetito. Ambas incretinas son rápidamente clivadas por la enzima dipeptidil peptidasa 4 (DPP-4). Nuevas drogas como los incretinomiméticos, análogos y los inhibidores de DPP-4 se presentan como una terapéutica prometedora para los pacientes con diabetes tipo 2. Conflicto de intereses: Dr. León Litwak - Miembro del Board Latinoamericano de Eli Lilly y Sanofi Aventis - Miembro del Board Nacional de los laboratorios Novo Nordisk, Novartis, GlaxoSmithKline, Sanofi Aventis, Boheringer Ingelheim, Bristol Myers, Astra Zeneca - Investigador principal de protocolos pertenecientes a Eli Lilly, Novo Nordisk, Novartis, GlaxoSmithKline, Takeda, PPDF, Pfizer, Merck Sharp and Dôhme, Amger, Roche, Minimed, Quintiles - Conferencista de los laboratorios mencionados. Abstract in english Two main pathophysiological mechanisms are currently involved in Type 2 Diabetes (T2DM), insulin resistance and impairment of beta cell function. However, in recent years a new mechanism was reported: a significant decrease in incretins production and/or action. Incretins are gastrointestinal hormon [...] es whose main action is stimulating insulin secretion in response to nutrients. The best known incretins are glucagon like peptide-1 (GLP-1) and Gastric insulinotropic peptide (GIP). GLP-1 and GIP not only increase insulin secretion, but also decrease glucagon secretion, slow gastric emptying and reduce apetite, generating weight loss. Both incretins are rapidly clived by the enzyme dipeptidil peptidase 4 (DPP4). In order to emulate incretins action, several drugs were developed: GLP-1 receptor agonists, GLP-1 mimetics, and DPP4 inhibitors. All of them seem to became a very promising tool for the treatment of T2DM. Financial Interests: Dr. León Litwak - Member of the Latin American Board of Eli Lilly and Sanofi Aventis - Member of the National Board of the following laboratories: Novo Nordisk, Novartis, GlaxoSmithKlein Sanofi, Aventis, Boheringer Ingelheim, Bristol Myers, Astra Zeneca - Principal Investigator of Protocols from: Eli Lilly, Novo Nordisk, Novartis, GlaxoSmithKlein, Takeda, PPDF, Pfizer, Merck Sharp and Dôhme, Amgen, Roche, Minimed, Quintiles - Lecturer for the former laboratories.

Claudia, Bayón; Mercedes Araceli, Barriga; León, Litwak.

330

Incretinas, incretinomiméticos, inhibidores de DPP IV: (2ª parte) / Incretins, Incretinmimetics, Inhibitors (2nd part)  

Scientific Electronic Library Online (English)

Full Text Available SciELO Argentina | Language: Spanish Abstract in spanish En los últimos años se reconoce un nuevo mecanismo involucrado en la fisiopatología de la Diabetes Mellitus tipo 2: el déficit de producción y/o acción de las incretinas. Las incretinas son enterohormonas que estimulan la secreción de insulina en respuesta a la ingesta de nutrientes. Glucagon like p [...] eptide-1 (GLP1) y Polipéptido insulinotrópico glucosa dependiente (GIP) son las principales incretinas descubiertas hasta hoy. Ambas presentan también efecto trófico sobre las células beta de los islotes pancreáticos. GLP-1 presenta otras acciones como son la inhibición de la secreción de glucagón, enlentecimiento del vaciamiento gástrico e inhibición del apetito. Ambas incretinas son rápidamente clivadas por la enzima dipeptidil peptidasa 4 (DPP-4). Nuevas drogas como los incretinomiméticos, análogos y los inhibidores de DPP-4 se presentan como una terapéutica prometedora para los pacientes con diabetes tipo 2. Conflicto de intereses: Dr. León Litwak - Miembro del Board Latinoamericano de Eli Lilly y Sanofi Aventis - Miembro del Board Nacional de los laboratorios Novo Nordisk, Novartis, GlaxoSmithKline, Sanofi Aventis, Boheringer Ingelheim, Bristol Myers, Astra Zeneca - Investigador principal de protocolos pertenecientes a Eli Lilly, Novo Nordisk, Novartis, GlaxoSmithKline, Takeda, PPDF, Pfizer, Merck Sharp and Dôhme, Amger, Roche, Minimed, Quintiles - Conferencista de los laboratorios mencionados. Abstract in english Two main pathophysiological mechanisms are currently involved in Type 2 Diabetes (T2DM), insulin resistance and impairment of beta cell function. However, in recent years a new mechanism was reported: a significant decrease in incretins production and/or action. Incretins are gastrointestinal hormon [...] es whose main action is stimulating insulin secretion in response to nutrients. The best known incretins are glucagon like peptide-1 (GLP-1) and Gastric insulinotropic peptide (GIP). GLP-1 and GIP not only increase insulin secretion, but also decrease glucagon secretion, slow gastric emptying and reduce apetite, generating weight loss. Both incretins are rapidly clived by the enzyme dipeptidil peptidase 4 (DPP4). In order to emulate incretins action, several drugs were developed: GLP-1 receptor agonists, GLP-1 mimetics, and DPP4 inhibitors. All of them seem to became a very promising tool for the treatment of T2DM. Financial Interests: Dr. León Litwak - Member of the Latin American Board of Eli Lilly and Sanofi Aventis - Member of the National Board of the following laboratories: Novo Nordisk, Novartis, GlaxoSmithKlein Sanofi, Aventis, Boheringer Ingelheim, Bristol Myers, Astra Zeneca - Principal Investigator of Protocols from: Eli Lilly, Novo Nordisk, Novartis, GlaxoSmithKlein, Takeda, PPDF, Pfizer, Merck Sharp and Dôhme, Amgen, Roche, Minimed, Quintiles - Lecturer for the former laboratories.

Claudia, Bayón; Mercedes Araceli, Barriga; León, Litwak.

2010-09-01

331

Incretinas, incretinomiméticos, inhibidores de DPP IV: (2ª parte Incretins, Incretinmimetics, Inhibitors (2nd part  

Directory of Open Access Journals (Sweden)

Full Text Available En los últimos años se reconoce un nuevo mecanismo involucrado en la fisiopatología de la Diabetes Mellitus tipo 2: el déficit de producción y/o acción de las incretinas. Las incretinas son enterohormonas que estimulan la secreción de insulina en respuesta a la ingesta de nutrientes. Glucagon like peptide-1 (GLP1 y Polipéptido insulinotrópico glucosa dependiente (GIP son las principales incretinas descubiertas hasta hoy. Ambas presentan también efecto trófico sobre las células beta de los islotes pancreáticos. GLP-1 presenta otras acciones como son la inhibición de la secreción de glucagón, enlentecimiento del vaciamiento gástrico e inhibición del apetito. Ambas incretinas son rápidamente clivadas por la enzima dipeptidil peptidasa 4 (DPP-4. Nuevas drogas como los incretinomiméticos, análogos y los inhibidores de DPP-4 se presentan como una terapéutica prometedora para los pacientes con diabetes tipo 2. Conflicto de intereses: Dr. León Litwak - Miembro del Board Latinoamericano de Eli Lilly y Sanofi Aventis - Miembro del Board Nacional de los laboratorios Novo Nordisk, Novartis, GlaxoSmithKline, Sanofi Aventis, Boheringer Ingelheim, Bristol Myers, Astra Zeneca - Investigador principal de protocolos pertenecientes a Eli Lilly, Novo Nordisk, Novartis, GlaxoSmithKline, Takeda, PPDF, Pfizer, Merck Sharp and Dôhme, Amger, Roche, Minimed, Quintiles - Conferencista de los laboratorios mencionados.Two main pathophysiological mechanisms are currently involved in Type 2 Diabetes (T2DM, insulin resistance and impairment of beta cell function. However, in recent years a new mechanism was reported: a significant decrease in incretins production and/or action. Incretins are gastrointestinal hormones whose main action is stimulating insulin secretion in response to nutrients. The best known incretins are glucagon like peptide-1 (GLP-1 and Gastric insulinotropic peptide (GIP. GLP-1 and GIP not only increase insulin secretion, but also decrease glucagon secretion, slow gastric emptying and reduce apetite, generating weight loss. Both incretins are rapidly clived by the enzyme dipeptidil peptidase 4 (DPP4. In order to emulate incretins action, several drugs were developed: GLP-1 receptor agonists, GLP-1 mimetics, and DPP4 inhibitors. All of them seem to became a very promising tool for the treatment of T2DM. Financial Interests: Dr. León Litwak - Member of the Latin American Board of Eli Lilly and Sanofi Aventis - Member of the National Board of the following laboratories: Novo Nordisk, Novartis, GlaxoSmithKlein Sanofi, Aventis, Boheringer Ingelheim, Bristol Myers, Astra Zeneca - Principal Investigator of Protocols from: Eli Lilly, Novo Nordisk, Novartis, GlaxoSmithKlein, Takeda, PPDF, Pfizer, Merck Sharp and Dôhme, Amgen, Roche, Minimed, Quintiles - Lecturer for the former laboratories.

Claudia Bayón

2010-09-01

332

Software Complexity Methodologies & Software Security  

Digital Repository Infrastructure Vision for European Research (DRIVER)

It is broadly clear that complexity is one of the software natural features. Software natural complexity and software requirement functionality are two inseparable part and they have special range. measurement complexity have explained with using the MacCabe and Halsted models and with an example discuss about software complexity in this paper Flow metric information Henry and Kafura, complexity metric system Agresti-card-glass, design metric in item’s level have compared and peruse then ca...

Masoud Rafighi; Nasser Modiri

2011-01-01

333

Workshop report on the 2nd Joint ENCCA/EuroSARC European bone sarcoma network meeting: integration of clinical trials with tumour biology  

Science.gov (United States)

This is the report of the 2nd Joint ENCCA/EuroSARC European Bone Sarcoma Network Meeting held in Leiden, The Netherlands, on 26-27 September 2013, bringing together preclinical and clinical investigators on bone sarcoma. The purpose of this workshop was to present the achievements of biological research and clinical trials in bone sarcomas and to stimulate crosstalk.

2014-01-01

334

The 2008—2013 Crisis as Metastasis. A Preview of the 2nd edition of The Cancer Stage of Capitalism by Pluto Press  

Directory of Open Access Journals (Sweden)

Full Text Available By means of selection of relevant excerpts, a preview is offered hereby of the 2nd edition of John McMurtry's prophetic 1999 book "The Cancer Stage of Capitalism", published by Pluto Press, and entitled "The Cancer Stage of Capitalism and Its Cure"

John McMurtry

2013-03-01

335

ENDF/B-5 formats manual. Revised update pages of Nov. 1983. Reprint of B.A. Magurno, BNL-NCS--50496 (ENDF-102) 2nd Edition  

International Nuclear Information System (INIS)

The ENDF-5 Format, originally the format of the US Evaluated Nuclear Data File ENDF/B-5, was internationally recommended for the computer storage, processing and exchange of evaluated neutron nuclear data. The pages included in this document serve as an update to the original ENDF-5 Formats Manual BNL-NCS-50496 [ENDF-102] 2nd Edition, October 1979. (author)

336

Design of control system for the 2nd and 3rd charge exchange system in J-PARC 3GeV RCS  

International Nuclear Information System (INIS)

J-PARC 3GeV Synchrotron Accelerator is using method of charge exchange injection using three carbon foils. In order to achieve this injection, three charge exchange devices installed in this facility. These devices are controlled by one control system. The 2nd and 3rd charge exchange devices are upgrading to increase maintainability and exhaust ability of the vacuum unit, and the control system has reconsidered. Basic policy of redesigning the control system is separated from centralized control system of the three devices, and we reconstruct the control system that independent from the centralized control system. On this condition, we are upgrading of the 2nd and 3rd charge exchange device. It is necessary to redesign the interlock unit about safety, because of being stand-alone control. Now, the error signal of the charge exchange unit consolidates the error signal of three devices, and it operates the Machine Protection System (MPS). Therefore, we needed long time to search occasion why the error happened. However, the MPS will be operated by the error signal on each unit, we hope it makes a difference to search occasion easily. The 2nd and 3rd charge exchange units adopt a simple control system using Yokogawa electric PLC FA-M3. We are designing of the control system with safety that fuses the drive unit and the vacuum unit. This report is about design of the 2nd and 3rd charge exchange unit control system that reconstructed the hardware of their unit. (author)

337

Report on the 2nd European conference on computer-aided design (CAD) in small- and medium-size industries (MICAD 82)  

Energy Technology Data Exchange (ETDEWEB)

A summary is presented of the 2nd European conference on computer aided design (CAD) in small- and medium-size industries (MICAD82) held in Paris, France, September 21-23, 1982. The conference emphasized applications of CAD in industries with limited investment resources and which are forced to innovate in order to sustain competition.

Magnuson, W.G. Jr.

1982-10-01

338

FragVLib a free database mining software for generating "Fragment-based Virtual Library" using pocket similarity search of ligand-receptor complexes  

Directory of Open Access Journals (Sweden)

Full Text Available Abstract Background With the exponential increase in the number of available ligand-receptor complexes, researchers are becoming more dedicated to mine these complexes to facilitate the drug design and development process. Therefore, we present FragVLib, free software which is developed as a tool for performing similarity search across database(s of ligand-receptor complexes for identifying binding pockets which are similar to that of a target receptor. Results The search is based on 3D-geometric and chemical similarity of the atoms forming the binding pocket. For each match identified, the ligand's fragment(s corresponding to that binding pocket are extracted, thus, forming a virtual library of fragments (FragVLib that is useful for structure-based drug design. Conclusions An efficient algorithm is implemented in FragVLib to facilitate the pocket similarity search. The resulting fragments can be used for structure-based drug design tools such as Fragment-Based Lead Discovery (FBLD. They can also be used for finding bioisosteres and as an idea generator.

Khashan Raed

2012-08-01

339

2nd order spline interpolation of the Abel transformation for use in cylindrically-symmetric radiative source  

International Nuclear Information System (INIS)

Inversion of the observed transverse radiance and transmittance, M(z) and N(z) , into the radial emission coefficients J(r) , in cylindrically-symmetric radiation source induce to solve the generalized Abel equations S(z) = 2 ?zR [J(r) K(z,r) r dr/? (R2-z2)] (0 ? z zR [K(r) r dr/? (R2-z2)]. This equation can be solved analytically. In 1981, Young proposed an iteration(Y' I). After 1990, we proposed successively a piecewise linear interpolation (PLI) and a block-bi-quadric interpolation(BBQI). In this paper, we notice that emission coefficient J(r) is sufficiently smooth and symmetric at r = 0, from which we know that J'(r) = 0. Considering the condition, we propose the 2nd-order spline interpolation (2OSI). The former Abel equation is separated into a system of linear algebraic equations, whose coefficient matrix is an upper Heisenberg matrix. So this equation can be solved easily and rapidly and the results obtained from the method are smoother than others. Thus the experimenters can apply this method easily. We have solved the two examples by using present 2OSI. The results show that the computed values converge to the exact solutions with an increase in the number of nodes n. Therefore the method (2OSI) is effective and reasonable

340

Effect of CO2, Nd:YAG, and Er:YAG lasers on dentin and pulp tissues in dogs  

Science.gov (United States)

Although there has been interest in lasers in dentistry since lasers were first developed in the early 1960's, this interest was limited until recently. Over the past five years there has been a flurry of interest to find the most effective wavelength and parameters of treatment. With this interest has come clinical and experimental reports. This project is a pilot study to investigate laser effects on dogs teeth. Multiple teeth from 2 dogs (n equals 40) were treated using either a CO2, Nd:YAG, or an Er:YAG laser, or slow-speed rotary instrumentation. One dog died after treatment and was not used in this study. The second dog was sacrificed four days after treatment with the lasers and the teeth were decalcified and processed for light microscopy. The dentin and pulpal tissues were then evaluated for changes from their normal histologic patterns. The purpose of this study was to first determine if the dog would be a good model for in-vivo histologic testing of lasers and second to evaluate the histologic effects of different lasers on dog's teeth. Our findings suggest that each laser causes a different degree of effect to the treated teeth. The specifics of these effects are discussed herein.

Abt, Elliot; Wigdor, Harvey A.; Walsh, Joseph T., Jr.; Brown, Joseph D.

1992-06-01

 
 
 
 
341

Explicit formulas for 2nd-order driving terms due to sextupoles and chromatic effects of quadrupoles.  

Energy Technology Data Exchange (ETDEWEB)

Optimization of nonlinear driving terms have become a useful tool for designing storage rings, especially modern light sources where the strong nonlinearity is dominated by the large chromatic effects of quadrupoles and strong sextupoles for chromaticity control. The Lie algebraic method is well known for computing such driving terms. However, it appears that there was a lack of explicit formulas in the public domain for such computation, resulting in uncertainty and/or inconsistency in widely used codes. This note presents explicit formulas for driving terms due to sextupoles and chromatic effects of quadrupoles, which can be considered as thin elements. The computation is accurate to the 4th-order Hamiltonian and 2nd-order in terms of magnet parameters. The results given here are the same as the APS internal note AOP-TN-2009-020. This internal nte has been revised and published here as a Light Source Note in order to get this information into the public domain, since both ELEGANT and OPA are using these formulas.

Wang, C-X. (Accelerator Systems Division (APS))

2012-04-25

342

Contractions of 2D 2nd Order Quantum Superintegrable Systems and the Askey Scheme for Hypergeometric Orthogonal Polynomials  

Directory of Open Access Journals (Sweden)

Full Text Available We show explicitly that all 2nd order superintegrable systems in 2 dimensions are limiting cases of a single system: the generic 3-parameter potential on the 2-sphere, S9 in our listing. We extend the Wigner-Inönü method of Lie algebra contractions to contractions of quadratic algebras and show that all of the quadratic symmetry algebras of these systems are contractions of that of S9. Amazingly, all of the relevant contractions of these superintegrable systems on flat space and the sphere are uniquely induced by the well known Lie algebra contractions of e(2 and so(3. By contracting function space realizations of irreducible representations of the S9 algebra (which give the structure equations for Racah/Wilson polynomials to the other superintegrable systems, and using Wigner's idea of ''saving'' a representation, we obtain the full Askey scheme of hypergeometric orthogonal polynomials. This relationship directly ties the polynomials and their structure equations to physical phenomena. It is more general because it applies to all special functions that arise from these systems via separation of variables, not just those of hypergeometric type, and it extends to higher dimensions.

Ernest G. Kalnins

2013-10-01

343

“Software Engineering” is Engineering  

Directory of Open Access Journals (Sweden)

Full Text Available Determine that "Software Engineering" is engineering, is an issue that generates discussion in many universities in which, by decades, computer science programs have used the term to describe individual courses and claim it as part of their discipline. However, some engineering faculties demand it as a new and necessary specialty between traditional engineering disciplines. This article analyzed the differences between Computer Science and "Software Engineering" as engineering and is argued the necessity of a program in this last that follow the traditional engineering approach looking for a professional formation of software engineers.

Edgar Serna M.

2011-12-01

344

Fractally Generated Microstrip Bandpass Filter Designs Basedon Dual-Mode Square Ring Resonator for WirelessCommunication Systems  

Directory of Open Access Journals (Sweden)

Full Text Available A novel fractal design scheme has been introduced in this paper to generate microstrip bandpass filter designs with miniaturized sizes for wireless applications. The presented fractal scheme is based on Minkowski-like prefractal geometry. The space-filling property and self-similarity of this fractal geometry has found to produce reduced size symmetrical structures corresponding to the successive iteration levels. The resulting filter designs are with sizes suitable for use in modern wireless communication systems. The performance of each of the generated bandpass filter structures up to the 2nd iteration has been analyzed using a method of moments (MoM based software IE3D, which is widely adopted in microwave research and industry. Results show that these filters possess good transmission and return loss characteristics, besides the miniaturized sizes meeting the design specifications of most of wireless communication systems

Jawad K. Ali

2008-01-01

345

UWB Tracking Software Development  

Science.gov (United States)

An Ultra-Wideband (UWB) two-cluster Angle of Arrival (AOA) tracking prototype system is currently being developed and tested at NASA Johnson Space Center for space exploration applications. This talk discusses the software development efforts for this UWB two-cluster AOA tracking system. The role the software plays in this system is to take waveform data from two UWB radio receivers as an input, feed this input into an AOA tracking algorithm, and generate the target position as an output. The architecture of the software (Input/Output Interface and Algorithm Core) will be introduced in this talk. The development of this software has three phases. In Phase I, the software is mostly Matlab driven and calls C++ socket functions to provide the communication links to the radios. This is beneficial in the early stage when it is necessary to frequently test changes in the algorithm. Phase II of the development is to have the software mostly C++ driven and call a Matlab function for the AOA tracking algorithm. This is beneficial in order to send the tracking results to other systems and also to improve the tracking update rate of the system. The third phase is part of future work and is to have the software completely C++ driven with a graphics user interface. This software design enables the fine resolution tracking of the UWB two-cluster AOA tracking system.

Gross, Julia; Arndt, Dickey; Ngo, Phong; Phan, Chau; Dusl, John; Ni, Jianjun; Rafford, Melinda

2006-01-01

346

Universe (2nd edition)  

International Nuclear Information System (INIS)

A general text on astronomy is presented. The foundations of the science are reviewed, including descriptions of naked-eye observatons of eclipses and planetary motions and such basic tools as Kepler's laws, the fundamental properties of light, and the optics of telescopes. The formation of the solar system is addressed, and the planets and their satellites are discussed individually. Solar science is treated in detail. Stellar evolution is described chronologically from birth to death. Molecular clouds, star clusters, nebulae, neutron stars, black holes, and various other phenomena that occur in the life of a star are examined in the sequence in which they naturally occur. A survey of the Milky Way introduces galactic astronomy. Quasars and cosmology are addressed, including the most recent developments in research. 156 references

347

PREFACE: Proceedings of the 2nd International Conference on Quantum Simulators and Design (Tokyo, Japan, 31 May-3 June 2008) Proceedings of the 2nd International Conference on Quantum Simulators and Design (Tokyo, Japan, 31 May-3 June 2008)  

Science.gov (United States)

This special issue of Journal of Physics: Condensed Matter comprises selected papers from the proceedings of the 2nd International Conference on Quantum Simulators and Design (QSD2008) held in Tokyo, Japan, between 31 May and 3 June 2008. This conference was organized under the auspices of the Development of New Quantum Simulators and Quantum Design Grant-in-Aid for Scientific Research on Priority Areas, Ministry of Education, Culture, Sports, Science and Technology of Japan (MEXT). The conference focused on the development of first principles electronic structure calculations and their applications. The aim was to provide an opportunity for discussion on the progress in computational materials design and, in particular, the development of quantum simulators and quantum design. Computational materials design is a computational approach to the development of new materials. The essential ingredient is the use of quantum simulators to design a material that meets a given specification of properties and functionalities. For this to be successful, the quantum simulator should be very reliable and be applicable to systems of realistic size. During the conference, new methods of quantum simulation and quantum design were discussed including methods beyond the local density approximation of density functional theory, order-N methods, methods dealing with excitations and reactions, and the application of these methods to the design of novel materials, devices and systems. The conference provided an international forum for experimental and theoretical researchers to exchange ideas. A total of 220 delegates from eight countries participated in the conference. There were 13 invited talks, ten oral presentations and 120 posters. The 3rd International Conference on Quantum Simulators and Design will be held in Germany in the autumn of 2011.

Akai, Hisazumi; Tsuneyuki, Shinji

2009-02-01

348

DNA sequencing - spanning the generations.  

Science.gov (United States)

Nucleic acid sequencing is the mainstay of biological research. There are several generations of DNA sequencing technologies that can be well characterized through their nature and the kind of output they provide. Dideoxy terminator sequencing developed by Sanger dominated for 30 years and was the workhorse used for the Human Genome Project. In 2005 the first 2nd generation sequencer was presented with an output orders of magnitude higher than Sanger sequencing and dramatically decreased cost. We are now at the dawn of 3rd generation with nanopore systems that are being developed for DNA sequencing. Meanwhile the field is also broadening into applications that complement 1st, 2nd and 3rd generation sequencing systems to get high resolution genetic information. The REvolutionary Approaches and Devices for Nucleic Acid analysis (READNA) consortium funded by the European Commission under FP7 has made great contributions to the development of new nucleic acid analysis methodology. PMID:23165096

McGinn, Steven; Gut, Ivo Glynne

2013-05-25

349

TESTING FOR OBJECT ORIENTED SOFTWARE  

Directory of Open Access Journals (Sweden)

Full Text Available This paper deals with design and development of an automated testing tool for Object Oriented Software. By an automated testing tool, we mean a tool that automates a part of the testing process. It can include one or more of the following processes: test strategy eneration, test case generation, test case execution, test data generation, reporting and logging results. By object-oriented software we mean a software designed using OO approach and implemented using a OO language. Testing of OO software is different from testing software created using procedural languages. Severalnew challenges are posed. In the past most of the methods for testing OO software were just a simple extension of existing methods for conventional software. However, they have been shown to be not very appropriate. Hence, new techniques have been developed. This thesis work has mainly focused on testing design specifications for OO software. As described later, there is a lack of specification-based testing tools for OO software. An advantage of testing software specifications as compared to program code is that specifications aregenerally correct whereas code is flawed. Moreover, with software engineering principles firmly established in the industry, most of the software developed nowadays follow all the steps of Software Development Life Cycle (SDLC. For this work, UML specifications created in Rational Rose are taken. UML has become the de-factostandard for analysis and design of OO software. Testing is conducted at 3 levels: Unit, Integration and System. At the system level there is no difference between the testing techniques used for OO software and other software created using a procedural language, and hence, conventional techniques can be used. This tool provides features for testing at Unit (Class level as well as Integration level. Further a maintenance-level component has also been incorporated. Results of applying this tool to sample Rational Rose files have been ncorporated, and have been found to be satisfactory.

Jitendra S. Kushwah

2011-02-01

350

Software Engineering for Tagging Software  

Directory of Open Access Journals (Sweden)

Full Text Available Tagging is integrated into web application to ease maintenance of large amount of information stored in aweb application. With no mention of requirement specification or design document for tagging software,academically or otherwise, integrating tagging software in a web application is a tedious task. In thispaper, a framework has been created for integration of tagging software in a web application. Theframework follows the software development life cycle paradigms and is to be used during it differentstages. The requirement component of framework presents a weighted requirement checklist that aids theuser in deciding requirement for the tagging software in a web application, from among popular ones. Thedesign component facilitates the developer in understanding the design of existing tagging software,modifying it or developing a new one. Also, the framework helps in verification and validation of taggingsoftware integrated in a web application.

Karan Gupta

2013-07-01

351

New glycoproteomics software, GlycoPep Evaluator, generates decoy glycopeptides de novo and enables accurate false discovery rate analysis for small data sets.  

Science.gov (United States)

Glycoproteins are biologically significant large molecules that participate in numerous cellular activities. In order to obtain site-specific protein glycosylation information, intact glycopeptides, with the glycan attached to the peptide sequence, are characterized by tandem mass spectrometry (MS/MS) methods such as collision-induced dissociation (CID) and electron transfer dissociation (ETD). While several emerging automated tools are developed, no consensus is present in the field about the best way to determine the reliability of the tools and/or provide the false discovery rate (FDR). A common approach to calculate FDRs for glycopeptide analysis, adopted from the target-decoy strategy in proteomics, employs a decoy database that is created based on the target protein sequence database. Nonetheless, this approach is not optimal in measuring the confidence of N-linked glycopeptide matches, because the glycopeptide data set is considerably smaller compared to that of peptides, and the requirement of a consensus sequence for N-glycosylation further limits the number of possible decoy glycopeptides tested in a database search. To address the need to accurately determine FDRs for automated glycopeptide assignments, we developed GlycoPep Evaluator (GPE), a tool that helps to measure FDRs in identifying glycopeptides without using a decoy database. GPE generates decoy glycopeptides de novo for every target glycopeptide, in a 1:20 target-to-decoy ratio. The decoys, along with target glycopeptides, are scored against the ETD data, from which FDRs can be calculated accurately based on the number of decoy matches and the ratio of the number of targets to decoys, for small data sets. GPE is freely accessible for download and can work with any search engine that interprets ETD data of N-linked glycopeptides. The software is provided at https://desairegroup.ku.edu/research. PMID:25137014

Zhu, Zhikai; Su, Xiaomeng; Go, Eden P; Desaire, Heather

2014-09-16

352

Does the Application of Instructional Mathematics Software Have Enough Efficiency?  

Directory of Open Access Journals (Sweden)

Full Text Available Modern tools as new educational systems can improve teaching-learning procedures in schools. Teaching mathematics is one of the main and difficult components in educational systems. Informing methods is essential for teachers and instructors. It seems that did not forget usual teaching method and using software or media considered as remedial teaching. Teachers always follow dynamic methods for teaching and learning. The aim of this study is that views of students studied regard to math software and its efficiency. Twenty two girl students of 2nd grade are chosen at high schools. Through standard questionnaire and survey method, views of students are collected. Data are studied via Kolmogorov-Smirnov and one sample sign tests. The results of tests indicated that students have positive views toward co-instructional software of math learning. Therefore it seems that mathematical software can advantages for teaching and learning at high schools.

Zahra Kalantarnia

2013-12-01

353

Phase relationship in the TiO{sub 2}-Nd{sub 2}O{sub 3} pseudo-binary system  

Energy Technology Data Exchange (ETDEWEB)

Highlights: Black-Right-Pointing-Pointer DSC and XRD measurements for the TiO{sub 2}-Nd{sub 2}O{sub 3} system. Black-Right-Pointing-Pointer Nd{sub 2}Ti{sub 2}O{sub 7}, Nd{sub 2}TiO{sub 5}, Nd{sub 2}Ti{sub 3}O{sub 9} and Nd{sub 4}Ti{sub 9}O{sub 24} exist. Black-Right-Pointing-Pointer Nd{sub 2}Ti{sub 4}O{sub 11} and Nd{sub 4}Ti{sub 9}O{sub 24} were the same compounds. Black-Right-Pointing-Pointer Thermodynamic calculation on the TiO{sub 2}-Nd{sub 2}O{sub 3} system. - Abstract: Phase equilibria in the TiO{sub 2}-Nd{sub 2}O{sub 3} system have been experimentally investigated via X-ray diffraction (XRD) and differential scanning calorimetry (DSC). Four compounds Nd{sub 2}Ti{sub 2}O{sub 7}, Nd{sub 2}TiO{sub 5}, Nd{sub 2}Ti{sub 3}O{sub 9} and Nd{sub 4}Ti{sub 9}O{sub 24} were confirmed to exist. The literature reported Nd{sub 2}Ti{sub 4}O{sub 11} was proved to be the same compound as Nd{sub 4}Ti{sub 9}O{sub 24}, and the reported phase transformation of Nd{sub 2}Ti{sub 4}O{sub 11} from {alpha} structure to {beta} at 1373 K was not detected. All the phase diagram data from both the literatures and the present work were critically reviewed and taken into account during the thermodynamic optimization of the TiO{sub 2}-Nd{sub 2}O{sub 3} system. A set of consistent thermodynamic parameters, which can explain most of the experimental data of the TiO{sub 2}-Nd{sub 2}O{sub 3} system, was achieved. The calculated phase diagram of the TiO{sub 2}-Nd{sub 2}O{sub 3} system was provided.

Gong, Weiping, E-mail: weiping_gong@csu.edu.cn [State Key Laboratory of Powder Metallurgy, Central South University, Changsha 410083, Hunan (China); Laboratory of Electronic Functional Materials, Huizhou University, Huizhou 516001, Guangdong (China); Zhang, Rui [State Key Laboratory of Powder Metallurgy, Central South University, Changsha 410083, Hunan (China)

2013-01-25

354

What difference does a year of schooling make? Maturation of brain response and connectivity between 2nd and 3rd grades during arithmetic problem solving.  

Science.gov (United States)

Early elementary schooling in 2nd and 3rd grades (ages 7-9) is an important period for the acquisition and mastery of basic mathematical skills. Yet, we know very little about neurodevelopmental changes that might occur over a year of schooling. Here we examine behavioral and neurodevelopmental changes underlying arithmetic problem solving in a well-matched group of 2nd (n = 45) and 3rd (n = 45) grade children. Although 2nd and 3rd graders did not differ on IQ or grade- and age-normed measures of math, reading and working memory, 3rd graders had higher raw math scores (effect sizes = 1.46-1.49) and were more accurate than 2nd graders in an fMRI task involving verification of simple and complex two-operand addition problems (effect size = 0.43). In both 2nd and 3rd graders, arithmetic complexity was associated with increased responses in right inferior frontal sulcus and anterior insula, regions implicated in domain-general cognitive control, and in left intraparietal sulcus (IPS) and superior parietal lobule (SPL) regions important for numerical and arithmetic processing. Compared to 2nd graders, 3rd graders showed greater activity in dorsal stream parietal areas right SPL, IPS and angular gyrus (AG) as well as ventral visual stream areas bilateral lingual gyrus (LG), right lateral occipital cortex (LOC) and right parahippocampal gyrus (PHG). Significant differences were also observed in the prefrontal cortex (PFC), with 3rd graders showing greater activation in left dorsal lateral PFC (dlPFC) and greater deactivation in the ventral medial PFC (vmPFC). Third graders also showed greater functional connectivity between the left dlPFC and multiple posterior brain areas, with larger differences in dorsal stream parietal areas SPL and AG, compared to ventral stream visual areas LG, LOC and PHG. No such between-grade differences were observed in functional connectivity between the vmPFC and posterior brain regions. These results suggest that even the narrow one-year interval spanning grades 2 and 3 is characterized by significant arithmetic task-related changes in brain response and connectivity, and argue that pooling data across wide age ranges and grades can miss important neurodevelopmental changes. Our findings have important implications for understanding brain mechanisms mediating early maturation of mathematical skills and, more generally, for educational neuroscience. PMID:21620984

Rosenberg-Lee, Miriam; Barth, Maria; Menon, Vinod

2011-08-01

355

A verification of the high density after contrast enhancement in the 2nd week in cerebroischemic lesion  

International Nuclear Information System (INIS)

To determine the indication, it is necessary to make clear the relation among the Stage (time and course), the Strength, the Pathogenesis, and the Effects of the operation in these diseases (SSPE relation). In this report, we focused on the High Density of CT after the contrast enhancement in the cases of ischemic lesions (the High Density was named ''Ribbon H. D.''). Seventeen cases of Ribbon H. D. in fresh infarctions were verified concerning the time of the appearance of the H. D., the features of its location and nature, and the histological findings. The results were as follows: The Ribbon H. D. appeared in the early stage of infarctions, and had its peak density at the end of the 2nd week after the onset. The Ribbon H. D. was mostly located along the cortical line, showing a ribbon-like band. The Ribbon H. D. did not appear in the sharply demarcated coagulation necrosis in the early stage or in the defined Low Density (L. D.) in the late stage of infarctions. Although the Ribbon H. D. shows the extravasation of contrast media, it does not necessarily show the existence of the hemorrhagic infarction. Some part of the Ribbon H. D. changes to a well-defined L. D. and the rest of the part becomes relative isodensity in the late stage. This change corresponds to the change in the incomplete necrosis which is afterwards divided into a resolution with a cystic cavity and the glial replacement in the late stage. In conclusion, it is possible to understand that the Ribbot is possible to understand that the Ribbon H. D. corresponds to the lesion of an incomplete necrosis, with neovascularization, in the early stage of infarctions. Therefore, in addition to the present indication of a by-pass operation (TIA, RIND), this incomplete necrosis (Ribbon H. D.), its surrounding area and just before the appearance of the Ribbon H. D. might be another indication of the operation. (author)

356

Effect of Software Manager, Programmer and Customer over Software Quality  

Directory of Open Access Journals (Sweden)

Full Text Available Several factors might be affecting the quality of software products. In this study we focus on three significant parameters: software manager, programmer and the customer. Our study demonstrates that the quality of product will improve by increasing the information generated by these three parameters. These parameters can be considered as triangular which the quality is its centroid. In this perspective, if the triangular be equilateral, then the optimum quality of the product will beachieved. In other words, to generate high quality software, the ability of software manager, programmer and customer must be same. Subsequently, only a manager or only an expert programmer, and with a software aware customer cannot be a guarantee to the high quality of the software products.

Ghrehbaghi Farhad

2013-01-01

357

Standardisation of factor VIII--V. Calibration of the 2nd International Standard for Factor VIII and von Willebrand factor activities in plasma.  

Science.gov (United States)

The proposed 2nd International Standard for Factor VIII and von Willebrand Factor activities in plasma, NIBSC code 87/718, was assayed against the 1st IRP, 80/511, and against fresh normal plasma, in 21 laboratories. There were no significant differences between the various assay methods for factor VIII antigen, von Willebrand factor antigen, and von Willebrand factor ristocetin co-factor activity. For factor VIII clotting activity there was a significant difference between the results of one-stage and two-stage assays. Plasma 87/718 has now been established by the WHO Expert Committee on Biological Standardisation as the 2nd IS for factor VIII and vWF in plasma with the following potencies: VIII:C 0.60 IU/ampoule; VIII:Ag 0.91 IU/ampoule; vWF:Ag 0.91 IU/ampoule; vWF/RCo 0.84 IU/ampoule. PMID:1412160

Heath, A B; Barrowcliffe, T W

1992-08-01

358

Microstructure, mechanical properties and fracture behavior of peak-aged Mg--4Y--2Nd--1Gd alloys under different aging conditions  

Energy Technology Data Exchange (ETDEWEB)

The morphology of precipitates and grain boundaries of peak-aged Mg--4Y--2Nd--1Gd alloys under different aging conditions were analyzed by transmission electron microscopy (TEM), and the mechanical properties and fracture behavior of the studied alloys both at room and elevated temperatures were investigated. The {beta} Prime Prime and {beta} Prime phases are the main precipitates of the alloys peak-aged at 200 Degree-Sign C and 225 Degree-Sign C, while the alloy peak-aged at 250 Degree-Sign C mainly consists of {beta}{sub 1} and {beta} phases. Discussion on relationship between precipitates and mechanical properties, fracture behavior reveals that the precipitates' density, kind and arrangement are the dominating factors influencing the mechanical properties, and the combined influence of grain boundary structure and precipitation hardening determine the fracture mechanism of Mg--4Y--2Nd--1Gd alloys.

Liu, Zhijie [National Engineering Research Center of Light Alloy Net Forming, School of Materials Science and Engineering, Shanghai Jiao Tong University, Shanghai 200240 (China); Wu, Guohua, E-mail: ghwu@sjtu.edu.cn [National Engineering Research Center of Light Alloy Net Forming, School of Materials Science and Engineering, Shanghai Jiao Tong University, Shanghai 200240 (China); Key State Laboratory of Metal Matrix Composite, School of Materials Science and Engineering, Shanghai Jiao Tong University, Shanghai 200240 (China); Liu, Wencai; Pang, Song [National Engineering Research Center of Light Alloy Net Forming, School of Materials Science and Engineering, Shanghai Jiao Tong University, Shanghai 200240 (China); Ding, Wenjiang [National Engineering Research Center of Light Alloy Net Forming, School of Materials Science and Engineering, Shanghai Jiao Tong University, Shanghai 200240 (China); Key State Laboratory of Metal Matrix Composite, School of Materials Science and Engineering, Shanghai Jiao Tong University, Shanghai 200240 (China)

2013-01-20

359

Effect of Mg substitution on photoluminescence of MgxCa1-xAl2O4: Eu2+, Nd3+  

International Nuclear Information System (INIS)

Rare earth ion-doped calcium aluminate (CaAl2O4) is an efficient blue phosphor. The compositions in the series MgxCa1-xAl2O4: Eu2+, Nd3+ (x=0.05-0.25) codoped with 1 mol% Eu and 3 mol% Nd were prepared by the solid state synthesis method. Crystalline phase, morphology and structural details were investigated by powder X-ray diffraction (XRD), scanning electron microscopy (SEM) and transmission electron microscopy (TEM) techniques. Effect of Mg substitution on structure and photoluminescence characteristics was investigated. Photoluminescence characteristics show the intense emission for MgCaAl2O4: Eu2+, Nd3+ in the blue region (?max=440 nm) with long persistence. The blue emission corresponds to transitions from 4f6 5d1 to 4f7 of Eu2+ ion

360

Software standardization  

International Nuclear Information System (INIS)

For users of computerized nuclear medicine the software is the main problem. Both in-house and commercial software is used and there is generally a lack of standardization. First efforts towards standardization have been made by NORDFORSK (Nordic Cooperative Organization for Applied Research). These include proposals for standardizing physiologic parameters and procedures for measuring them. While theoretically easy, standardization of software is a difficult problem in practice. Excessive standardization may even impede research and development. The development of clinical program standards for defining protocols/trials with capabilities for user modification of macros would seem to be desirable. (Author)

 
 
 
 
361

Software Smarts  

Science.gov (United States)

Under an SBIR (Small Business Innovative Research) contract with Johnson Space Center, Knowledge Based Systems Inc. (KBSI) developed an intelligent software environment for modeling and analyzing mission planning activities, simulating behavior, and, using a unique constraint propagation mechanism, updating plans with each change in mission planning activities. KBSI developed this technology into a commercial product, PROJECTLINK, a two-way bridge between PROSIm, KBSI's process modeling and simulation software and leading project management software like Microsoft Project and Primavera's SureTrak Project Manager.

1998-01-01

362

Investigation of explosive decomposition of volatile alkyl OMC of elements of the 2nd-6th groups of the Periodic system in the gas phase  

International Nuclear Information System (INIS)

Explosive decomposition of a number of alkylic compounds of elements of the 2nd-6th groups of the Periodic system (Cd, In, Te) in the gas phase has been investigated. On the basis of experimental results and literature data a correlation in the ability of the lowest alkylic compounds of OMC to be explosively decomposed depending upon the position of a metal in the Periodic system has been established

363

Fe69B20.2Nd4.2Nb3.3Y2.5Zr0.8 magnets produced by injection casting  

International Nuclear Information System (INIS)

Fe69B20.2Nd4.2Nb3.3Y2.5Zr0.8 hard magnets in rods were produced by injection casting method. Magnetic properties, phase evolution and microstructure for Fe69B20.2Nd4.2Nb3.3Y2.5Zr0.8 magnets have been investigated and presented in as-cast and annealed states. The magnets possess soft magnetic characteristics in as-cast state, and hard magnetic characteristics after annealing. The Nb refines the grain sizes of ?-Fe, Fe3B and Nd2Fe14B magnetic phases, while Y and Zr improve the glass forming ability. Good magnetic properties in Fe69B20.2Nd4.2Nb3.3Y2.5Zr0.8 magnets arise from the microstructure composed of magnetically exchange coupled nanocrystalline ?-Fe, Fe3B and Nd2Fe14B phases. Optimal annealed magnets of 2 mm diameter and 30 mm length demonstrate the hard magnetic properties, i.e. intrinsic coercivity jHc of 496 kA/m, remanence Br of 0.76 T and maximum energy product (BH)max of 72.0 kJ/m3. - Highlights: ? Fe69B20.2Nd4.2Nb3.3Y2.5Zr0.8 magnets are produced by injection casting. ? Magnetic properties depend on the intrinsic properties of the phases. ? Microstructure is composed of -Fe, Fe3B Nd2Fe14B nano-grains. ? Magnets demonstrate jHc of 496 kA/m, Br of 0.76 T and (BH)max of 72 kJ/m3

364

An overview of existing RCM procedures, software and databases used in various industrial segments. A brief description of RCMCost software. A brief description of RCM Workstation 2.5 software  

International Nuclear Information System (INIS)

The report is structured as follows: (1) Brief history (1st generation, 2nd generation, 3rd generation); (2) Application of RCM in various industrial segments (Aircraft industry, nuclear industry, chemical industry, petroleum and gas processing and transport, services and other industrial segments); (3) RCM standards; (4) RCM tools; (5) Databases usable for RCM; and (6) A brief description of selected codes for RCM analysis (RCMCost v3.0, RCM Workstation 2.5). (P.A.)

365

2nd State of the Onion: Larry Wall's Keynote Address at the Second Annual O'Reilly Perl Conference  

Science.gov (United States)

This page, part of publisher O'Reilly & Associates' Website devoted to the Perl language, contains a transcript of Larry Wall's keynote address at the second annual O'Reilly Perl Conference, which was held August 17-20, 1998, in San Jose, California. In his keynote address, Larry Wall, the original author of the Perl programming language, provides a thought-provoking (and entertaining) mix of philosophy and technology. Wall's talk touches on the future of the Perl language, the relationship of the free software community to commercial software developers, chaos, complexity, and human symbology. The page also includes copies of graphics used during the keynote.

366

Software Reviews.  

Science.gov (United States)

Presents reviews of six software packages. Includes (1) "Plain Vanilla Statistics"; (2) "MathCAD 2.0"; (3) "GrFx"; (4) "Trigonometry"; (5) "Algebra II"; (6) "Algebra Drill and Practice I, II, and III." (PK)

Mathematics and Computer Education, 1988

1988-01-01

367

Software management issues  

International Nuclear Information System (INIS)

The difficulty of managing the software in large HEP collaborations appears to becoming progressively worst with each new generation of detector. If one were to extrapolate to the SSC, it will become a major problem. This paper explores the possible causes of the difficulty and makes suggestions on what corrective actions should be taken

368

Thermochemistry of glasses along the 2NdAlO3-3SiO2 join  

International Nuclear Information System (INIS)

Five Nd-aluminosilicate glasses along the 2NdAlO3-3SiO2 join were synthesized using conventional drop-quench techniques. A sixth glass, with the end-member NdAlO3 composition, required synthesis by containerless liquid-phase processing methods to avoid crystallization. Enthalpies of drop solution (?Hds) and formation (?Hf) for the Nd-aluminosilicate glasses and the NdAlO3-composition end-member glass were measured in molten 2PbO-B2O3 at 1078 K in a twin Calvet type calorimeter. Values for ?Hds for the Nd-aluminosilicate glasses increase with decreasing silica content from 130.7 ± 1.5 to 149.6 ± 0.6 kJ mol-1. Similarly, values of ?Hf increase with decreasing silica content from 41.0 ± 2.0 to 59.0 ± 1.6 kJ mol-1. Values of ?Hds and ?Hf for NdAlO3-composition glass were measured as 99.3 ± 0.9 and 139.2 ± 2.1 kJ mol-1, respectively. Using transposed temperature drop calorimetry, the enthalpy of vitrification for NdAlO3-composition glass was measured as 69.5 ± 0.9 kJ mol-1 relative to the stable crystalline neodymium aluminium perovskite (NdAlO3) phase. Enthalpies of mixing were calculated based on amorphous end members; the strongly negative values support the absence of immiscibility in this system. Differential scanning calorimetry was used to determine glass transition (Tg) and crystallization (Tx) temperatures, as well as values for the configurational heat capacity (?CP(Tg)) and the temperature range of the supercooled liquid interval (?T(SCL)). The NdAlO3-composition glass showed no evidence of a glass transition prior to crystallization; only a single exotherm was observed, the onset of which occurred at 1045 K. For the Nd-aluminosilicates, values of Tg and ?T(SCL) increase with increasing silica content, from 1128 to 1139 K and from ?95 to ?175 K, respectively. Values of (?CP(Tg)) increase with decreasing silica content, from ?27 to ?75 J/g fw *K, reflecting the increasing fragility and decreasing stability of the liquids as the end member composition, NdAlO3, is approached

369

Software reliability report  

Science.gov (United States)

There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Unfortunately, the models appear to be unable to account for the random nature of the data. If the same code is debugged multiple times and one of the models is used to make predictions, intolerable variance is observed in the resulting reliability predictions. It is believed that data replication can remove this variance in lab type situations and that it is less than scientific to talk about validating a software reliability model without considering replication. It is also believed that data replication may prove to be cost effective in the real world, thus the research centered on verification of the need for replication and on methodologies for generating replicated data in a cost effective manner. The context of the debugging graph was pursued by simulation and experimentation. Simulation was done for the Basic model and the Log-Poisson model. Reasonable values of the parameters were assigned and used to generate simulated data which is then processed by the models in order to determine limitations on their accuracy. These experiments exploit the existing software and program specimens which are in AIR-LAB to measure the performance of reliability models.

Wilson, Larry

1991-01-01

370

Silverlight 4 Business Intelligence Software  

CERN Document Server

Business Intelligence (BI) software allows you to view different components of a business using a single visual platform, which makes comprehending mountains of data easier. BI is everywhere. Applications that include reports, analytics, statistics, and historical and predictive modeling are all examples of BI. Currently, we are in the second generation of BI software - called BI 2.0 - which is focused on writing BI software that is predictive, adaptive, simple, and interactive. As computers and software have evolved, more data can be presented to end users with increasingly visually rich tech

Czernicki, Bart

2010-01-01

371

Software survey  

Energy Technology Data Exchange (ETDEWEB)

This article presented a guide to new software applications designed to facilitate exploration, drilling and production activities. Oil and gas producers can use the proudcts for a range of functions, including reservoir characterization and accounting. In addition to a description of the software application, this article listed the name of software providers and the new features available in products. The featured software of Calgary-based providers included: PetroLOOK by Alcaro Softworks Inc.; ProphetFM and MasterDRIL by Advanced Measurements Inc.,; the EDGE screening tool by Canadian Discovery Ltd.; Emission Manager and Regulatory Document Manager by Envirosoft Corporation; ResSurveil, ResBalance and ResAssist by Epic Consulting Services Ltd; FAST WellTest and FAST RTA by Fekete Associates Inc.; OMNI 3D and VISTA 2D/3D by Gedco; VisualVoxAT, SBED and SBEDStudio by Geomodeling Technology Corporation; geoSCOUT, petroCUBE and gDC by GeoLOGIC Systems Ltd.; IHS Enerdeq Desktop and PETRA by IHS; DataVera by Intervera Data Solutions; FORGAS, PIPEFLO and WELLFLO by Neotechnology Consultants Ltd.; E and P Workflow Solutions by Neuralog Inc.; Oil and Gas Solutions by RiskAdvisory division of SAS; Petrel; GeoFrame, ECLIPSE, OFM, Osprey Risk and Avocet modeler, PIPESIM and Merak by Schlumberger Information Solutions; esi.manage and esi.executive by 3esi; and, dbAFE and PROSPECTOR by Winfund Corporation. Tower Management and Maintenance System, OverSite and Safety Orientation Management System software by Edmonton-based 3C Information Solutions Inc. were also highlighted along with PowerSHAPE, PowerMILL and FeatureCAM software by Windsor, Ontario-based Delcam. Software products by Texas-based companies featured in this article included the HTRI Xchanger Suite by Heat Transfer Research Inc.; Drillworks by Knowledge Systems; and GeoProbe, PowerView; GeoGraphix, AssetPlanner, Nexus software, Decision Management System, AssetSolver, and OpenWorks by Landmark; and, eVIN, Rig-Hand, and OVS by Merrick Systems Inc.

Anon.

2007-07-15

372

Software development for the VLA-GDSCC telemetry array project  

Science.gov (United States)

Software for the VLA-GDSCC Telemetry Array (VGTA) Project is being developed in a new manner. Within the Radio Frequency and Microwave Subsystems Section, most microprocessor software has been developed using Intel hardware and software development systems. The VGTA software, however, is being developed using IBM PCs running consumer-oriented software. Utility software and procedures have been generated which allow the software developed on the IBM PCs to be transferred and run on a multibus 8086 computer.

Cooper, H. W.; Hileman, L. R.

1986-11-01

373

MUSE instrument software  

Science.gov (United States)

MUSE Instrumentation Software is the software devoted to the control of the Multi-Unit Spectroscopic Explorer (MUSE), a second-generation VLT panoramic integral-field spectrograph instrument, installed at Paranal in January 2014. It includes an advanced and user-friendly GUI to display the raw data of the 24 detectors, as well as the on-line reconstructed images of the field of view allowing users to assess the quality of the data in quasi-real time. Furthermore, it implements the slow guiding system used to remove effects of possible differential drifts between the telescope guide probe and the instrument, and reach high image stability (software design and describe the developed tools that efficiently support astronomers while operating this complex instrument at the telescope.

Zins, Gérard; Pécontal, Arlette; Larrieu, Marie; Girard, Nathalie; Jarno, Aurélien; Cumani, Claudio; Baksai, Pedro; Comin, Mauro; Kiekebusch, Mario; Knudstrup, Jens; Popovic, Dan; Bacon, Roland; Richard, Johan; Stuik, Remko; Vernet, Joel

2014-07-01

374

Software Reviews.  

Science.gov (United States)

Reviews of seven software packages are presented including "The Environment I: Habitats and EcoSystems; II Cycles and Interactions"; "Super Sign Maker"; "The Great Knowledge Race: Substance Abuse"; "Exploring Science: Temperature"; "Fast Food Calculator and RD Aide"; "The Human Body: Circulation and Respiration" and "Forces in Liquids and Gases."…

Science and Children, 1989

1989-01-01

375

Software Directory.  

Science.gov (United States)

A 42-item software directory describes courseware in a variety of skill areas, including alphabetizing, communicating, daily living skills, grammar, mathematics, reading, speech, and vocabulary. An audiovisual listing describes videotapes, motion pictures, and filmstrips that focus primarily on disability awareness and various aspects of learning…

Journal of Reading, Writing, and Learning Disabilities International, 1985

1985-01-01

376

The current status of research into Attention Deficit Hyperactivity Disorder: Proceedings of the 2nd International Congress on ADHD: From Childhood to Adult Disease.  

Science.gov (United States)

Despite being a devastating psychiatric condition with high prevalence, ADHD has traditionally been widely under-researched, specifically in adult patients. Therefore, the recent surge in scientific projects focusing on ADHD is impressive. By reviewing selected research findings presented at the 2nd International Congress on ADHD, this paper gives an overview about current state-of-the art research in such different areas as diagnosis, classification, epidemiology, differential diagnosis and comorbidity, neurobiology (including molecular genetics, proteomics, neuroimaging and electrophysiology), environmental factors, modelling of ADHD, treatment (pharmacological and non-pharmacological), as well as forensic and social aspects. PMID:21432581

Thome, Johannes; Reddy, Duvvoor Prathap

2009-12-01

377

Synthesis, crystal structure and properties of superconducting and non-superconducting RuSr2(Nd,Ce)2Cu2O10 phases  

International Nuclear Information System (INIS)

In this paper we report on the structural, electrical and magnetic properties of a systematic work performed on the synthesis of RuSr2(Nd,Ce)2Cu2O10-? (Ru-1222) samples that differ for the number of annealings or sintering temperature in oxygen. This is done in order to highlight the different behaviours of properties as a consequence of the treatments. In particular, we deal with the development or loss of superconductivity after oxygen treatment. Further we discuss the influence of phase composition on superconductivity and magnetic ordering

378

Proceedings of the 2nd NUCEF international symposium NUCEF`98. Safety research and development of base technology on nuclear fuel cycle  

Energy Technology Data Exchange (ETDEWEB)

This volume contains 68 papers presented at the 2nd NUCEF International Symposium NUCEF`98 held on 16-17 November 1998, in Hitachinaka, Japan, following the 1st symposium NUCEF`95 (Proceeding: JAERI-Conf 96-003). The theme of this symposium was `Safety Research and Development of Base Technology on Nuclear Fuel Cycle`. The papers were presented in oral and poster sessions on following research fields: (1) Criticality Safety, (2) Reprocessing and Partitioning, (3) Radioactive Waste Management. The 68 papers are indexed individually. (J.P.N.)

NONE

1999-03-01

379

2nd IAEA research coordination meeting on collection and evaluation of reference data for thermo-mechanical properties of fusion reactor plasma facing materials. Summary report  

International Nuclear Information System (INIS)

The proceedings and results of the 2nd IAEA Research Coordination Meeting on ''Collection and Evaluation of Reference Data for Thermo-mechanical Properties of Fusion Reactor Plasma Facing Materials'' held on March 25, 26 and 27, 1996 at the IAEA Headquarters in Vienna are briefly described. This report includes a summary of presentations made by the meeting participants, the results of discussions amongst the participants regarding the status of data, publication of a multi-author review paper and recommendations regarding future work. (author). 1 tab

380

ISE-SPL: uma abordagem baseada em linha de produtos de software aplicada à geração automática de sistemas para educação médica na plataforma E-learning / ISE-SPL: a software product line approach applied to automatic generation of systems for medical education in E-learning platform  

Scientific Electronic Library Online (English)

Full Text Available SciELO Brazil | Language: Portuguese Abstract in portuguese INTRODUÇÃO: O e-learning surgiu como uma forma complementar de ensino, trazendo consigo vantagens como o aumento da acessibilidade da informação, aprendizado personalizado, democratização do ensino e facilidade de atualização, distribuição e padronização do conteúdo. Neste sentido, o presente trabal [...] ho tem como objeto apresentar uma ferramenta, intitulada de ISE-SPL, cujo propósito é a geração automática de sistemas de e-learning para a educação médica, utilizando para isso sistemas ISE (Interactive Spaced-Education) e conceitos de Linhas de Produto de Software. MÉTODOS: A ferramenta consiste em uma metodologia inovadora para a educação médica que visa auxiliar o docente da área de saúde na sua prática pedagógica por meio do uso de tecnologias educacionais, todas baseadas na computação aplicada à saúde (Informática em Saúde). RESULTADOS: Os testes realizados para validar a ISE-SPL foram divididos em duas etapas: a primeira foi feita através da utilização de um software de análise de ferramentas semelhantes ao ISE-SPL, chamado S.P.L.O.T; e a segunda foi realizada através da aplicação de questionários de usabilidade aos docentes da área da saúde que utilizaram o ISE-SPL. CONCLUSÃO: Ambos os testes demonstraram resultados positivos, permitindo comprovar a eficiência e a utilidade da ferramenta de geração de softwares de e-learning para o docente da área da saúde. Abstract in english INTRODUCTION: E-learning, which refers to the use of Internet-related technologies to improve knowledge and learning, has emerged as a complementary form of education, bringing advantages such as increased accessibility to information, personalized learning, democratization of education and ease of [...] update, distribution and standardization of the content. In this sense, this paper aims to present a tool, named ISE-SPL, whose purpose is the automatic generation of E-learning systems for medical education, making use of ISE systems (Interactive Spaced-Education) and concepts of Software Product Lines. METHODS: The tool consists of an innovative methodology for medical education that aims to assist professors of healthcare in their teaching through the use of educational technologies, all based on computing applied to healthcare (Informatics in Health). RESULTS: The tests performed to validate the ISE-SPL were divided into two stages: the first was made by using a software analysis tool similar to ISE-SPL, called S.P.L.O.T and the second was performed through usability questionnaires to healthcare professors who used ISE-SPL. CONCLUSION: Both tests showed positive results, allowing to conclude that ISE-SPL is an efficient tool for generation of E-learning software and useful for teachers in healthcare.

Túlio de Paiva Marques, Carvalho; Bruno Gomes de, Araújo; Ricardo Alexsandro de Medeiros, Valentim; Jose, Diniz Junior; Francis Solange Vieira, Tourinho; Rosiane Viana Zuza, Diniz.

2013-12-01

 
 
First Page Previous Page 1 2 3 4 5 6 7 8 9 10