WorldWideScience

Sample records for 2nd generation software

  1. STARS 2.0: 2nd-generation open-source archiving and query software

    Winegar, Tom

    2008-07-01

    The Subaru Telescope is in process of developing an open-source alternative to the 1st-generation software and databases (STARS 1) used for archiving and query. For STARS 2, we have chosen PHP and Python for scripting and MySQL as the database software. We have collected feedback from staff and observers, and used this feedback to significantly improve the design and functionality of our future archiving and query software. Archiving - We identified two weaknesses in 1st-generation STARS archiving software: a complex and inflexible table structure and uncoordinated system administration for our business model: taking pictures from the summit and archiving them in both Hawaii and Japan. We adopted a simplified and normalized table structure with passive keyword collection, and we are designing an archive-to-archive file transfer system that automatically reports real-time status and error conditions and permits error recovery. Query - We identified several weaknesses in 1st-generation STARS query software: inflexible query tools, poor sharing of calibration data, and no automatic file transfer mechanisms to observers. We are developing improved query tools and sharing of calibration data, and multi-protocol unassisted file transfer mechanisms for observers. In the process, we have redefined a 'query': from an invisible search result that can only transfer once in-house right now, with little status and error reporting and no error recovery - to a stored search result that can be monitored, transferred to different locations with multiple protocols, reporting status and error conditions and permitting recovery from errors.

  2. 2nd Generation Alkaline Electrolysis

    Yde, Lars; Kjartansdóttir, Cecilia Kristin; Allebrod, Frank; Mogensen, Mogens Bjerg; Møller, Per; Hilbert, Lisbeth R.; Nielsen, Peter Tommy; Mathiesen, Troels; Jensen, Jørgen; Andersen, Lars; Dierking, Alexander

    This report provides the results of the 2nd Generation Alkaline Electrolysis project which was initiated in 2008. The project has been conducted from 2009-2012 by a consortium comprising Århus University Business and Social Science – Centre for Energy Technologies (CET (former HIRC)), Technical...

  3. 2nd generation biogas. BioSNG

    The substitution of natural gas by a renewable equivalent is an interesting option to reduce the use of fossil fuels and the accompanying greenhouse gas emissions, as well as from the point of view of security of supply. The renewable alternative for natural gas is green natural gas, i.e. gaseous energy carriers produced from biomass comprising both biogas and Synthetic Natural Gas (SNG). Via this route can be benefited from all the advantages of natural gas, like the existing dense infrastructure, trade and supply network, and natural gas applications. In this presentation attention is paid to the differences between first generation biogas and second generation bioSNG; the market for bioSNG: grid injection vs. transportation fuel; latest update on the lab- and pilot-scale bioSNG development at ECN; and an overview is given of ongoing bioSNG activities worldwide

  4. 2nd Generation alkaline electrolysis. Final report

    Yde, L. [Aarhus Univ. Business and Social Science - Centre for Energy Technologies (CET), Aarhus (Denmark); Kjartansdottir, C.K. [Technical Univ. of Denmark. DTU Mechanical Engineering, Kgs. Lyngby (Denmark); Allebrod, F. [Technical Univ. of Denmark. DTU Energy Conversion, DTU Risoe Campus, Roskilde (Denmark)] [and others

    2013-03-15

    The overall purpose of this project has been to contribute to this load management by developing a 2{sup nd} generation of alkaline electrolysis system characterized by being compact, reliable, inexpensive and energy efficient. The specific targets for the project have been to: 1) Increase cell efficiency to more than 88% (according to the higher heating value (HHV)) at a current density of 200 mA /cm{sup 2}; 2) Increase operation temperature to more than 100 degree Celsius to make the cooling energy more valuable; 3) Obtain an operation pressure more than 30 bar hereby minimizing the need for further compression of hydrogen for storage; 4) Improve stack architecture decreasing the price of the stack with at least 50%; 5) Develop a modular design making it easy to customize plants in the size from 20 to 200 kW; 6) Demonstrating a 20 kW 2{sup nd} generation stack in H2College at the campus of Arhus University in Herning. The project has included research and development on three different technology tracks of electrodes; an electrochemical plating, an atmospheric plasma spray (APS) and finally a high temperature and pressure (HTP) track with operating temperature around 250 deg. C and pressure around 40 bar. The results show that all three electrode tracks have reached high energy efficiencies. In the electrochemical plating track a stack efficiency of 86.5% at a current density of 177mA/cm{sup 2} and a temperature of 74.4 deg. C has been shown. The APS track showed cell efficiencies of 97%, however, coatings for the anode side still need to be developed. The HTP cell has reached 100 % electric efficiency operating at 1.5 V (the thermoneutral voltage) with a current density of 1. 1 A/cm{sup 2}. This track only tested small cells in an externally heated laboratory set-up, and thus the thermal loss to surroundings cannot be given. The goal set for the 2{sup nd} generation electrolyser system, has been to generate 30 bar pressure in the cell stack. An obstacle to be

  5. Performance and validation of COMPUCEA 2nd generation for uranium measurements in physical inventory verifications

    Full text: In order to somewhat alleviate the kind of logistical problems encountered in the in-field measurements with the current COMPUCEA equipment (COMbined Product for Uranium Content and Enrichment Assay), and with the expected benefits of saving some time and costs for the missions in mind, ITU is presently developing a 2nd generation of the COMPUCEA device. This new development also forms a task in the support programme of the Joint Research Centre of the European Commission to the IAEA. To validate the in-field performance of the newly developed 2nd generation COMPUCEA, a prototype has been tested together with the 1st generation equipment during physical inventory verification (PIV) measurements in different uranium fuel fabrication plants in Europe. In this paper we will present the prototype of COMPUCEA 2nd generation, its hardware as well as the software developed for the evaluation of the U content and 235U enrichment. We will show a comparison of the performance of the 2nd generation with the 1st generation on a larger number of uranium samples measured during the in-field PIVs. The observed excellent performance of the new COMPUCEA represents an important step in the validation of this new instrument. (author)

  6. The 2nd Generation Real Time Mission Monitor (RTMM) Development

    Blakeslee, R. J.; Goodman, M.; Hardin, D. M.; Hall, J.; Yubin He, M.; Regner, K.; Conover, H.; Smith, T.; Meyer, P.; Lu, J.; Garrett, M.

    2009-12-01

    The NASA Real Time Mission Monitor (RTMM) is a visualization and information system that fuses multiple Earth science data sources, to enable real time decision-making for airborne and ground validation experiments. Developed at the National Aeronautics and Space Administration (NASA) Marshall Space Flight Center, RTMM is a situational awareness, decision-support system that integrates satellite imagery and orbit data, radar and other surface observations (e.g., lightning location network data), airborne navigation and instrument data sets, model output parameters, and other applicable Earth science data sets. The integration and delivery of this information is made possible using data acquisition systems, network communication links, network server resources, and visualizations through the Google Earth virtual globe application. In order to improve the usefulness and efficiency of the RTMM system, capabilities are being developed to allow the end-user to easily configure RTMM applications based on their mission-specific requirements and objectives. This second generation RTMM is being redesigned to take advantage of the Google plug-in capabilities to run multiple applications in a web browser rather than the original single application Google Earth approach. Currently RTMM employs a limited Service Oriented Architecture approach to enable discovery of mission specific resources. We are expanding the RTMM architecture such that it will more effectively utilize the Open Geospatial Consortium Sensor Web Enablement services and other new technology software tools and components. These modifications and extensions will result in a robust, versatile RTMM system that will greatly increase flexibility of the user to choose which science data sets and support applications to view and/or use. The improvements brought about by RTMM 2nd generation system will provide mission planners and airborne scientists with enhanced decision-making tools and capabilities to more

  7. A ZeroDimensional Model of a 2nd Generation Planar SOFC Using Calibrated Parameters

    Brian Elmegaard; Niels Houbak; Thomas Frank Petersen

    2006-01-01

    This paper presents a zero-dimensional mathematical model of a planar 2nd generation coflow SOFC developed for simulation of power systems. The model accounts for the electrochemical oxidation of hydrogen as well as the methane reforming reaction and the water-gas shift reaction. An important part of the paper is the electrochemical sub-model, where experimental data was used to calibrate specific parameters. The SOFC model was implemented in the DNA simulation software which is designed for ...

  8. The 2nd Generation VLTI path to performance

    Woillez, Julien; Berger, Jean-Philippe; Bonnet, Henri; de Wit, Willem-Jan; Egner, Sebastian; Eisenhauer, Frank; Gonté, Frédéric; Guieu, Sylvain; Haguenauer, Pierre; Mérand, Antoine; Pettazzi, Lorenzo; Poupar, Sébastien; Schöller, Markus; Schuhler, Nicolas

    2016-01-01

    The upgrade of the VLTI infrastructure for the 2nd generation instruments is now complete with the transformation of the laboratory, and installation of star separators on both the 1.8-m Auxiliary Telescopes (ATs) and the 8-m Unit Telescopes (UTs). The Gravity fringe tracker has had a full semester of commissioning on the ATs, and a first look at the UTs. The CIAO infrared wavefront sensor is about to demonstrate its performance relative to the visible wavefront sensor MACAO. First astrometric measurements on the ATs and astrometric qualification of the UTs are on-going. Now is a good time to revisit the performance roadmap for VLTI that was initiated in 2014, which aimed at coherently driving the developments of the interferometer, and especially its performance, in support to the new generation of instruments: Gravity and MATISSE.

  9. Super Boiler 2nd Generation Technology for Watertube Boilers

    Mr. David Cygan; Dr. Joseph Rabovitser

    2012-03-31

    This report describes Phase I of a proposed two phase project to develop and demonstrate an advanced industrial watertube boiler system with the capability of reaching 94% (HHV) fuel-to-steam efficiency and emissions below 2 ppmv NOx, 2 ppmv CO, and 1 ppmv VOC on natural gas fuel. The boiler design would have the capability to produce >1500 F, >1500 psig superheated steam, burn multiple fuels, and will be 50% smaller/lighter than currently available watertube boilers of similar capacity. This project is built upon the successful Super Boiler project at GTI. In that project that employed a unique two-staged intercooled combustion system and an innovative heat recovery system to reduce NOx to below 5 ppmv and demonstrated fuel-to-steam efficiency of 94% (HHV). This project was carried out under the leadership of GTI with project partners Cleaver-Brooks, Inc., Nebraska Boiler, a Division of Cleaver-Brooks, and Media and Process Technology Inc., and project advisors Georgia Institute of Technology, Alstom Power Inc., Pacific Northwest National Laboratory and Oak Ridge National Laboratory. Phase I of efforts focused on developing 2nd generation boiler concepts and performance modeling; incorporating multi-fuel (natural gas and oil) capabilities; assessing heat recovery, heat transfer and steam superheating approaches; and developing the overall conceptual engineering boiler design. Based on our analysis, the 2nd generation Industrial Watertube Boiler when developed and commercialized, could potentially save 265 trillion Btu and $1.6 billion in fuel costs across U.S. industry through increased efficiency. Its ultra-clean combustion could eliminate 57,000 tons of NOx, 460,000 tons of CO, and 8.8 million tons of CO2 annually from the atmosphere. Reduction in boiler size will bring cost-effective package boilers into a size range previously dominated by more expensive field-erected boilers, benefiting manufacturers and end users through lower capital costs.

  10. A ZeroDimensional Model of a 2nd Generation Planar SOFC Using Calibrated Parameters

    Brian Elmegaard

    2006-12-01

    Full Text Available This paper presents a zero-dimensional mathematical model of a planar 2nd generation coflow SOFC developed for simulation of power systems. The model accounts for the electrochemical oxidation of hydrogen as well as the methane reforming reaction and the water-gas shift reaction. An important part of the paper is the electrochemical sub-model, where experimental data was used to calibrate specific parameters. The SOFC model was implemented in the DNA simulation software which is designed for energy system simulation. The result is an accurate and flexible tool suitable for simulation of many different SOFC-based power systems.

  11. Proceedings 2nd Workshop on Formal Methods in the Development of Software

    Andrés, César; Llana, Luis

    2012-01-01

    This volume contains the proceedings of the 2nd WorkShop on Formal Methods in the Development of Software (WS-FMDS 2012). The workshop was held in Paris, France on August 30th, 2012 as a satellite event to the 18th International Symposium on Formal Methods (FM-2012). The aim of WS-FMDS 2012 is to provide a forum for researchers who are interested in the application of formal methods on systems which are being developing with a software methodology. In particular, this workshop is intended to ...

  12. From 1st- to 2nd-Generation Biofuel Technologies: Extended Executive Summary

    NONE

    2008-07-01

    This report looks at the technical challenges facing 2nd-generation biofuels, evaluates their costs and examines related current policies to support their development and deployment. The potential for production of more advanced biofuels is also discussed. Although significant progress continues to be made to overcome the technical and economic challenges, 2nd-generation biofuels still face major constraints to their commercial deployment.

  13. Intergenerational Transmission and the School-to-work Transition for 2nd Generation Immigrants

    Nielsen, Helena Skyt; Rosholm, Michael; Smith, Nina;

    2001-01-01

    We analyse the extent of intergenerational transmission through parental capital, ethnic capital and neighbourhood effects on several aspects of the school-to-work transition of 2nd generation immigrants and young ethnic Danes. The main findings are that parental capital has strong positive effects...... on the probability of completing a qualifying education and on the entry into the labour market, but it has a much smaller impact on the duration of the first employment spell and on the wage level. Growing up in neighbourhoods with a high concentration of immigrants is associated with negative...... labour market prospects both for young natives and 2nd generation immigrants....

  14. Intergenerational Transmission and the School-to-work Transition for 2nd Generation Immigrants

    Nielsen, Helena Skyt; Rosholm, Michael; Smith, Nina;

    2001-01-01

    We analyse the extent of intergenerational transmission through parental capital, ethnic capital and neighbourhood effects on several aspects of the school-to-work transition of 2nd generation immigrants and young ethnic Danes. The main findings are that parental capital has strong positive effects...

  15. The 1997 Protocol and the European Union (European Union and '2nd generation' responsibility conventions)

    The issue of accession of the Eastern European Member States to the 1997 Protocol is discussed with focus on the European Union's authority and enforcement powers. Following up the article published in the preceding issue of this journal, the present contribution analyses the relations of the '2nd generation' responsibility conventions to the law of the European Union. (orig.)

  16. Performance and validation of COMPUCEA 2nd generation for uranium measurements in physical inventory verification

    A new instrumental version of COMPUCEA has been developed with the aim to provide a simplified and more practical instrumentation for in-field use. The main design goals were to eliminate the radioactive sources and the liquid nitrogen-cooled Ge detectors used in the 1st generation of COMPUCEA. This paper describes the major technical features of the 2nd generation of equipment together with typical performance data. The performance tests carried out during first in-field measurements in the course of physical inventory verification campaigns represent an important step in the validation of this new instrument. (author)

  17. Systems Engineering Approach to Technology Integration for NASA's 2nd Generation Reusable Launch Vehicle

    Thomas, Dale; Smith, Charles; Thomas, Leann; Kittredge, Sheryl

    2002-01-01

    The overall goal of the 2nd Generation RLV Program is to substantially reduce technical and business risks associated with developing a new class of reusable launch vehicles. NASA's specific goals are to improve the safety of a 2nd-generation system by 2 orders of magnitude - equivalent to a crew risk of 1-in-10,000 missions - and decrease the cost tenfold, to approximately $1,000 per pound of payload launched. Architecture definition is being conducted in parallel with the maturating of key technologies specifically identified to improve safety and reliability, while reducing operational costs. An architecture broadly includes an Earth-to-orbit reusable launch vehicle, on-orbit transfer vehicles and upper stages, mission planning, ground and flight operations, and support infrastructure, both on the ground and in orbit. The systems engineering approach ensures that the technologies developed - such as lightweight structures, long-life rocket engines, reliable crew escape, and robust thermal protection systems - will synergistically integrate into the optimum vehicle. To best direct technology development decisions, analytical models are employed to accurately predict the benefits of each technology toward potential space transportation architectures as well as the risks associated with each technology. Rigorous systems analysis provides the foundation for assessing progress toward safety and cost goals. The systems engineering review process factors in comprehensive budget estimates, detailed project schedules, and business and performance plans, against the goals of safety, reliability, and cost, in addition to overall technical feasibility. This approach forms the basis for investment decisions in the 2nd Generation RLV Program's risk-reduction activities. Through this process, NASA will continually refine its specialized needs and identify where Defense and commercial requirements overlap those of civil missions.

  18. Boosting biogas yield of anaerobic digesters by utilizing concentrated molasses from 2nd generation bioethanol plant

    Shiplu Sarker, Henrik Bjarne Møller

    2013-01-01

    Concentrated molasses (C5 molasses) from 2nd generation bioethanol plant has been investigated for enhancing productivity of manure based digesters. A batch study at mesophilic condition (35±1C) showed the maximum methane yield from molasses as 286 LCH4/kgVS which was approximately 63% of the calculated theoretical yield. In addition to the batch study, co-digestion of molasses with cattle manure in a semi-continuously stirred reactor at thermophilic temperature (50±1°C) was also performed wi...

  19. Effects of Thermal Cycling on Control and Irradiated EPC 2nd Generation GaN FETs

    Patterson, Richard L.; Scheick, Leif; Lauenstein, Jean-Marie; Casey, Megan; Hammoud, Ahmad

    2013-01-01

    The power systems for use in NASA space missions must work reliably under harsh conditions including radiation, thermal cycling, and exposure to extreme temperatures. Gallium nitride semiconductors show great promise, but information pertaining to their performance is scarce. Gallium nitride N-channel enhancement-mode field effect transistors made by EPC Corporation in a 2nd generation of manufacturing were exposed to radiation followed by long-term thermal cycling in order to address their reliability for use in space missions. Results of the experimental work are presented and discussed.

  20. Advanced Electron Beam Ion Sources (EBIS) for 2-nd generation carbon radiotherapy facilities

    In this work we analyze how advanced Electron Beam Ion Sources (EBIS) can facilitate the progress of carbon therapy facilities. We will demonstrate that advanced ion sources enable operation of 2-nd generation ion beam therapy (IBT) accelerators. These new accelerator concepts with designs dedicated to IBT provide beams better suited for therapy and, are more cost efficient than contemporary IBT facilities. We will give a sort overview of the existing new IBT concepts and focus on those where ion source technology is the limiting factor. We will analyse whether this limitation can be overcome in the near future thanks to ongoing EBIS development

  1. Advanced Electron Beam Ion Sources (EBIS) for 2-nd generation carbon radiotherapy facilities

    Shornikov, A.; Wenander, F.

    2016-04-01

    In this work we analyze how advanced Electron Beam Ion Sources (EBIS) can facilitate the progress of carbon therapy facilities. We will demonstrate that advanced ion sources enable operation of 2-nd generation ion beam therapy (IBT) accelerators. These new accelerator concepts with designs dedicated to IBT provide beams better suited for therapy and, are more cost efficient than contemporary IBT facilities. We will give a sort overview of the existing new IBT concepts and focus on those where ion source technology is the limiting factor. We will analyse whether this limitation can be overcome in the near future thanks to ongoing EBIS development.

  2. Emotional and Behavioral Disorders in 1.5th Generation, 2nd Generation Immigrant Children, and Foreign Adoptees.

    Tan, Tony Xing

    2016-10-01

    Existing theories (e.g., acculturative stress theory) cannot adequately explain why mental disorders in immigrants are less prevalent than in non-immigrants. In this paper, the culture-gene co-evolutionary theory of mental disorders was utilized to generate a novel hypothesis that connection to heritage culture reduces the risk for mental disorders in immigrant children. Four groups of children aged 2-17 years were identified from the 2007 United States National Survey of Children's Health: 1.5th generation immigrant children (n = 1378), 2nd generation immigrant children (n = 4194), foreign adoptees (n = 270), and non-immigrant children (n = 54,877). The 1.5th generation immigrant children's connection to their heritage culture is stronger than or similar to the 2nd generation immigrants, while the foreign adoptees have little connection to their birth culture. Controlling for age, sex, family type and SES, the odds for having ADD/ADHD, Conduct Disorder, Anxiety Disorder, and Depression diagnosis were the lowest for the 1.5th generation immigrant children, followed by the 2nd generation immigrant children and the foreign adoptees. The foreign adoptees and non-adopted children were similar in the odds of having these disorders. Connection to heritage culture might be the underlying mechanism that explained recent immigrants' lower rates of mental disorders. PMID:26972324

  3. The Planar Optics Phase Sensor: a study for the VLTI 2nd Generation Fringe Tracker

    Blind, Nicolas; Absil, Olivier; Alamir, Mazen; Berger, Jean-Philippe; Defrère, Denis; Feautrier, Philippe; Hénault, Franois; Jocou, Laurent; Kern, Pierre; Laurent, Thomas; Malbet, Fabien; Mourard, Denis; Rousselet-Perrault, Karine; Sarlette, Alain; Surdej, Jean; Tarmoul, Nassima; Tatulli, Eric; Vincent, Lionel; 10.1117/12.857114

    2010-01-01

    In a few years, the second generation instruments of the Very Large Telescope Interferometer (VLTI) will routinely provide observations with 4 to 6 telescopes simultaneously. To reach their ultimate performance, they will need a fringe sensor capable to measure in real time the randomly varying optical paths differences. A collaboration between LAOG (PI institute), IAGL, OCA and GIPSA-Lab has proposed the Planar Optics Phase Sensor concept to ESO for the 2nd Generation Fringe Tracker. This concept is based on the integrated optics technologies, enabling the conception of extremely compact interferometric instruments naturally providing single-mode spatial filtering. It allows operations with 4 and 6 telescopes by measuring the fringes position thanks to a spectrally dispersed ABCD method. We present here the main analysis which led to the current concept as well as the expected on-sky performance and the proposed design.

  4. Enabling the 2nd Generation in Space: Building Blocks for Large Scale Space Endeavours

    Barnhardt, D.; Garretson, P.; Will, P.

    Today the world operates within a "first generation" space industrial enterprise, i.e. all industry is on Earth, all value from space is from bits (data essentially), and the focus is Earth-centric, with very limited parts of our population and industry participating in space. We are limited in access, manoeuvring, on-orbit servicing, in-space power, in-space manufacturing and assembly. The transition to a "Starship culture" requires the Earth to progress to a "second generation" space industrial base, which implies the need to expand the economic sphere of activity of mankind outside of an Earth-centric zone and into CIS-lunar space and beyond, with an equal ability to tap the indigenous resources in space (energy, location, materials) that will contribute to an expanding space economy. Right now, there is no comfortable place for space applications that are not discovery science, exploration, military, or established earth bound services. For the most part, space applications leave out -- or at least leave nebulous, unconsolidated, and without a critical mass -- programs and development efforts for infrastructure, industrialization, space resources (survey and process maturation), non-traditional and persistent security situational awareness, and global utilities -- all of which, to a far greater extent than a discovery and exploration program, may help determine the elements of a 2nd generation space capability. We propose a focus to seed the pre-competitive research that will enable global industry to develop the necessary competencies that we currently lack to build large scale space structures on-orbit, that in turn would lay the foundation for long duration spacecraft travel (i.e. key technologies in access, manoeuvrability, etc.). This paper will posit a vision-to-reality for a step wise approach to the types of activities the US and global space providers could embark upon to lay the foundation for the 2nd generation of Earth in space.

  5. The New 2nd-Generation SRF R&D Facility at Jefferson Lab: TEDF

    Reece, Charles E.; Reilly, Anthony V.

    2012-09-01

    The US Department of Energy has funded a near-complete renovation of the SRF-based accelerator research and development facilities at Jefferson Lab. The project to accomplish this, the Technical and Engineering Development Facility (TEDF) Project has completed the first of two phases. An entirely new 3,100 m{sup 2} purpose-built SRF technical work facility has been constructed and was occupied in summer of 2012. All SRF work processes with the exception of cryogenic testing have been relocated into the new building. All cavity fabrication, processing, thermal treatment, chemistry, cleaning, and assembly work is collected conveniently into a new LEED-certified building. An innovatively designed 800 m2 cleanroom/chemroom suite provides long-term flexibility for support of multiple R&D and construction projects as well as continued process evolution. The characteristics of this first 2nd-generation SRF facility are described.

  6. Boosting biogas yield of anaerobic digesters by utilizing concentrated molasses from 2nd generation bioethanol plant

    Sarker, Shiplu [Department of Renewable Energy, Faculty of Engineering and Science, University of Agder, Grimstad-4879 (Norway); Moeller, Henrik Bjarne [Department of Biosystems Engineering, Faculty of Science and Technology, Aarhus University, Research center Foulum, Blichers Alle, Post Box 50, Tjele-8830 (Denmark)

    2013-07-01

    Concentrated molasses (C5 molasses) from 2nd generation bioethanol plant has been investigated for enhancing productivity of manure based digesters. A batch study at mesophilic condition (35+- 1 deg C) showed the maximum methane yield from molasses as 286 LCH4/kgVS which was approximately 63% of the calculated theoretical yield. In addition to the batch study, co-digestion of molasses with cattle manure in a semi-continuously stirred reactor at thermophilic temperature (50+- 1 deg C) was also performed with a stepwise increase in molasses concentration. The results from this experiment revealed the maximum average biogas yield of 1.89 L/L/day when 23% VSmolasses was co-digested with cattle manure. However, digesters fed with more than 32% VSmolasses and with short adaptation period resulted in VFA accumulation and reduced methane productivity indicating that when using molasses as biogas booster this level should not be exceeded.

  7. 2nd International Workshop on Crowd Sourcing in Software Engineering (CSI-SE 2015)

    Fraser, G; Latoza, T.D.; Mariani, L

    2015-01-01

    Crowdsourcing is increasingly revolutionizing the ways in which software is engineered. Programmers increasingly crowdsource answering their questions through Q&A sites. Non-programmers may contribute human-intelligence to development projects, by, for example, usability testing software or even play games with a purpose to implicitly construct formal specifications. Crowdfunding helps to democratize decisions about what software to build. Software engineering researchers may even benefit...

  8. Geodesign from Theory to Practice: From Metaplanning to 2nd Generation of Planning Support Systems

    Michele Campagna

    2014-05-01

    Full Text Available This paper deals with the concept of Geodesign, a new approach to spatial planning and design which is grounded on extensive use of Geographic Information Science methods and tools. As a method Geodesign is intended to inform projects since their conceptualization, to analysis and diagnosis, to design of alternatives and impact simulation, and eventually the final choice. This approach appears particularly urgent and actual to many scholars from academia and practitioners from the industry and the planning practice for advances in GIScience nowadays offer unprecedented data and tools to manage territorial knowledge for decision-making support. The author argues research in Geodesign may contribute to solve major actual pitfalls in sustainable spatial planning: namely it may offer methods to help planners to inform sustainable design alternatives with environmental considerations and contextually assess their impacts; secondly, it may help to ensure more transparent, responsible, and accountable democratic decision-making processes. The argumentation is supported by the author recent research results with regards to the evolution from 1st generation Planning Support Systems (PSS, to metaplanning and 2nd generation PSS.

  9. Improved beam spot measurements in the 2nd generation proton beam writing system

    Nanosized ion beams (especially proton and helium) play a pivotal role in the field of ion beam lithography and ion beam analysis. Proton beam writing has shown lithographic details down to the 20 nm level, limited by the proton beam spot size. Introducing a smaller spot size will allow smaller lithographic features. Smaller probe sizes, will also drastically improve the spatial resolution for ion beam analysis techniques. Among many other requirements, having an ideal resolution standard, used for beam focusing and a reliable focusing method, is an important pre-requisite for sub-10 nm beam spot focusing. In this paper we present the fabrication processes of a free-standing resolution standard with reduced side-wall projection and high side-wall verticality. The resulting grid is orthogonal (90.0° ± 0.1), has smooth edges with better than 6 nm side-wall projection. The new resolution standard has been used in focusing a 2 MeV H2+ beam in the 2nd generation PBW system at Center for Ion Beam Applications, NUS. The beam size has been characterized using on- and off-axis scanning transmission ion microscopy (STIM) and ion induced secondary electron detection, carried out with a newly installed micro channel plate electron detector. The latter has been shown to be a realistic alternative to STIM measurements, as the drawback of PIN diode detector damage is alleviated. With these improvements we show reproducible beam focusing down to 14 nm

  10. Conceptual design study of $Nb_{3} Sn$ low-beta quadrupoles for 2nd generation LHC IRs

    Zlobin, A V; Andreev, N; Barzi, E; Bauer, P; Chichili, D R; Huang, Y; Imbasciati, L; Kashikhin, V V; Lamm, M J; Limon, P; Novitski, I; Peterson, T; Strait, J B; Yadav, S; Yamada, R

    2003-01-01

    Conceptual designs of 90-mm aperture high-gradient quadrupoles based on the Nb/sub 3/Sn superconductor, are being developed at Fermilab for possible 2nd generation IRs with the similar optics as in the current low-beta insertions. Magnet designs and results of magnetic, mechanical, thermal and quench protection analysis for these magnets are presented and discussed. (10 refs).

  11. Multi-objective Optimization of a Solar Assisted 1st and 2nd Generation Sugarcane Ethanol Production Plant

    Zevenhoven, Ron; Wallerand, Anna Sophia; Queiroz Albarelli, Juliana; Viana Ensinas, Adriano; Ambrosetti, Gianluca; Mian, Alberto; Maréchal, François

    2014-01-01

    Ethanol production sites utilizing sugarcane as feedstock are usually located in regions with high land availability and decent solar radiation. This offers the opportunity to cover parts of the process energy demand with concentrated solar power (CSP) and thereby increase the fuel production and carbon conversion efficiency. A plant is examined that produces 1st and 2nd generation ethanol by fermentation of sugars (from sugarcane) and enzymatic hydrolysis of the lignocellulosic residues (bag...

  12. Generative Software Development

    Rumpe, Bernhard; Schindler, Martin; Völkel, Steven; Weisemöller, Ingo

    2014-01-01

    Generation of software from modeling languages such as UML and domain specific languages (DSLs) has become an important paradigm in software engineering. In this contribution, we present some positions on software development in a model based, generative manner based on home grown DSLs as well as the UML. This includes development of DSLs as well as development of models in these languages in order to generate executable code, test cases or models in different languages. Development of formal...

  13. BMI differences in 1st and 2nd generation immigrants of Asian and European origin to Australia.

    Hauck, Katharina; Hollingsworth, Bruce; Morgan, Lawrie

    2011-01-01

    We estimate assimilation of immigrants' body mass index (BMI) to the host population of Australia over one generation, conducting separate analyses for immigrants from 7 regions of Europe and Asia. We use quantile regressions to allow for differing impact of generational status across 19 quantiles of BMI from under-weight to morbidly obese individuals. We find that 1st generation South European immigrants have higher, and South and East Asian immigrants have lower BMI than Australians, but have assimilated to the BMI of their hosts in the 2nd generation. There are no or only small BMI differences between Australians and 1st and 2nd generation immigrants from East Europe, North-West Europe, Middle East and Pacific regions. We conclude that both upward and downward assimilation in some immigrant groups is most likely caused by factors which can change over one generation (such as acculturation), and not factors which would take longer to change (such as genetics). Our results suggest that public health policies targeting the lifestyles of well educated Asian immigrants may be effective in preventing BMI increase in this subgroup. PMID:20869292

  14. White paper on perspectives of biofuels in Denmark - with focus on 2nd generation bioethanol; Hvidbog om perspektiver for biobraendstoffer i Danmark - med fokus paa 2. generations bioethanol

    Larsen, Gy.; Foghmar, J.

    2009-11-15

    The white paper presents the perspectives - both options and barriers - for a Danish focus on production and use of biomass, including sustainable 2nd generation bioethanol, for transport. The white paper presents the current knowledge of biofuels and bioethanol and recommendations for a Danish strategy. (ln)

  15. Time resolved 2nd harmonic generation at LaAlO3/SrTiO3 Interfaces

    Adhikari, Sanjay; Eom, Chang-Beom; Ryu, Sangwoo; Cen, Cheng

    2014-03-01

    Ultrafast spectroscopy can produce information of carrier/lattice dynamics, which is especially valuable for understanding phase transitions at LaAlO3/SrTiO3 interfaces. LaAlO3 (LAO) and SrTiO3 (STO) are both associated with wide band gap, which allows deep penetration of commonly used laser wavelengths and therefore usually leads to overwhelming bulk signal background. Here we report a time resolved study of a 2nd harmonic generation (SHG) signal resulting from impulsive below-the-band-gap optical pumping. The nonlinear nature of the signal enables us to probe the interface directly. Output of a home built Ti:Sapphire laser and BBO crystal were used to generate 30fs pulses of two colors (405nm and 810nm). The 405nm pulse was used to pump the LAO/STO interfaces, while 2nd harmonics of the 810nm pulse generated at the interfaces was probed as a function of the time delay. Signals from samples with varying LAO thicknesses clearly correlates to the metal-insulator transition. Distinct time dependent signals were observed at LAO/STO interfaces grown on different substrates. Experiments performed at different optical polarization geometries, interface electric fields and temperatures allow us to paint a clearer picture of the novel oxide heterostructures under investigation.

  16. Clinical evaluation of the 2nd generation radio-receptor assay for anti-thyrotropin receptor antibodies (TRAb) in Graves' disease

    Full text: Detection of autoantibodies to the TSH receptor by radioreceptorial assays (RRA) is largely requested in clinical practice for the diagnosis of Graves' disease and its differentiation from diffuse thyroid autonomy. Additionally, TRAb measurement during antithyroid drug treatment can be useful to evaluate the risk of disease's relapse alter therapy discontinuation. Nevertheless, some patients affected by Graves' disease are TRAb negative when 1st generation assay is used. Recently a new RRA method for TRAb assay was developed by using human recombinant TSH-receptor and solid-phase technique. Aim of our work was the comparison between 1st and 2nd generation TRAb assays in Graves' disease patients and, particularly, the evaluation of 2nd generation test in a sub-group of patients affected by Graves' disease but with negative 1st generation TRAb assay. We evaluated the diagnostic performance of a newly developed 2nd generation TRAb assay (DYNOtest(r) TRAK human, BRAHMS Diagnostica GmbH, Germany) in 46 patients affected by Graves' disease with negative 1st generation TRAb assay (TRAK Assay(r), BRAHMS Diagnostica GmbH, Germany) . A control groups of 50 Graves' disease patients with positive 1st generation TRAb assay, 50 patients affected by Hashimoto's thyroiditis and 50 patients affected by nodular goiter were also examined. 41 out of 46 patients affected by Graves' disease with negative 1st generation TRAb assay showed a positive 2nd generation test. The overall sensitivity of the 2nd generation test was significantly improved respect the 1st generation assay in Graves' disease patients (χ2 = 22.5, p<0.0001). 1 and 3 out of 50 patients affected by Hashimoto's thyroiditis were positive by 1st and 2nd generation TRAB assay, respectively. All these patients showed primary hypothyroidism. No differences resulted in euthyroid Hashimoto's thyroiditis sub-group and in nodular goiter control group. The 2nd generation TRAB assay is clearly more sensitive than the 1

  17. Large-aperture $Nb_{3}Sn$ quadrupoles for $2^{nd}$ generation LHC IRs

    Zlobin, A V; Chichili, D R; Huang Yu; Kashikhin, V V; Lamm, M J; Limon, P J; Mokhov, N V; Novitski, I; Peterson, T; Strait, J B; Yadav, S

    2002-01-01

    The 1/sup st/ generation of low-beta quadrupoles for the LHC interaction region (IR) was designed to achieve the nominal LHC luminosity of 10/sup 34/ cm/sup -2/s/sup -1/. Given that the lifetime of the 1/sup st/ generation IR quadrupoles is limited by ionizing radiation to 6-7 years, the 2/sup nd/ generation of IR quadrupoles has to be developed with the goal to achieve the ultimate luminosity up to 10/sup 35/ cm/sup -2/s/sup -1/. The IR quadrupole parameters such as nominal gradient, dynamic aperture and physical aperture, operation margins are the main factors limiting the machine performance. Conceptual designs of 90-mm aperture high-gradient quadrupoles, suitable for use in 2/sup nd/ generation high-luminosity LHC IRs with the similar optics, are presented. The issues related to the field gradient, field quality and operation margins are discussed. (5 refs).

  18. Mobile Radio Communications: Second and Third Generation Cellular and WATM Systems: 2nd

    Steele, R; Hanzo, L

    1999-01-01

    This comprehensive all-in-one reference work covers the fundamental physical aspects of mobile communications and explains the latest techniques employed in second and third generation digital cellular mobile radio systems. Mobile radio communications technology has progressed rapidly and it is now capable of the transmission of voice, data and image signals. This new edition reflects the current state-of-the-art by featuring: * Expanded and updated sections on voice compression techniques, i...

  19. Bellman's GAP : a 2nd generation language and system for algebraic dynamic programming

    Sauthoff, Georg

    2010-01-01

    The dissertation describes the new Bellman’s GAP which is a programming system for writing dynamic programming algorithms over sequential data. It is the second generation implementation of the algebraic dynamic programming framework (ADP). The system includes the multi-paradigm language (GAP-L), its compiler (GAP-C), functional modules (GAP-M) and a web site (GAP Pages) to experiment with GAP-L programs. GAP-L includes declarative constructs, e.g. tree grammars to model the search space, and...

  20. Next generation LP system for maintenance in nuclear power reactors (2nd report)

    Laser peening is a surface enhancement process that introduces compressive residual stress on materials by irradiating laser pulses under aqueous environment. The process utilizes the impulsive effect of high-pressure plasma generated by ablative interaction of each laser pulse. Around a decade ago, the authors invented a new process of laser peening (LP) without any surface preparation, while the conventional types required coating that prevented the surface from melting. Taking advantage of the new process without surface preparation, we have applied laser peening without coating to nuclear power plants as a preventive maintenance against stress corrosion cracking (SCC). Toshiba released the first LP system in 1999, which delivered laser pulses through waterproof pipes with mirrors. In 2002, fiber-delivery was attained and significantly extended the applicability. Now, the development of a new system has been just accomplished, which is extremely simple, reliable and easy-handled. (author)

  1. Self-assembling software generator

    Bouchard, Ann M. (Albuquerque, NM); Osbourn, Gordon C. (Albuquerque, NM)

    2011-11-25

    A technique to generate an executable task includes inspecting a task specification data structure to determine what software entities are to be generated to create the executable task, inspecting the task specification data structure to determine how the software entities will be linked after generating the software entities, inspecting the task specification data structure to determine logic to be executed by the software entities, and generating the software entities to create the executable task.

  2. Cogeneration and production of 2nd generation bio fuels using biomass gasification; Cogeneracion y produccion de biocombustibles de 2 generacion mediante gasificacion de biomasa

    Uruena Leal, A.; Diez Rodriguez, D.; Antolin Giraldo, G.

    2011-07-01

    Thermochemical decomposition process of gasification, in which a carbonaceous fuel, under certain conditions of temperature and oxygen deficiency, results in a series of reactions that will produce a series of gaseous products is now widely used for high performance energetic and versatility of these gaseous products for energy and 2nd generation bio fuels and reduce the emission of greenhouse gases. (Author)

  3. Open pit mine planning and design. Vol 1. Fundamentals; Vol. 2. CSMine software package and orebodey case examples. 2nd.

    Hustrulid, W.; Kuchta, M. [University of Utah, Salt Lake City, UT (United States)

    2006-04-15

    This book is designed to be both a textbook and a reference book describing the principles involved in the planning and design of open pit mines. Volume 1 deals with the fundamental concepts involved in the planning and design of an open pit mine. The eight chapters cover mine planning, mining revenues and costs, orebody description, geometrical considerations, pit limits, and production planning, mineral resources and ore reserves, and responsible mining. There is an extensive coverage of environmental considerations and basic economic principles. A large number of examples have been included to illustrate the applications. A second volume is devoted to a mine design and software package, CSMine. CSMine is user-friendly mine planning and design software developed specifically to illustrate the practical application of the involved principles. It also comprises the CSMine tutorial, the CSMine user's manual and eight orebody case examples, including drillhole data sets for performing a complete open pit mine evaluation. 545 ills., 211 tabs.

  4. Strategies for 2nd generation biofuels in EU - Co-firing to stimulate feedstock supply development and process integration to improve energy efficiency and economic competitiveness

    The present biofuel policies in the European Union primarily stimulate 1st generation biofuels that are produced based on conventional food crops. They may be a distraction from lignocellulose based 2nd generation biofuels - and also from biomass use for heat and electricity - by keeping farmers' attention and significant investments focusing on first generation biofuels and the cultivation of conventional food crops as feedstocks. This article presents two strategies that can contribute to the development of 2nd generation biofuels based on lignocellulosic feedstocks. The integration of gasification-based biofuel plants in district heating systems is one option for increasing the energy efficiency and improving the economic competitiveness of such biofuels. Another option, biomass co-firing with coal, generates high-efficiency biomass electricity and reduces CO2 emissions by replacing coal. It also offers a near-term market for lignocellulosic biomass, which can stimulate development of supply systems for biomass also suitable as feedstock for 2nd generation biofuels. Regardless of the long-term priorities of biomass use for energy, the stimulation of lignocellulosic biomass production by development of near term and cost-effective markets is judged to be a no-regrets strategy for Europe. Strategies that induce a relevant development and exploit existing energy infrastructures in order to reduce risk and reach lower costs, are proposed an attractive complement the present and prospective biofuel policies. (author)

  5. Strategies for 2nd generation biofuels in EU - Co-firing to stimulate feedstock supply development and process integration to improve energy efficiency and economic competitiveness

    The present biofuel policies in the European Union primarily stimulate 1st generation biofuels that are produced based on conventional food crops. They may be a distraction from lignocellulose based 2nd generation biofuels - and also from biomass use for heat and electricity - by keeping farmers' attention and significant investments focusing on first generation biofuels and the cultivation of conventional food crops as feedstocks. This article presents two strategies that can contribute to the development of 2nd generation biofuels based on lignocellulosic feedstocks. The integration of gasification-based biofuel plants in district heating systems is one option for increasing the energy efficiency and improving the economic competitiveness of such biofuels. Another option, biomass co-firing with coal, generates high-efficiency biomass electricity and reduces CO2 emissions by replacing coal. It also offers a near-term market for lignocellulosic biomass, which can stimulate development of supply systems for biomass also suitable as feedstock for 2nd generation biofuels. Regardless of the long-term priorities of biomass use for energy, the stimulation of lignocellulosic biomass production by development of near term and cost-effective markets is judged to be a no-regrets strategy for Europe. Strategies that induce a relevant development and exploit existing energy infrastructures in order to reduce risk and reach lower costs, are proposed an attractive complement the present and prospective biofuel policies.

  6. Immobilized High Level Waste (HLW) Interim Storage Alternative Generation and analysis and Decision Report 2nd Generation Implementing Architecture

    CALMUS, R.B.

    2000-09-14

    Two alternative approaches were previously identified to provide second-generation interim storage of Immobilized High-Level Waste (IHLW). One approach was retrofit modification of the Fuel and Materials Examination Facility (FMEF) to accommodate IHLW. The results of the evaluation of the FMEF as the second-generation IHLW interim storage facility and subsequent decision process are provided in this document.

  7. Generative Software Engineering

    Jézéquel, Jean-Marc

    2007-01-01

    Researching evermore abstract and powerful ways of composing programs is the meat of software engineering for half a century. Important early steps were subroutines (to encapsulate actions) and records (to encapsulate data). A large step forward came with the introduction of the object-oriented concepts (classes, subclasses and virtual methods) where classes can encapsulate both data and behaviors in a very powerful, but still flexible, way. For a long time, these concepts dominated the scene...

  8. Advances with the new AIMS fab 193 2nd generation: a system for the 65 nm node including immersion

    Zibold, Axel M.; Poortinga, E.; Doornmalen, H. v.; Schmid, R.; Scherubl, T.; Harnisch, W.

    2005-06-01

    The Aerial Image Measurement System, AIMS, for 193nm lithography emulation is established as a standard for the rapid prediction of wafer printability for critical structures including dense patterns and defects or repairs on masks. The main benefit of AIMS is to save expensive image qualification consisting of test wafer exposures followed by wafer CD-SEM resist or wafer analysis. By adjustment of numerical aperture (NA), illumination type and partial coherence (σ) to match any given stepper/ scanner, AIMS predicts the printability of 193nm reticles such as binary with, or without OPC and phase shifting. A new AIMS fab 193 second generation system with a maximum NA of 0.93 is now available. Improvements in field uniformity, stability over time, measurement automation and higher throughput meet the challenging requirements of the 65nm node. A new function, "Global CD Map" can be applied to automatically measure and analyse the global CD uniformity of repeating structures across a reticle. With the options of extended depth-of-focus (EDOF) software and the upcoming linear polarisation capability in the illumination the new AIMS fab 193 second generation system is able to cover both dry and immersion requirements for NA performed to study the effects of polarisation for imaging by comparing the aerial image of the AIMS to the resist image of the scanner.

  9. Advances with the new AIMS fab 193 2nd generation: a system for the 65 nm node including immersion

    Zibold, Axel M.; Poortinga, E.; Doornmalen, H. v.; Schmid, R.; Scherubl, T.; Harnisch, W.

    2005-06-01

    The Aerial Image Measurement System, AIMS, for 193nm lithography emulation is established as a standard for the rapid prediction of wafer printability for critical structures including dense patterns and defects or repairs on masks. The main benefit of AIMS is to save expensive image qualification consisting of test wafer exposures followed by wafer CD-SEM resist or wafer analysis. By adjustment of numerical aperture (NA), illumination type and partial coherence (σ) to match any given stepper/ scanner, AIMS predicts the printability of 193nm reticles such as binary with, or without OPC and phase shifting. A new AIMS fab 193 second generation system with a maximum NA of 0.93 is now available. Improvements in field uniformity, stability over time, measurement automation and higher throughput meet the challenging requirements of the 65nm node. A new function, "Global CD Map" can be applied to automatically measure and analyse the global CD uniformity of repeating structures across a reticle. With the options of extended depth-of-focus (EDOF) software and the upcoming linear polarisation capability in the illumination the new AIMS fab 193 second generation system is able to cover both dry and immersion requirements for NA < 1. Rigorous simulations have been performed to study the effects of polarisation for imaging by comparing the aerial image of the AIMS to the resist image of the scanner.

  10. Experimental Investigation of 2nd Generation Bioethanol Derived from Empty-fruit-bunch (EFB) of Oil-palm on Performance and Exhaust Emission of SI Engine

    Yanuandri Putrasari; Haznan Abimanyu; Achmad Praptijanto; Arifin Nur; Yan Irawan; Sabar Pangihutan Simanungkalit

    2014-01-01

    The experimental investigation of 2nd generation bioethanol derived from EFB of oil-palm blended with gasoline for 10, 20, 25% by volume and pure gasoline were conducted on performance and exhaust emission tests of SI engine. A four stroke, four cylinders, programmed fuel injection (PGMFI), 16 valves variable valve timing and electronic lift control (VTEC), single overhead camshaft (SOHC), and 1,497 cm3 SI engine (Honda/L15A) was used in this investigation. Engine performance test was carried...

  11. Anaerobic digestion in combination with 2nd generation ethanol production for maximizing biofuels yield from lignocellulosic biomass – testing in an integrated pilot-scale biorefinery plant

    Uellendahl, Hinrich; Ahring, Birgitte Kiær

    was higher for mesophilic than for thermophilic operation. The effluent from the ethanol fermentation showed no signs of toxicity to the anaerobic microorganisms. Implementation of the biogas production from the fermentation effluent accounted for about 30% higher biofuels yield in the biorefinery......An integrated biorefinery concept for 2nd generation bioethanol production together with biogas production from the fermentation effluent was tested in pilot-scale. The pilot plant comprised pretreatment, enzymatic hydrolysis, hexose and pentose fermentation into ethanol and anaerobic digestion of...... the fermentation effluent in a UASB (upflow anaerobic sludge blanket) reactor. Operation of the 770 liter UASB reactor was tested under both mesophilic (38ºC) and thermophilic (53ºC) conditions with increasing loading rates of the liquid fraction of the effluent from ethanol fermentation. At an OLR of...

  12. Methodology for measuring the impact of mobile technology change from 2nd to 3th generation percerved by users of smes in Barranquilla

    Jairo Polo

    2011-06-01

    Full Text Available This article presents the results of a research project undertaken to obtain a Masters inBusiness Administration from the Business School at the Universidad del Norte, whosepurpose was to identify and test a methodology to measure the impact exerted by thechange from 2nd to 3rd generation mobile tech, based on the perception of users belongingto Barranquilla SME, motivated by the influence of technological changes in behavior andthe knowledge creation among society members, and the importance it has taken to thesurvival of organizations the adoption of applications for process automation, web-basedapplications, voice, data and video that allow the development of competitive advantages,based on information and creativity for new and better products or services.

  13. Efficient 2(nd) and 4(th) harmonic generation of a single-frequency, continuous-wave fiber amplifier.

    Sudmeyer, Thomas; Imai, Yutaka; Masuda, Hisashi; Eguchi, Naoya; Saito, Masaki; Kubota, Shigeo

    2008-02-01

    We demonstrate efficient cavity-enhanced second and fourth harmonic generation of an air-cooled, continuous-wave (cw), single-frequency 1064 nm fiber-amplifier system. The second harmonic generator achieves up to 88% total external conversion efficiency, generating more than 20-W power at 532 nm wavelength in a diffraction-limited beam (M(2) crystal operated at 25 degrees C. The fourth harmonic generator is based on an AR-coated, Czochralski-grown beta-BaB(2)O(4) (BBO) crystal optimized for low loss and high damage threshold. Up to 12.2 W of 266-nm deep-UV (DUV) output is obtained using a 6-mm long critically phase-matched BBO operated at 40 degrees C. This power level is more than two times higher than previously reported for cw 266-nm generation. The total external conversion efficiency from the fundamental at 1064 nm to the fourth harmonic at 266 nm is >50%. PMID:18542230

  14. Lignocellulosic ethanol in Brazil : technical assessment of 1st and 2nd generation sugarcane ethanol in a Brazilian setting

    Stojanovic, M.; Bakker, R.R.C.

    2009-01-01

    Brazil is currently the largest ethanol-biofuel producer worldwide. Ethanol is produced by fermenting the sucrose part of the sugarcane that contains only one third of the sugarcane energy. The rest of the plant is burned to produce energy to run the process and to generate electricity that is sold

  15. Lignocellulosic ethanol in Brazil : technical assessment of 1st and 2nd generation sugarcane ethanol in a Brazilian setting

    Stojanovic, M.; Bakker, R.R.C.

    2009-01-01

    Brazil is currently the largest ethanol-biofuel producer worldwide. Ethanol is produced by fermenting the sucrose part of the sugarcane that contains only one third of the sugarcane energy. The rest of the plant is burned to produce energy to run the process and to generate electricity that is sold to the public grid, making the process a net energy producer. This paper evaluates current technology from an energy efficiency point of view and quantifies additional benefits from extra energy ge...

  16. FT-IR Investigation of Hoveyda-Grubbs'2nd Generation Catalyst in Self-Healing Epoxy Mixtures

    The development of smart composites capable of self-repair on aeronautical structures is still at the planning stage owing to complex issues to overcome. A very important issue to solve concerns the components' stability of the proposed composites which are compromised at the cure temperatures necessary for good performance of the composite. In this work we analyzed the possibility to apply Hoveyda Grubbs' second generation catalyst (HG2) to develop self-healing systems. Our experimental results have shown critical issues in the use of epoxy precursors in conjunction with Hoveyda-Grubbs II metathesis catalyst. However, an appropriate curing cycle of the self-healing mixture permits to overcome the critical issues making possible high temperatures for the curing process without deactivating self-repair activity.

  17. Contribution of ion beam analysis methods to the development of 2nd generation high temperature superconducting (HTS) wires

    Usov, Igor O [Los Alamos National Laboratory; Arendt, Paul N [Los Alamos National Laboratory; Stan, Liliana [Los Alamos National Laboratory; Holesinger, Terry G [Los Alamos National Laboratory; Foltyn, Steven R [Los Alamos National Laboratory; Depaula, Raymond F [Los Alamos National Laboratory

    2009-01-01

    One of the crucial steps in the second generation high temperature superconducting wire program was development of the buffer layer architecture. The architecture designed at the Superconductivity Technology Center at Los Alamos National Laboratory consists of several oxide layers wherein each layer plays a specific role, namely: nucleation layer, diffusion barrier, biaxially textured template, and an intermediate layer with a good match to the lattice parameter of superconducting Y{sub 1}Ba{sub 2}Cu{sub 3}O{sub 7} (YBCO) compound. This report demonstrates how a wide range of ion beam analysis techniques (SIMS, RBS, channeling, PIXE, PIGE, NRA, ERD) was employed for analysis of each buffer layer and the YBCO films. These results assisted in understanding of a variety of physical processes occurring during the buffet layer fabrication and helped to optimize the buffer layer architecture as a whole.

  18. Control system for the 2nd generation Berkeley automounters (BAM2) at GM/CA-CAT macromolecular crystallography beamlines

    GM/CA-CAT at Sector 23 of the Advanced Photon Source (APS) is an NIH funded facility for crystallographic structure determination of biological macromolecules by X-ray diffraction. A second-generation Berkeley automounter is being integrated into the beamline control system at the 23BM experimental station. This new device replaces the previous all-pneumatic gripper motions with a combination of pneumatics and XYZ motorized linear stages. The latter adds a higher degree of flexibility to the robot including auto-alignment capability, accommodation of a larger capacity sample Dewar of arbitrary shape, and support for advanced operations such as crystal washing, while preserving the overall simplicity and efficiency of the Berkeley automounter design.

  19. New approaches for improving the production of the 1st and 2nd generation ethanol by yeast.

    Kurylenko, Olena; Semkiv, Marta; Ruchala, Justyna; Hryniv, Orest; Kshanovska, Barbara; Abbas, Charles; Dmytruk, Kostyantyn; Sibirny, Andriy

    2016-01-01

    Increase in the production of 1st generation ethanol from glucose is possible by the reduction in the production of ethanol co-products, especially biomass. We have developed a method to reduce biomass accumulation of Saccharomyces cerevisiae by the manipulation of the intracellular ATP level due to overexpression of genes of alkaline phosphatase, apyrase or enzymes involved in futile cycles. The strains constructed accumulated up to 10% more ethanol on a cornmeal hydrolysate medium. Similar increase in ethanol accumulation was observed in the mutants resistant to the toxic inhibitors of glycolysis like 3-bromopyruvate and others. Substantial increase in fuel ethanol production will be obtained by the development of new strains of yeasts that ferment sugars of the abundant lignocellulosic feedstocks, especially xylose, a pentose sugar. We have found that xylose can be fermented under elevated temperatures by the thermotolerant yeast, Hansenula polymorpha. We combined protein engineering of the gene coding for xylose reductase (XYL1) along with overexpression of the other two genes responsible for xylose metabolism in yeast (XYL2, XYL3) and the deletion of the global transcriptional activator CAT8, with the selection of mutants defective in utilizing ethanol as a carbon source using the anticancer drug, 3-bromopyruvate. Resulted strains accumulated 20-25 times more ethanol from xylose at the elevated temperature of 45°C with up to 12.5 g L(-1) produced. Increase in ethanol yield and productivity from xylose was also achieved by overexpression of genes coding for the peroxisomal enzymes: transketolase (DAS1) and transaldolase (TAL2), and deletion of the ATG13 gene. PMID:26619255

  20. Experimental Investigation of 2nd Generation Bioethanol Derived from Empty-fruit-bunch (EFB of Oil-palm on Performance and Exhaust Emission of SI Engine

    Yanuandri Putrasari

    2014-07-01

    Full Text Available The experimental investigation of 2nd generation bioethanol derived from EFB of oil-palm blended with gasoline for 10, 20, 25% by volume and pure gasoline were conducted on performance and exhaust emission tests of SI engine. A four stroke, four cylinders, programmed fuel injection (PGMFI, 16 valves variable valve timing and electronic lift control (VTEC, single overhead camshaft (SOHC, and 1,497 cm3 SI engine (Honda/L15A was used in this investigation. Engine performance test was carried out for brake torque, power, and fuel consumption. The exhaust emission was analyzed for carbon monoxide (CO and hydrocarbon (HC. The engine was operated on speed range from1,500 until 4,500 rev/min with 85% throttle opening position. The results showed that the highest brake torque of bioethanol blends achieved by 10% bioethanol content at 3,000 to 4,500 rpm, the brake power was greater than pure gasoline at 3,500 to 4,500 rpm for 10% bioethanol, and bioethanol-gasoline blends of 10 and 20% resulted greater bsfc than pure gasoline at low speed from 1,500 to 3,500 rpm. The trend of CO and HC emissions tended to decrease when the engine speed increased.

  1. Next generation software process improvement

    Turnas, Daniel

    2003-01-01

    Approved for public release; distribution is unlimited Software is often developed under a process that can at best be described as ad hoc. While it is possible to develop quality software under an ad hoc process, formal processes can be developed to help increase the overall quality of the software under development. The application of these processes allows for an organization to mature. The software maturity level, and process improvement, of an organization can be measured with the Cap...

  2. Online Rule Generation Software Process Model

    Sudeep Marwaha

    2013-07-01

    Full Text Available For production systems like expert systems, a rule generation software can facilitate the faster deployment. The software process model for rule generation using decision tree classifier refers to the various steps required to be executed for the development of a web based software model for decision rule generation. The Royce’s final waterfall model has been used in this paper to explain the software development process. The paper presents the specific output of various steps of modified waterfall model for decision rules generation.

  3. Quantification of left and right ventricular function and myocardial mass: Comparison of low-radiation dose 2nd generation dual-source CT and cardiac MRI

    Objective: To prospectively evaluate the accuracy of left and right ventricular function and myocardial mass measurements based on a dual-step, low radiation dose protocol with prospectively ECG-triggered 2nd generation dual-source CT (DSCT), using cardiac MRI (cMRI) as the reference standard. Materials and methods: Twenty patients underwent 1.5 T cMRI and prospectively ECG-triggered dual-step pulsing cardiac DSCT. This image acquisition mode performs low-radiation (20% tube current) imaging over the majority of the cardiac cycle and applies full radiation only during a single adjustable phase. Full-radiation-phase images were used to assess cardiac morphology, while low-radiation-phase images were used to measure left and right ventricular function and mass. Quantitative CT measurements based on contiguous multiphase short-axis reconstructions from the axial CT data were compared with short-axis SSFP cardiac cine MRI. Contours were manually traced around the ventricular borders for calculation of left and right ventricular end-diastolic volume, end-systolic volume, stroke volume, ejection fraction and myocardial mass for both modalities. Statistical methods included independent t-tests, the Mann–Whitney U test, Pearson correlation statistics, and Bland–Altman analysis. Results: All CT measurements of left and right ventricular function and mass correlated well with those from cMRI: for left/right end-diastolic volume r = 0.885/0.801, left/right end-systolic volume r = 0.947/0.879, left/right stroke volume r = 0.620/0.697, left/right ejection fraction r = 0.869/0.751, and left/right myocardial mass r = 0.959/0.702. Mean radiation dose was 6.2 ± 1.8 mSv. Conclusions: Prospectively ECG-triggered, dual-step pulsing cardiac DSCT accurately quantifies left and right ventricular function and myocardial mass in comparison with cMRI with substantially lower radiation exposure than reported for traditional retrospective ECG-gating.

  4. Stroke Symbol Generation Software for Fighter Aircraft

    G.K. Tripathi

    2013-03-01

    Full Text Available This paper gives an overview of the stroke symbol generation software developed by Hindustan Aeronautics Limited for fighter aircraft. This paper covers the working principle of head-up-display, overview of target hardware on which the developed software has been integrated and tested, software architecture, hardware software interfaces and design details of stroke symbol generation software. The paper also covers the issues related to stroke symbol quality which were encountered by the design team and the details about how the issues were resolved during integration and test phase.Defence Science Journal, 2013, 63(2, pp.153-156, DOI:http://dx.doi.org/10.14429/dsj.63.4257

  5. TOWARDS TEST CASES GENERATION FROM SOFTWARE SPECIFICATIONS

    R. Jeevarathinam,

    2010-11-01

    Full Text Available Verification and Validation of software systems often consumes up to 70% of the development resources. Testing is one of the most frequently used Verification and Validation techniques for verifyingsystems. Many agencies that certify software systems for use require that the software be tested to certain specified levels of coverage. Currently, developing test cases to meet these requirements takes a major portion of the resources. Automating this task would result in significant time and cost savings. This testing research is aimed at the generation of such test cases. In the proposed approach a formal model of the required software behavior (a formal specification is used for test-case generation and as an oracle to determine if theimplementation produced the correct output during testing. This is referred to as Specification Based Testing. Specification based testing offers several advantages to traditional code based testing. The formal specification can be used as the source artifact to generate functional tests for the final product and since the test cases are produced at an earlier stage in the software development, they are available before the implementation is completed. Central to this approach is the use of model checkers as test case generation engines. Model checking is a technique for exploring the reachable state-space of a system model to verify properties of interest.There are several research challenges that must be addressed to realize this test generation approach.

  6. An Impact Motion Generation Support Software

    Tsujita, Teppei; Konno, Atsushi; Nomura, Yuki; Komizunai, Shunsuke; Ayaz, Yasar; Uchiyama, Masaru

    2010-01-01

    The detail of impact motion generation support software is presented in this paper. The developed software supports impact motion design with OpenHRP or OpenHRP3. A preliminary impact motion experiment is performed by a humanoid robot and the analyses of its result are presented. The analysis reveals that the designed motion is not robust against error in the position of the nail since the timing of pulling up the hammer is defined in the designed motion in advance. Therefore, ...

  7. 2nd Tourism Postdisciplinarity Conference

    2016-01-01

    Following the noted success of the 1st international conference on postdisciplinary approaches to tourism studies (held in Neuchatel, Switzerland, 19-22 June, 2013), we are happy to welcome you to the 2nd Tourism Postdisciplinarity Conference. Postdisciplinarity surpasses the boundaries of disciplinary thinking and opens up the possibility to question the established phenomena – touristic or otherwise – we take for granted. It does not claim that disciplinarity is essentially wrong, but it...

  8. 2nd Tourism Postdisciplinarity Conference

    Following the noted success of the 1st international conference on postdisciplinary approaches to tourism studies (held in Neuchatel, Switzerland, 19-22 June, 2013), we are happy to welcome you to the 2nd Tourism Postdisciplinarity Conference. Postdisciplinarity surpasses the boundaries of...... study less embedded in that system of thought. Postdisciplinarity is an epistemological endeavour that speaks of knowledge production and the ways in which the world of physical and social phenomena can be known. It is also an ontological discourse as it concerns what we call ‘tourism...

  9. Monte Carlo generators in ATLAS software

    This document describes how Monte Carlo (MC) generators can be used in the ATLAS software framework (Athena). The framework is written in C++ using Python scripts for job configuration. Monte Carlo generators that provide the four-vectors describing the results of LHC collisions are written in general by third parties and are not part of Athena. These libraries are linked from the LCG Generator Services (GENSER) distribution. Generators are run from within Athena and the generated event output is put into a transient store, in HepMC format, using StoreGate. A common interface, implemented via inheritance of a GeneratorModule class, guarantees common functionality for the basic generation steps. The generator information can be accessed and manipulated by helper packages like TruthHelper. The ATLAS detector simulation as well access the truth information from StoreGate1. Steering is done through specific interfaces to allow for flexible configuration using ATLAS Python scripts. Interfaces to most general purpose generators, including: Pythia6, Pythia8, Herwig, Herwig++ and Sherpa are provided, as well as to more specialized packages, for example Phojet and Cascade. A second type of interface exist for the so called Matrix Element generators that only generate the particles produced in the hard scattering process and write events in the Les Houches event format. A generic interface to pass these events to Pythia6 and Herwig for parton showering and hadronisation has been written.

  10. 家族企业代际传承及二代推动战略转型的绩效研究%Performance Study of Intergenerational Succession and Strategic Transformation Driven by the 2nd Generation of Family Business

    汪祥耀; 金一禾

    2015-01-01

    本文将家族企业代际传承分为参与管理、共同管理和接收管理三个阶段,探究了处于不同阶段家族企业的绩效情况以及二代推动的家族企业战略转型对企业绩效的影响。利用2010-2012年我国A股主板上市公司中家族企业的样本展开实证研究,得出如下结论:二代进入家族企业高管,参与家族企业的日常经营和战略决策,或者与一代共同管理家族企业,对企业业绩产生正面影响;由于样本数量较少的客观原因,接收管理后企业绩效的经济后果关系未能得到证实;二代推动的战略转型对共同管理和企业绩效的关系起反向调节作用,在代际传承的共同管理阶段实施战略转型会降低原有真实绩效。%This paper divides the intergeneration succession of family business into three phases, including Involvement Management, Co-management, and Takeover Management, then researches the performance of intergenerational succession and strategic transformation driven by the 2nd generation.The empirical study of the listed family firms in China's A-share main market in the years of 2010-2012 finds that the family firms with the 2nd generation involvement management and co-management have a better performance, that there is no proof of relationship between takeover management and firm perform-ance due to the lack of enough samples.Besides, the strategic transformation driven by the 2nd generation has a reverse effect on the relationship between the co-management and firm performance, and the strategic transformation will reduce the o-riginal real performance in the co-management phase.

  11. "Me-A Different 2nd Generation of the Wealthy"%"我——就是不一样的富二代"

    Guo Yan

    2010-01-01

    @@ In recent years,phrases like "G2 of the Wealthy"are frequently mentioned by people,and discussions on the second generation of wealthy Chinese are a hot topic.However,people still have the impression that"G2 of the" Wealthy"is a generation which lacds nothing but significance and pursuit,a generation without responsibility and lnly meaningless individuality.

  12. Enhanced animal productivity and health with improved manure management in 2nd Generation Environmentally Superior Technology in North Carolina: II. Air quality

    The aim of this study was to evaluate the effects of improved manure management on air quality and the beneficial effect of a cleaner environment on animal productivity and health using a second generation of Environmentally Superior Technology. The second generation system combines solid-liquid sep...

  13. 2nd Historic Mortars Conference

    Hughes, John; Groot, Caspar; Historic Mortars : Characterisation, Assessment and Repair

    2012-01-01

    This volume focuses on research and practical issues connected with mortars on historic structures. The book is divided into four sections: Characterisation of Historic Mortars, Repair Mortars and Design Issues, Experimental Research into Properties of Repair Mortars, and Assessment and Testing. The papers present the latest work of researchers in their field. The individual contributions were selected from the contributions to the 2nd Historic Mortars Conference, which took place in Prague, September, 22-24, 2010. All papers were reviewed and improved as necessary before publication. This peer review process by the editors resulted in the 34 individual contributions included in here. One extra paper reviewing and summarising State-of-the-Art knowledge covered by this publication was added as a starting and navigational point for the reader. The editors believe that having these papers in print is important and they hope that it will stimulate further research into historic mortars and related subjects. 

  14. Experimental Stochatics (2nd edition)

    Otto Moeschlin and his co-authors have written a book about simulation of stochastic systems. The book comes with a CD-ROM that contains the experiments discussed in the book, and the text from the book is repeated on the CD-ROM. According to the authors, the aim of the book is to give a quick introduction to stochastic simulation for 'all persons interested in experimental stochastics'. To please this diverse audience, the authors offer a book that has four parts. Part 1, called 'Artificial Randomness', is the longest of the four parts. It gives an overview of the generation, testing and basic usage of pseudo random numbers in simulation. Although algorithms for generating sequences of random numbers are fundamental to simulation, it is a slightly unusual choice to give it such weight in comparison to other algorithmic topics. The remaining three parts consist of simulation case studies. Part 2, 'Stochastic Models', treats four problems - Buffon's needle, a queuing system, and two problems related to the kinetic theory of gases. Part 3 is called 'Stochastic Processes' and discusses the simulation of discrete time Markov chains, birth-death processes, Brownian motion and diffusions. The last section of Part 3 is about simulation as a tool to understand the traffic flow in a system controlled by stoplights, an area of research for the authors. Part 4 is called 'Evaluation of Statistical Procedures'. This section contains examples where simulation is used to test the performance of statistical methods. It covers four examples: the Neymann-Pearson lemma, the Wald sequential test, Bayesian point estimation and Hartigan procedures. The CD-ROM contains an easy-to-install software package that runs under Microsoft Windows. The software contains the text and simulations from the book. What I found most enjoyable about this book is the number of topics covered in the case studies. The highly individual selection of applications, which may serve as a source of inspiration

  15. Enhanced animal productivity and health with improved manure management in 2nd Generation Environmentally Superior Technology in North Carolina: I. Water quality

    New legislation in North Carolina promotes the replacement of old lagoon technology with new Environmentally Superior Technology. Scientists at ARS Florence Center and industry cooperators completed design and demonstration of a second generation treatment system for swine waste that can achieve hig...

  16. Next generation lightweight mirror modeling software

    Arnold, William R.; Fitzgerald, Matthew; Rosa, Rubin Jaca; Stahl, H. Philip

    2013-09-01

    The advances in manufacturing techniques for lightweight mirrors, such as EXELSIS deep core low temperature fusion, Corning's continued improvements in the Frit bonding process and the ability to cast large complex designs, combined with water-jet and conventional diamond machining of glasses and ceramics has created the need for more efficient means of generating finite element models of these structures. Traditional methods of assembling 400,000 + element models can take weeks of effort, severely limiting the range of possible optimization variables. This paper will introduce model generation software developed under NASA sponsorship for the design of both terrestrial and space based mirrors. The software deals with any current mirror manufacturing technique, single substrates, multiple arrays of substrates, as well as the ability to merge submodels into a single large model. The modeler generates both mirror and suspension system elements, suspensions can be created either for each individual petal or the whole mirror. A typical model generation of 250,000 nodes and 450,000 elements only takes 3-5 minutes, much of that time being variable input time. The program can create input decks for ANSYS, ABAQUS and NASTRAN. An archive/retrieval system permits creation of complete trade studies, varying cell size, depth, and petal size, suspension geometry with the ability to recall a particular set of parameters and make small or large changes with ease. The input decks created by the modeler are text files which can be modified by any text editor, all the shell thickness parameters and suspension spring rates are accessible and comments in deck identify which groups of elements are associated with these parameters. This again makes optimization easier. With ANSYS decks, the nodes representing support attachments are grouped into components; in ABAQUS these are SETS and in NASTRAN as GRIDPOINT SETS, this make integration of these models into large telescope or satellite

  17. Next-Generation Lightweight Mirror Modeling Software

    Arnold, William R., Sr.; Fitzgerald, Mathew; Rosa, Rubin Jaca; Stahl, Phil

    2013-01-01

    The advances in manufacturing techniques for lightweight mirrors, such as EXELSIS deep core low temperature fusion, Corning's continued improvements in the Frit bonding process and the ability to cast large complex designs, combined with water-jet and conventional diamond machining of glasses and ceramics has created the need for more efficient means of generating finite element models of these structures. Traditional methods of assembling 400,000 + element models can take weeks of effort, severely limiting the range of possible optimization variables. This paper will introduce model generation software developed under NASA sponsorship for the design of both terrestrial and space based mirrors. The software deals with any current mirror manufacturing technique, single substrates, multiple arrays of substrates, as well as the ability to merge submodels into a single large model. The modeler generates both mirror and suspension system elements, suspensions can be created either for each individual petal or the whole mirror. A typical model generation of 250,000 nodes and 450,000 elements only takes 5-10 minutes, much of that time being variable input time. The program can create input decks for ANSYS, ABAQUS and NASTRAN. An archive/retrieval system permits creation of complete trade studies, varying cell size, depth, and petal size, suspension geometry with the ability to recall a particular set of parameters and make small or large changes with ease. The input decks created by the modeler are text files which can be modified by any editor, all the key shell thickness parameters are accessible and comments in deck identify which groups of elements are associated with these parameters. This again makes optimization easier. With ANSYS decks, the nodes representing support attachments are grouped into components; in ABAQUS these are SETS and in NASTRAN as GRIDPOINT SETS, this make integration of these models into large telescope or satellite models possible

  18. Next Generation Lightweight Mirror Modeling Software

    Arnold, William R., Sr.; Fitzgerald, Mathew; Rosa, Rubin Jaca; Stahl, H. Philip

    2013-01-01

    The advances in manufacturing techniques for lightweight mirrors, such as EXELSIS deep core low temperature fusion, Corning's continued improvements in the Frit bonding process and the ability to cast large complex designs, combined with water-jet and conventional diamond machining of glasses and ceramics has created the need for more efficient means of generating finite element models of these structures. Traditional methods of assembling 400,000 + element models can take weeks of effort, severely limiting the range of possible optimization variables. This paper will introduce model generation software developed under NASA sponsorship for the design of both terrestrial and space based mirrors. The software deals with any current mirror manufacturing technique, single substrates, multiple arrays of substrates, as well as the ability to merge submodels into a single large model. The modeler generates both mirror and suspension system elements, suspensions can be created either for each individual petal or the whole mirror. A typical model generation of 250,000 nodes and 450,000 elements only takes 5-10 minutes, much of that time being variable input time. The program can create input decks for ANSYS, ABAQUS and NASTRAN. An archive/retrieval system permits creation of complete trade studies, varying cell size, depth, and petal size, suspension geometry with the ability to recall a particular set of parameters and make small or large changes with ease. The input decks created by the modeler are text files which can be modified by any editor, all the key shell thickness parameters are accessible and comments in deck identify which groups of elements are associated with these parameters. This again makes optimization easier. With ANSYS decks, the nodes representing support attachments are grouped into components; in ABAQUS these are SETS and in NASTRAN as GRIDPOINT SETS, this make integration of these models into large telescope or satellite models easier.

  19. Using Test Generating Software for Assessment

    Singh Aurora, Tarlok

    2007-04-01

    Assessment is an important part of teaching and learning. Designing suitable tests and quizzes for assessment is a time consuming task. With faculty's much commitment at work, sometimes it is not easy to find enough time to design a good test before the test day. Searching for and modifying older tests can take considerable amount of time. There is a need to develop a customized test bank that one could use to generate a quiz or test quickly before class time or before a test. A number of commercial software is available for e-learning that has this capability. Some of these are - Test Generator, Examview, Test Pro Developer etc. Application of Examview software in developing a test bank for physics will be presented. A physics test bank, with applications in other disciplines, can be gradually built over time and used to create a test or quiz quickly. Multiple scrambled versions of a single test (and answer sheets) can be created to discourage cheating in a large class setting. The presentation will show how to build a test bank.

  20. [Implications of TCGA Network Data on 2nd Generation Immunotherapy Concepts Based on PD-L1 and PD-1 Target Structures].

    Peters, I; Tezval, H; Kramer, M W; Wolters, M; Grünwald, V; Kuczyk, M A; Serth, J

    2015-11-01

    The era of cytokines, given to patients with metastatic renal cell carcinoma (mRCC) as part of an unspecific immunomodulatory treatment concept, seems to have ended with the introduction of targeted therapies. However, preliminary data from studies on treatment with checkpoint inhibitors (e. g. anti-PD-1 and anti-PD-L1) may point the way to second-generation immunotherapy. The rationale of such immunomodulatory treatment is to stop or interrupt the tumour from "escaping" the body's immune defence. Thompson et al. report that increased protein expression of PD-L1 (CD274/ B7-H1) in tumour cells and tumour-infiltrating immune cells (TILs; lymphocytes and histiocytes) is associated with unfavourable clinical pathological parameters as well as poor survival. In small pilot groups of mRCC patients it was found that increased PD-L1 protein expression in tumours and TILs may be correlated with the objective response to anti-PD-1 treatment. Sometimes, however, a very wide variety of response rates was observed, which raises the question if this can be explained by individual expression levels of PD-L1 (CD 274) or PD-1 (PDCD1).Recently published data from the Cancer Genome Atlas (TCGA) Kidney Renal Clear Cell Carcinoma (KIRC) Network now provide a genome-wide data base that allows us to review or validate the molecular results obtained in clear cell renal cell carcinomas (ccRCC) to date.In this study, we analysed the TCGA KIRC mRNA expression data for PD-L1 and PD-1 for a possible association with clinical pathological parameters and the survival of 417 ccRCC patients.The mRNA expression of PD-L1 in primary nephrectomy specimens revealed no significant association with unfavourable clinical parameters. Interestingly, though, a positive correlation with patient survival was found (HR=0,59, p=0,006).These results, which partly contradict the concept applied to date, point out the necessity to ascertain the characteristics of PD-L1 and PD-1 expression at mRNA and protein

  1. Techno-economic evaluation of 2nd generation bioethanol production from sugar cane bagasse and leaves integrated with the sugar-based ethanol process

    Macrelli Stefano

    2012-04-01

    Full Text Available Abstract Background Bioethanol produced from the lignocellulosic fractions of sugar cane (bagasse and leaves, i.e. second generation (2G bioethanol, has a promising market potential as an automotive fuel; however, the process is still under investigation on pilot/demonstration scale. From a process perspective, improvements in plant design can lower the production cost, providing better profitability and competitiveness if the conversion of the whole sugar cane is considered. Simulations have been performed with AspenPlus to investigate how process integration can affect the minimum ethanol selling price of this 2G process (MESP-2G, as well as improve the plant energy efficiency. This is achieved by integrating the well-established sucrose-to-bioethanol process with the enzymatic process for lignocellulosic materials. Bagasse and leaves were steam pretreated using H3PO4 as catalyst and separately hydrolysed and fermented. Results The addition of a steam dryer, doubling of the enzyme dosage in enzymatic hydrolysis, including leaves as raw material in the 2G process, heat integration and the use of more energy-efficient equipment led to a 37 % reduction in MESP-2G compared to the Base case. Modelling showed that the MESP for 2G ethanol was 0.97 US$/L, while in the future it could be reduced to 0.78 US$/L. In this case the overall production cost of 1G + 2G ethanol would be about 0.40 US$/L with an output of 102 L/ton dry sugar cane including 50 % leaves. Sensitivity analysis of the future scenario showed that a 50 % decrease in the cost of enzymes, electricity or leaves would lower the MESP-2G by about 20%, 10% and 4.5%, respectively. Conclusions According to the simulations, the production of 2G bioethanol from sugar cane bagasse and leaves in Brazil is already competitive (without subsidies with 1G starch-based bioethanol production in Europe. Moreover 2G bioethanol could be produced at a lower cost if subsidies were used to compensate for the

  2. A Practical GLR Parser Generator for Software Reverse Engineering

    Teng Geng

    2014-03-01

    Full Text Available Traditional parser generators use deterministic parsing methods. These methods can not meet the parsing requirements of software reverse engineering effectively. A new parser generator is presented which can generate GLR parser with automatic error recovery. The generated GLR parser has comparable parsing speed with the traditional LALR(1 parser and can be used in the parsing of software reverse engineering.

  3. A Practical GLR Parser Generator for Software Reverse Engineering

    Teng Geng; Fu Xu; Han Mei; Wei Meng; Zhibo Chen; Changqing Lai

    2014-01-01

    Traditional parser generators use deterministic parsing methods. These methods can not meet the parsing requirements of software reverse engineering effectively. A new parser generator is presented which can generate GLR parser with automatic error recovery. The generated GLR parser has comparable parsing speed with the traditional LALR(1) parser and can be used in the parsing of software reverse engineering.

  4. A 2nd generation cosmic axion experiment

    Hagmann, C; Stoeffl, W.; Van Bibber, K.; Daw, E.; Kinion, D.; Rosenberg, L; Sikivie, P.; Sullivan, N.; D. Tanner; Moltz, D.; Nezrick, F.; Turner, M; Golubev, N.; Kravchuk, L.

    1995-01-01

    An experiment is described to detect dark matter axions trapped in the halo of our galaxy. Galactic axions are converted into microwave photons via the Primakoff effect in a static background field provided by a superconducting magnet. The photons are collected in a high Q microwave cavity and detected by a low noise receiver. The axion mass range accessible by this experiment is 1.3-13 micro-eV. The expected sensitivity will be roughly 50 times greater than achieved by previous experiments i...

  5. A 2nd generation cosmic axion experiment

    Hagmann, C A; Van Bibber, K; Daw, E J; Kinion, D S; Rosenberg, L J; Sikivie, P; Sullivan, N; Tanner, D B; Moltz, D M; Nezrick, F A; Turner, M; Golubev, N A; Kravchuk, L V

    1995-01-01

    An experiment is described to detect dark matter axions trapped in the halo of our galaxy. Galactic axions are converted into microwave photons via the Primakoff effect in a static background field provided by a superconducting magnet. The photons are collected in a high Q microwave cavity and detected by a low noise receiver. The axion mass range accessible by this experiment is 1.3-13 micro-eV. The expected sensitivity will be roughly 50 times greater than achieved by previous experiments in this mass range. The assembly of the detector is well under way at LLNL and data taking will start in mid-1995.

  6. Automatic Testcase Generation for Flight Software

    Bushnell, David Henry; Pasareanu, Corina; Mackey, Ryan M.

    2008-01-01

    The TacSat3 project is applying Integrated Systems Health Management (ISHM) technologies to an Air Force spacecraft for operational evaluation in space. The experiment will demonstrate the effectiveness and cost of ISHM and vehicle systems management (VSM) technologies through onboard operation for extended periods. We present two approaches to automatic testcase generation for ISHM: 1) A blackbox approach that views the system as a blackbox, and uses a grammar-based specification of the system's inputs to automatically generate *all* inputs that satisfy the specifications (up to prespecified limits); these inputs are then used to exercise the system. 2) A whitebox approach that performs analysis and testcase generation directly on a representation of the internal behaviour of the system under test. The enabling technologies for both these approaches are model checking and symbolic execution, as implemented in the Ames' Java PathFinder (JPF) tool suite. Model checking is an automated technique for software verification. Unlike simulation and testing which check only some of the system executions and therefore may miss errors, model checking exhaustively explores all possible executions. Symbolic execution evaluates programs with symbolic rather than concrete values and represents variable values as symbolic expressions. We are applying the blackbox approach to generating input scripts for the Spacecraft Command Language (SCL) from Interface and Control Systems. SCL is an embedded interpreter for controlling spacecraft systems. TacSat3 will be using SCL as the controller for its ISHM systems. We translated the SCL grammar into a program that outputs scripts conforming to the grammars. Running JPF on this program generates all legal input scripts up to a prespecified size. Script generation can also be targeted to specific parts of the grammar of interest to the developers. These scripts are then fed to the SCL Executive. ICS's in-house coverage tools will be run to

  7. Generating Protocol Software from CPN Models Annotated with Pragmatics

    Simonsen, Kent Inge; Kristensen, Lars M.; Kindler, Ekkart

    Model-driven software engineering (MDSE) provides a foundation for automatically generating software based on models that focus on the problem domain while abstracting from the details of underlying implementation platforms. Coloured Petri Nets (CPNs) have been widely used to formally model and...... verify protocol software, but limited work exists on using CPN models of protocols as a basis for automated code generation. The contribution of this paper is a method for generating protocol software from a class of CPN models annotated with code generation pragmatics. Our code generation method...

  8. Automatic program generation: future of software engineering

    Robinson, J.H.

    1979-01-01

    At this moment software development is still more of an art than an engineering discipline. Each piece of software is lovingly engineered, nurtured, and presented to the world as a tribute to the writer's skill. When will this change. When will the craftsmanship be removed and the programs be turned out like so many automobiles from an assembly line. Sooner or later it will happen: economic necessities will demand it. With the advent of cheap microcomputers and ever more powerful supercomputers doubling capacity, much more software must be produced. The choices are to double the number of programers, double the efficiency of each programer, or find a way to produce the needed software automatically. Producing software automatically is the only logical choice. How will automatic programing come about. Some of the preliminary actions which need to be done and are being done are to encourage programer plagiarism of existing software through public library mechanisms, produce well understood packages such as compiler automatically, develop languages capable of producing software as output, and learn enough about the whole process of programing to be able to automate it. Clearly, the emphasis must not be on efficiency or size, since ever larger and faster hardware is coming.

  9. Sistema especialista de 2ª geração para diagnose técnica: modelo e procedimento 2nd generation expert system for technical diagnosis: a model and a procedure

    Néocles Alves Pereira

    1994-04-01

    Full Text Available Este trabalho trata da diagnose em equipamentos industriais através do uso de Sistemas Especialistas. Com o objetivo de desenvolver procedimentos que contribuam na construção de Sistemas Especialistas para diagnose em Manutenção Industrial, consideramos os chamados Sistemas Especialistas de 2ª Geração. Propomos um modelo modificado e um procedimento de diagnose. Na estratégia de diagnose utilizamos uma busca "top-down best-first", que combina dois tipos de tratamento de incerteza: (i entropia, para decidir pelo melhor caminho nas estruturas de conhecimento, e (ii crença nos sintomas, para validar os diagnósticos obtidos. Esta proposta traz as seguintes vantagens: base de conhecimento mais completa, melhores explicação e apresentação de diagnósticos finais. Desenvolvemos um protótipo com base em informações reais sobre bombas centrífugas.This paper deals with the diagnosis of industrial equipments through the use of Expert Systems. Intending to develop procedures that result in diagnosis knowledge bases for Industrial Maintenance, we have considered 2nd Generation Expert Systems. We have proposed a modified model and a diagnosis procedure. We used for the diagnosis strategy a "top-down best-first search", that combines two types of uncertainty treatment: (i entropy, to find the best way in the search throughout knowledge structures, (ii belief in the symptoms, to validate the resultant diagnostics. This proposal has the following advantages: a more complete knowledge base, a better explanation and presentation of the resultant diagnostics. We have developed a prototype considering real informations about centrifugal pumps.

  10. A Practical Evaluation of Next Generation Sequencing & Molecular Cloning Software

    Meintjes, Peter; Qaadri, Kashef; Olsen, Christian

    2013-01-01

    Laboratories using Next Generation Sequencing (NGS) technologies and/ or high-throughput molecular cloning experiments can spend a significant amount of their research budget on data analysis and data management. The decision to develop in-house software, to rely on combinations of free software packages, or to purchase commercial software can significantly affect productivity and ROI. In this talk, we will describe a practical software evaluation process that was developed to assist core fac...

  11. Automatic generation of hardware/software interfaces

    King, Myron Decker; Dave, Nirav H.; Mithal, Arvind

    2012-01-01

    Enabling new applications for mobile devices often requires the use of specialized hardware to reduce power consumption. Because of time-to-market pressure, current design methodologies for embedded applications require an early partitioning of the design, allowing the hardware and software to be developed simultaneously, each adhering to a rigid interface contract. This approach is problematic for two reasons: (1) a detailed hardware-software interface is difficult to specify until one is de...

  12. Improved Ant Algorithms for Software Testing Cases Generation

    Shunkun Yang; Tianlong Man; Jiaqi Xu

    2014-01-01

    Existing ant colony optimization (ACO) for software testing cases generation is a very popular domain in software testing engineering. However, the traditional ACO has flaws, as early search pheromone is relatively scarce, search efficiency is low, search model is too simple, positive feedback mechanism is easy to porduce the phenomenon of stagnation and precocity. This paper introduces improved ACO for software testing cases generation: improved local pheromone update strategy for ant colony...

  13. Search-based software test data generation using evolutionary computation

    Maragathavalli, P.

    2011-01-01

    Search-based Software Engineering has been utilized for a number of software engineering activities.One area where Search-Based Software Engineering has seen much application is test data generation. Evolutionary testing designates the use of metaheuristic search methods for test case generation. The search space is the input domain of the test object, with each individual or potential solution, being an encoded set of inputs to that test object. The fitness function is tailored to find...

  14. Next-generation business intelligence software with Silverlight 3

    Czernicki, Bart

    2010-01-01

    Business Intelligence (BI) software is the code and tools that allow you to view different components of a business using a single visual platform, making comprehending mountains of data easier. Applications that include reports, analytics, statistics, and historical and predictive modeling are all examples of BI applications. Currently, we are in the second generation of BI software, called BI 2.0. This generation is focused on writing BI software that is predictive, adaptive, simple, and interactive. As computers and software have evolved, more data can be presented to end users with increas

  15. Creating the next generation control system software

    A new 1980's style support package for future accelerator control systems is proposed. It provides a way to create accelerator applications software without traditional programming. Visual Interactive Applications (VIA) is designed to meet the needs of expanded accelerator complexes in a more cost effective way than past experience with procedural languages by using technology from the personal computer and artificial intelligence communities. 4 refs

  16. Using DSL for Automatic Generation of Software Connectors

    Bureš, Tomáš; Malohlava, M.; Hnětynka, P.

    Los Alamitos: IEEE Computer Society, 2008, s. 138-147. ISBN 978-0-7695-3091-8. [ICCBSS 2008. International Conference on Composition-Based Software Systems /7./. Madrid (ES), 25.02.2008-29.02.2008,] R&D Projects: GA AV ČR 1ET400300504 Institutional research plan: CEZ:AV0Z10300504 Keywords : component based systems * software connectors * code generation * domain-specific languages Subject RIV: JC - Computer Hardware ; Software

  17. Abstracts: 2nd interventional MRI symposium

    Anon.

    1997-09-01

    Main topics of the 2nd interventional MRI symposium were: MR compatibility and pulse sequences; MR thermometry, biopsy, musculoskeletal system; laser-induced interstitial thermotherapy, radiofrequency ablations; intraoperative MR; vascular applications, breast, endoscopy; focused ultrasound, cryotherapy, perspectives; poster session with 34 posters described. (AJ)

  18. Abstracts: 2nd interventional MRI symposium

    Main topics of the 2nd interventional MRI symposium were: MR compatibility and pulse sequences; MR thermometry, biopsy, musculoskeletal system; laser-induced interstitial thermotherapy, radiofrequency ablations; intraoperative MR; vascular applications, breast, endoscopy; focused ultrasound, cryotherapy, perspectives; poster session with 34 posters described. (AJ)

  19. Generating Protocol Software from CPN Models Annotated with Pragmatics

    Simonsen, Kent Inge; Kristensen, Lars M.; Kindler, Ekkart

    2013-01-01

    verify protocol software, but limited work exists on using CPN models of protocols as a basis for automated code generation. The contribution of this paper is a method for generating protocol software from a class of CPN models annotated with code generation pragmatics. Our code generation method...... consists of three main steps: automatically adding so-called derived pragmatics to the CPN model, computing an abstract template tree, which associates pragmatics with code templates, and applying the templates to generate code which can then be compiled. We illustrate our method using a unidirectional...

  20. Automating Traceability for Generated Software Artifacts

    Richardson, Julian; Green, Jeffrey

    2004-01-01

    Program synthesis automatically derives programs from specifications of their behavior. One advantage of program synthesis, as opposed to manual coding, is that there is a direct link between the specification and the derived program. This link is, however, not very fine-grained: it can be best characterized as Program is-derived- from Specification. When the generated program needs to be understood or modified, more $ne-grained linking is useful. In this paper, we present a novel technique for automatically deriving traceability relations between parts of a specification and parts of the synthesized program. The technique is very lightweight and works -- with varying degrees of success - for any process in which one artifact is automatically derived from another. We illustrate the generality of the technique by applying it to two kinds of automatic generation: synthesis of Kalman Filter programs from speci3cations using the Aut- oFilter program synthesis system, and generation of assembly language programs from C source code using the GCC C compilel: We evaluate the effectiveness of the technique in the latter application.

  1. Generating DEM from LIDAR data - comparison of available software tools

    Korzeniowska, K.; Lacka, M.

    2011-12-01

    In recent years many software tools and applications have appeared that offer procedures, scripts and algorithms to process and visualize ALS data. This variety of software tools and of "point cloud" processing methods contributed to the aim of this study: to assess algorithms available in various software tools that are used to classify LIDAR "point cloud" data, through a careful examination of Digital Elevation Models (DEMs) generated from LIDAR data on a base of these algorithms. The works focused on the most important available software tools: both commercial and open source ones. Two sites in a mountain area were selected for the study. The area of each site is 0.645 sq km. DEMs generated with analysed software tools ware compared with a reference dataset, generated using manual methods to eliminate non ground points. Surfaces were analysed using raster analysis. Minimum, maximum and mean differences between reference DEM and DEMs generated with analysed software tools were calculated, together with Root Mean Square Error. Differences between DEMs were also examined visually using transects along the grid axes in the test sites.

  2. Microcomputers: Instrument Generation Software. Evaluation Guides. Guide Number 11.

    Gray, Peter J.

    Designed to assist evaluators in selecting the appropriate software for the generation of various data collection instruments, this guide discusses such key program characteristics as text entry, item storage and manipulation, item retrieval, and printing. Some characteristics of a good instrument generation program are discussed; these include…

  3. Radioisotope thermoelectric generator transportation system subsystem 143 software development plan

    This plan describes the activities to be performed and the controls to be applied to the process of specifying, developing, and qualifying the data acquisition software for the Radioisotope Thermoelectric Generator (RTG) Transportation System Subsystem 143 Instrumentation and Data Acquisition System (IDAS). This plan will serve as a software quality assurance plan, a verification and validation (V and V) plan, and a configuration management plan

  4. Software Surface Modeling and Grid Generation Steering Committee

    Smith, Robert E. (Editor)

    1992-01-01

    It is a NASA objective to promote improvements in the capability and efficiency of computational fluid dynamics. Grid generation, the creation of a discrete representation of the solution domain, is an essential part of computational fluid dynamics. However, grid generation about complex boundaries requires sophisticated surface-model descriptions of the boundaries. The surface modeling and the associated computation of surface grids consume an extremely large percentage of the total time required for volume grid generation. Efficient and user friendly software systems for surface modeling and grid generation are critical for computational fluid dynamics to reach its potential. The papers presented here represent the state-of-the-art in software systems for surface modeling and grid generation. Several papers describe improved techniques for grid generation.

  5. Development of the software generation method using model driven software engineering tool

    Jang, H. S.; Jeong, J. C.; Kim, J. H.; Han, H. W.; Kim, D. Y.; Jang, Y. W. [KOPEC, Taejon (Korea, Republic of); Moon, W. S. [NEXTech Inc., Seoul (Korea, Republic of)

    2003-10-01

    The methodologies to generate the automated software design specification and source code for the nuclear I and C systems software using model driven language is developed in this work. For qualitative analysis of the algorithm, the activity diagram is modeled and generated using Unified Modeling Language (UML), and then the sequence diagram is designed for automated source code generation. For validation of the generated code, the code audits and module test is performed using Test and QA tool. The code coverage and complexities of example code are examined in this stage. The low pressure pressurizer reactor trip module of the Plant Protection System was programmed as subject for this task. The test result using the test tool shows that errors were easily detected from the generated source codes that have been generated using test tool. The accuracy of input/output processing by the execution modules was clearly identified.

  6. Computer Software Configuration Item-Specific Flight Software Image Transfer Script Generator

    Bolen, Kenny; Greenlaw, Ronald

    2010-01-01

    A K-shell UNIX script enables the International Space Station (ISS) Flight Control Team (FCT) operators in NASA s Mission Control Center (MCC) in Houston to transfer an entire or partial computer software configuration item (CSCI) from a flight software compact disk (CD) to the onboard Portable Computer System (PCS). The tool is designed to read the content stored on a flight software CD and generate individual CSCI transfer scripts that are capable of transferring the flight software content in a given subdirectory on the CD to the scratch directory on the PCS. The flight control team can then transfer the flight software from the PCS scratch directory to the Electronically Erasable Programmable Read Only Memory (EEPROM) of an ISS Multiplexer/ Demultiplexer (MDM) via the Indirect File Transfer capability. The individual CSCI scripts and the CSCI Specific Flight Software Image Transfer Script Generator (CFITSG), when executed a second time, will remove all components from their original execution. The tool will identify errors in the transfer process and create logs of the transferred software for the purposes of configuration management.

  7. Search-Based Software Test Data Generation Using Evolutionary Computation

    P. Maragathavalli

    2011-02-01

    Full Text Available Search-based Software Engineering has been utilized for a number of software engineering activities.One area where Search-Based Software Engineering has seen much application is test data generation. Evolutionary testing designates the use of metaheuristic search methods for test case generation. The search space is the input domain of the test object, with each individual or potential solution, being an encoded set of inputs to that test object. The fitness function is tailored to find test data for the type of testthat is being undertaken. Evolutionary Testing (ET uses optimizing search techniques such as evolutionary algorithms to generate test data. The effectiveness of GA-based testing system is compared with a Random testing system. For simple programs both testing systems work fine, but as the complexity of the program or the complexity of input domain grows, GA-based testing system significantly outperforms Random testing.

  8. A rule-based software test data generator

    Deason, William H.; Brown, David B.; Chang, Kai-Hsiung; Cross, James H., II

    1991-01-01

    Rule-based software test data generation is proposed as an alternative to either path/predicate analysis or random data generation. A prototype rule-based test data generator for Ada programs is constructed and compared to a random test data generator. Four Ada procedures are used in the comparison. Approximately 2000 rule-based test cases and 100,000 randomly generated test cases are automatically generated and executed. The success of the two methods is compared using standard coverage metrics. Simple statistical tests showing that even the primitive rule-based test data generation prototype is significantly better than random data generation are performed. This result demonstrates that rule-based test data generation is feasible and shows great promise in assisting test engineers, especially when the rule base is developed further.

  9. Evaluation of the efficiency and fault density of software generated by code generators

    Schreur, Barbara

    1993-01-01

    Flight computers and flight software are used for GN&C (guidance, navigation, and control), engine controllers, and avionics during missions. The software development requires the generation of a considerable amount of code. The engineers who generate the code make mistakes and the generation of a large body of code with high reliability requires considerable time. Computer-aided software engineering (CASE) tools are available which generates code automatically with inputs through graphical interfaces. These tools are referred to as code generators. In theory, code generators could write highly reliable code quickly and inexpensively. The various code generators offer different levels of reliability checking. Some check only the finished product while some allow checking of individual modules and combined sets of modules as well. Considering NASA's requirement for reliability, an in house manually generated code is needed. Furthermore, automatically generated code is reputed to be as efficient as the best manually generated code when executed. In house verification is warranted.

  10. A code generation framework for the ALMA common software

    Troncoso, Nicolás; von Brand, Horst H.; Ibsen, Jorge; Mora, Matias; Gonzalez, Victor; Chiozzi, Gianluca; Jeram, Bogdan; Sommer, Heiko; Zamora, Gabriel; Tejeda, Alexis

    2010-07-01

    Code generation helps in smoothing the learning curve of a complex application framework and in reducing the number of Lines Of Code (LOC) that a developer needs to craft. The ALMA Common Software (ACS) has adopted code generation in specific areas, but we are now exploiting the more comprehensive approach of Model Driven code generation to transform directly an UML Model into a full implementation in the ACS framework. This approach makes it easier for newcomers to grasp the principles of the framework. Moreover, a lower handcrafted LOC reduces the error rate. Additional benefits achieved by model driven code generation are: software reuse, implicit application of design patterns and automatic tests generation. A model driven approach to design makes it also possible using the same model with different frameworks, by generating for different targets. The generation framework presented in this paper uses openArchitectureWare1 as the model to text translator. OpenArchitectureWare provides a powerful functional language that makes this easier to implement the correct mapping of data types, the main difficulty encountered in the translation process. The output is an ACS application readily usable by the developer, including the necessary deployment configuration, thus minimizing any configuration burden during testing. The specific application code is implemented by extending generated classes. Therefore, generated and manually crafted code are kept apart, simplifying the code generation process and aiding the developers by keeping a clean logical separation between the two. Our first results show that code generation improves dramatically the code productivity.

  11. Improved ant algorithms for software testing cases generation.

    Yang, Shunkun; Man, Tianlong; Xu, Jiaqi

    2014-01-01

    Existing ant colony optimization (ACO) for software testing cases generation is a very popular domain in software testing engineering. However, the traditional ACO has flaws, as early search pheromone is relatively scarce, search efficiency is low, search model is too simple, positive feedback mechanism is easy to produce the phenomenon of stagnation and precocity. This paper introduces improved ACO for software testing cases generation: improved local pheromone update strategy for ant colony optimization, improved pheromone volatilization coefficient for ant colony optimization (IPVACO), and improved the global path pheromone update strategy for ant colony optimization (IGPACO). At last, we put forward a comprehensive improved ant colony optimization (ACIACO), which is based on all the above three methods. The proposed technique will be compared with random algorithm (RND) and genetic algorithm (GA) in terms of both efficiency and coverage. The results indicate that the improved method can effectively improve the search efficiency, restrain precocity, promote case coverage, and reduce the number of iterations. PMID:24883391

  12. 2nd International Conference on Mobile and Wireless Technology

    Wattanapongsakorn, Naruemon

    2015-01-01

    This book provides a snapshot of the current state-of-the-art in the fields of mobile and wireless technology, security and applications.  The proceedings of the 2nd International Conference on Mobile and Wireless Technology (ICMWT2015), it represents the outcome of a unique platform for researchers and practitioners from academia and industry to share cutting-edge developments in the field of mobile and wireless science technology, including those working on data management and mobile security.   The contributions presented here describe the latest academic and industrial research from the international mobile and wireless community.  The scope covers four major topical areas: mobile and wireless networks and applications; security in mobile and wireless technology; mobile data management and applications; and mobile software.  The book will be a valuable reference for current researchers in academia and industry, and a useful resource for graduate-level students working on mobile and wireless technology...

  13. Computer aided power flow software engineering and code generation

    Bacher, R. [Swiss Federal Inst. of Tech., Zuerich (Switzerland)

    1995-12-31

    In this paper a software engineering concept is described which permits the automatic solution of a non-linear set of network equations. The power flow equation set can be seen as a defined subset of a network equation set. The automated solution process is the numerical Newton-Raphson solution process of the power flow equations where the key code parts are the numeric mismatch and the numeric Jacobian term computation. It is shown that both the Jacobian and the mismatch term source code can be automatically generated in a conventional language such as Fortran or C. Thereby one starts from a high level, symbolic language with automatic differentiation and code generation facilities. As a result of this software engineering process an efficient, very high quality Newton-Raphson solution code is generated which allows easier implementation of network equation model enhancements and easier code maintenance as compared to hand-coded Fortran or C code.

  14. Computer aided power flow software engineering and code generation

    Bacher, R. [Swiss Federal Inst. of Tech., Zuerich (Switzerland)

    1996-02-01

    In this paper a software engineering concept is described which permits the automatic solution of a non-linear set of network equations. The power flow equation set can be seen as a defined subset of a network equation set. The automated solution process is the numerical Newton-Raphson solution process of the power flow equations where the key code parts are the numeric mismatch and the numeric Jacobian term computation. It is shown that both the Jacobian and the mismatch term source code can be automatically generated in a conventional language such as Fortran or C. Thereby one starts from a high level, symbolic language with automatic differentiation and code generation facilities. As a result of this software engineering process an efficient, very high quality newton-Raphson solution code is generated which allows easier implementation of network equation model enhancements and easier code maintenance as compared to hand-coded Fortran or C code.

  15. Open Source Next Generation Visualization Software for Interplanetary Missions

    Trimble, Jay; Rinker, George

    2016-01-01

    Mission control is evolving quickly, driven by the requirements of new missions, and enabled by modern computing capabilities. Distributed operations, access to data anywhere, data visualization for spacecraft analysis that spans multiple data sources, flexible reconfiguration to support multiple missions, and operator use cases, are driving the need for new capabilities. NASA's Advanced Multi-Mission Operations System (AMMOS), Ames Research Center (ARC) and the Jet Propulsion Laboratory (JPL) are collaborating to build a new generation of mission operations software for visualization, to enable mission control anywhere, on the desktop, tablet and phone. The software is built on an open source platform that is open for contributions (http://nasa.github.io/openmct).

  16. Software Test Case Automated Generation Algorithm with Extended EDPN Model

    Jinlong Tao

    2013-08-01

    Full Text Available To improve the sufficiency for software testing and the performance of testing algorithms, an improved event-driven Petri network model using combination method is proposed, abbreviated as OEDPN model. Then it is applied to OATS method to extend the implementation of OATS. On the basis of OEDPN model, the marked associate recursive method of state combination on category is presented to solve problems of combined conflict. It is also for test case explosion generated by redundant test cases and hard extension of OATS method. Meanwhile, the generation methods on interactive test cases of extended OATS are also presented by research on generation test cases.

  17. Overview of the next generation of Fermilab collider software

    Fermilab is entering an era of operating a more complex collider facility. In addition, new operator workstations are available that have increased capabilities. The task of providing updated software in this new environment precipitated a project called Colliding Beam Software (CBS). It was soon evident that a new approach was needed for developing console software. Hence CBS, although a common acronym, is too narrow a description. A new generation of the application program subroutine library has been created to enhance the existing programming environment with a set of value added tools. Several key Collider applications were written that exploit CBS tools. This paper will discuss the new tools and the underlying change in methodology in application program development for accelerator control at Fermilab. (author)

  18. 2nd International Conference on Natural Fibers

    Rana, Sohel

    2016-01-01

    This book collects selected high quality articles submitted to the 2nd International Conference on Natural Fibers (ICNF2015). A wide range of topics is covered related to various aspects of natural fibres such as agriculture, extraction and processing, surface modification and functionalization, advanced structures, nano fibres, composites and nanocomposites, design and product development, applications, market potential, and environmental impact. Divided into separate sections on these various topics, the book presents the latest high quality research work addressing different approaches and techniques to improve processing, performance, functionalities and cost-effectiveness of natural fibre and natural based products, in order to promote their applications in various advanced technical sectors. This book is a useful source of information for materials scientists, teachers and students from various disciplines as well as for R& D staff in industries using natural fibre based materials. .

  19. PUS Services Software Building Block Automatic Generation for Space Missions

    Candia, S.; Sgaramella, F.; Mele, G.

    2008-08-01

    The Packet Utilization Standard (PUS) has been specified by the European Committee for Space Standardization (ECSS) and issued as ECSS-E-70-41A to define the application-level interface between Ground Segments and Space Segments. The ECSS-E- 70-41A complements the ECSS-E-50 and the Consultative Committee for Space Data Systems (CCSDS) recommendations for packet telemetry and telecommand. The ECSS-E-70-41A characterizes the identified PUS Services from a functional point of view and the ECSS-E-70-31 standard specifies the rules for their mission-specific tailoring. The current on-board software design for a space mission implies the production of several PUS terminals, each providing a specific tailoring of the PUS services. The associated on-board software building blocks are developed independently, leading to very different design choices and implementations even when the mission tailoring requires very similar services (from the Ground operative perspective). In this scenario, the automatic production of the PUS services building blocks for a mission would be a way to optimize the overall mission economy and improve the robusteness and reliability of the on-board software and of the Ground-Space interactions. This paper presents the Space Software Italia (SSI) activities for the development of an integrated environment to support: the PUS services tailoring activity for a specific mission. the mission-specific PUS services configuration. the generation the UML model of the software building block implementing the mission-specific PUS services and the related source code, support documentation (software requirements, software architecture, test plans/procedures, operational manuals), and the TM/TC database. The paper deals with: (a) the project objectives, (b) the tailoring, configuration, and generation process, (c) the description of the environments supporting the process phases, (d) the characterization of the meta-model used for the generation, (e) the

  20. AUTOMATED TEST CASES GENERATION FOR OBJECT ORIENTED SOFTWARE

    A.V.K.SHANTHI

    2011-09-01

    Full Text Available Software testing is an important activity in software development life cycle. Testing includes executing a program on a set of test cases and comparing the actual results with the expected results. To test a system, the implementation must be understood first, which can be done by creating a suitable model of the system. UML is widely accepted and used by industry for modeling and design of softwaresystems. A novel method to automatically generate test cases based on UML Class diagrams guaranteed in many ways. Test case generation from design specifications has the added advantage of allowing test cases to be available early in the software development cycle, thereby making test planning more effective. Here is a technique in which a new approach using data mining concepts is designed and that algorithm is to be used to generate test cases. The Tool generates a novel automated test case that is much superior, less complex and easier to implement in any Testing system. Where in this Tool, information from UML Class diagram extracted and mapped, tree structure is formed with help of those information’s, Genetic Algorithm implemented as data mining technique, where Genetic crossover operator applied to discover all patterns and Depth First Search algorithm implement to Binary tree’s formed to represent the knowledge i.e., test cases. Path coverage criterion is an importantconcept to be considered in test case generation is concern. This paper presents valid test cases generation scheme which is fully automated, and the generated test cases to satisfy transition pathcoverage criteria.

  1. Improved Ant Algorithms for Software Testing Cases Generation

    Shunkun Yang

    2014-01-01

    Full Text Available Existing ant colony optimization (ACO for software testing cases generation is a very popular domain in software testing engineering. However, the traditional ACO has flaws, as early search pheromone is relatively scarce, search efficiency is low, search model is too simple, positive feedback mechanism is easy to porduce the phenomenon of stagnation and precocity. This paper introduces improved ACO for software testing cases generation: improved local pheromone update strategy for ant colony optimization, improved pheromone volatilization coefficient for ant colony optimization (IPVACO, and improved the global path pheromone update strategy for ant colony optimization (IGPACO. At last, we put forward a comprehensive improved ant colony optimization (ACIACO, which is based on all the above three methods. The proposed technique will be compared with random algorithm (RND and genetic algorithm (GA in terms of both efficiency and coverage. The results indicate that the improved method can effectively improve the search efficiency, restrain precocity, promote case coverage, and reduce the number of iterations.

  2. Optimized generation of high resolution breast anthropomorphic software phantoms

    Purpose: The authors present an efficient method for generating anthropomorphic software breast phantoms with high spatial resolution. Employing the same region growing principles as in their previous algorithm for breast anatomy simulation, the present method has been optimized for computational complexity to allow for fast generation of the large number of phantoms required in virtual clinical trials of breast imaging. Methods: The new breast anatomy simulation method performs a direct calculation of the Cooper's ligaments (i.e., the borders between simulated adipose compartments). The calculation corresponds to quadratic decision boundaries of a maximum a posteriori classifier. The method is multiscale due to the use of octree-based recursive partitioning of the phantom volume. The method also provides user-control of the thickness of the simulated Cooper's ligaments and skin. Results: Using the proposed method, the authors have generated phantoms with voxel size in the range of (25-1000 μm)3/voxel. The power regression of the simulation time as a function of the reciprocal voxel size yielded a log-log slope of 1.95 (compared to a slope of 4.53 of our previous region growing algorithm). Conclusions: A new algorithm for computer simulation of breast anatomy has been proposed that allows for fast generation of high resolution anthropomorphic software phantoms.

  3. 2nd International Arctic Ungulate Conference

    A. Anonymous

    1996-01-01

    Full Text Available The 2nd International Arctic Ungulate Conference was held 13-17 August 1995 on the University of Alaska Fairbanks campus. The Institute of Arctic Biology and the Alaska Cooperative Fish and Wildlife Research Unit were responsible for organizing the conference with assistance from biologists with state and federal agencies and commercial organizations. David R. Klein was chair of the conference organizing committee. Over 200 people attended the conference, coming from 10 different countries. The United States, Canada, and Norway had the largest representation. The conference included invited lectures; panel discussions, and about 125 contributed papers. There were five technical sessions on Physiology and Body Condition; Habitat Relationships; Population Dynamics and Management; Behavior, Genetics and Evolution; and Reindeer and Muskox Husbandry. Three panel sessions discussed Comparative caribou management strategies; Management of introduced, reestablished, and expanding muskox populations; and Health risks in translocation of arctic ungulates. Invited lectures focused on the physiology and population dynamics of arctic ungulates; contaminants in food chains of arctic ungulates and lessons learned from the Chernobyl accident; and ecosystem level relationships of the Porcupine Caribou Herd.

  4. Exogenous attention enhances 2nd-order contrast sensitivity.

    Barbot, Antoine; Landy, Michael S; Carrasco, Marisa

    2011-05-11

    Natural scenes contain a rich variety of contours that the visual system extracts to segregate the retinal image into perceptually coherent regions. Covert spatial attention helps extract contours by enhancing contrast sensitivity for 1st-order, luminance-defined patterns at attended locations, while reducing sensitivity at unattended locations, relative to neutral attention allocation. However, humans are also sensitive to 2nd-order patterns such as spatial variations of texture, which are predominant in natural scenes and cannot be detected by linear mechanisms. We assess whether and how exogenous attention--the involuntary and transient capture of spatial attention--affects the contrast sensitivity of channels sensitive to 2nd-order, texture-defined patterns. Using 2nd-order, texture-defined stimuli, we demonstrate that exogenous attention increases 2nd-order contrast sensitivity at the attended location, while decreasing it at unattended locations, relative to a neutral condition. By manipulating both 1st- and 2nd-order spatial frequency, we find that the effects of attention depend both on 2nd-order spatial frequency of the stimulus and the observer's 2nd-order spatial resolution at the target location. At parafoveal locations, attention enhances 2nd-order contrast sensitivity to high, but not to low 2nd-order spatial frequencies; at peripheral locations attention also enhances sensitivity to low 2nd-order spatial frequencies. Control experiments rule out the possibility that these effects might be due to an increase in contrast sensitivity at the 1st-order stage of visual processing. Thus, exogenous attention affects 2nd-order contrast sensitivity at both attended and unattended locations. PMID:21356228

  5. Next Generation of ECT Software for Data Analysis of Steam Generator Tubes

    Improvements to existing EddyOne eddy current analysis software are being presented. Those improvements are geared towards improved interaction between the software and ECT analyst by having a better and more featured user interface, while keeping some industry standard signal display norms intact to keep the familiar factor and ease the transition to the next generation of EddyOne. Improvements presented in this paper thus ease the transition to the new software by reducing training requirements for the existing analysts and for new analysts coming to the industry. Further, by utilizing modern technologies next generation of software is able to further reduce maintenance and deployment costs of the whole system for future to come.(author).

  6. High-Quality Random Number Generation Software for High-Performance Computing Project

    National Aeronautics and Space Administration — Random number (RN) generation is the key software component that permits random sampling. Software for parallel RN generation (RNG) should be based on RNGs that are...

  7. Evaluation of the efficiency and reliability of software generated by code generators

    Schreur, Barbara

    1994-01-01

    There are numerous studies which show that CASE Tools greatly facilitate software development. As a result of these advantages, an increasing amount of software development is done with CASE Tools. As more software engineers become proficient with these tools, their experience and feedback lead to further development with the tools themselves. What has not been widely studied, however, is the reliability and efficiency of the actual code produced by the CASE Tools. This investigation considered these matters. Three segments of code generated by MATRIXx, one of many commercially available CASE Tools, were chosen for analysis: ETOFLIGHT, a portion of the Earth to Orbit Flight software, and ECLSS and PFMC, modules for Environmental Control and Life Support System and Pump Fan Motor Control, respectively.

  8. 2nd International technical meeting on small reactors

    The 2nd International Technical Meeting on Small Reactors was held on November 7-9, 2012 in Ottawa, Ontario. The meeting was hosted by Atomic Energy of Canada Limited (AECL) and Canadian Nuclear Society (CNS). There is growing international interest and activity in the development of small nuclear reactor technology. This meeting provided participants with an opportunity to share ideas and exchange information on new developments. This Technical Meeting covered topics of interest to designers, operators, researchers and analysts involved in the design, development and deployment of small reactors for power generation and research. A special session focussed on small modular reactors (SMR) for generating electricity and process heat, particularly in small grids and remote locations. Following the success of the first Technical Meeting in November 2010, which captured numerous accomplishments of low-power critical facilities and small reactors, the second Technical Meeting was dedicated to the achievements, capabilities, and future prospects of small reactors. This meeting also celebrated the 50th Anniversary of the Nuclear Power Demonstration (NPD) reactor which was the first small reactor (20 MWe) to generate electricity in Canada.

  9. Assessment of nursing care using indicators generated by software

    Ana Paula Souza Lima

    2015-04-01

    Full Text Available OBJECTIVE: to analyze the efficacy of the Nursing Process in an Intensive Care Unit using indicators generated by software. METHOD: cross-sectional study using data collected for four months. RNs and students daily registered patients, took history (at admission, performed physical assessments, and established nursing diagnoses, nursing plans/prescriptions, and assessed care delivered to 17 patients using software. Indicators concerning the incidence and prevalence of nursing diagnoses, rate of effectiveness, risk diagnoses, and rate of effective prevention of complications were computed. RESULTS: the Risk for imbalanced body temperature was the most frequent diagnosis (23.53%, while the least frequent was Risk for constipation (0%. The Risk for Impaired skin integrity was prevalent in 100% of the patients, while Risk for acute confusion was the least prevalent (11.76%. Risk for constipation and Risk for impaired skin integrity obtained a rate of risk diagnostic effectiveness of 100%. The rate of effective prevention of acute confusion and falls was 100%. CONCLUSION: the efficacy of the Nursing Process using indicators was analyzed because these indicators reveal how nurses have identified patients' risks and conditions, and planned care in a systematized manner.

  10. Effective Test Case Generation Using Antirandom Software Testing

    Kulvinder Singh,

    2010-11-01

    Full Text Available Random Testing is a primary technique for the software testing. Antirandom Testing improves the fault-detection capability of Random Testing by employing the location information of previously executed test cases. Antirandom testing selects test case such that it is as different as possible from all the previous executed test cases. The implementation is illustrated using basic examples. Moreover, compared with Random Testing, test cases generated in Antirandom Testing are more evenly spread across the input domain. AntirandomTesting has conventionally been applied to programs that have only numerical input types, because the distance between numerical inputs is readily measurable. The vast majority of research involves distance techniques for generating the antirandom test cases. Different from these techniques, we focus on the concrete values ofprogram inputs by proposing a new method to generate the antirandom test cases. The proposed method enables Antirandom Testing to be applied to all kinds of programs regardless of their input types. Empirical studies are further conducted for comparison and evaluation of the effectiveness of these methods is also presented. Experimental results show that, compared with random and hamming distance techniques, the proposed method significantly reduces the number of test cases required to detect the first failure. Overall, proposed antirandom testing gives higher fault coverage than antirandom testing with hamming distance method, which gives higher fault coverage than pure random testing.

  11. A NEO population generation and observation simulation software tool

    Müller, Sven; Gelhaus, Johannes; Hahn, Gerhard; Franco, Raffaella

    One of the main targets of ESA's Space Situational Awareness (SSA) program is to build a wide knowledge base about objects that can potentially harm Earth (Near-Earth Objects, NEOs). An important part of this effort is to create the Small Bodies Data Centre (SBDC) which is going to aggregate measurement data from a fully-integrated NEO observation sensor network. Until this network is developed, artificial NEO measurement data is needed in order to validate SBDC algorithms. Moreover, to establish a functioning NEO observation sensor network, it has to be determined where to place sensors, what technical requirements have to be met in order to be able to detect NEOs and which observation strategies work the best. Because of this, a sensor simulation software was needed. This paper presents a software tool which allows users to create and analyse NEO populations and to simulate and analyse population observations. It is a console program written in Fortran and comes with a Graphical User Interface (GUI) written in Java and C. The tool can be distinguished into the components ``Population Generator'' and ``Observation Simulator''. The Population Generator component is responsible for generating and analysing a NEO population. Users can choose between creating fictitious (random) and synthetic populations. The latter are based on one of two models describing the orbital and size distribution of observed NEOs: The existing socalled ``Bottke Model'' (Bottke et al. 2000, 2002) and the new ``Granvik Model'' (Granvik et al. 2014, in preparation) which has been developed in parallel to the tool. Generated populations can be analysed by defining 2D, 3D and scatter plots using various NEO attributes. As a result, the tool creates the appropiate files for the plotting tool ``gnuplot''. The tool's Observation Simulator component yields the Observation Simulation and Observation Analysis functions. Users can define sensor systems using ground- or space-based locations as well as

  12. Florida Investigates 2nd Possible Local Transmission of Zika Virus

    ... html Florida Investigates 2nd Possible Local Transmission of Zika Virus If confirmed, cases would be first instances of ... Broward County, north of Miami. Infection with the Zika virus, which in most cases is transmitted by mosquitoes, ...

  13. Generation and Optimization of Test cases for Object-Oriented Software Using State Chart Diagram

    Ranjita Kumari Swain

    2012-05-01

    Full Text Available The process of testing any software system is an enormous task which is time consuming and costly. The time and required effort to do sufficient testing grow, as the size and complexity of the software grows, which may cause overrun of the project budget, delay in the development of software system or some test cases may not be covered. During SDLC (software development life cycle, generally the software testing phase takes around 40-70% of the time and cost. Statebased testing is frequently used in software testing. Test data generation is one of the key issues in software testing. A properly generated test suite may not only locate the errors in a software system, but also help in reducing the high cost associated with software testing. It is often desired that test data in the form of test sequences within a test suite can be automatically generated to achieve required test coverage.

  14. Test case generation based on orthogonal table for software black-box testing

    LIU Jiu-fu; YANG Zhong; YANG Zhen-xing; SUN Lin

    2008-01-01

    Software testing is an important means to assure the software quality. This paper presents a practicable method to generate test cases of software testing, which is operational and high efficient. We discuss the identification of software specification categories and choices and make a classification tree. Based on the orthogonal array, it is easy to generate test cases. The number of this method is less than that of all combination of the choices.

  15. GENESIS: Agile Generation of Information Management Oriented Software

    Juan Erasmo Gómez

    2010-06-01

    Full Text Available The specification for an information system can be clear from the beginning: it must acquire, display, query and modify data, using a database. The main issue is to decide which information to manage. In the case originating this work, information was always evolving, even up to the end of the project. This implies the construction of a new system each time the information is redefined. This article presents Genesis, an agile development infrastructure, and proposes an approach for the immediate construction of required information systems. Experts describe their information needs and queries, and Genesis generates the corresponding application, with the appropriate graphical interfaces and database.La especificación de un sistema de información puede estar clara desde el principio: debe adquirir, desplegar, consultar y modificar datos, usando una base de datos. El asunto es decidir cuál información manejar. En el caso que origina este trabajo, la información evoluciona permanentemente, incluso hasta el final del proyecto. Esto implica la construcción de un nuevo sistema cada vez que se redefine la información. Este artículo presenta Genesis, una infraestructura ágil para la construcción inmediata del sistema de información que sea requerido. Los expertos describen su información y consultas. Genesis produce el software correspondiente, generando las interfaces gráficas y la base de datos apropiados.

  16. Software Defined Radio Architecture Contributions to Next Generation Space Communications

    Kacpura, Thomas J.; Eddy, Wesley M.; Smith, Carl R.; Liebetreu, John

    2015-01-01

    systems, as well as those communications and navigation systems operated by international space agencies and civilian and government agencies. In this paper, we review the philosophies, technologies, architectural attributes, mission services, and communications capabilities that form the structure of candidate next-generation integrated communication architectures for space communications and navigation. A key area that this paper explores is from the development and operation of the software defined radio for the NASA Space Communications and Navigation (SCaN) Testbed currently on the International Space Station (ISS). Evaluating the lessons learned from development and operation feed back into the communications architecture. Leveraging the reconfigurability provides a change in the way that operations are done and must be considered. Quantifying the impact on the NASA Space Telecommunications Radio System (STRS) software defined radio architecture provides feedback to keep the standard useful and up to date. NASA is not the only customer of these radios. Software defined radios are developed for other applications, and taking advantage of these developments promotes an architecture that is cost effective and sustainable. Developments in the following areas such as an updated operating environment, higher data rates, networking and security can be leveraged. The ability to sustain an architecture that uses radios for multiple markets can lower costs and keep new technology infused.

  17. 2nd European Advanced Accelerator Concepts Workshop

    Assmann, Ralph; Grebenyuk, Julia

    2015-01-01

    The European Advanced Accelerator Concepts Workshop has the mission to discuss and foster methods of beam acceleration with gradients beyond state of the art in operational facilities. The most cost effective and compact methods for generating high energy particle beams shall be reviewed and assessed. This includes diagnostics methods, timing technology, special need for injectors, beam matching, beam dynamics with advanced accelerators and development of adequate simulations. This workshop is organized in the context of the EU-funded European Network for Novel Accelerators (EuroNNAc2), that includes 52 Research Institutes and universities.

  18. Model of Next Generation Energy-Efficient Design Software for Buildings

    MA Zhiliang; ZHAO Yili

    2008-01-01

    Energy-efficient design for buildings (EEDB) is a vital step towards building energy-saving. In or-der to greatly improve the EEDB, the next generation EEDB software that makes use of latest technologies needs to be developed. This paper mainly focuses on establishing the model of the next generation EEDB software. Based on the investigation of literatures and the interviews to the designers, the requirements on the next generation EEDB software were identified, where the lifecycle assessment on both energy con-sumption and environmental impacts, 3D graphics support, and building information modeling (BIM) support were stressed. Then the workflow for using the next generation EEDB software was established. Finally,based on the workflow, the framework model for the software was proposed, and the partial models and the corresponding functions were systematically analyzed. The model lays a solid foundation for developing the next generation EEDB software.

  19. Thermoluminescent characteristics of ZrO2:Nd films

    In this work it is exposed the obtained results after analysing the photo luminescent and thermoluminescent characteristics of activated zirconium oxide with neodymium (ZrO2 :Nd) and its possible application in the UV radiation dosimetry. The realized experiments had as objective to study the characteristics such as the optimum thermal erased treatment, the influence of light on the response, the response depending on the wavelength, the fadeout of the information, the temperature effect, the response depending on the time and the recurring of the response. The results show that the ZrO2 :Nd is a promising material to be used as Tl dosemeter for the UV radiation. (Author)

  20. 2nd Quarter Transportation Report FY 2014

    Gregory, L.

    2014-07-30

    This report satisfies the U.S. Department of Energy (DOE), National Nuclear Security Administration Nevada Field Office (NNSA/NFO) commitment to prepare a quarterly summary report of radioactive waste shipments to the Nevada National Security Site (NNSS) Radioactive Waste Management Complex (RWMC) at Area 5. There were no shipments sent for offsite treatment and returned to the NNSS this quarter. This report summarizes the second quarter of fiscal year (FY) 2014 low-level radioactive waste (LLW) and mixed low-level radioactive waste (MLLW) shipments. This report also includes annual summaries for FY 2014 in Tables 4 and 5. Tabular summaries are provided which include the following: Sources of and carriers for LLW and MLLW shipments to and from the NNSS; Number and external volume of LLW and MLLW shipments; Highway routes used by carriers; and Incident/accident data applicable to LLW and MLLW shipments. In this report shipments are accounted for upon arrival at the NNSS, while disposal volumes are accounted for upon waste burial. The disposal volumes presented in this report do not include minor volumes of non-radioactive materials that were approved for disposal. Volume reports showing cubic feet (ft3) generated using the Low-Level Waste Information System may vary slightly due to differing rounding conventions.

  1. S-Cube: Enabling the Next Generation of Software Services

    Metzger, Andreas; Pohl, Klaus

    The Service Oriented Architecture (SOA) paradigm is increasingly adopted by industry for building distributed software systems. However, when designing, developing and operating innovative software services and servicebased systems, several challenges exist. Those challenges include how to manage the complexity of those systems, how to establish, monitor and enforce Quality of Service (QoS) and Service Level Agreements (SLAs), as well as how to build those systems such that they can proactively adapt to dynamically changing requirements and context conditions. Developing foundational solutions for those challenges requires joint efforts of different research communities such as Business Process Management, Grid Computing, Service Oriented Computing and Software Engineering. This paper provides an overview of S-Cube, the European Network of Excellence on Software Services and Systems. S-Cube brings together researchers from leading research institutions across Europe, who join their competences to develop foundations, theories as well as methods and tools for future service-based systems.

  2. 2nd ISPRA nuclear electronics symposium, Stresa, Italy May 20-23, 1975

    Two round tables were annexed to the 2nd Ispra Nuclear Electronics Symposium. The first one was concerned with software support for the implementation of microprocessors, MOS and bipolar microporcessors, environmental data systems, and the use of microprocessors and minicomputers in nuclear, biomedical and environmental fields. Nuclear electronics future, and its diversification, gravitational waves and electronics, the environmental measurements of air and water quality were discussed during the second round table, and relevant feelings brought out during the discussion on the extension of nuclear electronics techniques to other fields

  3. 2nd International Conference on Multiscale Computational Methods for Solids and Fluids

    2016-01-01

    This volume contains the best papers presented at the 2nd ECCOMAS International Conference on Multiscale Computations for Solids and Fluids, held June 10-12, 2015. Topics dealt with include multiscale strategy for efficient development of scientific software for large-scale computations, coupled probability-nonlinear-mechanics problems and solution methods, and modern mathematical and computational setting for multi-phase flows and fluid-structure interaction. The papers consist of contributions by six experts who taught short courses prior to the conference, along with several selected articles from other participants dealing with complementary issues, covering both solid mechanics and applied mathematics. .

  4. A Handbook for Classroom Instruction That Works, 2nd Edition

    Association for Supervision and Curriculum Development, 2012

    2012-01-01

    Perfect for self-help and professional learning communities, this handbook makes it much easier to apply the teaching practices from the ASCD-McREL best-seller "Classroom Instruction That Works: Research-Based Strategies for Increasing Student Achievement, 2nd Edition." The authors take you through the refined Instructional Planning Guide, so you…

  5. 2nd International Conference on Nuclear Physics in Astrophysics

    Fülöp, Zsolt; Somorjai, Endre; The European Physical Journal A : Volume 27, Supplement 1, 2006

    2006-01-01

    Launched in 2004, "Nuclear Physics in Astrophysics" has established itself in a successful topical conference series addressing the forefront of research in the field. This volume contains the selected and refereed papers of the 2nd conference, held in Debrecen in 2005 and reprinted from "The European Physical Journal A - Hadrons and Nuclei".

  6. 2nd International Conference on Data Management Technologies and Applications

    2013-01-01

    The 2nd International Conference on Data Management Technologies and Applications (DATA) aims to bring together researchers, engineers and practitioners interested on databases, data warehousing, data mining, data management, data security and other aspects of information systems and technology involving advanced applications of data.

  7. The 2nd Seminar on Standardization Cooperation in Northeast Asia

    2004-01-01

    @@ The 2nd Seminar on Standardization Cooperation in Northeast Asia(2003) was held in Beijing from Oct 30th - Oct 31st, which was the succession of the first one in Korea, 2002, with the participants coming from the standardization circles in China, Japan and Korea.

  8. Open-source software for generating electrocardiogram signals

    McSharry, P E; Sharry, Patrick E. Mc; Cifford, Gari D.

    2004-01-01

    ECGSYN, a dynamical model that faithfully reproduces the main features of the human electrocardiogram (ECG), including heart rate variability, RR intervals and QT intervals is presented. Details of the underlying algorithm and an open-source software implementation in Matlab, C and Java are described. An example of how this model will facilitate comparisons of signal processing techniques is provided.

  9. Automated Software Test Data Generation: Direction of Research

    Hitesh Tahbildar

    2011-02-01

    Full Text Available In this paper we are giving an overview of automatic test data generation. The basic objective of thispaper is to acquire the basic concepts related to automated test data generation research. The differentimplementation techniques are described with their relative merits and demerits. The future challengesand problems of test data generation are explained. Finally we describe the area where more focus isrequired for making automatic test data generation more effective in industry.

  10. Minimal TestCase Generation for Object-Oriented Software with State Charts

    Ranjita Kumari Swain; Prafulla Kumar Behera; Durga Prasad Mohapatra

    2012-01-01

    Today statecharts are a de facto standard in industry for modeling system behavior. Test data generation is one of the key issues in software testing. This paper proposes an reduction approach to test data generation for the state-based software testing. In this paper, first state transition graph is derived from state chart diagram. Then, all the required information are extracted from the state chart diagram. Then, test cases are generated. Lastly, a set of test cases are minimized by calcu...

  11. Development of a Prototype Automation Simulation Scenario Generator for Air Traffic Management Software Simulations

    Khambatta, Cyrus F.

    2007-01-01

    A technique for automated development of scenarios for use in the Multi-Center Traffic Management Advisor (McTMA) software simulations is described. The resulting software is designed and implemented to automate the generation of simulation scenarios with the intent of reducing the time it currently takes using an observational approach. The software program is effective in achieving this goal. The scenarios created for use in the McTMA simulations are based on data taken from data files from the McTMA system, and were manually edited before incorporation into the simulations to ensure accuracy. Despite the software s overall favorable performance, several key software issues are identified. Proposed solutions to these issues are discussed. Future enhancements to the scenario generator software may address the limitations identified in this paper.

  12. 2nd International Conference on Green Communications and Networks 2012

    Ma, Maode; GCN 2012

    2013-01-01

    The objective of the 2nd International Conference on Green Communications and Networks 2012 (GCN 2012) is to facilitate an exchange of information on best practices for the latest research advances in the area of communications, networks and intelligence applications. These mainly involve computer science and engineering, informatics, communications and control, electrical engineering, information computing, and business intelligence and management. Proceedings of the 2nd International Conference on Green Communications and Networks 2012 (GCN 2012) will focus on green information technology and applications, which will provide in-depth insights for engineers and scientists in academia, industry, and government. The book addresses the most innovative research developments including technical challenges, social and economic issues, and presents and discusses the authors’ ideas, experiences, findings, and current projects on all aspects of advanced green information technology and applications. Yuhang Yang is ...

  13. 2nd International Conference on Intelligent Technologies and Engineering Systems

    Chen, Cheng-Yi; Yang, Cheng-Fu

    2014-01-01

    This book includes the original, peer reviewed research papers from the 2nd International Conference on Intelligent Technologies and Engineering Systems (ICITES2013), which took place on December 12-14, 2013 at Cheng Shiu University in Kaohsiung, Taiwan. Topics covered include: laser technology, wireless and mobile networking, lean and agile manufacturing, speech processing, microwave dielectrics, intelligent circuits and systems, 3D graphics, communications, and structure dynamics and control.

  14. Introduction on the 2nd annual general meeting of ARCCNM

    This paper outlines general information on the 2nd annual general meeting of ARCCNM (Asian Regional Cooperative Council for Nuclear Medicine). The international symposium exchanged new development recently on basic and clinical nuclear medicine. Asian school of nuclear medicine is an educational enterprise of ARCCNM, and the objective is to organize and coordinate academic and training programs in nuclear medicine. It will promote nuclear medicine in Asia region through enhancing regional scientific activities and research collaboration

  15. 2nd Interdiciplinary Conference on Production, Logistics and Traffic 2015

    Friedrich, Hanno; Thaller, Carina; Geiger, Christiane

    2016-01-01

    This contributed volume contains the selected and reviewed papers of the 2nd Interdisciplinary Conference on Production, Logistics and Traffic (ICPLT) 2015, Dortmund, Germany. The topical focus lies on economic, ecological and societal issues related to commercial transport. The authors are international experts and the paper collection presents the state-of-the-art in the field, thus making this book a valuable read for both practitioners and researchers.

  16. 2nd International Open and Distance Learning (IODL) Symposium

    Reviewed by Murat BARKAN

    2006-01-01

    This closing remarks prepared and presented by Prof. Dr. Murat BARKAN Anadolu University, Eskisehir, TURKEY DEAR GUESTS, As the 2nd International Open and Distance Learning (IODL) Symposium is now drawing to end, I would like to thank you all for your outstanding speeches, distinguished presentations, constructive roundtable and session discussions, and active participation during the last five days. I hope you all share my view that the whole symposium has been...

  17. Exogenous attention enhances 2nd-order contrast sensitivity

    Barbot, Antoine; Landy, Michael S.; Carrasco, Marisa

    2011-01-01

    Natural scenes contain a rich variety of contours that the visual system extracts to segregrate the retinal image into perceptually coherent regions. Covert spatial attention helps extract contours by enhancing contrast sensitivity for 1st-order, luminance-defined patterns at attended locations, while reducing sensitivity at unattended locations, relative to neutral attention allocation. However, humans are also sensitive to 2nd-order patterns such as spatial variations of texture, which are ...

  18. 2nd International Conference on Electric and Electronics (EEIC 2012)

    Advances in Electric and Electronics

    2012-01-01

    This volume contains 108 full length papers presented at the 2nd International Conference on Electric and Electronics (EEIC 2012), held on April 21-22 in Sanya, China, which brings together researchers working in many different areas of education and learning to foster international collaborations and exchange of new ideas. This volume can be divided into two sections on the basis of the classification of manuscripts considered: the first section deals with Electric and the second section with Electronics.

  19. Specification and Generation of Environment for Model Checking of Software Components

    Pařízek, P.; Plášil, František

    2007-01-01

    Roč. 176, - (2007), s. 143-154. ISSN 1571-0661 R&D Projects: GA AV ČR 1ET400300504 Institutional research plan: CEZ:AV0Z10300504 Keywords : software components * behavior protocols * model checking * automated generation of environment Subject RIV: JC - Computer Hardware ; Software

  20. An application generator for rapid prototyping of Ada real-time control software

    Johnson, Jim; Biglari, Haik; Lehman, Larry

    1990-01-01

    The need to increase engineering productivity and decrease software life cycle costs in real-time system development establishes a motivation for a method of rapid prototyping. The design by iterative rapid prototyping technique is described. A tool which facilitates such a design methodology for the generation of embedded control software is described.

  1. Using Automatic Code Generation in the Attitude Control Flight Software Engineering Process

    McComas, David; O'Donnell, James R., Jr.; Andrews, Stephen F.

    1999-01-01

    This paper presents an overview of the attitude control subsystem flight software development process, identifies how the process has changed due to automatic code generation, analyzes each software development phase in detail, and concludes with a summary of our lessons learned.

  2. Rapid Protoyping Software for Developing Next-Generation Air Traffic Management Algorithms Project

    National Aeronautics and Space Administration — Research on next-generation air traffic management systems is being conducted at several laboratories using custom software. In order to provide a more uniform...

  3. Rapid Protoyping Software for Developing Next-Generation Air Traffic Management Algorithms Project

    National Aeronautics and Space Administration — Research on next-generation air traffic control systems are being conducted at several laboratories. Most of this work is being carried out using custom software....

  4. Automatically generated acceptance test: A software reliability experiment

    Protzel, Peter W.

    1988-01-01

    This study presents results of a software reliability experiment investigating the feasibility of a new error detection method. The method can be used as an acceptance test and is solely based on empirical data about the behavior of internal states of a program. The experimental design uses the existing environment of a multi-version experiment previously conducted at the NASA Langley Research Center, in which the launch interceptor problem is used as a model. This allows the controlled experimental investigation of versions with well-known single and multiple faults, and the availability of an oracle permits the determination of the error detection performance of the test. Fault interaction phenomena are observed that have an amplifying effect on the number of error occurrences. Preliminary results indicate that all faults examined so far are detected by the acceptance test. This shows promise for further investigations, and for the employment of this test method on other applications.

  5. Analysis of Software Test Item Generation- Comparison Between High Skilled and Low Skilled Engineers

    Masayuki Hirayama; Osamu Mizuno; Tohru Kikuno

    2005-01-01

    Recent software system contain many functions to provide various services. According to this tendency, it is difficult to ensure software quality and to eliminate crucial faults by conventional software testing methods. So taking the effect of test engineer's skill on test item generation into consideration, we propose a new test item generation method,which supports the generation of test items for illegal behavior of the system. The proposed method can generate test items based on use-case analysis, deviation analysis for legal behavior, and faults tree analysis for system fault situations. From the results of the experimental applications of our method, we confirmed that test items for illegal behavior of a system were effectively generated, and also the proposed method could effectively assist test item generation by an engineer with low-level skill.

  6. Psychosocial Risks Generated By Assets Specific Design Software

    Remus, Furtună; Angela, Domnariu; Petru, Lazăr

    2015-07-01

    The human activity concerning an occupation is resultant from the interaction between the psycho-biological, socio-cultural and organizational-occupational factors. Tehnological development, automation and computerization that are to be found in all the branches of activity, the level of speed in which things develop, as well as reaching their complexity, require less and less physical aptitudes and more cognitive qualifications. The person included in the work process is bound in most of the cases to come in line with the organizational-occupational situations that are specific to the demands of the job. The role of the programmer is essencial in the process of execution of ordered softwares, thus the truly brilliant ideas can only come from well-rested minds, concentrated on their tasks. The actual requirements of the jobs, besides the high number of benefits and opportunities, also create a series of psycho-social risks, which can increase the level of stress during work activity, especially for those who work under pressure.

  7. Learning from examples - Generation and evaluation of decision trees for software resource analysis

    Selby, Richard W.; Porter, Adam A.

    1988-01-01

    A general solution method for the automatic generation of decision (or classification) trees is investigated. The approach is to provide insights through in-depth empirical characterization and evaluation of decision trees for software resource data analysis. The trees identify classes of objects (software modules) that had high development effort. Sixteen software systems ranging from 3,000 to 112,000 source lines were selected for analysis from a NASA production environment. The collection and analysis of 74 attributes (or metrics), for over 4,700 objects, captured information about the development effort, faults, changes, design style, and implementation style. A total of 9,600 decision trees were automatically generated and evaluated. The trees correctly identified 79.3 percent of the software modules that had high development effort or faults, and the trees generated from the best parameter combinations correctly identified 88.4 percent of the modules on the average.

  8. Generating Safety-Critical PLC Code From a High-Level Application Software Specification

    2008-01-01

    The benefits of automatic-application code generation are widely accepted within the software engineering community. These benefits include raised abstraction level of application programming, shorter product development time, lower maintenance costs, and increased code quality and consistency. Surprisingly, code generation concepts have not yet found wide acceptance and use in the field of programmable logic controller (PLC) software development. Software engineers at Kennedy Space Center recognized the need for PLC code generation while developing the new ground checkout and launch processing system, called the Launch Control System (LCS). Engineers developed a process and a prototype software tool that automatically translates a high-level representation or specification of application software into ladder logic that executes on a PLC. All the computer hardware in the LCS is planned to be commercial off the shelf (COTS), including industrial controllers or PLCs that are connected to the sensors and end items out in the field. Most of the software in LCS is also planned to be COTS, with only small adapter software modules that must be developed in order to interface between the various COTS software products. A domain-specific language (DSL) is a programming language designed to perform tasks and to solve problems in a particular domain, such as ground processing of launch vehicles. The LCS engineers created a DSL for developing test sequences of ground checkout and launch operations of future launch vehicle and spacecraft elements, and they are developing a tabular specification format that uses the DSL keywords and functions familiar to the ground and flight system users. The tabular specification format, or tabular spec, allows most ground and flight system users to document how the application software is intended to function and requires little or no software programming knowledge or experience. A small sample from a prototype tabular spec application is

  9. Two 2nd Circuit decisions represent mixed bag on insurance.

    2000-01-21

    The 2nd U.S. Circuit Court of Appeals in New York issued two important rulings within a week on the extent to which the Americans with Disabilities Act (ADA) regulates insurance practices. [Name removed] v. Allstate Life Insurance Co. was a plaintiff-friendly decision, finding that the insurance company illegally refused to sell life insurance to a married couple because of their mental disability, major depression. [Name removed]. v. Israel Discount Bank of New York was more defendant friendly and tackled the issue of whether the ADA permits different benefit caps for mental and physical disabilities. PMID:11367226

  10. BIPHASIC TREATMENT OF 2ND CLASS ANGLE ANOMALIES

    C. Romanec

    2011-09-01

    Full Text Available Our approach aims at presenting, based on clinical observations and complementary examinations, the effects of a treatment’s setting up during the mixed dentition period. The objectives include the identification of the optimal time of treatment of II/1, II/2 Angle malocclusions, as well as the therapeutic possibilities for the treatment of 2nd class Angle malocclusion during the period of mixed and permanent dentition. The study is based on data collected from 114 clinical cases (69 girls and 45 boys with an age span between 7 and 18 years.

  11. 2nd conference on Continuous Media with Microstructure

    Kuczma, Mieczysław

    2016-01-01

    This book presents research advances in the field of Continuous Media with Microstructure and considers the three complementary pillars of mechanical sciences: theory, research and computational simulation. It focuses on the following problems: thermodynamic and mathematical modeling of materials with extensions of classical constitutive laws, single and multicomponent media including modern multifunctional materials, wave propagation, multiscale and multiphysics processes, phase transformations, and porous, granular and composite materials. The book presents the proceedings of the 2nd Conference on Continuous Media with Microstructure, which was held in 2015 in Łagów, Poland, in memory of Prof. Krzysztof Wilmański. .

  12. TagGD: fast and accurate software for DNA Tag generation and demultiplexing.

    Paul Igor Costea

    Full Text Available Multiplexing is of vital importance for utilizing the full potential of next generation sequencing technologies. We here report TagGD (DNA-based Tag Generator and Demultiplexor, a fully-customisable, fast and accurate software package that can generate thousands of barcodes satisfying user-defined constraints and can guarantee full demultiplexing accuracy. The barcodes are designed to minimise their interference with the experiment. Insertion, deletion and substitution events are considered when designing and demultiplexing barcodes. 20,000 barcodes of length 18 were designed in 5 minutes and 2 million barcoded Illumina HiSeq-like reads generated with an error rate of 2% were demultiplexed with full accuracy in 5 minutes. We believe that our software meets a central demand in the current high-throughput biology and can be utilised in any field with ample sample abundance. The software is available on GitHub (https://github.com/pelinakan/UBD.git.

  13. 2nd International Conference on Computer Science, Applied Mathematics and Applications

    Thi, Hoai; Nguyen, Ngoc

    2014-01-01

    The proceedings consists of 30 papers which have been selected and invited from the submissions to the 2nd International Conference on Computer Science, Applied Mathematics and Applications (ICCSAMA 2014) held on 8-9 May, 2014 in Budapest, Hungary. The conference is organized into 7 sessions: Advanced Optimization Methods and Their Applications, Queueing Models and Performance Evaluation, Software Development and Testing, Computational Methods for Mobile and Wireless Networks, Computational Methods for Knowledge Engineering, Logic Based Methods for Decision Making and Data Mining, and Nonlinear Systems and Applications, respectively. All chapters in the book discuss theoretical and practical issues connected with computational methods and optimization methods for knowledge engineering. The editors hope that this volume can be useful for graduate and Ph.D. students and researchers in Computer Science and Applied Mathematics. It is the hope of the editors that readers of this volume can find many inspiring idea...

  14. FACTORS GENERATING RISKS DURING REQUIREMENT ENGINEERING PROCESS IN GLOBAL SOFTWARE DEVELOPMENT ENVIRONMENT

    Huma Hayat Khan

    2014-03-01

    Full Text Available Challenges of Requirements Engineering become adequate when it is performed in global software development paradigm. There can be many reasons behind this challenging nature. “Risks” can be one of them, as there is more risk exposure in global development paradigm. So it is may be one of the main reasons of making Requirement Engineering more challenging. For this first there is a need to identify the factors which actually generate these risks. This paper therefore not only identifies the factors, but also the risks which these factors may generate. A systematic literature review is done for the identification of these factors and the risks which may occur during requirement engineering process in global software development paradigm. The list leads to progressive enhancement for assisting in requirement engineering activities in global software development paradigm. This work is especially useful for the, less experience people working in global software development.

  15. Afs password expiration starts Feb 2nd 2004

    2004-01-01

    Due to security reasons, and in agreement with CERN management, afs/lxplus passwords will fall into line with Nice/Mail passwords on February 2nd and expire annually. As of the above date afs account holders who have not changed their passwords for over a year will have a 60 day grace period to make a change. Following this date their passwords will become invalid. What does this mean to you? If you have changed your afs password in the past 10 months the only difference is that 60 days before expiration you will receive a warning message. Similar warnings will also appear nearer the time of expiration. If you have not changed your password for more than 10 months, then, as of February 2nd you will have 60 days to change it using the command ‘kpasswd'. Help to choose a good password can be found at: http://security.web.cern.ch/security/passwords/ If you have been given a temporary password at any time by the Helpdesk or registration team this will automatically fall into the expiration category ...

  16. A Practical Comparison of De Novo Genome Assembly Software Tools for Next-Generation Sequencing Technologies

    Zhang, Wenyu; Chen, Jiajia; Yang, Yang; Tang, Yifei; Shang, Jing; Shen, Bairong

    2011-01-01

    The advent of next-generation sequencing technologies is accompanied with the development of many whole-genome sequence assembly methods and software, especially for de novo fragment assembly. Due to the poor knowledge about the applicability and performance of these software tools, choosing a befitting assembler becomes a tough task. Here, we provide the information of adaptivity for each program, then above all, compare the performance of eight distinct tools against eight groups of simulat...

  17. Scoping analysis of the Advanced Test Reactor using SN2ND

    Wolters, E.; Smith, M. (NE NEAMS PROGRAM); ( SC)

    2012-07-26

    A detailed set of calculations was carried out for the Advanced Test Reactor (ATR) using the SN2ND solver of the UNIC code which is part of the SHARP multi-physics code being developed under the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program in DOE-NE. The primary motivation of this work is to assess whether high fidelity deterministic transport codes can tackle coupled dynamics simulations of the ATR. The successful use of such codes in a coupled dynamics simulation can impact what experiments are performed and what power levels are permitted during those experiments at the ATR. The advantages of the SN2ND solver over comparable neutronics tools are its superior parallel performance and demonstrated accuracy on large scale homogeneous and heterogeneous reactor geometries. However, it should be noted that virtually no effort from this project was spent constructing a proper cross section generation methodology for the ATR usable in the SN2ND solver. While attempts were made to use cross section data derived from SCALE, the minimal number of compositional cross section sets were generated to be consistent with the reference Monte Carlo input specification. The accuracy of any deterministic transport solver is impacted by such an approach and clearly it causes substantial errors in this work. The reasoning behind this decision is justified given the overall funding dedicated to the task (two months) and the real focus of the work: can modern deterministic tools actually treat complex facilities like the ATR with heterogeneous geometry modeling. SN2ND has been demonstrated to solve problems with upwards of one trillion degrees of freedom which translates to tens of millions of finite elements, hundreds of angles, and hundreds of energy groups, resulting in a very high-fidelity model of the system unachievable by most deterministic transport codes today. A space-angle convergence study was conducted to determine the meshing and angular cubature

  18. Scoping analysis of the Advanced Test Reactor using SN2ND

    A detailed set of calculations was carried out for the Advanced Test Reactor (ATR) using the SN2ND solver of the UNIC code which is part of the SHARP multi-physics code being developed under the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program in DOE-NE. The primary motivation of this work is to assess whether high fidelity deterministic transport codes can tackle coupled dynamics simulations of the ATR. The successful use of such codes in a coupled dynamics simulation can impact what experiments are performed and what power levels are permitted during those experiments at the ATR. The advantages of the SN2ND solver over comparable neutronics tools are its superior parallel performance and demonstrated accuracy on large scale homogeneous and heterogeneous reactor geometries. However, it should be noted that virtually no effort from this project was spent constructing a proper cross section generation methodology for the ATR usable in the SN2ND solver. While attempts were made to use cross section data derived from SCALE, the minimal number of compositional cross section sets were generated to be consistent with the reference Monte Carlo input specification. The accuracy of any deterministic transport solver is impacted by such an approach and clearly it causes substantial errors in this work. The reasoning behind this decision is justified given the overall funding dedicated to the task (two months) and the real focus of the work: can modern deterministic tools actually treat complex facilities like the ATR with heterogeneous geometry modeling. SN2ND has been demonstrated to solve problems with upwards of one trillion degrees of freedom which translates to tens of millions of finite elements, hundreds of angles, and hundreds of energy groups, resulting in a very high-fidelity model of the system unachievable by most deterministic transport codes today. A space-angle convergence study was conducted to determine the meshing and angular cubature

  19. Software module for geometric product modeling and NC tool path generation

    The intelligent CAD/CAM system named VIRTUAL MANUFACTURE is created. It is consisted of four intelligent software modules: the module for virtual NC machine creation, the module for geometric product modeling and automatic NC path generation, the module for virtual NC machining and the module for virtual product evaluation. In this paper the second intelligent software module is presented. This module enables feature-based product modeling carried out via automatic saving of the designed product geometric features as knowledge data. The knowledge data are afterwards applied for automatic NC program generation for the designed product NC machining. (Author)

  20. 2nd International Open and Distance Learning (IODL Symposium

    Reviewed by Murat BARKAN

    2006-10-01

    Full Text Available This closing remarks prepared and presented by Prof. Dr. Murat BARKAN Anadolu University, Eskisehir, TURKEY DEAR GUESTS, As the 2nd International Open and Distance Learning (IODL Symposium is now drawing to end, I would like to thank you all for your outstanding speeches, distinguished presentations, constructive roundtable and session discussions, and active participation during the last five days. I hope you all share my view that the whole symposium has been a very stimulating and successful experience. Also, on behalf of all the participants, I would like to take this opportunity to thank and congratulate the symposium organization committee for their excellent job in organizing and hosting our 2nd meeting. Throughout the symposium, five workshops, six keynote speeches and 66 papers, which were prepared by more than 150 academicians and practitioners from 23 different countries, reflected remarkable and various views and approaches about open and flexible learning. Besides, all these academic endeavors, 13 educational films were displayed during the symposium. The technology exhibition, hosted by seven companies, was very effective to showcase the current level of the technology and the success of applications of theory into practice. Now I would like to go over what our scholar workshop and keynote presenters shared with us: Prof. Marina McIsaac form Arizona State University dwelled on how to determine research topics worthwhile to be examined and how to choose appropriate research design and methods. She gave us clues on how to get articles published in professional journals. Prof. Colin Latchem from Australia and Prof. Dr. Ali Ekrem Ozkul from Anadolu University pointed to the importance of strategic planning for distance and flexible learning. They highlighted the advantages of strategic planning for policy-makers, planners, managers and staff. Dr. Wolfram Laaser from Fern University of Hagen, presented different multimedia clips and

  1. 2nd International Conference on NeuroRehabilitation

    Andersen, Ole; Akay, Metin

    2014-01-01

    The book is the proceedings of the 2nd International Conference on NeuroRehabilitation (ICNR 2014), held 24th-26th June 2014 in Aalborg, Denmark. The conference featured the latest highlights in the emerging and interdisciplinary field of neural rehabilitation engineering and identified important healthcare challenges the scientific community will be faced with in the coming years. Edited and written by leading experts in the field, the book includes keynote papers, regular conference papers, and contributions to special and innovation sessions, covering the following main topics: neuro-rehabilitation applications and solutions for restoring impaired neurological functions; cutting-edge technologies and methods in neuro-rehabilitation; and translational challenges in neuro-rehabilitation. Thanks to its highly interdisciplinary approach, the book will not only be a  highly relevant reference guide for academic researchers, engineers, neurophysiologists, neuroscientists, physicians and physiotherapists workin...

  2. 2nd international conference on advanced nanomaterials and nanotechnology

    Goswami, D; Perumal, A

    2013-01-01

    Nanoscale science and technology have occupied centre stage globally in modern scientific research and discourses in the early twenty first century. The enabling nature of the technology makes it important in modern electronics, computing, materials, healthcare, energy and the environment. This volume contains selected articles presented (as Invited/Oral/Poster presentations) at the 2nd international conference on advanced materials and nanotechnology (ICANN-2011) held recently at the Indian Institute of Technology Guwahati, during Dec 8-10, 2011. The list of topics covered in this proceedings include: Synthesis and self assembly of nanomaterials Nanoscale characterisation Nanophotonics & Nanoelectronics Nanobiotechnology Nanocomposites  F   Nanomagnetism Nanomaterials for Enery Computational Nanotechnology Commercialization of Nanotechnology The conference was represented by around 400 participants from several countries including delegates invited from USA, Germany, Japan, UK, Taiwan, Italy, Singapor...

  3. Isotope effects on vapour phase 2nd viral coefficients

    Vapor phase 2nd virial coefficient isotope effects (VCIE's) are interpreted. A useful correlation ids developed between -Δ(B-b0)/(B-b0) = -VCIE and the reference condensed phase reduced isotopic partition function ratio [ln(fc/fg)]*. B is the second virial coefficient , b0 = 2πσ3/3, σ is the Lennard-Jones size parameter, and Δ is an isotopic difference, light-heavy. [ln(fc/fg)]* can be obtained from vapor pressure isotope effects for T/TCRITICAL p/f2g), where ln(fp/f2g) is the reduced isotopic partition function ratio describing the equilibrium between monomers and interacting pairs. At temperatures well removed from crossovers in ln(fp/f2g) or [ln(fc/fg)]*, ln(fp/f2g) = (0.4±0.2)[ln(fc/fg)]*. (author)

  4. 2nd International Congress on Neurotechnology, Electronics and Informatics

    Encarnação, Pedro

    2016-01-01

    This book is a timely report on current neurotechnology research. It presents a snapshot of the state of the art in the field, discusses current challenges and identifies new directions. The book includes a selection of extended and revised contributions presented at the 2nd International Congress on Neurotechnology, Electronics and Informatics (NEUROTECHNIX 2014), held October 25-26 in Rome, Italy. The chapters are varied: some report on novel theoretical methods for studying neuronal connectivity or neural system behaviour; others report on advanced technologies developed for similar purposes; while further contributions concern new engineering methods and technological tools supporting medical diagnosis and neurorehabilitation. All in all, this book provides graduate students, researchers and practitioners dealing with different aspects of neurotechnologies with a unified view of the field, thus fostering new ideas and research collaborations among groups from different disciplines.

  5. 2nd International Afro-European Conference for Industrial Advancement

    Wegrzyn-Wolska, Katarzyna; Hassanien, Aboul; Snasel, Vaclav; Alimi, Adel

    2016-01-01

    This volume contains papers presented at the 2nd International Afro-European Conference for Industrial Advancement -- AECIA 2015. The conference aimed at bringing together the foremost experts and excellent young researchers from Africa, Europe and the rest of the world to disseminate the latest results from various fields of engineering, information, and communication technologies. The topics, discussed at the conference, covered a broad range of domains spanning from ICT and engineering to prediction, modeling, and analysis of complex systems. The 2015 edition of AECIA featured a distinguished special track on prediction, modeling and analysis of complex systems -- Nostradamus, and special sessions on Advances in Image Processing and Colorization and Data Processing, Protocols, and Applications in Wireless Sensor Networks.

  6. 2nd CEAS Specialist Conference on Guidance, Navigation and Control

    Mulder, Bob; Choukroun, Daniel; Kampen, Erik-Jan; Visser, Coen; Looye, Gertjan

    2013-01-01

    Following the successful 1st CEAS (Council of European Aerospace Societies) Specialist Conference on Guidance, Navigation and Control (CEAS EuroGNC) held in Munich, Germany in 2011, Delft University of Technology happily accepted the invitation of organizing the 2nd  CEAS EuroGNC in Delft, The Netherlands in 2013. The goal of the conference is to promote new advances in aerospace GNC theory and technologies for enhancing safety, survivability, efficiency, performance, autonomy and intelligence of aerospace systems using on-board sensing, computing and systems. A great push for new developments in GNC are the ever higher safety and sustainability requirements in aviation. Impressive progress was made in new research fields such as sensor and actuator fault detection and diagnosis, reconfigurable and fault tolerant flight control, online safe flight envelop prediction and protection, online global aerodynamic model identification, online global optimization and flight upset recovery. All of these challenges de...

  7. 2nd International Multidisciplinary Microscopy and Microanalysis Congress

    Oral, Ahmet; Ozer, Mehmet

    2015-01-01

    The 2nd International Multidisciplinary Microscopy and Microanalysis Congress & Exhibition (InterM 2014) was held on 16–19 October 2014 in Oludeniz, Fethiye/ Mugla, Turkey. The aim of the congress was to gather scientists from various branches and discuss the latest improvements in the field of microscopy. The focus of the congress has been widened in an "interdisciplinary" manner, so as to allow all scientists working on several related subjects to participate and present their work. These proceedings include 33 peer-reviewed technical papers, submitted by leading academic and research institutions from over 17 countries and representing some of the most cutting-edge research available. The papers were presented at the congress in the following sessions: ·         Applications of Microscopy in the Physical Sciences ·         Applications of Microscopy in the Biological Sciences.

  8. 2nd International Conference on Communication and Computer Engineering

    Othman, Mohd; Othman, Mohd; Rahim, Yahaya; Pee, Naim

    2016-01-01

    This book covers diverse aspects of advanced computer and communication engineering, focusing specifically on industrial and manufacturing theory and applications of electronics, communications, computing and information technology. Experts in research, industry, and academia present the latest developments in technology, describe applications involving cutting-edge communication and computer systems, and explore likely future trends. In addition, a wealth of new algorithms that assist in solving computer and communication engineering problems are presented. The book is based on presentations given at ICOCOE 2015, the 2nd International Conference on Communication and Computer Engineering. It will appeal to a wide range of professionals in the field, including telecommunication engineers, computer engineers and scientists, researchers, academics and students.

  9. 2nd International Conference on Harmony Search Algorithm

    Geem, Zong

    2016-01-01

    The Harmony Search Algorithm (HSA) is one of the most well-known techniques in the field of soft computing, an important paradigm in the science and engineering community.  This volume, the proceedings of the 2nd International Conference on Harmony Search Algorithm 2015 (ICHSA 2015), brings together contributions describing the latest developments in the field of soft computing with a special focus on HSA techniques. It includes coverage of new methods that have potentially immense application in various fields. Contributed articles cover aspects of the following topics related to the Harmony Search Algorithm: analytical studies; improved, hybrid and multi-objective variants; parameter tuning; and large-scale applications.  The book also contains papers discussing recent advances on the following topics: genetic algorithms; evolutionary strategies; the firefly algorithm and cuckoo search; particle swarm optimization and ant colony optimization; simulated annealing; and local search techniques.   This book ...

  10. 2nd International Conference on Construction and Building Research

    Fernández-Plazaola, Igor; Hidalgo-Delgado, Francisco; Martínez-Valenzuela, María; Medina-Ramón, Francisco; Oliver-Faubel, Inmaculada; Rodríguez-Abad, Isabel; Salandin, Andrea; Sánchez-Grandia, Rafael; Tort-Ausina, Isabel; Construction and Building Research

    2014-01-01

    Many areas of knowledge converge in the building industry and therefore research in this field necessarily involves an interdisciplinary approach. Effective research requires strong relations between a broad variety of scientific and technological domains and more conventional construction or craft processes, while also considering advanced management processes, where all the main actors permanently interact. This publication takes an interdisciplinary approach grouping various studies on the building industry chosen from among the works presented for the 2nd International Conference on Construction and Building Research. The papers examine aspects of materials and building systems; construction technology; energy and sustainability; construction management; heritage, refurbishment and conservation. The information contained within these pages may be of interest to researchers and practitioners in construction and building activities from the academic sphere, as well as public and private sectors.

  11. 2nd Colombian Congress on Computational Biology and Bioinformatics

    Cristancho, Marco; Isaza, Gustavo; Pinzón, Andrés; Rodríguez, Juan

    2014-01-01

    This volume compiles accepted contributions for the 2nd Edition of the Colombian Computational Biology and Bioinformatics Congress CCBCOL, after a rigorous review process in which 54 papers were accepted for publication from 119 submitted contributions. Bioinformatics and Computational Biology are areas of knowledge that have emerged due to advances that have taken place in the Biological Sciences and its integration with Information Sciences. The expansion of projects involving the study of genomes has led the way in the production of vast amounts of sequence data which needs to be organized, analyzed and stored to understand phenomena associated with living organisms related to their evolution, behavior in different ecosystems, and the development of applications that can be derived from this analysis.  .

  12. Advanced User Interface Generation in the Software Framework for Magnetic Measurements at CERN

    Arpaia, P; La Commara, Giuseppe; Arpaia, Pasquale

    2010-01-01

    A model-based approach, the Model-View-Interactor Paradigm, for automatic generation of user interfaces in software frameworks for measurement systems is proposed. The Model-View-Interactor Paradigm is focused on the ``interaction{''} typical in a software framework for measurement applications: the final user interacts with the automatic measurement system executing a suitable high-level script previously written by a test engineer. According to the main design goal of frameworks, the proposed approach allows the user interfaces to be separated easily from the application logic for enhancing the flexibility and reusability of the software. As a practical case study, this approach has been applied to the flexible software framework for magnetic measurements at the European Organization for Nuclear research (CERN). In particular, experimental results about the scenario of permeability measurements are reported.

  13. Using Genetic Algorithm for Automated Efficient Software Test Case Generation for Path Testing

    Premal B. Nirpal

    2011-05-01

    Full Text Available This paper discusses genetic algorithms that can automatically generate test cases to test selected path. This algorithm takes a selected path as a target and executes sequences of operators iteratively for test cases to evolve. The evolved test case can lead the program execution to achieve the target path. An automatic path-oriented test data generation is not only a crucial problem but also a hot issue in the research area of software testing today.

  14. New Software Developments for Quality Mesh Generation and Optimization from Biomedical Imaging Data

    Yu, Zeyun; Wang, Jun; Gao, Zhanheng; Xu, Ming; Hoshijima, Masahiko

    2013-01-01

    In this paper we present a new software toolkit for generating and optimizing surface and volumetric meshes from three-dimensional (3D) biomedical imaging data, targeted at image-based finite element analysis of some biomedical activities in a single material domain. Our toolkit includes a series of geometric processing algorithms including surface re-meshing and quality-guaranteed tetrahedral mesh generation and optimization. All methods described have been encapsulated into a user-friendly ...

  15. Minimal Testcase Generation for Object-Oriented Software with State Charts

    Ranjita Kumari Swain

    2012-08-01

    Full Text Available Today statecharts are a de facto standard in industry for modeling system behavior. Test data generation isone of the key issues in software testing. This paper proposes an reduction approach to test data generationfor the state-based software testing. In this paper, first state transition graph is derived from state chartdiagram. Then, all the required information are extracted from the state chart diagram. Then, test casesare generated. Lastly, a set of test cases are minimized by calculating the node coverage for each test case.It is also determined that which test cases are covered by other test cases. The advantage of our testgeneration technique is that it optimizes test coverage by minimizing time and cost. The present test datageneration scheme generates test cases which satisfy transition path coverage criteria, path coveragecriteria and action coverage criteria. A case study on Railway Ticket Vending Machine (RTVM has beenpresented to illustrate our approach.

  16. Software R&D for Next Generation of HEP Experiments, Inspired by Theano

    CERN. Geneva

    2015-01-01

    In the next decade, the frontiers of High Energy Physics (HEP) will be explored by three machines: the High Luminosity Large Hadron Collider (HL-LHC) in Europe, the Long Base Neutrino Facility (LBNF) in the US, and the International Linear Collider (ILC) in Japan. These next generation experiments must address two fundamental problems in the current generation of HEP experimental software: the inability to take advantage and adapt to the rapidly evolving processor landscape, and the difficulty in developing and maintaining increasingly complex software systems by physicists. I will propose a strategy, inspired by the automatic optimization and code generation in Theano, to simultaneously address both problems. I will describe three R&D projects with short-term physics deliverables aimed at developing this strategy. The first project is to develop maximally sensitive General Search for New Physics at the LHC by applying the Matrix Element Method running GPUs of HPCs. The second is to classify and reconstru...

  17. Reliable Mining of Automatically Generated Test Cases from Software Requirements Specification (SRS)

    Raamesh, Lilly

    2010-01-01

    Writing requirements is a two-way process. In this paper we use to classify Functional Requirements (FR) and Non Functional Requirements (NFR) statements from Software Requirements Specification (SRS) documents. This is systematically transformed into state charts considering all relevant information. The current paper outlines how test cases can be automatically generated from these state charts. The application of the states yields the different test cases as solutions to a planning problem. The test cases can be used for automated or manual software testing on system level. And also the paper presents a method for reduction of test suite by using mining methods thereby facilitating the mining and knowledge extraction from test cases.

  18. Photonic generation and independent steering of multiple RF signals for software defined radars.

    Ghelfi, Paolo; Laghezza, Francesco; Scotti, Filippo; Serafino, Giovanni; Pinna, Sergio; Bogoni, Antonella

    2013-09-23

    As the improvement of radar systems claims for digital approaches, photonics is becoming a solution for software defined high frequency and high stability signal generation. We report on our recent activities on the photonic generation of flexible wideband RF signals, extending the proposed architecture to the independent optical beamforming of multiple signals. The scheme has been tested generating two wideband signals at 10 GHz and 40 GHz, and controlling their independent delays at two antenna elements. Thanks to the multiple functionalities, the proposed scheme allows to improve the effectiveness of the photonic approach, reducing its cost and allowing flexibility, extremely wide bandwidth, and high stability. PMID:24104176

  19. The design of real time infrared image generation software based on Creator and Vega

    Wang, Rui-feng; Wu, Wei-dong; Huo, Jun-xiu

    2013-09-01

    Considering the requirement of high reality and real-time quality dynamic infrared image of an infrared image simulation, a method to design real-time infrared image simulation application on the platform of VC++ is proposed. This is based on visual simulation software Creator and Vega. The functions of Creator are introduced simply, and the main features of Vega developing environment are analyzed. The methods of infrared modeling and background are offered, the designing flow chart of the developing process of IR image real-time generation software and the functions of TMM Tool and MAT Tool and sensor module are explained, at the same time, the real-time of software is designed.

  20. APTWG: 2nd Asia-Pacific Transport Working Group Meeting

    This conference report summarizes the contributions to and discussions at the 2nd Asia-Pacific Transport Working Group Meeting held in Chengdu, China, from 15 to 18 May 2012. The topics of the meeting were organized under five main headings: momentum transport, non-locality in transport, edge turbulence and L–H transition, three-dimensional effects on transport physics, and particle, momentum and heat pinches. It is found that lower hybrid wave and ion cyclotron wave induce co-current rotation while electron cyclotron wave induces counter-current rotation. A four-stage imaging for low (L) to high (H) confinement transition gradually emerges and a more detailed verification is urgently expected. The new edge-localized modes mitigation technique with supersonic molecular beam injection was approved to be effective to some extent on HL-2A and KSTAR. It is also found that low collisionality, trapped electron mode to ion temperature gradient transition (or transition of higher to lower density and temperature gradients), fuelling and lithium coating are in favour of inward pinch of particles in tokamak plasmas. (paper)

  1. Book review: Psychology in a work context (2nd Ed.

    Nanette Tredoux

    2003-10-01

    Full Text Available Bergh, Z. & Theron, A.L. (Eds (2003 Psychology in a work context (2nd Ed.. Cape Town: Oxford University Press. This book is an overview and introduction to Industrial and Organisational Psychology. It is a work of ambitious scope, and it is clear that the contributors have invested a great deal of thought and effort in the planning and execution of the book. The current version is the second edition, and it looks set to become one of those standard textbooks that are revised every few years to keep up with changing times. It is a handsome volume, produced to a high standard of editorial care, pleasingly laid out and organised well enough to be useful as an occasional reference source. An English-Afrikaans glossary, tables of contents for every chapter as well as for the entire book, a comprehensive index and extensive bibliography make it easy to retrieve the information relating to a particular topic. Every chapter ends with a conclusion summarising the gist of the material covered. Quality illustrations lighten the tone and help to bring some of the concepts to life. Learning outcomes and self-assessment exercises and questions for every chapter will be useful to the lecturer using the book as a source for a tutored course, and for the student studying by distance learning. If sold at the suggested retail price, the book represents good value compared to imported textbooks that cover similar ground.

  2. PREFACE: 2nd International Symposium "Optics and its Applications"

    Calvo, Maria L.; Dolganova, Irina N.; Gevorgyan, Narine; Guzman, Angela; Papoyan, Aram; Sarkisyan, Hayk; Yurchenko, Stanislav

    2016-01-01

    The ICTP smr2633: 2nd International Symposium "Optics and its Applications" (OPTICS-2014) http://indico.ictp.it/event/a13253/ was held in Yerevan and Ashtarak, Armenia, on 1-5 September 2014. The Symposium was organized by the Abdus Salam International Center for Theoretical Physics (ICTP) with the collaboration of the SPIE Armenian Student Chapter, the Armenian TC of ICO, the Russian-Armenian University (RAU), the Institute for Physical Research of the National Academy of Sciences of Armenia (IPR of NAS), the Greek-Armenian industrial company LT-Pyrkal, and the Yerevan State University (YSU). The Symposium was co-organized by the BMSTU SPIE & OSA student chapters. The International Symposium OPTICS-2014 was dedicated to the 50th anniversary of the Abdus Salam International Center for Theoretical Physics. This symposium "Optics and its Applications" was the First Official ICTP Scientific Event in Armenia. The presentations at OPTICS-2014 were centered on these topics: optical properties of nanostructures; quantum optics & information; singular optics and its applications; laser spectroscopy; strong field optics; nonlinear & ultrafast optics; photonics & fiber optics; optics of liquid crystals; and mathematical methods in optics.

  3. Next generation of decision making software for nanopatterns characterization: application to semiconductor industry

    Dervilllé, A.; Labrosse, A.; Zimmermann, Y.; Foucher, J.; Gronheid, R.; Boeckx, C.; Singh, A.; Leray, P.; Halder, S.

    2016-03-01

    The dimensional scaling in IC manufacturing strongly drives the demands on CD and defect metrology techniques and their measurement uncertainties. Defect review has become as important as CD metrology and both of them create a new metrology paradigm because it creates a completely new need for flexible, robust and scalable metrology software. Current, software architectures and metrology algorithms are performant but it must be pushed to another higher level in order to follow roadmap speed and requirements. For example: manage defect and CD in one step algorithm, customize algorithms and outputs features for each R&D team environment, provide software update every day or every week for R&D teams in order to explore easily various development strategies. The final goal is to avoid spending hours and days to manually tune algorithm to analyze metrology data and to allow R&D teams to stay focus on their expertise. The benefits are drastic costs reduction, more efficient R&D team and better process quality. In this paper, we propose a new generation of software platform and development infrastructure which can integrate specific metrology business modules. For example, we will show the integration of a chemistry module dedicated to electronics materials like Direct Self Assembly features. We will show a new generation of image analysis algorithms which are able to manage at the same time defect rates, images classifications, CD and roughness measurements with high throughput performances in order to be compatible with HVM. In a second part, we will assess the reliability, the customization of algorithm and the software platform capabilities to follow new specific semiconductor metrology software requirements: flexibility, robustness, high throughput and scalability. Finally, we will demonstrate how such environment has allowed a drastic reduction of data analysis cycle time.

  4. Software for Generating Troposphere Corrections for InSAR Using GPS and Weather Model Data

    Moore, Angelyn W.; Webb, Frank H.; Fishbein, Evan F.; Fielding, Eric J.; Owen, Susan E.; Granger, Stephanie L.; Bjoerndahl, Fredrik; Loefgren, Johan; Fang, Peng; Means, James D.; Bock, Yehuda; Tong, Xiaopeng

    2013-01-01

    Atmospheric errors due to the troposphere are a limiting error source for spaceborne interferometric synthetic aperture radar (InSAR) imaging. This software generates tropospheric delay maps that can be used to correct atmospheric artifacts in InSAR data. The software automatically acquires all needed GPS (Global Positioning System), weather, and Digital Elevation Map data, and generates a tropospheric correction map using a novel algorithm for combining GPS and weather information while accounting for terrain. Existing JPL software was prototypical in nature, required a MATLAB license, required additional steps to acquire and ingest needed GPS and weather data, and did not account for topography in interpolation. Previous software did not achieve a level of automation suitable for integration in a Web portal. This software overcomes these issues. GPS estimates of tropospheric delay are a source of corrections that can be used to form correction maps to be applied to InSAR data, but the spacing of GPS stations is insufficient to remove short-wavelength tropospheric artifacts. This software combines interpolated GPS delay with weather model precipitable water vapor (PWV) and a digital elevation model to account for terrain, increasing the spatial resolution of the tropospheric correction maps and thus removing short wavelength tropospheric artifacts to a greater extent. It will be integrated into a Web portal request system, allowing use in a future L-band SAR Earth radar mission data system. This will be a significant contribution to its technology readiness, building on existing investments in in situ space geodetic networks, and improving timeliness, quality, and science value of the collected data

  5. Software architecture for control and data acquisition of linear plasma generator Magnum-PSI

    Highlights: ► An architecture based on a modular design. ► The design offers flexibility and extendability. ► The design covers the overall software architecture. ► It also covers its (sub)systems’ internal structure. -- Abstract: The FOM Institute DIFFER – Dutch Institute for Fundamental Energy Research has completed the construction phase of Magnum-PSI, a magnetized, steady-state, large area, high-flux linear plasma beam generator to study plasma surface interactions under ITER divertor conditions. Magnum-PSI consists of several hardware subsystems, and a variety of diagnostic systems. The COntrol, Data Acquisition and Communication (CODAC) system integrates these subsystems and provides a complete interface for the Magnum-PSI users. Integrating it all, from the lowest hardware level of sensors and actuators, via the level of networked PLCs and computer systems, up to functions and classes in programming languages, demands a sound and modular software architecture, which is extendable and scalable for future changes. This paper describes this architecture, and the modular design of the software subsystems. The design is implemented in the CODAC system at the level of services and subsystems (the overall software architecture), as well as internally in the software subsystems

  6. Resolution 519/012. It is allowed to R DEL SUR S.A company to generate a wind electricity source by a generating power plant placed in Maldonado province 2nd and 4th Catastral section, as well as the connection to the Interconnected National System

    The Resolution 519 is according to the Electric Wholesale Market regulation and it authorizes the power generation using the wind as the primary source. The company who presented this project was R DEL SUR S.A with the aim to instal a wind power plant in Maldonado province.

  7. Aptaligner: Automated Software for Aligning Pseudorandom DNA X-Aptamers from Next-Generation Sequencing Data

    Lu, Emily; Elizondo-Riojas, Miguel-Angel; Chang, Jeffrey T.; Volk, David E.

    2014-01-01

    Next-generation sequencing results from bead-based aptamer libraries have demonstrated that traditional DNA/RNA alignment software is insufficient. This is particularly true for X-aptamers containing specialty bases (W, X, Y, Z, ...) that are identified by special encoding. Thus, we sought an automated program that uses the inherent design scheme of bead-based X-aptamers to create a hypothetical reference library and Markov modeling techniques to provide improved alignments. Aptaligner provid...

  8. Simulation of photovoltaic systems electricity generation using homer software in specific locations in Serbia

    Pavlović Tomislav M.; Milosavljević Dragana D.; Pirsl Danica S.

    2013-01-01

    In this paper basic information of Homer software for PV system electricity generation, NASA - Surface meteorology and solar energy database, RETScreen, PVGIS and HMIRS (Hydrometeorological Institute of Republic of Serbia) solar databases are given. The comparison of the monthly average values for daily solar radiation per square meter received by the horizontal surface taken from NASA, RETScreen, PVGIS and HMIRS solar databases for three locations in Serbia (Belgrade, Negotin and Zlati...

  9. Towards a Pattern-based Automatic Generation of Logical Specifications for Software Models

    Klimek, Radoslaw

    2014-01-01

    The work relates to the automatic generation of logical specifications, considered as sets of temporal logic formulas, extracted directly from developed software models. The extraction process is based on the assumption that the whole developed model is structured using only predefined workflow patterns. A method of automatic transformation of workflow patterns to logical specifications is proposed. Applying the presented concepts enables bridging the gap between the benefits of deductive rea...

  10. NgsRelate: a software tool for estimating pairwise relatedness from next-generation sequencing data

    Korneliussen, Thorfinn Sand; Moltke, Ida

    2015-01-01

    MOTIVATION: Pairwise relatedness estimation is important in many contexts such as disease mapping and population genetics. However, all existing estimation methods are based on called genotypes, which is not ideal for next-generation sequencing (NGS) data of low depth from which genotypes cannot be called with high certainty.RESULTS: We present a software tool, NgsRelate, for estimating pairwise relatedness from NGS data. It provides maximum likelihood estimates that are based on genotype lik...

  11. An automation of design and modelling tasks in NX Siemens environment with original software - generator module

    Zbiciak, M.; Grabowik, C.; Janik, W.

    2015-11-01

    Nowadays the design constructional process is almost exclusively aided with CAD/CAE/CAM systems. It is evaluated that nearly 80% of design activities have a routine nature. These design routine tasks are highly susceptible to automation. Design automation is usually made with API tools which allow building original software responsible for adding different engineering activities. In this paper the original software worked out in order to automate engineering tasks at the stage of a product geometrical shape design is presented. The elaborated software works exclusively in NX Siemens CAD/CAM/CAE environment and was prepared in Microsoft Visual Studio with application of the .NET technology and NX SNAP library. The software functionality allows designing and modelling of spur and helicoidal involute gears. Moreover, it is possible to estimate relative manufacturing costs. With the Generator module it is possible to design and model both standard and non-standard gear wheels. The main advantage of the model generated in such a way is its better representation of an involute curve in comparison to those which are drawn in specialized standard CAD systems tools. It comes from fact that usually in CAD systems an involute curve is drawn by 3 points that respond to points located on the addendum circle, the reference diameter of a gear and the base circle respectively. In the Generator module the involute curve is drawn by 11 involute points which are located on and upper the base and the addendum circles therefore 3D gear wheels models are highly accurate. Application of the Generator module makes the modelling process very rapid so that the gear wheel modelling time is reduced to several seconds. During the conducted research the analysis of differences between standard 3 points and 11 points involutes was made. The results and conclusions drawn upon analysis are shown in details.

  12. Development of a Hydrologic Characterization Technology for Fault Zones Phase II 2nd Report

    Karasaki, Kenzi [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Doughty, Christine [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Gasperikova, Erika [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Peterson, John [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Conrad, Mark [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Cook, Paul [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Tiemi, Onishi [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2011-03-31

    This is the 2nd report on the three-year program of the 2nd phase of the NUMO-LBNL collaborative project: Development of Hydrologic Characterization Technology for Fault Zones under NUMO-DOE/LBNL collaboration agreement. As such, this report is a compendium of the results by Kiho et al. (2011) and those by LBNL.

  13. Examples to Accompany "Descriptive Cataloging of Rare Books, 2nd Edition."

    Association of Coll. and Research Libraries, Chicago, IL.

    This book is intended to be used with "Descriptive Cataloging of Rare Books," 2nd edition (DCRB) as an illustrative aid to catalogers and others interested in or needing to interpret rare book cataloging. As such, it is to be used in conjunction with the rules it illustrates, both in DCRB and in "Anglo-American Cataloging Rules," 2nd edition…

  14. SEMANTIC WEB-BASED SOFTWARE ENGINEERING BY AUTOMATED REQUIREMENTS ONTOLOGY GENERATION IN SOA

    Vahid Rastgoo

    2014-04-01

    Full Text Available This paper presents an approach for automated generation of requirements ontology using UML diagrams in service-oriented architecture (SOA. The goal of this paper is to convenience progress of software engineering processes like software design, software reuse, service discovering and etc. The proposed method is based on a four conceptual layers. The first layer includes requirements achieved by stakeholders, the second one designs service-oriented diagrams from the data in first layer and extracts XMI codes of them. The third layer includes requirement ontology and protocol ontology to describe behavior of services and relationships between them semantically. Finally the forth layer makes standard the concepts exists in ontologies of previous layer. The generated ontology exceeds absolute domain ontology because it considers the behavior of services moreover the hierarchical relationship of them. Experimental results conducted on a set of UML4Soa diagrams in different scopes demonstrate the improvement of the proposed approach from different points of view such as: completeness of requirements ontology, automatic generation and considering SOA.

  15. Analysis of quality raw data of second generation sequencers with Quality Assessment Software

    2011-01-01

    Background Second generation technologies have advantages over Sanger; however, they have resulted in new challenges for the genome construction process, especially because of the small size of the reads, despite the high degree of coverage. Independent of the program chosen for the construction process, DNA sequences are superimposed, based on identity, to extend the reads, generating contigs; mismatches indicate a lack of homology and are not included. This process improves our confidence in the sequences that are generated. Findings We developed Quality Assessment Software, with which one can review graphs showing the distribution of quality values from the sequencing reads. This software allow us to adopt more stringent quality standards for sequence data, based on quality-graph analysis and estimated coverage after applying the quality filter, providing acceptable sequence coverage for genome construction from short reads. Conclusions Quality filtering is a fundamental step in the process of constructing genomes, as it reduces the frequency of incorrect alignments that are caused by measuring errors, which can occur during the construction process due to the size of the reads, provoking misassemblies. Application of quality filters to sequence data, using the software Quality Assessment, along with graphing analyses, provided greater precision in the definition of cutoff parameters, which increased the accuracy of genome construction. PMID:21501521

  16. PREFACE: 2nd National Conference on Nanotechnology 'NANO 2008'

    Czuba, P.; Kolodziej, J. J.; Konior, J.; Szymonski, M.

    2009-03-01

    This issue of Journal of Physics: Conference Series contains selected papers presented at the 2nd National Conference on Nanotechnology 'NANO2008', that was held in Kraków, Poland, 25-28 June 2008. It was organized jointly by the Polish Chemical Society, Polish Physical Society, Polish Vacuum Society, and the Centre for Nanometer-scale Science and Advanced Materials (NANOSAM) of the Jagiellonian University. The meeting presentations were categorized into the following topics: 1. Nanomechanics and nanotribology 2. Characterization and manipulation in nanoscale 3. Quantum effects in nanostructures 4. Nanostructures on surfaces 5. Applications of nanotechnology in biology and medicine 6. Nanotechnology in education 7. Industrial applications of nanotechnology, presentations of the companies 8. Nanoengineering and nanomaterials (international sessions shared with the fellows of Maria-Curie Host Fellowships within the 6th FP of the European Community Project 'Nano-Engineering for Expertise and Development, NEED') 9. Nanopowders 10. Carbon nanostructures and nanosystems 11. Nanoelectronics and nanophotonics 12. Nanomaterials in catalysis 13. Nanospintronics 14. Ethical, social, and environmental aspects of nanotechnology The Conference was attended by 334 participants. The presentations were delivered as 7 invited plenary lectures, 25 invited topical lectures, 78 oral and 108 poster contributions. Only 1/6 of the contributions presented during the Conference were submitted for publication in this Proceedings volume. From the submitted material, this volume of Journal of Physics: Conference Series contains 37 articles that were positively evaluated by independent referees. The Organizing Committee gratefully acknowledges all these contributions. We also thank all the referees of the papers submitted for the Proceedings for their timely and thorough work. We would like to thank all members of the National Program Committee for their work in the selection process of

  17. Library perceptions of using social software as blogs in the idea generation phase of service innovations

    Scupola, Ada; Nicolajsen, Hanne Westh

    2013-01-01

    This article investigates the use of social software such as blogs to communicate with and to involve users in the idea generation process of service innovations. After a theoretical discussion of user involvement and more specifically user involvement using web-tools with specific focus on blogs......, the article reports findings and lessons from a field experiment at a university library. In the experiment, a blog was established to collect service innovation ideas from the library users. The experiment shows that a blog may engage a limited number of users in the idea generation process and...

  18. Pragmatics Annotated Coloured Petri Nets for Protocol Software Generation and Verification

    Fagerland Simonsen, Kent Inge; Kristensen, Lars Michael; Kindler, Ekkart

    PetriCode is a tool that supports automated generation of protocol software from a restricted class of Coloured Petri Nets (CPNs) called Pragmatics Annotated Coloured Petri Nets (PA-CPNs). Petri-Code and PA-CPNs have been designed with five main requirements in mind, which include the same model...... being used for verification and code generation. The PetriCode approach has been discussed and evaluated in earlier papers already. In this paper, we give a formal definition of PA-CPNs and demonstrate how the specific structure of PA-CPNs can be exploited for verification purposes....

  19. 2nd interface between ecology and land development in California

    Keeley, Jon E.; Baer-Keeley, Melanie; Fortheringham, C.J.

    2000-01-01

    The 2nd Interface Between Ecology and Land Development Conference was held in association with Earth Day 1997, five years after the first Interface Conference. Rapid population growth in California has intensified the inevitable conflict between land development and preservation of natural ecosystems. Sustainable development requires wise use of diminishing natural resources and, where possible, restoration of damaged landscapes. These Earth Week Celebrations brought together resource managers, scientists, politicians, environmental consultants, and concerned citizens in an effort to improve the communication necessary to maintain our natural biodiversity, ecosystem processes and general quality of life. As discussed by our keynote speaker, Michael Soule, the best predictor of habitat loss is population growth and nowhere is this better illustrated than in California. As urban perimeters expand, the interface between wildlands and urban areas increases. Few problems are more vexing than how to manage the fire prone ecosystems indigenous to California at this urban interface. Today resource managers face increasing challenges of dealing with this problem and the lead-off section of the proceedings considers both the theoretical basis for making decisions related to prescribed burning and the practical application. Habitat fragmentation is an inevitable consequence of development patterns with significant impacts on animal and plant populations. Managers must be increasingly resourceful in dealing with problems of fragmentation and the often inevitable consequences, including susceptibility to invasive oganisms. One approach to dealing with fragmentation problems is through careful landplanning. California is the national leader in the integration of conservation and economics. On Earth Day 1991, Governor Pete Wilson presented an environmental agenda that promised to create between land owners and environmentalists, agreements that would guarantee the protection of

  20. Pragmatics Annotated Coloured Petri Nets for Protocol Software Generation and Verification

    Simonsen, Kent Inge; Kristensen, Lars Michael; Kindler, Ekkart

    This paper presents the formal definition of Pragmatics Annotated Coloured Petri Nets (PA-CPNs). PA-CPNs represent a class of Coloured Petri Nets (CPNs) that are designed to support automated code genera-tion of protocol software. PA-CPNs restrict the structure of CPN models and allow Petri net...... elements to be annotated with so-called pragmatics, which are exploited for code generation. The approach and tool for gen-erating code is called PetriCode and has been discussed and evaluated in earlier work already. The contribution of this paper is to give a formal def-inition for PA-CPNs; in addition......, we show how the structural restrictions of PA-CPNs can be exploited for making the verification of the modelled protocols more efficient. This is done by automatically deriving progress measures for the sweep-line method, and by introducing so-called service testers, that can be used to control the...

  1. Easy Steps to STAIRS. 2nd Revised Edition.

    National Library of Australia, Canberra.

    This manual for computer searchers describes the software package--IBM's STAIRS (Storage And Information Retrieval System)--used for searching databases in AUSINET (AUStralian Information NETwork). Whereas the first edition explained STAIRS in the context of the National Library's Online ERIC Project and the ERIC data base, this second edition…

  2. EVENT GENERATION OF STANDARD MODEL HIGGS DECAY TO DIMUON PAIRS USING PYTHIA SOFTWARE

    Yusof, Adib

    2015-01-01

    My project for CERN Summer Student Programme 2015 is on Event Generation of Standard Model Higgs Decay to Dimuon Pairs using Pythia Software. Briefly, Pythia or specifically, Pythia 8.1 is a program for the generation of high-energy Physics events that is able to describe the collisions at any given energies between elementary particles such as Electron, Positron, Proton as well as anti-Proton. It contains theory and models for a number of Physics aspects, including hard and soft interactions, parton distributions, initial-state and final-state parton showers, multiparton interactions, fragmentation and decay. All programming code is to be written in C++ language for this version (the previous version uses FORTRAN) and can be linked to ROOT software for displaying output in form of histogram. For my project, I need to generate events for standard model Higgs Boson into Muon and anti-Muon pairs (H→μ+ μ) to study the expected significance value for this particular process at centre-of-mass energy of 13 TeV...

  3. PREFACE: 2nd Workshop on Germanium Detectors and Technologies

    Abt, I.; Majorovits, B.; Keller, C.; Mei, D.; Wang, G.; Wei, W.

    2015-05-01

    The 2nd workshop on Germanium (Ge) detectors and technology was held at the University of South Dakota on September 14-17th 2014, with more than 113 participants from 8 countries, 22 institutions, 15 national laboratories, and 8 companies. The participants represented the following big projects: (1) GERDA and Majorana for the search of neutrinoless double-beta decay (0νββ) (2) SuperCDMS, EDELWEISS, CDEX, and CoGeNT for search of dark matter; (3) TEXONO for sub-keV neutrino physics; (4) AGATA and GRETINA for gamma tracking; (5) AARM and others for low background radiation counting; (5) as well as PNNL and LBNL for applications of Ge detectors in homeland security. All participants have expressed a strong desire on having better understanding of Ge detector performance and advancing Ge technology for large-scale applications. The purpose of this workshop was to leverage the unique aspects of the underground laboratories in the world and the germanium (Ge) crystal growing infrastructure at the University of South Dakota (USD) by brining researchers from several institutions taking part in the Experimental Program to Stimulate Competitive Research (EPSCoR) together with key leaders from international laboratories and prestigious universities, working on the forefront of the intensity to advance underground physics focusing on the searches for dark matter, neutrinoless double-beta decay (0νββ), and neutrino properties. The goal of the workshop was to develop opportunities for EPSCoR institutions to play key roles in the planned world-class research experiments. The workshop was to integrate individual talents and existing research capabilities, from multiple disciplines and multiple institutions, to develop research collaborations, which includes EPSCor institutions from South Dakota, North Dakota, Alabama, Iowa, and South Carolina to support multi-ton scale experiments for future. The topic areas covered in the workshop were: 1) science related to Ge

  4. Evolution of a Reconfigurable Processing Platform for a Next Generation Space Software Defined Radio

    Kacpura, Thomas J.; Downey, Joseph A.; Anderson, Keffery R.; Baldwin, Keith

    2014-01-01

    The National Aeronautics and Space Administration (NASA)Harris Ka-Band Software Defined Radio (SDR) is the first, fully reprogrammable space-qualified SDR operating in the Ka-Band frequency range. Providing exceptionally higher data communication rates than previously possible, this SDR offers in-orbit reconfiguration, multi-waveform operation, and fast deployment due to its highly modular hardware and software architecture. Currently in operation on the International Space Station (ISS), this new paradigm of reconfigurable technology is enabling experimenters to investigate navigation and networking in the space environment.The modular SDR and the NASA developed Space Telecommunications Radio System (STRS) architecture standard are the basis for Harris reusable, digital signal processing space platform trademarked as AppSTAR. As a result, two new space radio products are a synthetic aperture radar payload and an Automatic Detection Surveillance Broadcast (ADS-B) receiver. In addition, Harris is currently developing many new products similar to the Ka-Band software defined radio for other applications. For NASAs next generation flight Ka-Band radio development, leveraging these advancements could lead to a more robust and more capable software defined radio.The space environment has special considerations different from terrestrial applications that must be considered for any system operated in space. Each space mission has unique requirements that can make these systems unique. These unique requirements can make products that are expensive and limited in reuse. Space systems put a premium on size, weight and power. A key trade is the amount of reconfigurability in a space system. The more reconfigurable the hardware platform, the easier it is to adapt to the platform to the next mission, and this reduces the amount of non-recurring engineering costs. However, the more reconfigurable platforms often use more spacecraft resources. Software has similar considerations

  5. NEXT GENERATION ANALYSIS SOFTWARE FOR COMPONENT EVALUATION - Results of Rotational Seismometer Evaluation

    Hart, D. M.; Merchant, B. J.; Abbott, R. E.

    2012-12-01

    The Component Evaluation project at Sandia National Laboratories supports the Ground-based Nuclear Explosion Monitoring program by performing testing and evaluation of the components that are used in seismic and infrasound monitoring systems. In order to perform this work, Component Evaluation maintains a testing facility called the FACT (Facility for Acceptance, Calibration, and Testing) site, a variety of test bed equipment, and a suite of software tools for analyzing test data. Recently, Component Evaluation has successfully integrated several improvements to its software analysis tools and test bed equipment that have substantially improved our ability to test and evaluate components. The software tool that is used to analyze test data is called TALENT: Test and AnaLysis EvaluatioN Tool. TALENT is designed to be a single, standard interface to all test configuration, metadata, parameters, waveforms, and results that are generated in the course of testing monitoring systems. It provides traceability by capturing everything about a test in a relational database that is required to reproduce the results of that test. TALENT provides a simple, yet powerful, user interface to quickly acquire, process, and analyze waveform test data. The software tool has also been expanded recently to handle sensors whose output is proportional to rotation angle, or rotation rate. As an example of this new processing capability, we show results from testing the new ATA ARS-16 rotational seismometer. The test data was collected at the USGS ASL. Four datasets were processed: 1) 1 Hz with increasing amplitude, 2) 4 Hz with increasing amplitude, 3) 16 Hz with increasing amplitude and 4) twenty-six discrete frequencies between 0.353 Hz to 64 Hz. The results are compared to manufacture-supplied data sheets.

  6. A proposed metamodel for the implementation of object oriented software through the automatic generation of source code

    CARVALHO, J. S. C.

    2008-12-01

    Full Text Available During the development of software one of the most visible risks and perhaps the biggest implementation obstacle relates to the time management. All delivery deadlines software versions must be followed, but it is not always possible, sometimes due to delay in coding. This paper presents a metamodel for software implementation, which will rise to a development tool for automatic generation of source code, in order to make any development pattern transparent to the programmer, significantly reducing the time spent in coding artifacts that make up the software.

  7. 2nd U.S. Case of Bacteria Resistant to Last-Resort Antibiotic

    ... news/fullstory_159807.html 2nd U.S. Case of Bacteria Resistant to Last-Resort Antibiotic Scientists concerned it ... the United States who was infected with a bacteria that is resistant to an antibiotic of last ...

  8. Optimized Pump Power Ratio on 2nd Order Pumping Discrete Raman Amplifier

    Renxiang Huang; Youichi Akasaka; David L. Harris; James Pan

    2003-01-01

    By optimizing pump power ratio between 1st order backward pump and 2nd order forward pump on discrete Raman amplifier, we demonstrated over 2dB noise figure improvement without excessive non-linearity degradation.

  9. Combustion synthesis and characterization of Ba2NdSbO6 nanocrystals

    V T Kavitha; R Jose; S Ramakrishna; P R S Wariar; J Koshy

    2011-07-01

    Nanocrystalline Ba2NdSbO6, a complex cubic perovskite metal oxide, powders were synthesized by a self-sustained combustion method employing citric acid. The product was characterized by X-ray diffraction, differential thermal analysis, thermogravimetric analysis, Fourier transform infrared spectroscopy, transmission electron microscopy and scanning electron microscopy. The as-prepared powders were single phase Ba2NdSbO6 and a mixture of polycrystalline spheroidal particles and single crystalline nanorods. The Ba2NdSbO6 sample sintered at 1500°C for 4 h has high density (∼ 95% of theoretical density). Sintered nanocrystalline Ba2NdSbO6 had a dielectric constant of ∼ 21; and dielectric loss = 8 × 10-3 at 5 MHz.

  10. 76 FR 29750 - Filing Dates for the Nevada Special Election in the 2nd Congressional District

    2011-05-23

    ... General Election on September 13, 2011, to fill the U.S. House seat in ] the 2nd Congressional District... report, the first report must cover all activity that occurred before the committee registered as...

  11. 77 FR 75161 - Filing Dates for the Illinois Special Election in the 2nd Congressional District

    2012-12-19

    ... February 26, 2013, and April 9, 2013, to fill the U.S. House seat in the 2nd Congressional District vacated... not previously filed a report, the first report must cover all activity that occurred before...

  12. File list: His.Lar.50.AllAg.2nd_instar [Chip-atlas[Archive

    Full Text Available His.Lar.50.AllAg.2nd_instar dm3 Histone Larvae 2nd instar SRX013015,SRX013042,SRX01...3112,SRX013043,SRX013087,SRX013096 http://dbarchive.biosciencedbc.jp/kyushu-u/dm3/assembled/His.Lar.50.AllAg.2nd_instar.bed ...

  13. File list: ALL.Lar.10.AllAg.2nd_instar [Chip-atlas[Archive

    Full Text Available ALL.Lar.10.AllAg.2nd_instar dm3 All antigens Larvae 2nd instar SRX013087,SRX013015,...SRX013112,SRX013042,SRX013043,SRX013096,SRX013113,SRX013016,SRX013114 http://dbarchive.biosciencedbc.jp/kyushu-u/dm3/assembled/ALL.Lar.10.AllAg.2nd_instar.bed ...

  14. File list: His.Lar.05.AllAg.2nd_instar [Chip-atlas[Archive

    Full Text Available His.Lar.05.AllAg.2nd_instar dm3 Histone Larvae 2nd instar SRX013087,SRX013096,SRX01...3043,SRX013015,SRX013112,SRX013042 http://dbarchive.biosciencedbc.jp/kyushu-u/dm3/assembled/His.Lar.05.AllAg.2nd_instar.bed ...

  15. File list: His.Lar.20.AllAg.2nd_instar [Chip-atlas[Archive

    Full Text Available His.Lar.20.AllAg.2nd_instar dm3 Histone Larvae 2nd instar SRX013015,SRX013042,SRX01...3112,SRX013043,SRX013096,SRX013087 http://dbarchive.biosciencedbc.jp/kyushu-u/dm3/assembled/His.Lar.20.AllAg.2nd_instar.bed ...

  16. File list: ALL.Lar.20.AllAg.2nd_instar [Chip-atlas[Archive

    Full Text Available ALL.Lar.20.AllAg.2nd_instar dm3 All antigens Larvae 2nd instar SRX013015,SRX013042,...SRX013112,SRX013043,SRX013016,SRX013114,SRX013096,SRX013087,SRX013113 http://dbarchive.biosciencedbc.jp/kyushu-u/dm3/assembled/ALL.Lar.20.AllAg.2nd_instar.bed ...

  17. File list: ALL.Lar.50.AllAg.2nd_instar [Chip-atlas[Archive

    Full Text Available ALL.Lar.50.AllAg.2nd_instar dm3 All antigens Larvae 2nd instar SRX013015,SRX013042,...SRX013112,SRX013016,SRX013114,SRX013043,SRX013087,SRX013096,SRX013113 http://dbarchive.biosciencedbc.jp/kyushu-u/dm3/assembled/ALL.Lar.50.AllAg.2nd_instar.bed ...

  18. ANT: Software for Generating and Evaluating Degenerate Codons for Natural and Expanded Genetic Codes.

    Engqvist, Martin K M; Nielsen, Jens

    2015-08-21

    The Ambiguous Nucleotide Tool (ANT) is a desktop application that generates and evaluates degenerate codons. Degenerate codons are used to represent DNA positions that have multiple possible nucleotide alternatives. This is useful for protein engineering and directed evolution, where primers specified with degenerate codons are used as a basis for generating libraries of protein sequences. ANT is intuitive and can be used in a graphical user interface or by interacting with the code through a defined application programming interface. ANT comes with full support for nonstandard, user-defined, or expanded genetic codes (translation tables), which is important because synthetic biology is being applied to an ever widening range of natural and engineered organisms. The Python source code for ANT is freely distributed so that it may be used without restriction, modified, and incorporated in other software or custom data pipelines. PMID:25901796

  19. Meteosat Second Generation station: processing software and computing arquitecture; Estacion de Recepcion de Imagenes del Satelite Meteosat Segunda generacion: Arquitectura Informatica y Software de Proceso

    Martin, L.; Cony, M.; Navarro, A. A.; Zarzalejo, L. F.; Polo, J.

    2010-05-01

    The Renewable Energy Division of CIEMAT houses a specific station for receiving the Meteosat Second Generation images, which is of interest in the works being carried out concerning the solar radiation derived from satellite images. The complexity, the huge amount of information being received and the particular characteristics of the MSG images encouraged the design and development of a specific computer structure and the associated software as well, for a better and more suitable use of the images. This document describes the mentioned structure and software. (Author) 8 refs.

  20. Severe weather phenomena: SQUALL LINES The case of July 2nd 2009

    Paraschivescu, Mihnea; Tanase, Adrian

    2010-05-01

    The wind intensity plays an important role, among the dangerous meteorological phenomena, to produce negative effects on the economy and the social activities, particularly when the wind is about to turn into a storm. During the past years one can notice an increase of wind frequency and intensity due to climate changes and, consequently, as a result of the extreme meteorological phenomena not only on a planetary level but also on a regional one. Although dangerous meteorological phenomena cannot be avoided, since they are natural, nevertheless they can be anticipated and decision making institutions and mass media can be informed. This is the reason why, in this paper, we set out to identify the synoptic conditions that led to the occurrence of the severe storm case in Bucharest on July 2nd, 2009, as well as the matrices that generate such cases. At the same time we sought to identify some indications evidence especially from radar data so as to lead to the improvement of the time interval between the nowcasting warning and the actual occurrence of the phenomenon.

  1. 2nd International Conference on Computer and Communication Technologies

    Raju, K; Mandal, Jyotsna; Bhateja, Vikrant

    2016-01-01

    The book is about all aspects of computing, communication, general sciences and educational research covered at the Second International Conference on Computer & Communication Technologies held during 24-26 July 2015 at Hyderabad. It hosted by CMR Technical Campus in association with Division – V (Education & Research) CSI, India. After a rigorous review only quality papers are selected and included in this book. The entire book is divided into three volumes. Three volumes cover a variety of topics which include medical imaging, networks, data mining, intelligent computing, software design, image processing, mobile computing, digital signals and speech processing, video surveillance and processing, web mining, wireless sensor networks, circuit analysis, fuzzy systems, antenna and communication systems, biomedical signal processing and applications, cloud computing, embedded systems applications and cyber security and digital forensic. The readers of these volumes will be highly benefited from the te...

  2. Next generation hyper-scale software and hardware systems for big data analytics

    CERN. Geneva

    2013-01-01

    Building on foundational technologies such as many-core systems, non-volatile memories and photonic interconnects, we describe some current technologies and future research to create real-time, big data analytics, IT infrastructure. We will also briefly describe some of our biologically-inspired software and hardware architecture for creating radically new hyper-scale cognitive computing systems. About the speaker Rich Friedrich is the director of Strategic Innovation and Research Services (SIRS) at HP Labs. In this strategic role, he is responsible for research investments in nano-technology, exascale computing, cyber security, information management, cloud computing, immersive interaction, sustainability, social computing and commercial digital printing. Rich's philosophy is to fuse strategy and inspiration to create compelling capabilities for next generation information devices, systems and services. Using essential insights gained from the metaphysics of innnovation, he effectively leads ...

  3. Next Generation Astronomical Data Processing using Big Data Technologies from the Apache Software Foundation

    Mattmann, Chris

    2014-04-01

    In this era of exascale instruments for astronomy we must naturally develop next generation capabilities for the unprecedented data volume and velocity that will arrive due to the veracity of these ground-based sensor and observatories. Integrating scientific algorithms stewarded by scientific groups unobtrusively and rapidly; intelligently selecting data movement technologies; making use of cloud computing for storage and processing; and automatically extracting text and metadata and science from any type of file are all needed capabilities in this exciting time. Our group at NASA JPL has promoted the use of open source data management technologies available from the Apache Software Foundation (ASF) in pursuit of constructing next generation data management and processing systems for astronomical instruments including the Expanded Very Large Array (EVLA) in Socorro, NM and the Atacama Large Milimetre/Sub Milimetre Array (ALMA); as well as for the KAT-7 project led by SKA South Africa as a precursor to the full MeerKAT telescope. In addition we are funded currently by the National Science Foundation in the US to work with MIT Haystack Observatory and the University of Cambridge in the UK to construct a Radio Array of Portable Interferometric Devices (RAPID) that will undoubtedly draw from the rich technology advances underway. NASA JPL is investing in a strategic initiative for Big Data that is pulling in these capabilities and technologies for astronomical instruments and also for Earth science remote sensing. In this talk I will describe the above collaborative efforts underway and point to solutions in open source from the Apache Software Foundation that can be deployed and used today and that are already bringing our teams and projects benefits. I will describe how others can take advantage of our experience and point towards future application and contribution of these tools.

  4. A Soliton Hierarchy Associated with a Spectral Problem of 2nd Degree in a Spectral Parameter and Its Bi-Hamiltonian Structure

    Yuqin Yao

    2016-01-01

    Full Text Available Associated with so~(3,R, a new matrix spectral problem of 2nd degree in a spectral parameter is proposed and its corresponding soliton hierarchy is generated within the zero curvature formulation. Bi-Hamiltonian structures of the presented soliton hierarchy are furnished by using the trace identity, and thus, all presented equations possess infinitely commuting many symmetries and conservation laws, which implies their Liouville integrability.

  5. Computer-generated holograms (CGH) realization: the integration of dedicated software tool with digital slides printer

    Guarnieri, Vittorio; Francini, Franco

    1997-12-01

    Last generation of digital printer is usually characterized by a spatial resolution enough high to allow the designer to realize a binary CGH directly on a transparent film avoiding photographic reduction techniques. These devices are able to produce slides or offset prints. Furthermore, services supplied by commercial printing company provide an inexpensive method to rapidly verify the validity of the design by means of a test-and-trial process. Notably, this low-cost approach appears to be suitable for a didactical environment. On the basis of these considerations, a set of software tools able to design CGH's has been developed. The guidelines inspiring the work have been the following ones: (1) ray-tracing approach, considering the object to be reproduced as source of spherical waves; (2) Optimization and speed-up of the algorithms used, in order to produce a portable code, runnable on several hardware platforms. In this paper calculation methods to obtain some fundamental geometric functions (points, lines, curves) are described. Furthermore, by the juxtaposition of these primitives functions it is possible to produce the holograms of more complex objects. Many examples of generated CGHs are presented.

  6. Automatic Generation of Just-in-Time Online Assessments from Software Design Models

    Zualkernan, Imran A.; El-Naaj, Salim Abou; Papadopoulos, Maria; Al-Amoudi, Budoor K.; Matthews, Charles E.

    2009-01-01

    Computer software is pervasive in today's society. The rate at which new versions of computer software products are released is phenomenal when compared to the release rate of new products in traditional industries such as aircraft building. This rapid rate of change can partially explain why most certifications in the software industry are…

  7. Software for evaluating magnetic induction field generated by power lines: implementation of a new algorithm

    The Regional Environment Protection Agency of Friuli Venezia Giulia (A.R.P.A. F.V.G., Italy) has performed an analysis on existing software designed to calculate magnetic induction field generated by power lines. As far as the agency requirements are concerned the tested programs display some difficulties in the immediate processing of electrical and geometrical data supplied by plant owners, and in certain cases turn out to be inadequate in representing complex configurations of power lines. Furthermore, none of them is preset for cyclic calculus to determine the time evolution of induction in a certain exposure area. Finally, the output data are not immediately importable by ArcView, the G.I.S. used by A.R.P.A. F.V.G., and it is not always possible to implement the territory orography to determine the field at specified heights above the ground. P.h.i.d.e.l., an innovative software, tackles and works out al l the above mentioned problems. The power line wires interested in its implementation are represented by poly lines, and the field is analytically calculated, with no further approximation, not even when more power lines are concerned. Therefore, the obtained results, when compared with those of other programs, are the closest to experimental measurements. The output data can be employed both in G.I.S. and Excel environments, allowing the immediate overlaying of digital cartography and the determining of the 3 and 10 μT bands, in compliance with the Italian Decree of the President of the Council of Ministers of 8 July 2003. (authors)

  8. Software for evaluating magnetic induction field generated by power lines: implementation of a new algorithm

    Comelli, M.; Benes, M.; Bampo, A.; Villalta, R. [Regional Environment Protection Agency of Friuli Venezia Giulia (ARPA FVG), Environmental Physics, Udine (Italy)

    2006-07-01

    The Regional Environment Protection Agency of Friuli Venezia Giulia (A.R.P.A. F.V.G., Italy) has performed an analysis on existing software designed to calculate magnetic induction field generated by power lines. As far as the agency requirements are concerned the tested programs display some difficulties in the immediate processing of electrical and geometrical data supplied by plant owners, and in certain cases turn out to be inadequate in representing complex configurations of power lines. Furthermore, none of them is preset for cyclic calculus to determine the time evolution of induction in a certain exposure area. Finally, the output data are not immediately importable by ArcView, the G.I.S. used by A.R.P.A. F.V.G., and it is not always possible to implement the territory orography to determine the field at specified heights above the ground. P.h.i.d.e.l., an innovative software, tackles and works out al l the above mentioned problems. The power line wires interested in its implementation are represented by poly lines, and the field is analytically calculated, with no further approximation, not even when more power lines are concerned. Therefore, the obtained results, when compared with those of other programs, are the closest to experimental measurements. The output data can be employed both in G.I.S. and Excel environments, allowing the immediate overlaying of digital cartography and the determining of the 3 and 10 {mu}T bands, in compliance with the Italian Decree of the President of the Council of Ministers of 8 July 2003. (authors)

  9. Power system economics : the Nordic electricity market. 2nd ed.

    This book written as a textbook for students of engineering is designed for the Norwegian Power Markets course which is part of the Energy and Environment Master's Program and the recently established international MSc program in Electric Power Engineering. As the title indicates, the book deals with both power system economics in general and the practical implementation and experience from the Nordic market. Areas of coverage include: -- Restructuring/deregulation of the power supply system -- Grid access including tariffs and congestion management -- Generation planning -- Market modeling -- Ancillary services -- Regulation of grid monopolies. Although Power Systems Economics is written primarily as a textbook for students, other readers will also find the book interesting. It deals with problems that have been subject of considerable attention in the power sector for some years and it addresses issues that are still relevant and important. (au)

  10. 2nd International Conference on Robot Intelligence Technology and Applications

    Matson, Eric; Myung, Hyun; Xu, Peter; Karray, Fakhri

    2014-01-01

    We are facing a new technological challenge on how to store and retrieve knowledge and manipulate intelligence for autonomous services by intelligent systems which should be capable of carrying out real world tasks autonomously. To address this issue, robot researchers have been developing intelligence technology (InT) for “robots that think” which is in the focus of this book. The book covers all aspects of intelligence from perception at sensor level and reasoning at cognitive level to behavior planning at execution level for each low level segment of the machine. It also presents the technologies for cognitive reasoning, social interaction with humans, behavior generation, ability to cooperate with other robots, ambience awareness, and an artificial genome that can be passed on to other robots. These technologies are to materialize cognitive intelligence, social intelligence, behavioral intelligence, collective intelligence, ambient intelligence and genetic intelligence. The book aims at serving resear...

  11. Generation and Optimization of Test cases for Object-Oriented Software Using State Chart Diagram

    Ranjita Kumari Swain; Prafulla Kumar Behera; Durga Prasad Mohapatra

    2012-01-01

    The process of testing any software system is an enormous task which is time consuming and costly. The time and required effort to do sufficient testing grow, as the size and complexity of the software grows, which may cause overrun of the project budget, delay in the development of software system or some test cases may not be covered. During SDLC (software development life cycle), generally the software testing phase takes around 40-70% of the time and cost. State-based testing is frequentl...

  12. Monitoring North Korea Explosions: Status and Result of 1st and 2nd Tests (Invited)

    Chi, H.; Lee, H.; Shin, J.; Park, J.; Sheen, D.; Kim, G.; Che, I.; Lim, I.; Kim, T.

    2009-12-01

    Through data exchanging with China, Russia and Japan, KIGAM could monitor North Korea explosion tests in near real time with azimuthal full coverage from the test site. Except for the East Sea (Japan Sea) side, the seismic stations are distributed uniformly along the boundaries of North Korea and adjacent countries, and only stations with the distance of 200 to 550 Km from the test site were considered. Irrespective of azimuthal directions of stations from the test site, the conventional discrimination, Pn/Lg spectral ratio clearly showed that both tests were explosion. But mb-Ms discrimination did not show apparently the known pattern of explosion for both tests. Body wave magnitude, mb(Pn) of 2nd test, which was evaluated as 4.5 by KIGAM, varies with directional location of stations widely from 4.1 to 5.2. The magnitude obtained from Lg, mb(Lg), showed narrow variation between 4.3 to 4.7 with the average of 4.5. In the case of 1st test, both mb(Pn) and mb(Lg) showed equivalently large variation with directional station location. The error ellipses of epicentral determination of test site for 1st and 2nd tests showed almost identical pattern if they were separately calculated with the same configuration of stations. But the combined use of 1st and 2nd test data showed that 2nd test site was moved approximately 2 Km westward from 1st site. The cut-off frequencies of P wave of 1st and 2nd tests showed no or negligible difference even though the estimated yield of 2nd test were much larger than that of 1st one. The ratio of 1st and 2nd P-wave amplitudes showed from 2 to 3.1 times. Correspondingly the estimated energy or yield were ranged from 4 to roughly 10 times. KIGAM evaluated the yield of 2nd test were 8 times in the average larger than that of 1st one.

  13. Regional Observations of North Korea Explosions: 1st and 2nd Tests

    Chi, Heon Cheol; Shin, Jin Soo; Lee, Hee-Il; Park, Jung Ho; Sheen, Dong-Hoon; Kim, Geunyoung; Kim, Tea Sung; Che, Il-Young; Lim, In-Seub

    2010-05-01

    Through data exchanging with China, Russia and Japan, KIGAM could monitor North Korea explosion tests in near real time with azimuthally full coverage from the test site. Except for the East Sea (Japan Sea) side, the seismic stations are distributed uniformly along the boundaries of North Korea and adjacent countries. The error ellipses of epicentral determination of test site for 1st and 2nd tests showed almost identical pattern if they were separately calculated with the same configuration of stations. But the combined use of the 1st and the 2nd test data showed that the 2nd test site was moved approximately 2 Km westward from 1st site. The Pn/Lg spectral ratio clearly discriminate these events from two nearby natural earthquakes above 4 Hz. Full moment tensor inversion also indicate the 2nd test had a very large isotropic component. But mb-Ms discrimination, which has been considered one of the most reliable discriminants for separating explosions and earthquakes, did not show apparently the known pattern of explosion for both tests. Body wave magnitude, mb(Pn) of the 2nd test, which was evaluated as 4.5 by KIGAM, varies with directional location of stations widely from 4.1 to 5.2. The magnitude obtained from Lg, mb(Lg), showed narrow variation between 4.3 to 4.7 with the average of 4.5. In the case of both 1st and 2nd tests, both mb(Pn) and mb(Lg) showed equivalently large variation with directional station location. These variations are mainly due to lateral variation of crustal structures surrounding the test site. Remarkably mb(Lg) showed very linear relationship with mb(Pn). By considering attenuation characteristics according to the propagation path, the variations could be effectively reduced. The cut-off frequencies of P wave of both tests showed no or negligible difference even though the estimated yield of the 2nd test were much larger than that of the 1st one. The ratio of P-wave amplitudes of two tests showed from 2 to 3.1 times. Correspondingly the

  14. Advances in Sustainability: Contributions and Outcomes of the 2nd World Sustainability Forum

    Sylvie Flämig

    2013-03-01

    Full Text Available After a successful start in 2011, the 2nd World Sustainability Forum (WSF was held on sciforum.net from 1–30 November 2012. More than 80 papers were presented and over 180 authors contributed to the multidisciplinary conference. The objective of this short report is to sum up the contributions and discussions of the 2nd World Sustainability Forum. It is organized as follows. First, some general information on the Forum is given, then a summary of the contributions to the different sections, as well as providing an overview of the discussions. A final section including an outlook to the 3rd World Sustainability Forum concludes the article.

  15. Advances in Sustainability: Contributions and Outcomes of the 2nd World Sustainability Forum

    Sylvie Flämig; Marc A. Rosen

    2013-01-01

    After a successful start in 2011, the 2nd World Sustainability Forum (WSF) was held on sciforum.net from 1–30 November 2012. More than 80 papers were presented and over 180 authors contributed to the multidisciplinary conference. The objective of this short report is to sum up the contributions and discussions of the 2nd World Sustainability Forum. It is organized as follows. First, some general information on the Forum is given, then a summary of the contributions to the different sections...

  16. Proceedings of the 2nd KUR symposium on hyperfine interactions

    Hyperfine interactions between a nuclear spin and an electronic spin discovered from hyperfine splitting in atomic optical spectra have been utilized not only for the determination of nuclear parameters in nuclear physics but also for novel experimental techniques in many fields such as solid state physics, chemistry, biology, mineralogy and for diagnostic methods in medical science. Experimental techniques based on hyperfine interactions yield information about microscopic states of matter so that they are important in material science. Probes for material research using hyperfine interactions have been nuclei in the ground state and radioactive isotopes prepared with nuclear reactors or particle accelerators. But utilization of muons generated from accelerators is recently growing. Such wide spread application of hyperfine interaction techniques gives rise to some difficulty in collaboration among various research fields. In these circumstances, the present workshop was planned after four years since the last KUR symposium on the same subject. This report summarizes the contributions to the workshop in order to be available for the studies of hyperfine interactions. (J.P.N.)

  17. DOE performance indicators for 2nd quarter CY 1993

    1993-11-01

    The Department of Energy (DOE) has established a Department-wide Performance Indicator (PI) Program for trending and analysis of operational data as directed by DOE Order 5480.26. The PI Program was established to provide a means for monitoring the environment, safety, and health (ES&H) performance of the DOE at the Secretary and other management levels. This is the tenth in a series of quarterly reports generated for the Department of Energy Idaho Operations Office (DOE-ID) by EG&G Idaho, Inc. to meet the requirements of the PI Program as directed by the DOE Standard (DOE-STD-1048-92). The information in this tenth quarterly report, while contributing to a historical database for supporting future trending analysis, does not at this time provide a sound basis for developing trend-related conclusions. In the future, it is expected that trending and analysis of operational data will enhance the safety culture in both DOE and contractor organizations by providing an early warning of deteriorating environment, safety, and health conditions. DOE-STD-1048-92 identifies four general areas of PIs. They are: Personnel Safety, Operational Incidents, Environment, and Management. These four areas have been subdivided into 26 performance indicators. Approximately 115 performance indicator control and distribution charts comprise the body of this report. A brief summary of PIs contained in each of these general areas is provided. The four EG&G facilities whose performance is charted herein are as follows: (1) The Advanced Test Reactor (ATR), (2) The Radioactive Waste Management Complex (RWMC), (3) The Waste Experimental Reduction Facility (WERF), and (4) The Test Reactor Area (TRA) Hot Cells.

  18. 2nd international expert meeting straw power; 2. Internationale Fachtagung Strohenergie

    NONE

    2012-06-15

    Within the 2nd Guelzow expert discussions at 29th to 30th March, 2012 in Berlin (Federal Republic of Germany), the following lectures were held: (1) Promotion of the utilisation of straw in Germany (A. Schuette); (2) The significance of straw in the heat and power generation in EU-27 member states in 2020 and in 2030 under consideration of the costs and sustainability criteria (C. Panoutsou); (3) State of he art of the energetic utilization of hay goods in Europe (D. Thraen); (4) Incineration technological characterisation of straw based on analysis data as well as measured data of large-scale installations (I. Obernberger); (5) Energetic utilization of hay goods in Germany (T. Hering); (6) Actual state of the art towards establishing the first German straw thermal power station (R. Knieper); (7) Straw thermal power plants at agricultural sow farms and poultry farms (H. Heilmann); (8) Country report power from straw in Denmark (A. Evald); (9) Country report power from straw in Poland (J. Antonowicz); (10) Country report power from straw in China (J. Zhang); (11) Energetic utilisation of straw in Czechia (D. Andert); (12) Mobile pelletization of straw (S. Auth); (13) Experiences with the straw thermal power plant from Vattenfall (N. Kirkegaard); (14) Available straw potentials in Germany (potential, straw provision costs) (C. Weiser); (15) Standardization of hay good and test fuels - Classification and development of product standards (M. Englisch); (16) Measures of reduction of emissions at hay good incinerators (V. Lenz); (17) Fermentation of straw - State of the art and perspectives (G. Reinhold); (18) Cellulosis - Ethanol from agricultural residues - Sustainable biofuels (A. Hartmair); (19) Syngas by fermentation of straw (N. Dahmen); (20) Construction using straw (D. Scharmer).

  19. Highlights of the 2nd session of the General Conference

    The debates of the Conference were based on the 'First Annual Report to the General Conference' (GC(n)/39) covering the period 23 October 1957 to 30 June 1958, the 'Programme and Budget for 1959' (GC(H)/36) both submitted by the Board of Governors and on the statement made by the Director General on 22 September 1958 (GC(II)OR. 14) which brought the survey of the Agency's activities up-to-date. Delegates appraised the first year's achievements and many speakers emphasized the importance of close international co-operation in the field of atomic energy and dealt with the role the IAEA was called upon to play. Referring to the offers of various and in particular fissionable materials several delegates supported the statement made by the Director General in his opening address that some preferential treatment must be given the Agency by the offering countries thereby providing some inducement for governments to utilize the channels of true international co-operation. Issues concerning safeguards and reactors were discussed. The Conference finally recommended that the Board of Governors should give earnest and early consideration to initiating action for a survey to be made of the needs of the less developed countries in the matter of nuclear power generation plants, and to the adoption of measures for continuing study regarding the development of technology and economics of small and medium scale nuclear power reactors best suited for less developed countries, and assisting them in planning and implementing their training programmes in that connection. The Conference, finally, voted in favour of the appropriations necessary for the setting up of laboratory facilities. Practically all delegates agreed, although with varying emphasis, on the importance of technical assistance and other activities of the IAEA which would soonest benefit the less advanced countries. The General Conference finally approved by 59 votes, none against and one abstention the Board of Governors

  20. Generation of controller of an underwater robot for constant altitude cruising by self-training. 2nd Report. ; Modification of forward model and adaptation process. Jiko kunren ni yoru kaichu robot no teikodo koko. 2. ; Forward model to controller no chosei hoho no kairyo

    Suto, T.; Ura, T. (The University of Tokyo, Tokyo (Japan). Institute of Industrial Science)

    1993-12-01

    As a guidance system to be applied to constant altitude cruising of the self-controlling underwater robot, improvement of the SONCS (composed from controller network and forward model network) proposed in the previous paper by the author's laboratory is reported. The forward model network was divided into three modules respectively holding a function representing dynamics of the robot and deriving the quantity of state at a next time-step, a function of deriving the data of distance measurement at the next step, and a function of calculating the altitude from the data of distance measurement. A difference type network which represents the output with the increment from the input and a learning method which generates temporary instruction data train from the signals reversely propagating the forward network and thereby adjusts the controller network were introduced. Effectiveness of these three technical improvements was demonstrated based on numerical simulation. 4 refs., 12 figs.

  1. The Effect of Using Computer Edutainment on Developing 2nd Primary Graders' Writing Skills

    Mohammed Abdel Raheem, Azza Ashraf

    2011-01-01

    The present study attempted to examine the effect of using computer edutainment on developing 2nd graders' writing skills. The study comprised thirty-second year primary stage enrolled in Bani Hamad primary governmental school, Minia governorate. The study adopted the quasi-experimental design. Thirty participants were randomly assigned to one…

  2. Proceedings of the 2nd symposium on valves for coal conversion and utilization

    Maxfield, D.A. (ed.)

    1981-01-01

    The 2nd symposium on valves for coal conversion and utilization was held October 15 to 17, 1980. It was sponsored by the US Department of Energy, Morgantown Energy Technology Center, in cooperation with the Valve Manufacturers Association. Seventeen papers have been entered individually into EDB and ERA. (LTN)

  3. Introductory statement to the 2nd scientific forum on sustainable development: A role for nuclear power?

    In his Introductory Statement to the 2nd Scientific Forum on 'Sustainable Development - A Role for Nuclear Power?' (Vienna, 28 September 1999), the Director General of the IAEA focussed on the the main aspects concerning the development of nuclear power: safety, competitiveness, and public support

  4. Proceedings of the 2nd Mediterranean Conference on Information Technology Applications (ITA '97)

    This is the proceedings of the 2nd Mediterranean Conference on Information Technology Applications, held in Nicosia, Cyprus, between 6-7 November, 1997. It contains 16 papers. Two of these fall within the scope of INIS and are dealing with Telemetry, Radiation Monitoring, Environment Monitoring, Radiation Accidents, Air Pollution Monitoring, Diagnosis, Computers, Radiology and Data Processing

  5. The 2nd Global Space Development Summit Held In Washington DC

    Bian Ji

    2009-01-01

    @@ The 2nd Global Space Development Summit,organized by the Center for Strategic and International Studies in partnership with the American Institute of Aeronautics and Astronautics, the Space Foundation and the Chinese Society of Astronautics (CSA), took place in Washington, D.C. On November 12-13.

  6. 2nd International Congress on Economics and Business – New Economic Trends and Business Opportunities

    ARIK, Nazlı

    2016-01-01

    Abstract. In this study, the evaluation of The 2nd International Congress On Economics And Business: New Economic Trends and Business Opportunities held on May 30- June 3, 2016 in Sarajevo will be mentioned.Keywords. Economics, Economic Trends, Business Opportunities,  Labour Relations, Financial economics.JEL. M10, M20, O10.

  7. Stem cells and cancer immunotherapy: Arrowhead’s 2nd annual cancer immunotherapy conference

    Bot, Adrian; Chiriva-Internati, Maurizio; Cornforth, Andrew; Brian J Czerniecki; Ferrone, Soldano; Geles, Kenneth; Greenberg, Philip D.; Hurt, Elaine; Koya, Richard C.; Masoud H Manjili; Matsui, William; Morgan, Richard A.; Palena, Claudia M; Powell Jr, Daniel J; Restifo, Nicholas P

    2014-01-01

    Investigators from academia and industry gathered on April 4 and 5, 2013, in Washington DC at the Arrowhead’s 2nd Annual Cancer Immunotherapy Conference. Two complementary concepts were discussed: cancer “stem cells” as targets and therapeutic platforms based on stem cells.

  8. Mash-Up Personal Learning Environments. Proceedings of the 2nd Workshop MUPPLE’09

    Wild, Fridolin; Kalz, Marco; Palmér, Matthias; Müller, Daniel

    2009-01-01

    Wild, F., Kalz, M., Palmér, M., & Müller, D. (Eds.). (2009). Mash-Up Personal Learning Environments. Proceedings of the 2nd Workshop MUPPLE’09. September, 29, 2009, Nice, France: CEUR Workshop Proceedings, online http://sunsite.informatik.rwth-aachen.de/Publications/CEUR-WS/Vol-506/

  9. Managing Complexity in Next Generation Robotic Spacecraft: From a Software Perspective

    Reinholtz, Kirk

    2008-01-01

    This presentation highlights the challenges in the design of software to support robotic spacecraft. Robotic spacecraft offer a higher degree of autonomy, however currently more capabilities are required, primarily in the software, while providing the same or higher degree of reliability. The complexity of designing such an autonomous system is great, particularly while attempting to address the needs for increased capabilities and high reliability without increased needs for time or money. The efforts to develop programming models for the new hardware and the integration of software architecture are highlighted.

  10. Conceptual schemas generation from organizacional model in an automatic software production process.

    Martínez Rebollar, Alicia

    2008-01-01

    Actualmente, la ingeniería de software ha propuesto múltiples técnicas para mejorar el desarrollo de software, sin embargo, la meta final no ha sido satisfecha. En muchos casos, el producto software no satisface las necesidades reales de los clientes finales del negocio donde el sistema operará. Uno de los problemas principales de los trabajos actuales es la carencia de un enfoque sistemático para mapear cada concepto de modelado del dominio del problema (modelos organizacio...

  11. Individual Differences In The School Performance of 2nd-Grade Children Born to Low-Income Adolescent Mothers

    Apiwattanalunggarn, Kunlakarn Lekskul; Luster, Tom

    2005-01-01

    The purpose of this study was to investigate factors that contribute to individual differences in the school performance of 2nd-grade children born to adolescent mothers. The sample of this study was 90 low-income adolescent mothers and their children. Data were collected from the adolescent mothers and their first-born children, now in 2nd grade,…

  12. GSIMF: a web service based software and database management system for the next generation grids

    To process the vast amount of data from high energy physics experiments, physicists rely on Computational and Data Grids; yet, the distribution, installation, and updating of a myriad of different versions of different programs over the Grid environment is complicated, time-consuming, and error-prone. Our Grid Software Installation Management Framework (GSIMF) is a set of Grid Services that has been developed for managing versioned and interdependent software applications and file-based databases over the Grid infrastructure. This set of Grid services provide a mechanism to install software packages on distributed Grid computing elements, thus automating the software and database installation management process on behalf of the users. This enables users to remotely install programs and tap into the computing power provided by Grids

  13. System and Component Software Specification, Run-time Verification and Automatic Test Generation Project

    National Aeronautics and Space Administration — This proposal is for the creation of a system-level software specification and verification tool. This proposal suggests a major leap-forward in usability of...

  14. A 2nd generation static model of greenhouse energy requirements (horticern) : a comparison with dynamic models

    Jolliet, O; Munday, G L

    1989-01-01

    Optimisation of a greenhouse and its components requires a suitable model permitting precise determination of its energy requirements. Existing static models are simple but lack precision; dynamic models though more precise, are unsuitable for use over long periods and difficult to handle in practice. A theoretical study and measurements from the CERN trial greenhouse have allowed the development of new static model named "HORTICERN", precise and easy to use for predicting energy consumption and which takes into account effects of solar energy, wind and radiative loss to the sky. This paper compares the HORTICERN model with the dynamic models of Bot, Takakura, Van Bavel and Gembloux, and demonstrates that its precision is comparable; differences on average being less than 5%, it is independent of type of greenhouse (e.g. single or double glazing, Hortiplus, etc.) and climate. The HORTICERN method has been developed for PC use and is proving to be a powerful tool for greenhouse optimisation by research work...

  15. Validation of the 2nd Generation Proteasome Inhibitor Oprozomib for Local Therapy of Pulmonary Fibrosis

    Semren, Nora; Habel-Ungewitter, Nunja C.; Fernandez, Isis E.; Königshoff, Melanie; Eickelberg, Oliver; Stöger, Tobias; Meiners, Silke

    2015-01-01

    Proteasome inhibition has been shown to prevent development of fibrosis in several organs including the lung. However, effects of proteasome inhibitors on lung fibrosis are controversial and cytotoxic side effects of the overall inhibition of proteasomal protein degradation cannot be excluded. Therefore, we hypothesized that local lung-specific application of a novel, selective proteasome inhibitor, oprozomib (OZ), provides antifibrotic effects without systemic toxicity in a mouse model of lu...

  16. Biomass pyrolysis as an alternative process for the production of 2nd generation biofuels

    Kogdenko, Nadezda

    2010-01-01

    Bio-fuel production from renewable energy sources is the topic that have been studied by scientists and discussed in a political agenda for a couple of decades. In this period of time, however, it was discovered that approaches and technologies used until

  17. Performance of 2nd Generation BaBar Resistive Plate Chambers

    Anulli, F.; Baldini, R.; Calcaterra, A.; de Sangro, R.; Finocchiaro, G.; Patteri, P.; Piccolo, M.; Zallo, A.; /Frascati; Cheng, C.H.; Lange, D.J.; Wright, D.M.; /LLNL,; Messner, R.; Wisniewski, William J.; /SLAC; Pappagallo, M.; /Bari U. /INFN, Bari; Andreotti, M.; Bettoni, D.; Calabrese, R.; Cibinetto, G.; Luppi, E.; Negrini, M.; /Ferrara; Capra, R.; /Genoa U. /INFN, Genoa /Naples U. /INFN, Naples /Perugia U. /INFN, Perugia /Pisa U. /INFN, Pisa /Rome U. /INFN, Rome /Oregon U. /UC, Riverside

    2005-07-12

    The BaBar detector has operated nearly 200 Resistive Plate Chambers (RPCs), constructed as part of an upgrade of the forward endcap muon detector, for the past two years. The RPCs experience widely different background and luminosity-driven singles rates (0.01-10 Hz/cm{sup 2}) depending on position within the endcap. Some regions have integrated over 0.3 C/cm{sup 2}. RPC efficiency measured with cosmic rays is high and stable. The average efficiency measured with beam is also high. However, a few of the highest rate RPCs have suffered efficiency losses of 5-15%. Although constructed with improved techniques and minimal use of linseed oil, many of the RPCs, which are operated in streamer mode, have shown increased dark currents and noise rates that are correlated with the direction of the gas flow and the integrated current. Studies of the above aging effects are presented and correlated with detector operating conditions.

  18. Utilisation of 2nd generation web technologies in master level vocational teacher training

    Péter Tóth

    2009-03-01

    Full Text Available The Masters level Opportunities and Technological Innovation in Vocational Teacher Education project (project site: http://motivate.tmpk.bmf.hu/ aims to develop the use and management of virtual learning environments in the area of vocational teacher training, drawing on a well established international partnership of institutions providing both technical and educational expertise. This paper gives an overall picture of the first results and products of the collaboration. We touch upon the goals, the assessments and the learning process of using “Multimedia and e-Learning: e-learning methods and tools” module in details. The main cooperative and collaborative devices are presented in virtual learning environment. The communication during collaborative learning, the structured debate on forum and the benefits of collaborative learning in VLE are interpreted at the end of this paper.

  19. Power plant intake quantification of wheat straw composition for 2nd generation bioethanol optimization

    Lomborg, Carina J.; Thomsen, Mette Hedegaard; Jensen, Erik Steen;

    2010-01-01

    (glucan), hemicelluloses (xylan, arabinan), and lignin. Aiming at chemometric multivariate calibration, 44 pre-selected samples were subjected to spectroscopy and reference analysis. For glucan and xylan prediction accuracies (slope: 0.89, 0.94) and precisions (r2: 0.87) were obtained, corresponding to...

  20. The new 2nd-generation laser station at Santiago de Cuba

    Masevich, A. G.; Chepurnov, B. D.; Fundora, M.; del Pino, J.; Kautzleben, H.

    The new laser-radar station at Santiago de Cuba was equipped in cooperation between the Academies of Sciences of the USSR, Cuba and the G.D.R. The system is based on a modified satellite-tracking camera (SBG). Its basic concept and the technical performance are similar to the laser-radar station of the Central Institute for Physics of the Earth, Potsdam. During a first 6-weeks-observation campaign (Dec. 1985 - Jan. 1986), 70 satellite passes (including 40 passes of the geodynamical satellite LAGEOS) were obtained.

  1. Validation of the 2nd Generation Proteasome Inhibitor Oprozomib for Local Therapy of Pulmonary Fibrosis.

    Nora Semren

    Full Text Available Proteasome inhibition has been shown to prevent development of fibrosis in several organs including the lung. However, effects of proteasome inhibitors on lung fibrosis are controversial and cytotoxic side effects of the overall inhibition of proteasomal protein degradation cannot be excluded. Therefore, we hypothesized that local lung-specific application of a novel, selective proteasome inhibitor, oprozomib (OZ, provides antifibrotic effects without systemic toxicity in a mouse model of lung fibrosis. Oprozomib was first tested on the human alveolar epithelial cancer cell line A549 and in primary mouse alveolar epithelial type II cells regarding its cytotoxic effects on alveolar epithelial cells and compared to the FDA approved proteasome inhibitor bortezomib (BZ. OZ was less toxic than BZ and provided high selectivity for the chymotrypsin-like active site of the proteasome. In primary mouse lung fibroblasts, OZ showed significant anti-fibrotic effects, i.e. reduction of collagen I and α smooth muscle actin expression, in the absence of cytotoxicity. When applied locally into the lungs of healthy mice via instillation, OZ was well tolerated and effectively reduced proteasome activity in the lungs. In bleomycin challenged mice, however, locally applied OZ resulted in accelerated weight loss and increased mortality of treated mice. Further, OZ failed to reduce fibrosis in these mice. While upon systemic application OZ was well tolerated in healthy mice, it rather augmented instead of attenuated fibrotic remodelling of the lung in bleomycin challenged mice. To conclude, low toxicity and antifibrotic effects of OZ in pulmonary fibroblasts could not be confirmed for pulmonary fibrosis of bleomycin-treated mice. In light of these data, the use of proteasome inhibitors as therapeutic agents for the treatment of fibrotic lung diseases should thus be considered with caution.

  2. Benchmarking of Modern Data Analysis Tools for a 2nd generation Transient Data Analysis Framework

    Goncalves, Nuno

    2016-01-01

    During the past year of operating the Large Hadron Collider (LHC), the amount of transient accelerator data to be persisted and analysed has been steadily growing. Since the startup of the LHC in 2006, the amount of weekly data storage requirements exceeded what the systems was initially designed to accommodate in a full year of operation. Moreover, it is predicted that the data acquisition rates will continue to increase in the future, due to foreseen improvements in the infrastructure within the scope of the High Luminosity LHC project. Despite the efforts for improving and optimizing the current data storage infrastructures (CERN Accelerator Logging Service and Post Mortem database), some limitations still persist and require a different approach to scale up efficiently to provide efficient services for future machine upgrades. This project aims to explore one of the possibilities among novel solutions proposed to solve the problem of working with large datasets. The configuration is composed of Spark for ...

  3. Crystal structures and phase transformation of deuterated lithium imide, Li{sub 2}ND

    Balogh, Michael P. [Chemical and Environmental Sciences Laboratory, General Motors Research and Development Center, 30500 Mound Road, Warren, MI 48090-9055 (United States)]. E-mail: michael.p.balogh@gm.com; Jones, Camille Y. [NIST Center for Neutron Research, 100 Bureau Drive, Stop 8562, National Institute of Standards and Technology, Gaithersburg, MD 20899-8562 (United States); Herbst, J.F. [Materials and Processes Laboratory, General Motors Research and Development Center, 30500 Mound Road, Warren, MI 48090-9055 (United States); Hector, Louis G. [Materials and Processes Laboratory, General Motors Research and Development Center, 30500 Mound Road, Warren, MI 48090-9055 (United States); Kundrat, Matthew [Aerotek Corp., 26211 Central Park Blvd., Southfield, MI 48076 (United States)

    2006-08-31

    We have investigated the crystal structure of deuterated lithium imide, Li{sub 2}ND, by means of neutron and X-ray diffraction. An order-disorder transition occurs near 360K. Below that temperature Li{sub 2}ND can be described to the same level of accuracy as a disordered cubic (Fd3-bar m) structure with partially occupied Li 32e sites or as a fully occupied orthorhombic (Ima2 or Imm2) structure. The high temperature phase is best characterized as disordered cubic (Fm3-bar m) with D atoms randomized over the 192l sites. Density functional theory calculations complement and support the diffraction analyses. We compare our findings in detail with previous studies.

  4. Crystal structures and phase transformation of deuterated lithium imide, Li2ND

    We have investigated the crystal structure of deuterated lithium imide, Li2ND, by means of neutron and X-ray diffraction. An order-disorder transition occurs near 360K. Below that temperature Li2ND can be described to the same level of accuracy as a disordered cubic (Fd3-bar m) structure with partially occupied Li 32e sites or as a fully occupied orthorhombic (Ima2 or Imm2) structure. The high temperature phase is best characterized as disordered cubic (Fm3-bar m) with D atoms randomized over the 192l sites. Density functional theory calculations complement and support the diffraction analyses. We compare our findings in detail with previous studies

  5. Revised data for 2nd version of nuclear criticality safety handbook/data collection

    This paper outlines the data prepared for the 2nd version of Data Collection of the Nuclear Criticality Safety Handbook. These data are discussed in the order of its preliminary table of contents. The nuclear characteristic parameters (k∞, M2, D) were derived, and subcriticality judgment graphs were drawn for eleven kinds of fuels which were often encountered in criticality safety evaluation of fuel cycle facilities. For calculation of criticality data, benchmark calculations using the combination of the continuous energy Monte Carlo criticality code MVP and the Japanese Evaluated Nuclear Data Library JENDL-3.2 were made. The calculation errors were evaluated for this combination. The implementation of the experimental results obtained by using NUCEF facilities into the 2nd version of the Data Collection is under discussion. Therefore, related data were just mentioned. A database is being prepared to retrieve revised data easily. (author)

  6. Analysis of plume oxidation during the air pollution episode of September 2nd 1998

    Harrison, R.M.; Baggot, S.

    2001-08-01

    This study was commissioned by the Environment Agency in order to provide further investigation into the air pollution episode of September 2nd, 1998 which afflicted parts of the Midlands and South Yorkshire. A report by the Environment Agency based upon numerical modelling by the Meteorological Office indicated a number of major industrial facilities as the source of the emissions responsible for the episode. In this report an investigation is made of likely chemical changes during airmass transport and its impact on air composition at ground-level and downwind receptor locations. It is concluded that the measurements of air quality at Nottingham Centre, Stoke on Trent and Birmingham Centre on 2nd September 1998 are consistent with emissions from the sources identified in the Environment Agency report on this episode when allowance is made for oxidation of sulphur and nitrogen oxides within the plume. 3 refs., 9 figs., 9 tabs.

  7. OASIS4 – a coupling software for next generation earth system modelling

    R. Redler

    2010-01-01

    Full Text Available In this article we present a new version of the Ocean Atmosphere Sea Ice Soil coupling software (OASIS4. With this new fully parallel OASIS4 coupler we target the needs of Earth system modelling in its full complexity. The primary focus of this article is to describe the design of the OASIS4 software and how the coupling software drives the whole coupled model system ensuring the synchronization of the different component models. The application programmer interface (API manages the coupling exchanges between arbitrary climate component models, as well as the input and output from and to files of each individual component. The OASIS4 Transformer instance performs the parallel interpolation and transfer of the coupling data between source and target model components. As a new core technology for the software, the fully parallel search algorithm of OASIS4 is described in detail. First benchmark results are discussed with simple test configurations to demonstrate the efficiency and scalability of the software when applied to Earth system model components. Typically the compute time needed to perform the search is in the order of a few seconds and is only weakly dependant on the grid size.

  8. OASIS4 – a coupling software for next generation earth system modelling

    R. Redler

    2009-07-01

    Full Text Available In this article we present a new version of the Ocean Atmosphere Sea Ice Soil coupling software (OASIS4. With this new fully parallel OASIS4 coupler we target the needs of Earth system modelling in its full complexity. The primary focus of this article is to describe the design of the OASIS4 software and how the coupling software drives the whole coupled model system ensuring the synchronization of the different component models. The application programmer interface (API manages the coupling exchanges between arbitrary climate component models, as well as the input and output from and to files of each individual component. The OASIS4 Transformer instance performs the parallel interpolation and transfer of the coupling data between source and target model components. As a new core technology for the software, the fully parallel search algorithm of OASIS4 is described in detail. First benchmark results are discussed with simple test configurations to demonstrate the efficiency and scalability of the software when applied to Earth system model components. Typically the compute time needed in order to perform the search is in the order of a few seconds and is only weakly dependant on the grid size.

  9. Caries correction factors applied to a Punic (6th - 2nd BC) population from Ibiza (Spain)

    Márquez-Grant, N

    2009-01-01

    Caries correction factors were applied to a Punic (6th-2nd century BC) rural sample from the island of Ibiza (Spain). Data obtained on dental caries and ante-mortem tooth loss provided a corrected rate of 12.8% of teeth with caries. This result, in conjunction with other sources of information such as stable isotope analysis and documentary evidence, indicated a diet based on terrestrial protein (mainly carbohydrates) and a low component of marine protein. The paper suggests further research ...

  10. 2006: 2nd Jameson - D.I.T Faculty of Tourism and Food Cocktail Competition

    Murphy, James Peter

    2006-01-01

    The 2nd Jameson - D.I.T Faculty of Tourism and Food Cocktail Competition took place on Thursday November 30th 2006, this initiative between the Faculty of Tourism and Food and Jameson offered over 60 hospitality and bartending students currently studying and working in the hospitality and licensed trade industries the opportunity to improve their skills in creative drinks mixing, in direct response to the growing demand for new cocktails and exciting new drinks to suit every season. Overall p...

  11. Protection of Architectural Heritage in Latvia, the 2nd Half of the 19th Century - 1940

    Mintaurs, Mārtiņš

    2008-01-01

    ANNOTATION The dissertation “Protection of Architectural Heritage in Latvia, the 2nd Half of the 19th Century – 1940” created at the University of Latvia, Department of Archaeology and Ancillary Historical Disciplines of the Faculty of History and Philosophy in 2007 by Martins Mintaurs under the guidance of associated professor Aleksandrs Gavrilins, Dr. hist. The dissertation includes introduction, examination of sources and bibliography, three chapters, conclusion, index of...

  12. Production of artificial ionospheric layers by frequency sweeping near the 2nd gyroharmonic

    Pedersen, T.; M. McCarrick; Reinisch, B.; Watkins, B.; Hamel, R.; Paznukhov, V.

    2011-01-01

    Artificial ionospheric plasmas descending from the background F-region have been observed on multiple occasions at the High Frequency Active Auroral Research Program (HAARP) facility since it reached full 3.6 MW power. Proximity of the transmitter frequency to the 2nd harmonic of the electron gyrofrequency (2fce) has been noted as a requirement for their occurrence, and their disappearance after only a few minutes has been attributed to the increasing...

  13. Group field theory as the 2nd quantization of Loop Quantum Gravity

    Oriti, Daniele

    2013-01-01

    We construct a 2nd quantized reformulation of canonical Loop Quantum Gravity at both kinematical and dynamical level, in terms of a Fock space of spin networks, and show in full generality that it leads directly to the Group Field Theory formalism. In particular, we show the correspondence between canonical LQG dynamics and GFT dynamics leading to a specific GFT model from any definition of quantum canonical dynamics of spin networks. We exemplify the correspondence of dynamics in the specifi...

  14. Study on self-medication among 2nd year medical students

    K. Jagadeesh; K. N. Chidananda; Sreenivas P. Revankar; Nagaraja S. Prasad

    2015-01-01

    Background: Self-medication is use of medicines by individuals to treat self-recognized symptoms and illness. Self-medication is a common type of self-care behavior in the general public, but medical students differ in such practice, as they have knowledge about drugs and diseases. Methods: The present study involved 100 2nd year final term medical students in and ldquo;Shivamogga Institute of Medical Sciences, and rdquo; Shivamogga, Karnataka. Study was questionnaire based, and the resul...

  15. Standardization and Innovation The 2nd Shenzhen Hi-tech Standardization Forum Held

    Huang Manxue; Huang Li

    2006-01-01

    @@ "The 2nd Shenzhen Hi-tech Standardization Forum"was fulfilled in Shenzhen Civil Center on November 25th,2005. The forum lasted for two days. There were around 400 delegates from enterprises and standardization organizations participating in this forum. Nine experts from standardization, intellectual property organizations and enterprises made their excellent speeches focusing on the topic "Standardardization and Innovation". All participates agreed that standardization improves innovation efficiently.

  16. Archaeometric study of glass beads from the 2nd century BC cemetery of Numantia

    García Heras, Manuel; Rincoón López, Jesús M.; Alfredo JIMENO MARTÍNEZ; Villegas Broncano, María Angeles

    2003-01-01

    Recent archaeologícalf ieldwork undertaken in the Celtiberian cremation necropolis of Numantia (Soria, Spain) has provided a group of glass beads from the 2nd century BC. Such glass beads were part, together with other metallic and ceramic items, of the offerings deposited with the dead. They are ring-shaped in typology and deep-blue, amber, or semitransparent white in colour. This paper reports results derived from the chemical and microstructural characterization carried out on a representa...

  17. City look package: the 2nd Summer Youth Olympic Games : Nanjing 2014

    2016-01-01

    The City Look Package of the 2nd Summer Youth Olympic Games (hereinafter referred to as "Nanjing 2014") is the package of designs developed to decorate the host city during Games time, comprised of usage guidelines for combinations of fundamental elements inside the city, including core graphics, emblem, slogan and so on. As a result, it is the most important guiding document in design and implementation of the city Look. In order to protect the authenticity, integrity and consistency of the ...

  18. A Communications Guide for Sustainable Development: How Interested Parties Become Partners, 2nd Edition

    Fowler, Kimberly M.; Hund, Gretchen; Engel-Cox, Jill A.

    2016-03-06

    The 2nd edition is an updated version plus an e-book. This book was developed to assist organizations in designing and managing their communication and stakeholder involvement programs. The guidebook describes a step-by-step approach, provides case studies, and presents tools to consider. The book uses a scenario approach to outline changes an organization may confront, and provides a menu of communication and engagement activities that support organizational decision making.

  19. Idaho National Laboratory Quarterly Performance Analysis for the 2nd Quarter FY 2015

    Mitchell, Lisbeth A. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-04-01

    This report is published quarterly by the Idaho National Laboratory (INL) Quality and Performance Management Organization. The Department of Energy (DOE) Occurrence Reporting and Processing System (ORPS), as prescribed in DOE Order 232.2, “Occurrence Reporting and Processing of Operations Information,” requires a quarterly analysis of events, both reportable and not reportable, for the previous 12 months. This report is the analysis of events for the 2nd Qtr FY-15.

  20. Application research on enhancing near-infrared micro-imaging quality by 2nd derivative

    Wang, Dong; Ma, Zhi-hong; Zhao, Liu; Wang, Bei-hong; Han, Ping; Pan, Li-gang; Wang, Ji-hua

    2013-08-01

    Near-infrared micro-imaging will not only provide the sample's spatial distribution information, but also the spectroscopic information of each pixel. In this thesis, it took the artificial sample of wheat flour and formaldehyde sodium sulfoxylate distribution given for example to research the data processing method for enhancing the quality of near-infrared micro-imaging. Near-infrared spectroscopic feature of wheat flour and formaldehyde sodium sulfoxylate being studied on, compare correlation imaging and 2nd derivative imaging were applied in the imaging processing of the near-infrared micro-image of the artificial sample. Furthermore, the two methods were combined, i.e. 2nd derivative compare correlation imaging was acquired. The result indicated that the difference of the correlation coefficients between the two substances, i.e. wheat flour and formaldehyde sodium sulfoxylate, and the reference spectrum has been increased from 0.001 in compare correlation image to 0.796 in 2nd derivative compare correlation image respectively, which enhances the imaging quality efficiently. This study will, to some extent, be of important reference significance to near-infrared micro-imaging method research of agricultural products and foods.

  1. Automatic Generation of Machine Emulators: Efficient Synthesis of Robust Virtual Machines for Legacy Software Migration

    Franz, Michael; Gal, Andreas; Probst, Christian

    As older mainframe architectures become obsolete, the corresponding le- gacy software is increasingly executed via platform emulators running on top of more modern commodity hardware. These emulators are virtual machines that often include a combination of interpreters and just-in-time compilers....... Implementing interpreters and compilers for each combination of emulated and target platform independently of each other is a redundant and error-prone task. We describe an alternative approach that automatically synthesizes specialized virtual-machine interpreters and just-in-time compilers, which then...... execute on top of an existing software portability platform such as Java. The result is a considerably reduced implementation effort....

  2. Software concept of in-service diagnostic systems for nuclear steam generating facilities

    The concept of software systems of in-service diagnostics is presented for the primary circuits of WWER-440 and WWER-1000 reactors. The basic and supplementary systems and user software are described for the collection, processing and evaluation of diagnostic signals from the primary circuits of the Dukovany and Bohunice nuclear power plants and the design is presented of the hierarchical structure of computers in the diagnostic systems of the Mochovce and Temelin nuclear power plants. The systems are operated using computers of Czechoslovak make of the ADT production series with operating systems RTE-II or DOS IV. (J.B.)

  3. Development of the software tool for generation and visualization of the finite element head model with bone conduction sounds

    Nikolić, Dalibor; Milošević, Žarko; Saveljić, Igor; Filipović, Nenad

    2015-12-01

    Vibration of the skull causes a hearing sensation. We call it Bone Conduction (BC) sound. There are several investigations about transmission properties of bone conducted sound. The aim of this study was to develop a software tool for easy generation of the finite element (FE) model of the human head with different materials based on human head anatomy and to calculate sound conduction through the head. Developed software tool generates a model in a few steps. The first step is to do segmentation of CT medical images (DICOM) and to generate a surface mesh files (STL). Each STL file presents a different layer of human head with different material properties (brain, CSF, different layers of the skull bone, skin, etc.). The next steps are to make tetrahedral mesh from obtained STL files, to define FE model boundary conditions and to solve FE equations. This tool uses PAK solver, which is the open source software implemented in SIFEM FP7 project, for calculations of the head vibration. Purpose of this tool is to show impact of the bone conduction sound of the head on the hearing system and to estimate matching of obtained results with experimental measurements.

  4. User's manual for the UNDERDOG [Underground Nuclear Depository Evaluation, Reduction, and Detailed Output Generator] data reduction software

    UNDERDOG is a computer program that aids experimentalists in the process of data reduction. This software allows a user to reduce, extract, and generate displays of data collected at the WIPP site. UNDERDOG contains three major functional components: a Data Reduction package, a Data Analysis interface, and a Publication-Quality Output generator. It also maintains audit trails of all actions performed for quality assurance purposes and provides mechanisms which control an individual's access to the data. UNDERDOG was designed to run on a Digital Equipment Corporation VAX computer using the VMS operating system. 8 refs., 24 figs., 2 tabs

  5. Modeling of wind turbines with doubly fed generator system

    Fortmann, Jens

    2014-01-01

    Jens Fortmann describes the deduction of models for the grid integration of variable speed wind turbines and the reactive power control design of wind plants. The modeling part is intended as background to understand the theory, capabilities and limitations of the generic doubly fed generator and full converter wind turbine models described in the IEC 61400-27-1 and as 2nd generation WECC models that are used as standard library models of wind turbines for grid simulation software. Focus of the reactive power control part is a deduction of the origin and theory behind the reactive current requ

  6. Building the Next Generation of Aerospace Data Processing Systems by Reusing Existing Software Components

    Marshall, James J.; Downs, Robert R.; Samadi, Shahin

    2010-01-01

    The authors appreciate the contributions of the current and previous members of the National Aeronautics and Space Administration (NASA) Earth Science Data Systems Software Reuse Working Group to some of the work presented here, and very much appreciate the support received from the NASA for the work reported in this paper, including the support for Robert Downs under Contract NAS5-03117.

  7. Comparative Evaluation of Three Continuous Speech Recognition Software Packages in the Generation of Medical Reports

    Devine, Eric G.; Gaehde, Stephan A.; Curtis, Arthur C.

    2000-01-01

    Objective: To compare out-of-box performance of three commercially available continuous speech recognition software packages: IBM ViaVoice 98 with General Medicine Vocabulary; Dragon Systems NaturallySpeaking Medical Suite, version 3.0; and L&H Voice Xpress for Medicine, General Medicine Edition, version 1.2.

  8. A proposed approach for developing next-generation computational electromagnetics software

    Miller, E.K.; Kruger, R.P. [Los Alamos National Lab., NM (United States); Moraites, S. [Simulated Life Systems, Inc., Chambersburg, PA (United States)

    1993-02-01

    Computations have become a tool coequal with mathematics and measurements as a means of performing electromagnetic analysis and design. This is demonstrated by the volume of articles and meeting presentations in which computational electromagnetics (CEM) is routinely employed to address an increasing variety of problems. Yet, in spite of the substantial resources invested in CEM software over the past three decades, little real progress seems to have been made towards providing the EM engineer software tools having a functionality equivalent to that expected of hardware instrumentation. Furthermore, the bulk of CEM software now available is generally of limited applicability to large, complex problems because most modeling codes employ a single field propagator, or analytical form, of Maxwell`s Equations. The acknowledged advantages of hybrid models, i.e., those which employ different propagators in differing regions of a problem, are relatively unexploited. The thrust of this discussion is to propose a new approach designed to address both problems outlined above, integrating advances being made in both software and hardware development. After briefly reviewing the evolution of modeling CEM software to date and pointing out the deficiencies thereof, we describe an approach for making CEM tools more truly ``user friendly`` called EMSES (Electromagnetic Modeling and Simulation Environment for Systems). This will be achieved through two main avenues. One is developing a common problem-description language implemented in a visual programming environment working together with a translator that produces the specific model description needed by various numerical treatments, in order to optimize user efficiency. The other is to employ a new modeling paradigm based on the idea of field propagators to expedite the development of the hybrid models that are needed to optimize computation efficiency.

  9. A proposed approach for developing next-generation computational electromagnetics software

    Miller, E.K.; Kruger, R.P. (Los Alamos National Lab., NM (United States)); Moraites, S. (Simulated Life Systems, Inc., Chambersburg, PA (United States))

    1993-01-01

    Computations have become a tool coequal with mathematics and measurements as a means of performing electromagnetic analysis and design. This is demonstrated by the volume of articles and meeting presentations in which computational electromagnetics (CEM) is routinely employed to address an increasing variety of problems. Yet, in spite of the substantial resources invested in CEM software over the past three decades, little real progress seems to have been made towards providing the EM engineer software tools having a functionality equivalent to that expected of hardware instrumentation. Furthermore, the bulk of CEM software now available is generally of limited applicability to large, complex problems because most modeling codes employ a single field propagator, or analytical form, of Maxwell's Equations. The acknowledged advantages of hybrid models, i.e., those which employ different propagators in differing regions of a problem, are relatively unexploited. The thrust of this discussion is to propose a new approach designed to address both problems outlined above, integrating advances being made in both software and hardware development. After briefly reviewing the evolution of modeling CEM software to date and pointing out the deficiencies thereof, we describe an approach for making CEM tools more truly user friendly'' called EMSES (Electromagnetic Modeling and Simulation Environment for Systems). This will be achieved through two main avenues. One is developing a common problem-description language implemented in a visual programming environment working together with a translator that produces the specific model description needed by various numerical treatments, in order to optimize user efficiency. The other is to employ a new modeling paradigm based on the idea of field propagators to expedite the development of the hybrid models that are needed to optimize computation efficiency.

  10. Wavelength and oscillator strength of dipole transition 1s22p-1s2nd for Mn22+ ion

    WANG ZhiWen; WANG YaNan; HU MuHong; LI XinRu; LIU Ying

    2008-01-01

    The transition energies, wavelengths and dipole oscillator strengths of 1s22p-1s2nd (3≤n≤9) for Mn22+. ion are calculated. The fine structure splittings of 1s2nd (n ≤9) states for this ion are also evaluated. In calculating energy, the higher-order relativistic contribution is estimated under a hydrogenic approximation. The quantum defect of Rydberg series 1s2nd is determined according to the quantum defect theory. The results obtained in this paper excellently agree with the experi-mental data available in literatures.

  11. Simplification of coding of NRU loop experiment software with dimensional generator

    The following are specific topics of this paper: 1.There is much creativity in the manner in which Dimensional Generator can be applied to a specific programming task [2]. This paper tells how Dimensional Generator was applied to a reactor-physics task. 2. In this first practical use, Dimensional Generator itself proved not to need change, but a better user interface was found necessary, essentially because the relevance of Dimensional Generator to reactor physics was initially underestimated. It is briefly described. 3. The use of Dimensional Generator helps make reactor-physics source code somewhat simpler. That is explained here with brief examples from BURFEL-PC and WIMSBURF. 4. Most importantly, with the help of Dimensional Generator, all erroneous physical expressions were automatically detected. The errors are detailed here (in spite of the author's embarrassment) because they show clearly, both in theory and in practice, how Dimensional Generator offers quality enhancement of reactor-physics programming. (authors)

  12. Rankine: A computer software package for the analysis and design of steam power generating units

    Somerton, C.W.; Brouillette, T.; Pourciau, C.; Strawn, D.; Whitehouse, L.

    1987-04-01

    A software package has been developed for the analysis of steam power systems. Twenty-eight configurations are considered, all based upon the simple Rankine cycle with various additional components such as feedwater heaters and reheat legs. The package is demonstrated by two examples. In the first, the optimum operating conditions for a simple reheat cycle are determined by using the program. The second example involves calculating the exergetic efficiency of an actual steam power system.

  13. Research on Object-oriented Software Testing Cases of Automatic Generation

    Junli Zhang

    2013-11-01

    Full Text Available In the research on automatic generation of testing cases, there are different execution paths under drivers of different testing cases. The probability of these paths being executed is also different. For paths which are easy to be executed, more redundant testing case tend to be generated; But only fewer testing cases are generated for the control paths which are hard to be executed. Genetic algorithm can be used to instruct the automatic generation of testing cases. For the former paths, it can restrict the generation of these kinds of testing cases. On the contrary, the algorithm will encourage the generation of such testing cases as much as possible. So based on the study on the technology of path-oriented testing case automatic generation, the genetic algorithm is adopted to construct the process of automatic generation. According to the triggering path during the dynamic execution of program, the generated testing cases are separated into different equivalence class. The number of testing case is adjusted dynamicly by the fitness corresponding to the paths. The method can create a certain number of testing cases for each execution path to ensure the sufficiency. It also reduces redundant testing cases so it is an effective method for automatic generation of testing cases.

  14. A technical note about Phidel: A new software for evaluating magnetic induction field generated by power lines

    The Regional Environment Protection Agency of Friuli Venezia Giulia (ARPA FVG, Italy) has performed an analysis on existing software designed to calculate magnetic induction field generated by power lines. As far as the agency's requirements are concerned the tested programs display some difficulties in the immediate processing of electrical and geometrical data supplied by plant owners, and in certain cases turn out to be inadequate in representing complex configurations of power lines. Phidel, an innovative software, tackles and works out all the above-mentioned problems. Therefore, the obtained results, when compared with those of other programs, are the closest to experimental measurements. The output data can be employed both in the GIS and Excel environments, allowing the immediate overlaying of digital cartography and the determining of the 3 and 10 μT bands, in compliance with the Italian Decree of the President of the Council of Ministers of 8 July 2003. (authors)

  15. PHASTpep: Analysis Software for Discovery of Cell-Selective Peptides via Phage Display and Next-Generation Sequencing

    Dasa, Siva Sai Krishna; Kelly, Kimberly A.

    2016-01-01

    Next-generation sequencing has enhanced the phage display process, allowing for the quantification of millions of sequences resulting from the biopanning process. In response, many valuable analysis programs focused on specificity and finding targeted motifs or consensus sequences were developed. For targeted drug delivery and molecular imaging, it is also necessary to find peptides that are selective—targeting only the cell type or tissue of interest. We present a new analysis strategy and accompanying software, PHage Analysis for Selective Targeted PEPtides (PHASTpep), which identifies highly specific and selective peptides. Using this process, we discovered and validated, both in vitro and in vivo in mice, two sequences (HTTIPKV and APPIMSV) targeted to pancreatic cancer-associated fibroblasts that escaped identification using previously existing software. Our selectivity analysis makes it possible to discover peptides that target a specific cell type and avoid other cell types, enhancing clinical translatability by circumventing complications with systemic use. PMID:27186887

  16. 2nd International Doctoral Symposium on Applied Computation and Security Systems

    Cortesi, Agostino; Saeed, Khalid; Chaki, Nabendu

    2016-01-01

    The book contains the extended version of the works that have been presented and discussed in the Second International Doctoral Symposium on Applied Computation and Security Systems (ACSS 2015) held during May 23-25, 2015 in Kolkata, India. The symposium has been jointly organized by the AGH University of Science & Technology, Cracow, Poland; Ca’ Foscari University, Venice, Italy and University of Calcutta, India. The book is divided into volumes and presents dissertation works in the areas of Image Processing, Biometrics-based Authentication, Soft Computing, Data Mining, Next Generation Networking and Network Security, Remote Healthcare, Communications, Embedded Systems, Software Engineering and Service Engineering.

  17. Generator program for computer-assisted instruction: MACGEN. A software tool for generating computer-assisted instructional texts.

    Utsch, M J; Ingram, D

    1983-01-01

    This publication describes MACGEN, an interactive development tool to assist teachers to create, modify and extend case simulations, tutorial exercises and multiple-choice question tests designed for computer-aided instruction. The menu-driven software provides full authoring facilities for text files in MACAID format by means of interactive editing. Authors are prompted for items which they might want to change whereas all user-independent items are provided automatically. Optional default values and explanatory messages are available with every prompt. Errors are corrected automatically or commented upon. Thus the program eliminates the need to familiarize with a new language or details of the text file structure. The options for modification of existing text files include display, renumbering of frames and a line-oriented editor. The resulting text files can be interpreted by the MACAID driver without further changes. The text file is held as ASCII records and as such is also accessible with many standard word-processing systems if desired. PMID:6362978

  18. Estructura cristalina del nuevo óxido tipo perovskita compleja Ba2NdZrO5,5

    D.A. Landínez Téllez

    2007-01-01

    Full Text Available A new complex perovskite material Ba2NdZrO5;5has been synthesized for the first time by a conventional solid state reaction process. X–raydiffraction (XRD measurements and Rietveld analysis revealed an ordered complex cubic structure characteristic of A2BB0O6crystallinestructure with a lattice constanta= 8;40ß0;01̊A. Energy Dispersive X–ray (EDX analysis shows that Ba2NdZrO5;5is free of impuritytraces. Preliminary studies reveal that at820±C temperature Ba2NdZrO5;5does not react with YBa2Cu3O7°±. These favorable characteristicsof Ba2NdZrO5;5show that it can be used as a potential substrate material for fabrication of superconducting films.

  19. Transition energy and dipole oscillator strength for 1s22p-1s2nd of Cr21+ ion

    Wang Zhi-Wen; Liu Ying; Hu Mu-Hong; Li Xin-Ru; Wang Ya-Nan

    2008-01-01

    The transition energies, wavelengths and dipole oscillator strengths of 1s22p-1s2nd (3 ≤ n ≤ 9) for Cr21+ ion are calculated. The fine structure splittings of 1s2nd (n ≤ 9) states for this ion are also calculated. In calculating energy, we have estimated the higher-order relativistic contribution under a hydrogenic approximation. The quantum defect of Rydberg series 1s2nd is determined according to the quantum defect theory. The results obtained in this paper excellently agree with the experimental data available in the literature. Combining the quantum defect theory with the discrete oscillator strengths, the discrete oscillator strengths for the transitions from initial state 1s22p to highly excited 1s2nd states (n ≥ 10) and the oscillator strength density corresponding to the bound-free transitions are obtained.

  20. Carbon dioxide emissions and the overshoot ratio change resulting from the implementation of 2nd Energy Master Plan in South Korea

    Yeo, M. J.; Kim, Y. P.

    2015-12-01

    The direction of the energy policies of the country is important in the projection of environmental impacts of the country. The greenhouse gases (GHGs) emission of the energy sector in South Korea is very huge, about 600 MtCO2e in 2011. Also the carbon footprint due to the energy consumption contributes to the ecological footprint is also large, more than 60%. Based on the official plans (the national greenhouse gases emission reduction target for 2030 (GHG target for 2030) and the 2nd Energy Master Plan (2nd EMP)), several scenarios were proposed and the sensitivity of the GHG emission amount and 'overshoot ratio' which is the ratio of ecological footprint to biocapacity were estimated. It was found that to meet the GHG target for 2030 the ratio of non-emission energy for power generation should be over 71% which would be very difficult. We also found that the overshoot ratio would increase from 5.9 in 2009 to 7.6 in 2035. Thus, additional efforts are required to reduce the environmental burdens in addition to optimize the power mix configuration. One example is the conversion efficiency in power generation. If the conversion efficiency in power generation rises up 50% from the current level, 40%, the energy demand and resultant carbon dioxide emissions would decrease about 10%. Also the influence on the environment through changes in consumption behavior, for example, the diet choice is expected to be meaningful.

  1. Software tool for analysing the family shopping basket without candidate generation

    Roberto Carlos Naranjo Cuervo

    2010-05-01

    Full Text Available Tools leading to useful knowledge being obtained for supporting marketing decisions being taken are currently needed in the e-commerce environment. A process is needed for this which uses a series of techniques for data-processing; data-mining is one such technique enabling automatic information discovery. This work presents the association rules as a suitable technique for dis-covering how customers buy from a company offering business to consumer (B2C e-business, aimed at supporting decision-ma-king in supplying its customers or capturing new ones. Many algorithms such as A priori, DHP, Partition, FP-Growth and Eclat are available for implementing association rules; the following criteria were defined for selecting the appropriate algorithm: database insert, computational cost, performance and execution time. The development of a software tool is also presented which involved the CRISP-DM approach; this software tool was formed by the following four sub-modules: data pre-processing, data-mining, re-sults analysis and results application. The application design used three-layer architecture: presentation logic, business logic and service logic. Data warehouse design and algorithm design were included in developing this data-mining software tool. It was tested by using a FoodMart company database; the tests included performance, functionality and results’ validity, thereby allo-wing association rules to be found. The results led to concluding that using association rules as a data mining technique facilita-tes analysing volumes of information for B2C e-business services which represents a competitive advantage for those companies using Internet as their sales’ media.

  2. Application to non-destructive assay of 2nd layer tail cylinder at Rokkasho Enrichment plant

    International Atomic Energy Agency (IAEA) safeguards criteria 93+2 requirement stipulates that depleted uranium under the single C/S (Metal or Camera) is re-verified with 10% coverage (β=0.9) once a year, normally during physical inventory verification. In order to maintain the continuity of knowledge, the entire depleted uranium cylinders transferred from the processing area are sealed by inspectorates (the state and the IAEA) after sufficient verification with 50% coverage (β=0.5). At Rokkasho Enrichment plant, the depleted uranium cylinder (0.2∼0.3% enrichment of tail cylinder centrifuged by cascade) separated at the processing area is stored in the cylinder storage. The 2nd layer tail cylinder pile-up has been adopted for efficient-use of the storage; one block is comprised of four cylinders on the lower, three on the upper. The capacity of the specific cylinder crane is limited to as low as height of the upper cylinder. Current lifter with germanium detector is applied to the lower cylinder. However, it is highly possible that inspectorate selects random for the upper tail cylinder at the physical inventory verification. Therefore, at the verification, the operators corporate to lift down the 2nd layer tail cylinder to ground floor so that inspectorate is able to verify it. The Nuclear Material Control Center (NMCC) has designed and manufactured the specific detector lifter capable of the 2nd layer measurement and the potable detector's collimator. Afterwards, we conducted a site test in the plant and inspection efficiency and reduced operator's burden. (author)

  3. PREFACE: 2nd International Conference on Innovative Materials, Structures and Technologies

    Ručevskis, Sandris

    2015-11-01

    The 2nd International Conference on Innovative Materials, Structures and Technologies (IMST 2015) took place in Riga, Latvia from 30th September - 2nd October, 2015. The first event of the conference series, dedicated to the 150th anniversary of the Faculty of Civil Engineering of Riga Technical University, was held in 2013. Following the established tradition, the aim of the conference was to promote and discuss the latest results of industrial and academic research carried out in the following engineering fields: analysis and design of advanced structures and buildings; innovative, ecological and energy efficient building materials; maintenance, inspection and monitoring methods; construction technologies; structural management; sustainable and safe transport infrastructure; and geomatics and geotechnics. The conference provided an excellent opportunity for leading researchers, representatives of the industrial community, engineers, managers and students to share the latest achievements, discuss recent advances and highlight the current challenges. IMST 2015 attracted over 120 scientists from 24 countries. After rigorous reviewing, over 80 technical papers were accepted for publication in the conference proceedings. On behalf of the organizing committee I would like to thank all the speakers, authors, session chairs and reviewers for their efficient and timely effort. The 2nd International Conference on Innovative Materials, Structures and Technologies was organized by the Faculty of Civil Engineering of Riga Technical University with the support of the Latvia State Research Programme under the grant agreement "INNOVATIVE MATERIALS AND SMART TECHNOLOGIES FOR ENVIRONMENTAL SAFETY, IMATEH". I would like to express sincere gratitude to Juris Smirnovs, Dean of the Faculty of Civil Engineering, and Andris Chate, manager of the Latvia State Research Programme. Finally, I would like to thank all those who helped to make this event happen. Special thanks go to Diana

  4. A software tool for simulation of surfaces generated by ball nose end milling

    Bissacco, Giuliano

    2004-01-01

    The number of models available for prediction of surface topography is very limited. The main reason is that these models cannot be based on engineering principles like those for elastic deformations. Most knowledge about surface roughness and integrity is empirical and up to now very few mathema...... readable by a surface processor software (SPIP [2]), for calculation of a number of surface roughness parameters. In the next paragraph a description of the basic features of ball nose end milled surfaces is given, while in paragraph 3 the model is described....

  5. Interface software package for generating the source for neutron transport discrete ordinates code

    The safe operation of the reactors imposes heightened requirements toward the quality of the calculated values of the irradiation for the life-time limit assessment of the reactor vessel. The organisation of the calculations has to assure maximum authenticity of the input data, possibility for control and revision of the initial conditions. That's why the whole calculating process has to be computerised. This work presents the software package by means of which the distribution of the primary neutron source in the reactor core, calculated with the,help of diffusion codes is transformed to the source suitable for the codes calculating the neutron transport out of the core by discrete ordinates method

  6. The 2nd to 4th Digit Length Difference and Ratio as Predictors of Hyperandrogenism and Metabolic Syndrome in Females

    Pınar Yıldız1; Mustafa Yıldız; Ali Cihat Yıldırım3; et al, ...

    2015-01-01

    Objective: In this study we evaluated the usefulness of 2nd to 4th (2nd:4th) digit length difference and ratio in determining hyperandrogenism in females and the relationship with metabolic syndrome. Methods: We designed a cross-sectional clinical study and examined 150 females who visited our clinic; 137 completed the study. We measured blood pressure and anthropometric values. Biochemical parameters associated with metabolic syndrome were also measured. Results: The mean age of our p...

  7. Future Control and Automation : Proceedings of the 2nd International Conference on Future Control and Automation

    2012-01-01

    This volume Future Control and Automation- Volume 2 includes best papers from 2012 2nd International Conference on Future Control and Automation (ICFCA 2012) held on July 1-2, 2012, Changsha, China. Future control and automation is the use of control systems and information technologies to reduce the need for human work in the production of goods and services. This volume can be divided into six sessions on the basis of the classification of manuscripts considered, which is listed as follows: Mathematical Modeling, Analysis and Computation, Control Engineering, Reliable Networks Design, Vehicular Communications and Networking, Automation and Mechatronics.

  8. [Infected chorionic hematoma as a cause of infection in the 2nd trimester].

    Weigel, M; Friese, K; Schmitt, W; Strittmatter, H J; Melchert, F

    1992-12-01

    Superinfected subchorionic haematomas are a rare septic focus in the 2nd trimenon. Symptoms being unspecific, the diagnosis has to be made by exclusion, in most cases. As the changes of a successful treatment of the manifest infection is poor, antibiotic prophylaxis as well as close laboratory controls and early antibiotic therapy should be discussed after sonographic diagnosis of an intrauterine haematoma. Two of our three patients reported on having suffered a miscarriage; only one pregnancy could be maintained after spontaneous depletion of the infected haemorrhage. PMID:1490559

  9. 2nd EUROPEAN CONFERENCE ON ELECTROCHEMICAL METHODS APPLIED TO THE CONSERVATION OF ARTWORKS

    Domenech Carbo, Mª Teresa; DOMENECH CARBO, ANTONIO

    2014-01-01

    This book is issued at the occasion of the 2nd European Conference on electrochemical methods applied to the conservation of artworks, held in Valencia, on 23th September, 2014. This Conference has been hosted by the Instituto Universitario de Restauración del Patrimonio of the Universitat Politècnica de València and has been organized under the auspices of the Ministerio de Ciencia e Innovación, the Universitat Politécnica de València, the Universitat de València and the Universisad de Grana...

  10. 1st and 2nd Trimester Headsize in Fetuses with Congenital Heart Disease: A Cohort Study

    Lauridsen, Mette Høj; Petersen, Olav Bjørn; Vestergaard, Else Marie;

    2014-01-01

    Background: Congenital heart disease (CHD) is associated with neuro-developmental disorders. The influence of CHD on the brain may be present in the fetus. We hypothesize that fetal cerebral growth is impaired as early as 2nd trimester. Aim: To investigate if fetal cerebral growth is associated...... and screening for fetal malformations is carried out. Our cohort includes all fetuses in Western Denmark (2.9 million inhabitants) screened in between January 1st 2012 and December 31st 2013, diagnosed with any structural, non-syndromic congenital heart disease either during pregnancy or up to 6...

  11. Construction of the 2nd 500kV DC gun at KEK

    The 2nd 500 kV DC photocathode electron gun for a ERL injector was constructed at KEK. The gun has some functions such as a insulated anode electrode for using dark current monitor, a repeller electrode for decreasing backward ions, extreme high vacuum pumps and so on. A high voltage conditioning is just begun from this summer. In addition, a new cathode preparation system has been developed. It can prepare three cathodes simultaneously and storage many cathodes in a good vacuum condition. The detail design was finished and the construction of all in-vacuum components is progressing. (author)

  12. TF insert experiment log book. 2nd Experiment of CS model coil

    The cool down of CS model coil and TF insert was started on August 20, 2001. It took almost one month and immediately started coil charge since September 17, 2001. The charge test of TF insert and CS model coil was completed on October 19, 2001. In this campaign, total shot numbers were 88 and the size of the data file in the DAS (Data Acquisition System) was about 4 GB. This report is a database that consists of the log list and the log sheets of every shot. This is an experiment logbook for 2nd experiment of CS model coil and TF insert for charge test. (author)

  13. Mapping and industrial IT project to a 2nd semester design-build project

    Nyborg, Mads; Høgh, Stig

    2010-01-01

    CDIO means bringing the engineer's daily life and working practice into the educational system. In our opinion this is best done by selecting an appropriate project from industry. In this paper we describe how we have mapped an industrial IT project to a 2nd semester design-build project in the Diploma IT program at the Technical University of Denmark. The system in question is a weighing system operating in a LAN environment. The system is used in the medical industry for producing tablets. ...

  14. The 2nd International Association of Neurorestoratology Annual Conference(IANAC)Summary

    Lin CHEN; Da-Qian HUANG; Di CHEN; Hong-Yun HUANG

    2009-01-01

    The 2nd International Association of Neurorestoratology Annual Conference(IANAC)was successfully held in Beijing, China,from April 24 to 26,2009.More than 200 representatives from 30 countries and regions attended the meeting and carried out extensive academic communications and reached important consensus on many issues in neuroregeneration,neural structural repair,neural replacement,neuroprotection,neuromodulation,neurorehabilitation,neuroplasticity and other areas in the field of neurorestoratology.The general assembly adopted"Beijing Declaration of International Association of Neurorestoratology"(Beijing Declaration)that was proposed by 32 scientists from 18 countries.

  15. The ratio of 2nd to 4th digit length: a new predictor of disease predisposition?

    Manning, J T; Bundred, P E

    2000-05-01

    The ratio between the length of the 2nd and 4th digits is: (a) fixed in utero; (b) lower in men than in women; (c) negatively related to testosterone and sperm counts; and (d) positively related to oestrogen concentrations. Prenatal levels of testosterone and oestrogen have been implicated in infertility, autism, dyslexia, migraine, stammering, immune dysfunction, myocardial infarction and breast cancer. We suggest that 2D:4D ratio is predictive of these diseases and may be used in diagnosis, prognosis and in early life-style interventions which may delay the onset of disease or facilitate its early detection. PMID:10859702

  16. DRS // CUMULUS Oslo 2013. The 2nd International Conference for Design Education Researchers

    Liv Merete Nielsen

    2013-01-01

    14-17 May 2013, Oslo, NorwayWe have received more than 200 full papers for the 2nd International Conference for Design Education Researchers in Oslo.This international conference is a springboard for sharing ideas and concepts about contemporary design education research. Contributors are invited to submit research that deals with different facets of contemporary approaches to design education research. All papers will be double-blind peer-reviewed. This conference is open to research in any ...

  17. Collection of documents in the 2nd information exchange meeting on radioactive waste disposal research network

    The 2nd meeting on 'Radioactive Waste Disposal Research Network' was held at the Nagoya University Museum on March 30, 2007. The 'Radioactive Waste Disposal Research Network' was established in Interorganization Atomic Energy Research Program under academic collaborative agreement between Japan Atomic Energy Agency and the University of Tokyo. The objective is to develop both research infrastructures and human expertise in Japan to an adequate performance level, thereby contributing to the development of the fundamental research in the field of radioactive waste disposal. This material is a collection of presentations and discussions during the information exchange meeting. (author)

  18. PREFACE: 2nd International Meeting for Researchers in Materials and Plasma Technology

    Niño, Ely Dannier V.

    2013-11-01

    These proceedings present the written contributions of the participants of the 2nd International Meeting for Researchers in Materials and Plasma Technology, 2nd IMRMPT, which was held from February 27 to March 2, 2013 at the Pontificia Bolivariana Bucaramanga-UPB and Santander and Industrial - UIS Universities, Bucaramanga, Colombia, organized by research groups from GINTEP-UPB, FITEK-UIS. The IMRMPT, was the second version of biennial meetings that began in 2011. The three-day scientific program of the 2nd IMRMPT consisted in 14 Magisterial Conferences, 42 Oral Presentations and 48 Poster Presentations, with the participation of undergraduate and graduate students, professors, researchers and entrepreneurs from Colombia, Russia, France, Venezuela, Brazil, Uruguay, Argentina, Peru, Mexico, United States, among others. Moreover, the objective of IMRMPT was to bring together national and international researchers in order to establish scientific cooperation in the field of materials science and plasma technology; introduce new techniques of surface treatment of materials to improve properties of metals in terms of the deterioration due to corrosion, hydrogen embrittlement, abrasion, hardness, among others; and establish cooperation agreements between universities and industry. The topics covered in the 2nd IMRMPT include New Materials, Surface Physics, Laser and Hybrid Processes, Characterization of Materials, Thin Films and Nanomaterials, Surface Hardening Processes, Wear and Corrosion / Oxidation, Modeling, Simulation and Diagnostics, Plasma Applications and Technologies, Biomedical Coatings and Surface Treatments, Non Destructive Evaluation and Online Process Control, Surface Modification (Ion Implantation, Ion Nitriding, PVD, CVD). The editors hope that those interested in the are of materials science and plasma technology, enjoy the reading that reflect a wide range of topics. It is a pleasure to thank the sponsors and all the participants and contributors for

  19. Mesocosm soil ecological risk assessment tool for GMO 2nd tier studies

    D'Annibale, Alessandra; Maraldo, Kristine; Larsen, Thomas;

    effects in 2nd tier caged experimental systems, cf. the new GMO ERA guidance: EFSA Journal 2010; 8(11):1879. We propose to perform a trophic structure analysis, TSA, and include the trophic structure as an ecological endpoint to gain more direct insight into the change in interactions between species, i...... control. After 5 and 11 weeks, data on populations, plants and soil organic matter decomposition were evaluated. Natural abundances of stable isotopes, 13C and 15N, of animals, soil, plants and added organic matter (crushed maize leaves) were used to describe the soil food web structure....

  20. [Model and enlightenment from rescue of August 2nd Kunshan explosion casualty].

    Tan, Q; Qiu, H B; Sun, B W; Shen, Y M; Nie, L J; Zhang, H W

    2016-01-01

    On August 2nd, 2014, a massive dust explosion occurred in a factory of Kunshan, resulting in a mass casualty involving 185 burn patients. They were transported to 20 medical institutions in Jiangsu province and Shanghai. More than one thousand of medical personnel of our country participated in this emergency rescue, and satisfactory results were achieved. In this paper, the characteristics of this accident were analyzed, the positive effects of interdisciplinary cooperation were affirmed, and the contingency plan, rescue process and pattern, and reserve, organization and management of talents during this rescue process were reviewed retrospectively. PMID:27426066

  1. Future Control and Automation : Proceedings of the 2nd International Conference on Future Control and Automation

    2012-01-01

    This volume Future Control and Automation- Volume 1 includes best papers selected from 2012 2nd International Conference on Future Control and Automation (ICFCA 2012) held on July 1-2, 2012, Changsha, China. Future control and automation is the use of control systems and information technologies to reduce the need for human work in the production of goods and services. This volume can be divided into five sessions on the basis of the classification of manuscripts considered, which is listed as follows: Identification and Control, Navigation, Guidance and Sensor, Simulation Technology, Future Telecommunications and Control

  2. Software-Controlled Next Generation Optical Circuit Switching for HPC and Cloud Computing Datacenters

    Muhammad Imran

    2015-11-01

    Full Text Available In this paper, we consider the performance of optical circuit switching (OCS systems designed for data center networks by using network-level simulation. Recent proposals have used OCS in data center networks but the relatively slow switching times of OCS-MEMS switches (10–100 ms and the latencies of control planes in these approaches have limited their use to the largest data center networks with workloads that last several seconds. Herein, we extend the applicability and generality of these studies by considering dynamically changing short-lived circuits in software-controlled OCS switches, using the faster switching technologies that are now available. The modelled switch architecture features fast optical switches in a single hop topology with a centralized, software-defined optical control plane. We model different workloads with various traffic aggregation parameters to investigate the performance of such designs across usage patterns. Our results show that, with suitable choices for the OCS system parameters, delay performance comparable to that of electrical data center networks can be obtained.

  3. Methodology and Toolset for Model Verification, Hardware/Software co-simulation, Performance Optimisation and Customisable Source-code generation

    Berger, Michael Stübert; Soler, José; Yu, Hao;

    2013-01-01

    The MODUS project aims to provide a pragmatic and viable solution that will allow SMEs to substantially improve their positioning in the embedded-systems development market. The MODUS tool will provide a model verification and Hardware/Software co-simulation tool (TRIAL) and a performance...... optimisation and customisable source-code generation tool (TUNE). The concept is depicted in automated modelling and optimisation of embedded-systems development. The tool will enable model verification by guiding the selection of existing open-source model verification engines, based on the automated analysis...

  4. NDesign: software for study design for the detection of rare variants from next-generation sequencing data.

    Sugaya, Yuki; Akazawa, Yasuaki; Saito, Akira; Kamitsuji, Shigeo

    2012-10-01

    We developed a software program, NDesign, for the design of a study intended for detecting rare variants from next-generation sequencing (NGS) data. In this study design, the optimal depth of coverage and the average depth of coverage are first evaluated, and then the ability of the designed experiment to obtain a desired power is determined. NDesign has been developed to calculate both these depths, as well as to evaluate the power of the designed experiment. It has a simple implementation in the JavaScript language, and is expected to enable researchers to design optimal NGS studies. PMID:22786579

  5. DNA Data Visualization (DDV: Software for Generating Web-Based Interfaces Supporting Navigation and Analysis of DNA Sequence Data of Entire Genomes.

    Tomasz Neugebauer

    Full Text Available Data visualization methods are necessary during the exploration and analysis activities of an increasingly data-intensive scientific process. There are few existing visualization methods for raw nucleotide sequences of a whole genome or chromosome. Software for data visualization should allow the researchers to create accessible data visualization interfaces that can be exported and shared with others on the web. Herein, novel software developed for generating DNA data visualization interfaces is described. The software converts DNA data sets into images that are further processed as multi-scale images to be accessed through a web-based interface that supports zooming, panning and sequence fragment selection. Nucleotide composition frequencies and GC skew of a selected sequence segment can be obtained through the interface. The software was used to generate DNA data visualization of human and bacterial chromosomes. Examples of visually detectable features such as short and long direct repeats, long terminal repeats, mobile genetic elements, heterochromatic segments in microbial and human chromosomes, are presented. The software and its source code are available for download and further development. The visualization interfaces generated with the software allow for the immediate identification and observation of several types of sequence patterns in genomes of various sizes and origins. The visualization interfaces generated with the software are readily accessible through a web browser. This software is a useful research and teaching tool for genetics and structural genomics.

  6. An overview of the Software Development Process for the NASA Langley Atmospheric Data Center Archive Next Generation system

    Piatko, P.; Perez, J.; Kinney, J. B.

    2013-12-01

    The Atmospheric Science Data Center (ASDC) at NASA Langley Research Center is responsible for the archive and distribution of Earth science data in the areas of radiation budget, clouds, aerosols, and tropospheric chemistry. The ASDC has developed and implemented the Archive Next Generation (ANGe) system, a state-of-the-art data ingest, archival, and distribution system to serve the atmospheric sciences data provider and user communities. The ANGe project follows a software development process that covers the full life-cycle of the system, from initial requirements to deployment to production to long-term maintenance of the software. The project uses several tools to support the different stages of the process, such as Subversion for source code control, JIRA for change management, Confluence for documentation and collaboration, and Bamboo for continuous integration. Based on our experience with developing ANGe and other projects at the ASDC, we also provide support for local science projects by setting up Subversion repositories and tools such as Trac, and providing training and support on their use. An overview of the software development process and the tools used to support it will be presented.

  7. System and Component Software Specification, Run-time Verification and Automatic Test Generation Project

    National Aeronautics and Space Administration — The following background technology is described in Part 5: Run-time Verification (RV), White Box Automatic Test Generation (WBATG). Part 5 also describes how WBATG...

  8. Roles of doping ions in afterglow properties of blue CaAl2O4:Eu2+,Nd3+ phosphors

    Eu2+ doped and Nd3+ co-doped calcium aluminate (CaAl2O4:Eu2+,Nd3+) phosphor was prepared by a urea-nitrate solution combustion method at furnace temperatures as low as 500 °C. The produced CaAl2O4:Eu2+,Nd3+ powder was investigated in terms of phase composition, morphology and luminescence by X-Ray diffraction (XRD), Scanning Electron Microscope (SEM), Fourier Transform Infra Red spectroscopy (FTIR) and Photoluminescence (PL) techniques respectively. XRD analysis depicts a dominant monoclinic phase that indicates no change in the crystalline structure of the phosphor with varying concentration of Eu2+ and Nd3+. SEM results show agglomerates with non-uniform shapes and sizes with a number of irregular network structures having lots of voids and pores. The Energy Dispersive X-ray Spectroscopy (EDS) and (FTIR) spectra confirm the expected chemical components of the phosphor. PL measurements indicated one broadband excitation spectra from 200 to 300 nm centered around 240 nm corresponding to the crystal field splitting of the Eu2+ d-orbital and an emission spectrum in the blue region with a maximum on 440 nm. This is a strong indication that there was dominantly one luminescence center, Eu2+ which represents emission from transitions between the 4f7 ground state and the 4f6–5d1 excited state configuration. High concentrations of Eu2+ and Nd3+ generally reduce both intensity and lifetime of the phosphor powders. The optimized content of Eu2+ is 1 mol% and for Nd3+ is 1 mol% for the obtained phosphors with excellent optical properties. The phosphor also emits visible light at around 587 and 616 nm. Such emissions can be ascribed to the 5D0–7F1 and 5D0–7F2 intrinsic transition of Eu3+ respectively. The decay characteristics exhibit a significant rise in initial intensity with increasing Eu2+ doping concentration while the decay time increased with Nd3+ co-doping. The observed afterglow can be ascribed to the generation of suitable traps due to the presence of the Nd

  9. Software tools for automatic generation of finite element mesh and application of biomechanical calculation in medicine

    Milašinović Danko Z.

    2008-01-01

    Full Text Available Cardiovascular diseases are common and a special difficulty in their curing is diagnostics. Modern medical instruments can provide data that is much more adequate for computer modeling. Computer simulations of blood flow through the cardiovascular organs give powerful advantages to scientists today. The motivation for this work is raw data that our Center recently received from the University Clinical center in Heidelberg from a multislice CT scanner. In this work raw data from CT scanner was used for creating a 3D model of the aorta. In this process we used Gmsh, TetGen (Hang Si as well as our own software tools, and the result was the 8-node (brick mesh on which the calculation was run. The results obtained were very satisfactory so...

  10. New generation of software? Modeling of energy demands for residential ventilation with HTML interface

    Forowicz, T.

    1997-06-01

    The paper presents an interactive on-line package for calculation of energy and cost demands for residential infiltration and ventilation, with input and output data entry through a web browser. This is a unique tool. It represents a new kind of approach to developing software employing user (client) and server (package provider) computers. The main program, servicing {open_quotes}intelligent{close_quotes} CGI (Common Gateway Interface) calls, resides on the server and dynamically handles the whole package performance and the procedure of calculations. The {open_quotes}computing engine{close_quotes} consists of two parts: RESVENT - the previously existing program for ventilation calculations and ECONOMICS - for heating and cooling system energy and cost calculations. The user interface is designed in such a way, that it allows simultaneous access by many users from all over the world.

  11. Software Defined Networking for Next Generation Converged Metro-Access Networks

    Ruffini, M.; Slyne, F.; Bluemm, C.; Kitsuwan, N.; McGettrick, S.

    2015-12-01

    While the concept of Software Defined Networking (SDN) has seen a rapid deployment within the data center community, its adoption in telecommunications network has progressed slowly, although the concept has been swiftly adopted by all major telecoms vendors. This paper presents a control plane architecture for SDN-driven converged metro-access networks, developed through the DISCUS European FP7 project. The SDN-based controller architecture was developed in a testbed implementation targeting two main scenarios: fast feeder fiber protection over dual-homed Passive Optical Networks (PONs) and dynamic service provisioning over a multi-wavelength PON. Implementation details and results of the experiment carried out over the second scenario are reported in the paper, showing the potential of SDN in providing assured on-demand services to end-users.

  12. Characterization of the 1st and 2nd EF-hands of NADPH oxidase 5 by fluorescence, isothermal titration calorimetry, and circular dichroism

    Wei Chin-Chuan

    2012-04-01

    Full Text Available Abstract Background Superoxide generated by non-phagocytic NADPH oxidases (NOXs is of growing importance for physiology and pathobiology. The calcium binding domain (CaBD of NOX5 contains four EF-hands, each binding one calcium ion. To better understand the metal binding properties of the 1st and 2nd EF-hands, we characterized the N-terminal half of CaBD (NCaBD and its calcium-binding knockout mutants. Results The isothermal titration calorimetry measurement for NCaBD reveals that the calcium binding of two EF-hands are loosely associated with each other and can be treated as independent binding events. However, the Ca2+ binding studies on NCaBD(E31Q and NCaBD(E63Q showed their binding constants to be 6.5 × 105 and 5.0 × 102 M-1 with ΔHs of -14 and -4 kJ/mol, respectively, suggesting that intrinsic calcium binding for the 1st non-canonical EF-hand is largely enhanced by the binding of Ca2+ to the 2nd canonical EF-hand. The fluorescence quenching and CD spectra support a conformational change upon Ca2+ binding, which changes Trp residues toward a more non-polar and exposed environment and also increases its α-helix secondary structure content. All measurements exclude Mg2+-binding in NCaBD. Conclusions We demonstrated that the 1st non-canonical EF-hand of NOX5 has very weak Ca2+ binding affinity compared with the 2nd canonical EF-hand. Both EF-hands interact with each other in a cooperative manner to enhance their Ca2+ binding affinity. Our characterization reveals that the two EF-hands in the N-terminal NOX5 are Ca2+ specific. Graphical abstract

  13. Editorial: 2nd Special Issue on behavior change, health, and health disparities.

    Higgins, Stephen T

    2015-11-01

    This Special Issue of Preventive Medicine (PM) is the 2nd that we have organized on behavior change, health, and health disparities. This is a topic of fundamental importance to improving population health in the U.S. and other industrialized countries that are trying to more effectively manage chronic health conditions. There is broad scientific consensus that personal behavior patterns such as cigarette smoking, other substance abuse, and physical inactivity/obesity are among the most important modifiable causes of chronic disease and its adverse impacts on population health. As such behavior change needs to be a key component of improving population health. There is also broad agreement that while these problems extend across socioeconomic strata, they are overrepresented among more economically disadvantaged populations and contribute directly to the growing problem of health disparities. Hence, behavior change represents an essential step in curtailing that unsettling problem as well. In this 2nd Special Issue, we devote considerable space to the current U.S. prescription opioid addiction epidemic, a crisis that was not addressed in the prior Special Issue. We also continue to devote attention to the two largest contributors to preventable disease and premature death, cigarette smoking and physical inactivity/obesity as well as risks of co-occurrence of these unhealthy behavior patterns. Across each of these topics we included contributions from highly accomplished policy makers and scientists to acquaint readers with recent accomplishments as well as remaining knowledge gaps and challenges to effectively managing these important chronic health problems. PMID:26257372

  14. A novel 2nd-order bandpass MFSS filter with miniaturized structure

    C. Y. Fang

    2015-08-01

    Full Text Available In order to effectively obtain a miniaturized structure and good filtering properties, we propose a novel 2nd-order bandpass metamaterial frequency selective surface (MFSS filter which contains two capacitive layers and one inductive layer, where there are multi-loop metallic patches as shunt capacitor C and planar wire grids as series inductor L respectively. Unlike the traditional operation way—the tuned elements used in resonant surface approximately equal to one wavelength in circumference and the structure thickness with a spacing of a quarter wavelength apart, by changing the value of L and C and matching multilayer dielectric to adjust the LC coupling resonance and the resonance impedance respectively, the proposed MFSS filter can achieves a miniatured structure with ideal bandpass properties. Measurement results of the fabricated prototype of the bandpass filter (BPF indicate that the dimension of the tuned element on resonant surface is approximately 0.025 wavelength, i.e., 0.025λ. At the same time, the filter has the stable center frequency of f0 = 1.53GHz and the transmittance of T ⩾ 96.3% and high Q-value for the TE/TM wave polarization at various incidence angles. The novel 2nd-order bandpass MFSS filter with miniaturized structure not only can decrease structure dimension, but also has a wide range of applications to microwave and infrared band.

  15. Proceedings of the 2nd technical meeting on high temperature gas-cooled reactors

    From the point of view for establishing and upgrading the technology basis of HTGRs, the 2nd Technical Meeting on High Temperature Gas-cooled Reactors (HTGRs) was held on March 11 and 12, 1992, in Tokai Research Establishment in order to review the present status and the results of Research and Development (R and D) of HTGRs, to discuss on the items of R and D which should be promoted more positively in the future and then, to help in determining the strategy of development of high temperature engineering and examination in JAERI. At the 2nd Technical Meeting, which followed the 1st Technical Meeting held in February 1990 in Tokai Research Establishment, expectations to the High Temperature Engineering Test Reactor (HTTR), possible contributions of the HTGRs to the preservation of the global environment and the prospect of HTGRs were especially discussed, focusing on the R and D of Safety, high temperature components and process heat utilization by the experts from JAERI as well as universities, national institutes, industries and so on. This proceedings summarizes the papers presented in the oral sessions and materials exhibited in the poster session at the meeting and will be variable as key materials for promoting the R and D on HTGRs from now on. (author)

  16. Efficacy and Safety of rAAV2-ND4 Treatment for Leber's Hereditary Optic Neuropathy.

    Wan, Xing; Pei, Han; Zhao, Min-Jian; Yang, Shuo; Hu, Wei-Kun; He, Heng; Ma, Si-Qi; Zhang, Ge; Dong, Xiao-Yan; Chen, Chen; Wang, Dao-Wen; Li, Bin

    2016-01-01

    Leber's hereditary optic neuropathy (LHON) is a mitochondrially inherited disease leading to blindness. A mitochondrial DNA point mutation at the 11778 nucleotide site of the NADH dehydrogenase subunit 4 (ND4) gene is the most common cause. The aim of this study was to evaluate the efficacy and safety of a recombinant adeno-associated virus 2 (AAV2) carrying ND4 (rAAV2-ND4) in LHON patients carrying the G11778A mutation. Nine patients were administered rAAV2-ND4 by intravitreal injection to one eye and then followed for 9 months. Ophthalmologic examinations of visual acuity, visual field, and optical coherence tomography were performed. Physical examinations included routine blood and urine. The visual acuity of the injected eyes of six patients improved by at least 0.3 log MAR after 9 months of follow-up. In these six patients, the visual field was enlarged but the retinal nerve fibre layer remained relatively stable. No other outcome measure was significantly changed. None of the nine patients had local or systemic adverse events related to the vector during the 9-month follow-up period. These findings support the feasible use of gene therapy for LHON. PMID:26892229

  17. Predicting Municipal Solid Waste Generation through Time Series Method (ARMA Technique and System Dynamics Modeling (Vensim Software

    A Ebrahimi

    2016-06-01

    Full Text Available Background and Objective: Predicting municipal solid waste generation has an important role in solid waste management. The aim of this study was to predict municipal solid waste generation in Isfahan through time series method and system dynamics modeling. Materials and Methods: Verified data of solid waste generation was collected from Waste Management Organization and population information was collected from the National Statistics Center, Iran for the period 1996-2011. Next, the effect of   factors on solid waste generation such as population, urbanization, gross domestic product was investigated. Moreover, the relationship between each of these factors was identified using generalized estimating equation  model. Finally, the quantity of the solid waste generated in Isfahan city was predicted using system dynamics modeling by Vensim software and time series method by ARMA technique. Results: It was found that population and gross domestic product have a significant relationship with the amount of solid waste with P value 0.026 and 0 respectively. The annual average of municipal solid waste generation would be 1501.4 ton/day in 2021 estimated by the time series method and 1436 ton/day estimated by the system dynamics modeling. In addition, average annual growth rate achieved was 3.44%. Conclusion: According to the results obtained, population and gross domestic product have a significant effect on MSW generation. Municipal solid waste generation will increase in future. Increasing solid waste is not the same in different areas and methods. The prediction of the time series method by ARMA technique gives precise results compared with other methods.

  18. E-Filing Case Management Services in the US Federal Courts: The Next Generation: A Case Study

    J.Michael Greenwood

    2015-07-01

    Full Text Available The U.S. Federal Courts Administrative Office of the U.S. Courts (AOUSC was responsible for developing the Case Management/Electronic Case File system (legacy CM/ECF originally implemented in 1996 to service the federal courts. The AOUSC is presently developing its 2nd generation service (NextGen. The IJCA carried an earlier narrative of CM/ECF’s evolution.  This second IJCA article describes the approach taken to define and develop that 2nd generation CM/ECF system. This article reviews the methodology used for determining requirements; the new software tools and hardware technologies used; and the expanded functions and enhanced services being incorporated into the new product. Also included is an exploration of the various obstacles, problems, and organizational issues which occur when transitioning from a legacy system to one that is more modern and complex.

  19. A Customer Value Creation Framework for Businesses That Generate Revenue with Open Source Software

    Aparna Shanker

    2012-03-01

    Full Text Available Technology entrepreneurs must create value for customers in order to generate revenue. This article examines the dimensions of customer value creation and provides a framework to help entrepreneurs, managers, and leaders of open source projects create value, with an emphasis on businesses that generate revenue from open source assets. The proposed framework focuses on a firm's pre-emptive value offering (also known as a customer value proposition. This is a firm's offering of the value it seeks to create for a customer, in order to meet his or her requirements.

  20. Net Generation at Social Software: Challenging Assumptions, Clarifying Relationships and Raising Implications for Learning

    Valtonen, Teemu; Dillon, Patrick; Hacklin, Stina; Vaisanen, Pertti

    2010-01-01

    This paper takes as its starting point assumptions about use of information and communication technology (ICT) by people born after 1983, the so called net generation. The focus of the paper is on social networking. A questionnaire survey was carried out with 1070 students from schools in Eastern Finland. Data are presented on students' ICT-skills…

  1. Massive coordination of dispersed generation using PowerMatcher based software agents

    One of the outcomes of the EU-Fifth framework CRISP-project (http://crisp.ecn.nl/), has been the development of a real-time control strategy based on the application of distributed intelligence (ICT) to coordinate demand and supply in electricity grids. This PowerMatcher approach has been validated in two real-life and real-time field tests. The experiments aimed at controlled coordination of dispersed electricity suppliers (DG-RES) and demanders in distributed grids enabled by ICT-networks. Optimization objectives for the technology in the tests were minimization of imbalance in a commercial portfolio and mitigation of strong load variations in a distribution network with residential micro-CHPs. With respect to the number of ICT-nodes, the field tests were on a relatively small-scale. However, application of the technology has yielded some very encouraging results in both occasions. In the present paper, lessons learned from the field experiments are discussed. Furthermore, it contains an account of the roadmap for scaling up these field-tests with a larger number of nodes and with more diverse appliance/installation types. Due to its autonomous decision making agent-paradigm, the PowerMatcher software technology is expected to be widely more scaleable than central coordination approaches. Indeed, it is based on microeconomic theory and is expected to work best if it is applied on a massive scale in transparent market settings. A set of various types of supply and demand appliances was defined and implemented in a PowerMatcher software simulation environment. A massive amount of these PowerMatcher node-agents each representing such a devicetype was utilized in a number of scenario calculations. As the production of DG-RES-resources and the demand profiles are strongly dependent on the time-of-year, climate scenarios leading to operational snapshots of the cluster were taken for a number of representative periods. The results of these larger scale simulations as

  2. SMOG 2: A Versatile Software Package for Generating Structure-Based Models.

    Jeffrey K Noel

    2016-03-01

    Full Text Available Molecular dynamics simulations with coarse-grained or simplified Hamiltonians have proven to be an effective means of capturing the functionally important long-time and large-length scale motions of proteins and RNAs. Originally developed in the context of protein folding, structure-based models (SBMs have since been extended to probe a diverse range of biomolecular processes, spanning from protein and RNA folding to functional transitions in molecular machines. The hallmark feature of a structure-based model is that part, or all, of the potential energy function is defined by a known structure. Within this general class of models, there exist many possible variations in resolution and energetic composition. SMOG 2 is a downloadable software package that reads user-designated structural information and user-defined energy definitions, in order to produce the files necessary to use SBMs with high performance molecular dynamics packages: GROMACS and NAMD. SMOG 2 is bundled with XML-formatted template files that define commonly used SBMs, and it can process template files that are altered according to the needs of each user. This computational infrastructure also allows for experimental or bioinformatics-derived restraints or novel structural features to be included, e.g. novel ligands, prosthetic groups and post-translational/transcriptional modifications. The code and user guide can be downloaded at http://smog-server.org/smog2.

  3. SMOG 2: A Versatile Software Package for Generating Structure-Based Models

    Noel, Jeffrey K.; Levi, Mariana; Raghunathan, Mohit; Lammert, Heiko; Hayes, Ryan L.; Onuchic, José N.; Whitford, Paul C.

    2016-01-01

    Molecular dynamics simulations with coarse-grained or simplified Hamiltonians have proven to be an effective means of capturing the functionally important long-time and large-length scale motions of proteins and RNAs. Originally developed in the context of protein folding, structure-based models (SBMs) have since been extended to probe a diverse range of biomolecular processes, spanning from protein and RNA folding to functional transitions in molecular machines. The hallmark feature of a structure-based model is that part, or all, of the potential energy function is defined by a known structure. Within this general class of models, there exist many possible variations in resolution and energetic composition. SMOG 2 is a downloadable software package that reads user-designated structural information and user-defined energy definitions, in order to produce the files necessary to use SBMs with high performance molecular dynamics packages: GROMACS and NAMD. SMOG 2 is bundled with XML-formatted template files that define commonly used SBMs, and it can process template files that are altered according to the needs of each user. This computational infrastructure also allows for experimental or bioinformatics-derived restraints or novel structural features to be included, e.g. novel ligands, prosthetic groups and post-translational/transcriptional modifications. The code and user guide can be downloaded at http://smog-server.org/smog2. PMID:26963394

  4. SMOG 2: A Versatile Software Package for Generating Structure-Based Models.

    Noel, Jeffrey K; Levi, Mariana; Raghunathan, Mohit; Lammert, Heiko; Hayes, Ryan L; Onuchic, José N; Whitford, Paul C

    2016-03-01

    Molecular dynamics simulations with coarse-grained or simplified Hamiltonians have proven to be an effective means of capturing the functionally important long-time and large-length scale motions of proteins and RNAs. Originally developed in the context of protein folding, structure-based models (SBMs) have since been extended to probe a diverse range of biomolecular processes, spanning from protein and RNA folding to functional transitions in molecular machines. The hallmark feature of a structure-based model is that part, or all, of the potential energy function is defined by a known structure. Within this general class of models, there exist many possible variations in resolution and energetic composition. SMOG 2 is a downloadable software package that reads user-designated structural information and user-defined energy definitions, in order to produce the files necessary to use SBMs with high performance molecular dynamics packages: GROMACS and NAMD. SMOG 2 is bundled with XML-formatted template files that define commonly used SBMs, and it can process template files that are altered according to the needs of each user. This computational infrastructure also allows for experimental or bioinformatics-derived restraints or novel structural features to be included, e.g. novel ligands, prosthetic groups and post-translational/transcriptional modifications. The code and user guide can be downloaded at http://smog-server.org/smog2. PMID:26963394

  5. Multigrid preconditioning of steam generator two-phase mixture balance equations in the Genepi software

    Belliard, Michel

    2006-01-01

    International audience Within the framework of averaged two-phase mixture flow simulations of PWR Steam Generators (SG), this paper provides a geometric version of a pseudo-FMG FAS preconditioning of the balance equations used in the CEA Genepi code. The 3D steady-state flow is reached by a transient computation using a fractional step algorithm and a projection method. Our application is based on the PVM package. The difficulties of applying geometric FAS multigrid methods to the balance ...

  6. Software for pre-processing Illumina next-generation sequencing short read sequences

    Chen, Chuming; Khaleel, Sari S; Huang, Hongzhan; Cathy H Wu

    2014-01-01

    Background When compared to Sanger sequencing technology, next-generation sequencing (NGS) technologies are hindered by shorter sequence read length, higher base-call error rate, non-uniform coverage, and platform-specific sequencing artifacts. These characteristics lower the quality of their downstream analyses, e.g. de novo and reference-based assembly, by introducing sequencing artifacts and errors that may contribute to incorrect interpretation of data. Although many tools have been devel...

  7. Splicing Express: a software suite for alternative splicing analysis using next-generation sequencing data

    Kroll, Jose E.; Kim, JiHoon; Ohno-Machado, Lucila; de Souza, Sandro J.

    2015-01-01

    Motivation. Alternative splicing events (ASEs) are prevalent in the transcriptome of eukaryotic species and are known to influence many biological phenomena. The identification and quantification of these events are crucial for a better understanding of biological processes. Next-generation DNA sequencing technologies have allowed deep characterization of transcriptomes and made it possible to address these issues. ASEs analysis, however, represents a challenging task especially when many dif...

  8. LISP software generative compilation within the frame of a SLIP system

    After having outlined the limitations associated with the use of some programming languages (Fortran, Algol, assembler, and so on), and the interest of the use of the LISP structure and its associated language, the author notices that some problems remain regarding the memorisation of the computing process obtained by interpretation. Thus, he introduces a generative compiler which produces an executable programme, and which is written in a language very close to the used machine language, i.e. the FAP assembler language

  9. Sustainable development - a role for nuclear power? 2nd scientific forum

    The 2nd Scientific Forum of the International Atomic Energy Agency (IAEA) was held during the 43rd General Conference. This paper summarizes the deliberations of the two-day Forum. The definition of 'sustainable development' of the 1987 Bruntland Commission - 'development that meets the needs of the present without compromising the ability of future generations to meet their own needs' - provided the background for the Forum's debate whether and how nuclear power could contribute to sustainable energy development. The framework for this debate comprises different perspectives on economic, energy, environmental, and political considerations. Nuclear power, along with all energy generating systems, should be judged on these considerations using a common set of criteria (e.g., emission levels, economics, public safety, wastes, and risks). First and foremost, there is a growing political concern over the possible adverse impact of increasing emissions of greenhouse gases from fossil fuel combustion. However, there is debate as to whether this would have any material impact on the predominantly economic criteria currently used to make investment decisions on energy production. According to the views expressed, the level of safety of existing nuclear power plants is no longer a major concern - a view not yet fully shared by the general public. The need to maintain the highest standards of safety in operation remains, especially under the mounting pressure of competitiveness in deregulated and liberalized energy markets. The industry must continuously reinforce a strong safety culture among reactor designers, builders, and operators. Furthermore, a convincing case for safety will have to be made for any new reactor designs. Of greater concern to the public and politicians are the issues of radioactive waste and proliferation of nuclear weapons. There is a consensus among technical experts that radioactive wastes from nuclear power can be disposed of safely and

  10. TestDose: A nuclear medicine software based on Monte Carlo modeling for generating gamma camera acquisitions and dosimetry

    Garcia, Marie-Paule, E-mail: marie-paule.garcia@univ-brest.fr; Villoing, Daphnée [UMR 1037 INSERM/UPS, CRCT, 133 Route de Narbonne, 31062 Toulouse (France); McKay, Erin [St George Hospital, Gray Street, Kogarah, New South Wales 2217 (Australia); Ferrer, Ludovic [ICO René Gauducheau, Boulevard Jacques Monod, St Herblain 44805 (France); Cremonesi, Marta; Botta, Francesca; Ferrari, Mahila [European Institute of Oncology, Via Ripamonti 435, Milano 20141 (Italy); Bardiès, Manuel [UMR 1037 INSERM/UPS, CRCT, 133 Route de Narbonne, Toulouse 31062 (France)

    2015-12-15

    Purpose: The TestDose platform was developed to generate scintigraphic imaging protocols and associated dosimetry by Monte Carlo modeling. TestDose is part of a broader project (www.dositest.com) whose aim is to identify the biases induced by different clinical dosimetry protocols. Methods: The TestDose software allows handling the whole pipeline from virtual patient generation to resulting planar and SPECT images and dosimetry calculations. The originality of their approach relies on the implementation of functional segmentation for the anthropomorphic model representing a virtual patient. Two anthropomorphic models are currently available: 4D XCAT and ICRP 110. A pharmacokinetic model describes the biodistribution of a given radiopharmaceutical in each defined compartment at various time-points. The Monte Carlo simulation toolkit GATE offers the possibility to accurately simulate scintigraphic images and absorbed doses in volumes of interest. The TestDose platform relies on GATE to reproduce precisely any imaging protocol and to provide reference dosimetry. For image generation, TestDose stores user’s imaging requirements and generates automatically command files used as input for GATE. Each compartment is simulated only once and the resulting output is weighted using pharmacokinetic data. Resulting compartment projections are aggregated to obtain the final image. For dosimetry computation, emission data are stored in the platform database and relevant GATE input files are generated for the virtual patient model and associated pharmacokinetics. Results: Two samples of software runs are given to demonstrate the potential of TestDose. A clinical imaging protocol for the Octreoscan™ therapeutical treatment was implemented using the 4D XCAT model. Whole-body “step and shoot” acquisitions at different times postinjection and one SPECT acquisition were generated within reasonable computation times. Based on the same Octreoscan™ kinetics, a dosimetry

  11. TestDose: A nuclear medicine software based on Monte Carlo modeling for generating gamma camera acquisitions and dosimetry

    Purpose: The TestDose platform was developed to generate scintigraphic imaging protocols and associated dosimetry by Monte Carlo modeling. TestDose is part of a broader project (www.dositest.com) whose aim is to identify the biases induced by different clinical dosimetry protocols. Methods: The TestDose software allows handling the whole pipeline from virtual patient generation to resulting planar and SPECT images and dosimetry calculations. The originality of their approach relies on the implementation of functional segmentation for the anthropomorphic model representing a virtual patient. Two anthropomorphic models are currently available: 4D XCAT and ICRP 110. A pharmacokinetic model describes the biodistribution of a given radiopharmaceutical in each defined compartment at various time-points. The Monte Carlo simulation toolkit GATE offers the possibility to accurately simulate scintigraphic images and absorbed doses in volumes of interest. The TestDose platform relies on GATE to reproduce precisely any imaging protocol and to provide reference dosimetry. For image generation, TestDose stores user’s imaging requirements and generates automatically command files used as input for GATE. Each compartment is simulated only once and the resulting output is weighted using pharmacokinetic data. Resulting compartment projections are aggregated to obtain the final image. For dosimetry computation, emission data are stored in the platform database and relevant GATE input files are generated for the virtual patient model and associated pharmacokinetics. Results: Two samples of software runs are given to demonstrate the potential of TestDose. A clinical imaging protocol for the Octreoscan™ therapeutical treatment was implemented using the 4D XCAT model. Whole-body “step and shoot” acquisitions at different times postinjection and one SPECT acquisition were generated within reasonable computation times. Based on the same Octreoscan™ kinetics, a dosimetry

  12. Software emulator of nuclear pulse generation with different pulse shapes and pile-up

    Pechousek, Jiri; Konecny, Daniel; Novak, Petr; Kouril, Lukas; Kohout, Pavel; Celiktas, Cuneyt; Vujtek, Milan

    2016-08-01

    The optimal detection of output signals from nuclear counting devices represents one of the key physical factors that govern accuracy and experimental reproducibility. In this context, the fine calibration of the detector under diverse experimental scenarios, although time costly, is necessary. However this process can be rendered easier with the use of systems that work in lieu of emulators. In this report we describe an innovative programmable pulse generator device capable to emulate the scintillation detector signals, in a way to mimic the detector performances under a variety of experimental conditions. The emulator generates a defined number of pulses, with a given shape and amplitude in the form of a sampled detector signal. The emulator output is then used off-line by a spectrometric system in order to set up its optimal performance. Three types of pulse shapes are produced by our device, with the possibility to add noise and pulse pile-up effects into the signal. The efficiency of the pulse detection, pile-up rejection and/or correction, together with the dead-time of the system, are therein analyzed through the use of some specific algorithms for pulse processing, and the results obtained validate the beneficial use of emulators for the accurate calibration process of spectrometric systems.

  13. Tetragonal ZrO2:Nd3+ nanosphere: Combustion synthesis, luminescence and photoacoustic spectroscopy

    Gupta, Santosh K.; Chandrasekhar, D.; Kadam, R. M.

    2015-12-01

    Nanocrystalline ZrO2:Nd3+ was synthesised using gel-combustion method and characterized systematically using X-ray diffraction (XRD) and transmission electron microscopy (TEM). Through this route we can stabilize metastable tetragonal phase at 500 °C through addition of 1 mol % Nd3+ which is technologically more important. Optical characterization of the sample was done using photoluminescence (PL) and photoacoustic spectroscopy (PAS). PL studies shows an intense and optimum stimulated emission cross section of 1065 nm peak corresponding to 4F3/2 → 4I11/2 which and thus it can be a probable laser material. PAS is used to investigate electronic absorption of Nd3 in zirconia. Various covalency parameters like nephelauxetic ratio (β), covalency factor (b1/2) and Sinha parameter (δ) were evaluated for pure oxide powder and as well as for Nd3+ doped zirconia.

  14. Book Review: The Communicating Leader: The key to strategic alignment (2nd Ed

    X. C. Birkenbach

    2003-10-01

    Full Text Available Title: The Communicating Leader: The key to strategic alignment (2nd Ed Author: Gustav Puth Publisher: Van Schaik Publishers Reviewer: XC Birkenbach The aim of the book according to the author, is "meant to be a usable tool, an instrument in the toolbox of the real leader and leadership student". The book is written in conversational style (as intended by the author and the 219 pages of the 10 chapters are logically packaged into three parts. While the main emphasis is naturally on leadership and communication, the coverage includes topics typically encountered in Organisational Behaviour or Management texts, e.g., organizational culture, managing change, motivation, conflict management and strategic management.

  15. 2nd International Conference on Education and Educational Technology (EET 2011)

    Education and Educational Technology

    2012-01-01

    This volume includes extended and revised versions of a set of selected papers from the 2011 2nd International Conference on Education and Educational Technology (EET 2011) held in Chengdu, China, October 1-2, 2011. The mission of EET 2011 Volume 1 is to provide a forum for researchers, educators, engineers, and government officials involved in the general areas of education and educational technology to disseminate their latest research results and exchange views on the future research directions of these fields. 130 related topic papers were selected into this volume. All the papers were reviewed by 2 program committee members and selected by the volume editor Prof. Yuanzhi Wang, from Intelligent Information Technology Application Research Association, Hong Kong. The conference will bring together leading researchers, engineers and scientists in the domain of interest. We hope every participant can have a good opportunity to exchange their research ideas and results and to discuss the state of the art in th...

  16. Proceedings of the 2nd seminar of R and D on advanced ORIENT

    The 2nd Seminar of R and D on advanced ORIENT was held at Ricotte, on November 7th, 2008, Japan Atomic Energy Agency. The first meeting of this seminar was held on Oarai, Ibaraki on May, 2008, and more than fifty participants including related researchers and general public people were attended to this seminar. The second seminar has headed by Nuclear Science and Engineering Directorate, JAEA on Tokai, Ibaraki with 63 participants. Spent nuclear fuel should be recognized not only mass of radioactive elements but also potentially useful materials including platinum metals and rare earth elements. Taking the cooperation with universities related companies and research institutes, into consideration, we aimed at expanding and progressing the basic researches. This report records abstracts and figures submitted from the oral speakers in this seminar. (author)

  17. Preliminary GPS orbit combination results of the IGS 2nd reprocessing campaign

    Choi, Kevin

    2015-04-01

    International GNSS Service (IGS) has contributed to the International Terrestrial Reference Frame by reprocessing historic GPS network data and submitting Terrestrial Reference Frame solutions and Earth Rotation Parameters. For the 2nd reprocessing campaign, Analysis Centers (ACs) used up to 21 years of GPS observation data with daily integrations. IERS2010 conventions are applied to model the physical effects of the Earth. Total eight ACs have participated (7 Global solutions, and 2 Tide Gauge solutions) by reprocessing entire time series in a consistent way using the latest models and methodology. IGS combined daily SINEX TRF and EOP combinations have already been submitted to the IERS for ITRF2013. This presentation mainly focuses on the preliminary quality assessment of the reprocessed AC orbits. Quality of the orbit products are examined by examining the repeatability between daily AC satellite ephemeris. Power spectral analysis shows the background noise characteristics of each AC products, and its periodic behaviors.

  18. Summary of the 2nd workshop on ion beam-applied biology

    Induction of novel plant resources by ion beam-irradiation has been investigated in JAERI. To share the knowledge of the present status of the field, and to find out future plants, 1st Workshop on ion beam-applied biology was held last year titled as ''Development of breeding technique for ion beams''. To further improve the research cooperation and to exchange useful information in the field, researchers inside JAERI and also with researchers outside, such as those from agricultural experiment stations, companies, and Universities met each other at the 2nd workshop on ion beam-applied biology titled as ''Future development of breeding technique for ion beams''. People from RIKEN, Institute of Radiation Breeding, Wakasa wan Energy Research Center, National Institute of Radiological Science also participated in this workshop. The 12 of the presented papers are indexed individually. (J.P.N.)

  19. 2nd Canada-China joint workshop on supercritical-water-cooled reactors (CCSC-2010)

    The 2nd Canada-China Joint Workshop on Supercritical-Water-Cooled Reactors (CCSC-2010) was held in Toronto, Ontario, Canada on April 25-25, 2010. This joint workshop aimed at providing a forum for discussion of advancements and issues, sharing information and technology transfer, and establishing future collaborations on research and developments for supercritical water-cooled reactors (SCWR) between Canadian and Chinese research organizations. Participants were those involved in research and development of SCWR core design, materials, chemistry, corrosion, thermalhydraulics, and safety analysis at organizations in Canada and China. Papers related to the following topics were of interest to the workshop: reactor core and fuel designs; materials, chemistry and corrosion; thermalhydraulics and safety analysis; balance of plant; and other applications.

  20. 2nd International Conference on Electrical Systems, Technology and Information 2015

    Tanoto, Yusak; Lim, Resmana; Santoso, Murtiyanto; Pah, Nemuel

    2016-01-01

    This book includes the original, peer-reviewed research papers from the 2nd International Conference on Electrical Systems, Technology and Information (ICESTI 2015), held in September 2015 at Patra Jasa Resort & Villas Bali, Indonesia. Topics covered include: Mechatronics and Robotics, Circuits and Systems, Power and Energy Systems, Control and Industrial Automation, and Information Theory.    It explores emerging technologies and their application in a broad range of engineering disciplines, including communication technologies and smart grids. It examines hybrid intelligent and knowledge-based control, embedded systems, and machine learning. It also presents emerging research and recent application in green energy system and storage. It discusses the role of electrical engineering in biomedical, industrial and mechanical systems, as well as multimedia systems and applications, computer vision and image and signal processing. The primary objective of this series is to provide references for disseminat...

  1. 2nd International Colloquium on Sports Science, Exercise, Engineering and Technology 2015

    Sulaiman, Norasrudin; Adnan, Rahmat

    2016-01-01

    The proceeding is a collection of research papers presented at the 2nd International Colloquium on Sports Science, Exercise, Engineering and Technology (ICoSSEET2015), a conference dedicated to address the challenges in the areas of sports science, exercise, sports engineering and technology including other areas of sports, thereby presenting a consolidated view to the interested researchers in the aforesaid fields. The goal of this conference was to bring together researchers and practitioners from academia and industry to focus on the scope of the conference and establishing new collaborations in these areas. The topics of interest are in mainly (1) Sports and Exercise Science (2) Sports Engineering and Technology Application (3) Sports Industry and Management.

  2. 2nd Symposium on Fluid-Structure-Sound Interactions and Control

    Liu, Yang; Huang, Lixi; Hodges, Dewey

    2014-01-01

    With rapid economic and industrial development in China, India and elsewhere, fluid-related structural vibration and noise problems are widely encountered in many fields, just as they are in the more developed parts of the world, causing increasingly grievous concerns. Turbulence clearly has a significant impact on many such problems. On the other hand, new opportunities are emerging with the advent of various new technologies, such as signal processing, flow visualization and diagnostics, new functional materials, sensors and actuators, etc. These have revitalized interdisciplinary research activities, and it is in this context that the 2nd symposium on fluid-structure-sound interactions and control (FSSIC) was organized. Held in Hong Kong (May 20-21, 2013) and Macau (May 22-23, 2013), the meeting brought together scientists and engineers working in all related branches from both East and West and provided them with a forum to exchange and share the latest progress, ideas and advances and to chart the fronti...

  3. 2nd International Conference on Education and Educational Technology (EET 2011)

    Education Management, Education Theory and Education Application

    2012-01-01

    This volume includes extended and revised versions of a set of selected papers from the 2011 2nd International Conference on Education and Educational Technology (EET 2011) held in Chengdu, China, October 1-2, 2011. The mission of EET 2011 Volume 2 is to provide a forum for researchers, educators, engineers, and government officials involved in the general areas of education management, education theory and education application to disseminate their latest research results and exchange views on the future research directions of these fields. 133 related topic papers were selected into this volume. All the papers were reviewed by 2 program committee members and selected by the volume editor Prof. Yuanzhi Wang, from Intelligent Information Technology Application Research Association, Hong Kong. The conference will bring together leading researchers, engineers and scientists in the domain of interest. We hope every participant can have a good opportunity to exchange their research ideas and results and to discus...

  4. analysis and implementation of reactor protection system circuits - case study Egypt's 2 nd research reactor-

    this work presents a way to design and implement the trip unit of a reactor protection system (RPS) using a field programmable gate arrays (FPGA). instead of the traditional embedded microprocessor based interface design method, a proposed tailor made FPGA based circuit is built to substitute the trip unit (TU), which is used in Egypt's 2 nd research reactor ETRR-2. the existing embedded system is built around the STD32 field computer bus which is used in industrial and process control applications. it is modular, rugged, reliable, and easy-to-use and is able to support a large mix of I/O cards and to easily change its configuration in the future. therefore, the same bus is still used in the proposed design. the state machine of this bus is designed based around its timing diagrams and implemented in VHDL to interface the designed TU circuit

  5. Proceedings of the 2nd joint seminar on atomic collisions and heavy ion induced nuclear reactions

    The meeting of the 2nd joint seminar on atomic collisions and heavy ion induced nuclear reactions was held at the University of Tokyo, May 13 and 14, 1982. The aim of this seminar has been not only to recognize the common problems lying between above two research fields, but also to obtain an overview of the theoretical and experimental approaches to clear the current problems. In the seminar, more than 50 participants gathered and presented 16 papers. These are two general reviews and fourteen comprehensive surveys on topical subjects which have been developed very intensively in recent years. The editors would like to thank all participants for their assistance and cooperation in making possible a publication of these proceedings. (author)

  6. A Perpendicular Biased 2nd Harmonic Cavity for the Fermilab Booster

    Tan, C. Y.; Dey, J. [Fermilab; Madrak, R. L. [Fermilab; Pellico, W. [Fermilab; Romanov, G. [Fermilab; Sun, D. [Fermilab; Terechkine, I. [Fermilab

    2015-07-13

    A perpendicular biased 2nd harmonic cavity is currently being designed for the Fermilab Booster. Its purpose cavity is to flatten the bucket at injection and thus change the longitudinal beam distribution so that space charge effects are decreased. It can also with transition crossing. The reason for the choice of perpendicular biasing over parallel biasing is that the Q of the cavity is much higher and thus allows the accelerating voltage to be a factor of two higher than a similar parallel biased cavity. This cavity will also provide a higher accelerating voltage per meter than the present folded transmission line cavity. However, this type of cavity presents technical challenges that need to be addressed. The two major issues are cooling of the garnet material from the effects of the RF and the cavity itself from eddy current heating because of the 15 Hz bias field ramp. This paper will address the technical challenge of preventing the garnet from overheating.

  7. Proceedings of the 2nd annual meeting of Japanese Society of Radiation Safety Management 2003 Tsukuba

    This is the program and the proceedings of the 2nd annual meeting of Japanese Society of Radiation Safety Management held from December 3rd through the 5th of 2003. The sessions held were: (1) Research on Low-level Waste, (2) Topics related to Detector, Measurement, and Instrument, (3) Dose Level and Imaging Plate, (4) Radiation, (5) Safety Education and Safety Evaluation. The poster sessions held were: (1) Safety Education, Safety Evaluation, Shielding, and so on, (2) Control System and Control Technology, (3) Detector and Radiation Measurement, (4) Topics Related to Imaging Plate, (5) Environment and Radiation Measurement, and (6) Radiation Control. Symposia held were: (1) 'Regarding Basic Concept to Incorporate International Exemption Level in Regulation' as the keynote lecture and (2) 'Regarding Correspondence Associated with Legal Revision and Radiation Safety Regulation'. Regarding these topics, after the explanation from each area, panel discussions were held. (S.K.)

  8. Belief Functions: Theory and Applications - Proceedings of the 2nd International Conference on Belief Functions

    Masson, Marie-Hélène

    2012-01-01

    The theory of belief functions, also known as evidence theory or Dempster-Shafer theory, was first introduced by Arthur P. Dempster in the context of statistical inference, and was later developed by Glenn Shafer as a general framework for modeling epistemic uncertainty. These early contributions have been the starting points of many important developments, including the Transferable Belief Model and the Theory of Hints. The theory of belief functions is now well established as a general framework for reasoning with uncertainty, and has well understood connections to other frameworks such as probability, possibility and imprecise probability theories.   This volume contains the proceedings of the 2nd International Conference on Belief Functions that was held in Compiègne, France on 9-11 May 2012. It gathers 51 contributions describing recent developments both on theoretical issues (including approximation methods, combination rules, continuous belief functions, graphical models and independence concepts) an...

  9. Nonlinear Dynamics of Memristor Based 2nd and 3rd Order Oscillators

    Talukdar, Abdul Hafiz

    2011-05-01

    Exceptional behaviours of Memristor are illustrated in Memristor based second order (Wien oscillator) and third order (phase shift oscillator) oscillator systems in this Thesis. Conventional concepts about sustained oscillation have been argued by demonstrating the possibility of sustained oscillation with oscillating resistance and dynamic poles. Mathematical models are also proposed for analysis and simulations have been presented to support the surprising characteristics of the Memristor based oscillator systems. This thesis also describes a comparative study among the Wien family oscillators with one Memristor. In case of phase shift oscillator, one Memristor and three Memristors systems are illustrated and compared to generalize the nonlinear dynamics observed for both 2nd order and 3rd order system. Detail explanations are provided with analytical models to simplify the unconventional properties of Memristor based oscillatory systems.

  10. 2nd FP7 Conference and International Summer School Nanotechnology : From Fundamental Research to Innovations

    Yatsenko, Leonid

    2015-01-01

    This book presents some of the latest achievements in nanotechnology and nanomaterials from leading researchers in Ukraine, Europe, and beyond. It features contributions from participants in the 2nd International Summer School “Nanotechnology: From Fundamental Research to Innovations” and International Research and Practice Conference “Nanotechnology and Nanomaterials”, NANO-2013, which were held in Bukovel, Ukraine on August 25-September 1, 2013. These events took place within the framework of the European Commission FP7 project Nanotwinning, and were organized jointly by the Institute of Physics of the National Academy of Sciences of Ukraine, University of Tartu (Estonia), University of Turin (Italy), and Pierre and Marie Curie University (France). Internationally recognized experts from a wide range of universities and research institutions share their knowledge and key results on topics ranging from nanooptics, nanoplasmonics, and interface studies to energy storage and biomedical applications. Pr...

  11. 2nd international KES conference on Smart Education and Smart e-Learning

    Howlett, Robert; Jain, Lakhmi

    2015-01-01

    This book contains the contributions presented at the 2nd international KES conference on Smart Education and Smart e-Learning, which took place in Sorrento, Italy, June 17-19, 2015. It contains a total of 45 peer-reviewed book chapters that are grouped into several parts: Part 1 - Smart Education, Part 2 – Smart Educational Technology, Part 3 – Smart e-Learning, Part 4 – Smart Professional Training and Teachers’ Education, and Part 5 – Smart Teaching and Training related Topics.  This book can be a useful source of research data and valuable information for faculty, scholars, Ph.D. students, administrators, and practitioners  - those who are interested in innovative areas of smart education and smart e-learning.  .

  12. THR Simulator – the software for generating radiographs of THR prosthesis

    Hou Sheng-Mou

    2009-01-01

    Full Text Available Abstract Background Measuring the orientation of acetabular cup after total hip arthroplasty is important for prognosis. The verification of these measurement methods will be easier and more feasible if we can synthesize prosthesis radiographs in each simulated condition. One reported method used an expensive mechanical device with an indeterminable precision. We thus develop a program, THR Simulator, to directly synthesize digital radiographs of prostheses for further analysis. Under Windows platform and using Borland C++ Builder programming tool, we developed the THR Simulator. We first built a mathematical model of acetabulum and femoral head. The data of the real dimension of prosthesis was adopted to generate the radiograph of hip prosthesis. Then with the ray tracing algorithm, we calculated the thickness each X-ray beam passed, and then transformed to grey scale by mapping function which was derived by fitting the exponential function from the phantom image. Finally we could generate a simulated radiograph for further analysis. Results Using THR Simulator, the users can incorporate many parameters together for radiograph synthesis. These parameters include thickness, film size, tube distance, film distance, anteversion, abduction, upper wear, medial wear, and posterior wear. These parameters are adequate for any radiographic measurement research. This THR Simulator has been used in two studies, and the errors are within 2° for anteversion and 0.2 mm for wearing measurement. Conclusion We design a program, THR Simulator that can synthesize prosthesis radiographs. Such a program can be applied in future studies for further analysis and validation of measurement of various parameters of pelvis after total hip arthroplasty.

  13. A Software Method for Generating Concurrent Pwm Signal from Pic18f4520 for Biomimetic Robotic Fish Control

    M.O. Afolayan

    2013-06-01

    Full Text Available A method of generating multiple pulse width modulated signal with phase difference is presented in this work. The microcontroller used is PIC18F4520 and its output is used to drive Futaba RC servo motors directly. The concurrency of pulse width modulated signal in this work is relative, due to the fact that there is a finite time (microcontroller period between each instruction toggling the output pins. This finite time is equal to the minimum period of the microcontroller, which is 125 ns in this work. Also, the phase difference for the servo motors is set to 60o and is achieved by changing the duty cycle of each of the channels while the period remains at 20 ms for all the channels. The robotic fish using this software PWM code was able to attain a linear speed of 0.985 m/s.

  14. The MeqTrees software system and its use for third-generation calibration of radio interferometers

    Noordam, J. E.; Smirnov, O. M.

    2010-12-01

    Context. The formulation of the radio interferometer measurement equation (RIME) for a generic radio telescope by Hamaker et al. has provided us with an elegant mathematical apparatus for better understanding, simulation and calibration of existing and future instruments. The calibration of the new radio telescopes (LOFAR, SKA) would be unthinkable without the RIME formalism, and new software to exploit it. Aims: The MeqTrees software system is designed to implement numerical models, and to solve for arbitrary subsets of their parameters. It may be applied to many problems, but was originally geared towards implementing Measurement Equations in radio astronomy for the purposes of simulation and calibration. The technical goal of MeqTrees is to provide a tool for rapid implementation of such models, while offering performance comparable to hand-written code. We are also pursuing the wider goal of increasing the rate of evolution of radio astronomical software, by offering a tool that facilitates rapid experimentation, and exchange of ideas (and scripts). Methods: MeqTrees is implemented as a Python-based front-end called the meqbrowser, and an efficient (C++-based) computational back-end called the meqserver. Numerical models are defined on the front-end via a Python-based Tree Definition Language (TDL), then rapidly executed on the back-end. The use of TDL facilitates an extremely short turn-around time (hours rather than weeks or months) for experimentation with new ideas. This is also helped by unprecedented visualization capabilities for all final and intermediate results. A flexible data model and a number of important optimizations in the back-end ensures that the numerical performance is comparable to that of hand-written code. Results: MeqTrees is already widely used as the simulation tool for new instruments (LOFAR, SKA) and technologies (focal plane arrays). It has demonstrated that it can achieve a noise-limited dynamic range in excess of a million, on

  15. Cataloging of the Northern Sky from the POSS-II using a Next-Generation Software Technology

    Djorgovski, S. G.; Weir, N.; Fayyad, U.

    Digitization of the Second Palomar Observatory Sky Survey (POSS-II) is now in progress at STScI. The resulting data set, the Palomar-STScI Digital Sky Survey (DPOSS), will consist of about 3 TB of pixel data. In order to extract useful information from this data set quickly, uniformly, and efficiently, we have developed a software system to catalog, calibrate, classify, maintain, and analyse the scans, called Sky Image Cataloging and Analysis Tool (SKICAT). It is a suite of programs designed to facilitate the maintenance and analysis of astronomical surveys comprised of multiple, overlapping images and/or catalogs. The system serves three principal functions: catalog construction (including object classification), catalog management, and catalog analysis. It provides a powerful, integrated environment for the manipulation and scientific investigation of catalogs from virtually any source. The system is a testbed for practical astronomical applications of AI technology, including machine learning, expert systems, etc., used for astronomical catalog generation and analysis. The system also provides tools to merge these catalogs into a large, complex database which may be easily queried, modified, and upgraded (e.g., as more or better calibration data are added). For example, we make a considerable use of the GID3* decision tree induction software. The resulting Palomar Northern Sky Catalog (PNSC) is expected to contain galaxies, and stars, in 3 colors ( ), down to the limiting magnitude , with the star-galaxy classification accurate to 90 -- 95 percent down to . The catalog will be continuously upgraded as more calibration data become available. It will be made available to the community via computer networks and/or suitable media, probably in installments, as soon as scientific validation and quality checks are completed. Analysis software (parts of SKICAT) will also be freely available. A vast variety of scientific projects will be possible with this data base

  16. GONe: Software for estimating effective population size in species with generational overlap

    Coombs, J.A.; Letcher, B.H.; Nislow, K.H.

    2012-01-01

    GONe is a user-friendly, Windows-based program for estimating effective size (N e) in populations with overlapping generations. It uses the Jorde-Ryman modification to the temporal method to account for age structure in populations. This method requires estimates of age-specific survival and birth rate and allele frequencies measured in two or more consecutive cohorts. Allele frequencies are acquired by reading in genotypic data from files formatted for either GENEPOP or TEMPOFS. For each interval between consecutive cohorts, N e is estimated at each locus and over all loci. Furthermore, N e estimates are output for three different genetic drift estimators (F s, F c and F k). Confidence intervals are derived from a chi-square distribution with degrees of freedom equal to the number of independent alleles. GONe has been validated over a wide range of N e values, and for scenarios where survival and birth rates differ between sexes, sex ratios are unequal and reproductive variances differ. GONe is freely available for download at. ?? 2011 Blackwell Publishing Ltd.

  17. Geochemical evidence of paleogeography and paleoclimate during deposition of the 2nd Member of Kongdian Formation in Kongnan area

    2007-01-01

    The 2nd Member of Kongdian Formation has been made up of a large number of oil shale and mudstone in the Kongnan aera of Huanghua depression around the Bohai Bay. In the Kongnan area, the lake basins were very large and deep during the deposition of the 2nd Member of Kongdian Formation. During that period,the lakes were sealed, uncommunieated with the sea water and the paleoclimate was very warm and wet in Kongnan area. Analyzing the content of the trace element and the rare earth element, carbon and oxygen isotope in the disquisition, The authors prove the two views correct.

  18. Efficacy and Safety of rAAV2-ND4 Treatment for Leber’s Hereditary Optic Neuropathy

    Xing Wan; Han Pei; Min-jian Zhao; Shuo Yang; Wei-kun Hu; Heng He; Si-qi Ma; Ge Zhang; Xiao-yan Dong; Chen Chen; Dao-wen Wang; Bin Li,

    2016-01-01

    Leber’s hereditary optic neuropathy (LHON) is a mitochondrially inherited disease leading to blindness. A mitochondrial DNA point mutation at the 11778 nucleotide site of the NADH dehydrogenase subunit 4 (ND4) gene is the most common cause. The aim of this study was to evaluate the efficacy and safety of a recombinant adeno-associated virus 2 (AAV2) carrying ND4 (rAAV2-ND4) in LHON patients carrying the G11778A mutation. Nine patients were administered rAAV2-ND4 by intravitreal injection to o...

  19. Numerical Simulation of the Francis Turbine and CAD used to Optimized the Runner Design (2nd).

    Sutikno, Priyono

    2010-06-01

    Hydro Power is the most important renewable energy source on earth. The water is free of charge and with the generation of electric energy in a Hydroelectric Power station the production of green house gases (mainly CO2) is negligible. Hydro Power Generation Stations are long term installations and can be used for 50 years and more, care must be taken to guarantee a smooth and safe operation over the years. Maintenance is necessary and critical parts of the machines have to be replaced if necessary. Within modern engineering the numerical flow simulation plays an important role in order to optimize the hydraulic turbine in conjunction with connected components of the plant. Especially for rehabilitation and upgrading existing Power Plants important point of concern are to predict the power output of turbine, to achieve maximum hydraulic efficiency, to avoid or to minimize cavitations, to avoid or to minimized vibrations in whole range operation. Flow simulation can help to solve operational problems and to optimize the turbo machinery for hydro electric generating stations or their component through, intuitive optimization, mathematical optimization, parametric design, the reduction of cavitations through design, prediction of draft tube vortex, trouble shooting by using the simulation. The classic design through graphic-analytical method is cumbersome and can't give in evidence the positive or negative aspects of the designing options. So it was obvious to have imposed as necessity the classical design methods to an adequate design method using the CAD software. There are many option chose during design calculus in a specific step of designing may be verified in ensemble and detail form a point of view. The final graphic post processing would be realized only for the optimal solution, through a 3 D representation of the runner as a whole for the final approval geometric shape. In this article it was investigated the redesign of the hydraulic turbine's runner

  20. The new generation of the software system used for the schematic-parametric optimization of multiple-circuit heat supply systems

    Sokolov, D. V.; Stennikov, V. A.; Oshchepkova, T. B.; Barakhtenko, Ye. A.

    2012-04-01

    The authors describe the new generation of the software system intended for the schematic-parametric optimization of multi-circuit heat supply systems (MC HSS) that make it possible to perform calculations of such systems having an intrinsic structure with any set of nodes, sections, and circuits. The expanded architecture of the software system used in organizing a flexible adaptive model of the computational process management is presented.

  1. RPC Stereo Processor (rsp) - a Software Package for Digital Surface Model and Orthophoto Generation from Satellite Stereo Imagery

    Qin, R.

    2016-06-01

    Large-scale Digital Surface Models (DSM) are very useful for many geoscience and urban applications. Recently developed dense image matching methods have popularized the use of image-based very high resolution DSM. Many commercial/public tools that implement matching methods are available for perspective images, but there are rare handy tools for satellite stereo images. In this paper, a software package, RPC (rational polynomial coefficient) stereo processor (RSP), is introduced for this purpose. RSP implements a full pipeline of DSM and orthophoto generation based on RPC modelled satellite imagery (level 1+), including level 2 rectification, geo-referencing, point cloud generation, pan-sharpen, DSM resampling and ortho-rectification. A modified hierarchical semi-global matching method is used as the current matching strategy. Due to its high memory efficiency and optimized implementation, RSP can be used in normal PC to produce large format DSM and orthophotos. This tool was developed for internal use, and may be acquired by researchers for academic and non-commercial purpose to promote the 3D remote sensing applications.

  2. Curriculum on the Edge of Survival: How Schools Fail to Prepare Students for Membership in a Democracy. 2nd Edition

    Heller, Daniel

    2012-01-01

    Typically, school curriculum has been viewed through the lens of preparation for the workplace or higher education, both worthy objectives. However, this is not the only lens, and perhaps not even the most powerful one to use, if the goal is to optimize the educational system. "Curriculum on the Edge of Survival, 2nd Edition," attempts to define…

  3. VALOIR 2012 2nd Workshop on Managing the Client Value Creation Process in Agile Projects: Message from the Chairs

    Pérez, Jennifer; Buglione, Luigi; Daneva, Maya; Dieste, Oscar; Jedlitschka, Andreas; Juristo, Natalia

    2012-01-01

    Welcome to the 2nd Workshop on Managing the Client Value Creation Process in Agile Projects (VALOIR) at the PROFES 2012 conference! The overall goal of VALOIR is to make the knowledge on value creation and management explicit, encouraging the discussion on the use of measurement and estimation appro

  4. Report from the 2nd Summer School in Computational Biology organized by the Queen's University of Belfast

    Frank Emmert-Streib

    2014-12-01

    Full Text Available In this paper, we present a meeting report for the 2nd Summer School in Computational Biology organized by the Queen's University of Belfast. We describe the organization of the summer school, its underlying concept and student feedback we received after the completion of the summer school.

  5. Official report of the 2nd Summer Youth Olympic Games: Nanjing 2014 : Share the Games, share our dreams

    2015-01-01

    The official report of the 2nd Youth Olympic Games is composed of one volume, published in English and available only in electronic form. However, the Nanjing Youth Olympic Games Organising Committee (NYOCOG) has also published a Chinese version, available in print only.

  6. Transition Energy and Oscillator Strength of 1s22p——1s2nd for Fe23+ ion

    WANG Zhi-Wen; LI Xin-Ru; HU Mu-Hong; LIU Ying; WANG Ya-Nan

    2008-01-01

    Transition energies, wavelengths and dipole oscillator strengths of 1s22p - 1s2nd (3 ≤ n ≤ 9) for Fe23+ ion are calculated. The fine structure splittings of 1s2nd (n ≤ 9) states for this ion are also evaluated. The higher-order relativistic contribution to the energy is estimated under a hydrogenic approximation. The quantum defect of Rydberg series 1s2nd is determined according to the quantum defect theory. The energies of any highly excited states with (n ≥ 10) for this series can be reliably predicted using these quantum defects as input. The results in this paper excellently agree with the experimental data available in the literature. Combining the quantum defect theory with the discrete oscillator strengths, the discrete oscillator strengths for the transitions from same given initial state 1s22p to highly excited 1s2nd states (n ≥ 10) and the oscillator strength density corresponding to the bound-free transitions is obtained.

  7. The Influence of Neighborhood Density and Word Frequency on Phoneme Awareness in 2nd and 4th Grades

    Hogan, Tiffany P.; Bowles, Ryan P.; Catts, Hugh W.; Storkel, Holly L.

    2011-01-01

    Purpose: The purpose of this study was to test the hypothesis that two lexical characteristics--neighborhood density and word frequency--interact to influence performance on phoneme awareness tasks. Methods: Phoneme awareness was examined in a large, longitudinal dataset of 2nd and 4th grade children. Using linear logistic test model, the relation…

  8. The Hyphen as a Syllabification Cue in Reading Bisyllabic and Multisyllabic Words among Finnish 1st and 2nd Graders

    Häikiö, Tuomo; Bertram, Raymond; Hyönä, Jukka

    2016-01-01

    Finnish ABC books present words with hyphens inserted at syllable boundaries. Syllabification by hyphens is abandoned in the 2nd grade for bisyllabic words, but continues for words with three or more syllables. The current eye movement study investigated how and to what extent syllable hyphens in bisyllabic ("kah-vi" "cof-fee")…

  9. Phase Relations of the CaO-SiO2-Nd2O3 System and the Implication for Rare Earths Recycling

    Le, Thu Hoai; Malfliet, Annelies; Blanpain, Bart; Guo, Muxing

    2016-03-01

    CaO-SiO2-Nd2O3 slags were equilibrated at 1773 K and 1873 K (1500 °C and 1600 °C) for 24 hours in Ar, and quenched in water to determine the operative phase relations. The composition and crystallinity of the phases in equilibrium were determined by EPMA-WDS and EBSD, respectively. Based on these analyses, the liquid stability region was accurately determined, and a large part of the isothermal section of the phase diagram was constructed. Data resulting from this work can be used to generate a thermodynamic database for rare-earth oxide-containing systems and to support further investigation on separation of rare earths from metallurgical slags or other residues through high-temperature processing.

  10. Phase Relations of the CaO-SiO2-Nd2O3 System and the Implication for Rare Earths Recycling

    Le, Thu Hoai; Malfliet, Annelies; Blanpain, Bart; Guo, Muxing

    2016-06-01

    CaO-SiO2-Nd2O3 slags were equilibrated at 1773 K and 1873 K (1500 °C and 1600 °C) for 24 hours in Ar, and quenched in water to determine the operative phase relations. The composition and crystallinity of the phases in equilibrium were determined by EPMA-WDS and EBSD, respectively. Based on these analyses, the liquid stability region was accurately determined, and a large part of the isothermal section of the phase diagram was constructed. Data resulting from this work can be used to generate a thermodynamic database for rare-earth oxide-containing systems and to support further investigation on separation of rare earths from metallurgical slags or other residues through high-temperature processing.

  11. Production of artificial ionospheric layers by frequency sweeping near the 2nd gyroharmonic

    T. Pedersen

    2011-01-01

    Full Text Available Artificial ionospheric plasmas descending from the background F-region have been observed on multiple occasions at the High Frequency Active Auroral Research Program (HAARP facility since it reached full 3.6 MW power. Proximity of the transmitter frequency to the 2nd harmonic of the electron gyrofrequency (2fce has been noted as a requirement for their occurrence, and their disappearance after only a few minutes has been attributed to the increasing frequency mismatch at lower altitudes. We report new experiments employing frequency sweeps to match 2fce in the artificial plasmas as they descend. In addition to revealing the dependence on the 2fce resonance, this technique reliably produces descending plasmas in multiple transmitter beam positions and appears to increase their stability and lifetime. High-speed ionosonde measurements are used to monitor the altitude and density of the artificial plasmas during both the formation and decay stages.

  12. CELEBRATED APRIL 2nd – INTERNATIONAL DAY OF PERSONS WITH AUTISM

    Manuela KRCHANOSKA

    2014-09-01

    Full Text Available On April 2nd, the Macedonian Scientific Society for Autism, for the fourth time organized an event on the occasion of the International Day of Persons with Autism. The event with cultural and artistic character was held at the Museum of the Macedonian Struggle under the motto “They are not alone, we are with them”. The huge number of citizens only confirmed the motto. It seemed that the hall of the Museum of the Macedonian Struggle is too small for the warm hearts of the audience. More than 300 guests were present in the hall, among which there were children with autism and their families, prominent professors, doctors, special educators and rehabilitators, psychologists, students and other citizens with glad heart and will who decided to enrich the event with their presence. The event was opened by the violinist Plamenka Trajkovska, which performed one song. After her, the President of the Macedonian Scientific Society for Autism, PhD. Vladimir Trajkovski delivered his speech. The professor told the parents of autistic children, who were present in large number, not to lose hope, to fight for their children, and that the Macedonian Scientific Society for Autism will provide tremendous support and assistance in this struggle.

  13. Deaf-mute teaching during the Spanish 2nd Republic period. A historical view

    Alfredo ALCINA MADUEÑO

    2011-07-01

    Full Text Available Deaf-mute teaching during the Spanish 2nd Republic period is an issue which has never been studied in depth consideration and some of its aspects have been not even touched by any research. We could say that deaf-mute education is granted with the characteristics found in general education, at least regarding this specific stage, mainly: economic and budget thoughtful following effort by the governments, methodological renovation, modernization of the educative system, spreading of the school-net, teachers’ formation, etcetera. However it is true that we can fin a very strict-stated idiosyncrasy, which sometimes turns out to be even controversial, not only due to the opposition of different political parties in power during different 2-years periods (31/37 and 34/35, but also because of the republican-socialist governments decisions regarding educative policy. The Republican legacy consists actually much more in documentary facts than in actual realisations, and will have much more application within that political regime that follows and annihilates the Republic than within the period of the Republic itself. The consideration of primary sources (both legal and documental is the base that supports the final conclusions that are provided by means of this exposition.

  14. Proceedings of the 2nd CSNI Specialist Meeting on Simulators and Plant Analysers

    The safe utilisation of nuclear power plants requires the availability of different computerised tools for analysing the plant behaviour and training the plant personnel. These can be grouped into three categories: accident analysis codes, plant analysers and training simulators. The safety analysis of nuclear power plants has traditionally been limited to the worst accident cases expected for the specific plant design. Many accident analysis codes have been developed for different plant types. The scope of the analyses has continuously expanded. The plant analysers are now emerging tools intended for extensive analysis of the plant behaviour using a best estimate model for the whole plant including the reactor and full thermodynamic process, both combined with automation and electrical systems. The comprehensive model is also supported by good visualisation tools. Training simulators with real time plant model are tools for training the plant operators to run the plant. Modern training simulators have also features supporting visualisation of the important phenomena occurring in the plant during transients. The 2nd CSNI Specialist Meeting on Simulators and Plant Analysers in Espoo attracted some 90 participants from 17 countries. A total of 49 invited papers were presented in the meeting in addition to 7 simulator system demonstrations. Ample time was reserved for the presentations and informal discussions during the four meeting days. (orig.)

  15. Academic Training - 2nd Term: 08.01.2007 - 31.03.2007

    2006-01-01

    2006 - 2007 ACADEMIC TRAINING PROGRAMME 2nd Term : 08.01.2007 - 31.03.2007 LECTURE SERIES Applied Superconductivity by V. Palmieri, INFN, Padova, It. 17, 18, 19 January 11:00 -1200 - Auditorium, Bldg 500 String Theory for Pedestrians by B. Zwiebach, M.I.T. Cambridge, USA 29, 30, 31 January 11:00-12:00 - Auditorium, Bldg 500 on 29, 30 January TH Auditorium on 31 January Introduction to Supersymmetry by D. Kaplan, John Hopkins University, Baltimore, USA 12, 13, 14, 15 February 11:00-12:00 - Auditorium, Bldg 500 The Hunt for the Higgs Particle by F. Zwirner, University of Padova, It 27, 28 February, 1st March 11:00-12:00 - Auditorium, Bldg 500 From Evolution Theory to Parallel and Distributed Genetic by F. Fernandez de Vega 15, 16, March 11:00-12:00 - Auditorium, Bldg 500 The lectures are open to all those interested, without application. The abstract of the lectures, as well as any change to the above information (title, dates, time, place etc.) will be published in the CERN bulletin, the WWW, an...

  16. Academic Training - 2nd Term: 08.01.2007 - 31.03.2007

    2006-01-01

    2006 - 2007 ACADEMIC TRAINING PROGRAMME 2nd Term : 08.01.2007 - 31.03.2007 LECTURE SERIES Applied Superconductivity by V. Palmieri, INFN, Padova, It. 17, 18, 19 January 11:00 -1200 - Auditorium, bldg. 500 String Theory for Pedestrians by B. Zwiebach, M.I.T. Cambridge, USA 29, 30, 31 January 11:00-12:00 - Auditorium, bldg. 500 on 29, 30 January TH Auditorium on 31 January Introduction to Supersymmetry by D. Kaplan, John Hopkins University, Baltimore, USA 12, 13, 14, 15 February 11:00-12:00 - Auditorium, bldg. 500 The Hunt for the Higgs Particle by F. Zwirner, University of Padova, It 27, 28 February, 1st March 11:00-12:00 - Auditorium, bldg. 500 From Evolution Theory to Parallel and Distributed Genetic Programming by F. Fernandez de Vega 15, 16, March 11:00-12:00 - Auditorium, bldg. 500 The lectures are open to all those interested, without application. The abstract of the lectures, as well as any change to the above information (title, dates, time, place etc.) will be published in the WWW, and ...

  17. DRS // CUMULUS Oslo 2013. The 2nd International Conference for Design Education Researchers

    Liv Merete Nielsen

    2013-01-01

    Full Text Available 14-17 May 2013, Oslo, NorwayWe have received more than 200 full papers for the 2nd International Conference for Design Education Researchers in Oslo.This international conference is a springboard for sharing ideas and concepts about contemporary design education research. Contributors are invited to submit research that deals with different facets of contemporary approaches to design education research. All papers will be double-blind peer-reviewed. This conference is open to research in any aspect and discipline of design educationConference themeDesign Learning for Tomorrow - Design Education from Kindergarten to PhDDesigned artefacts and solutions influence our lives and values, both from a personal and societal perspective. Designers, decision makers, investors and consumers hold different positions in the design process, but they all make choices that will influence our future visual and material culture. To promote sustainability and meet global challenges for the future, professional designers are dependent on critical consumers and a design literate general public.  For this purpose design education is important for all. We propose that design education in general education represents both a foundation for professional design education and a vital requirement for developing the general public’s competence for informed decision making.REGISTRATION AT http://www.hioa.no/DRScumulus

  18. Boundary value problems for the 2nd-order Seiberg-Witten equations

    Celso Melchiades Doria

    2005-02-01

    Full Text Available It is shown that the nonhomogeneous Dirichlet and Neuman problems for the 2nd-order Seiberg-Witten equation on a compact 4-manifold X admit a regular solution once the nonhomogeneous Palais-Smale condition ℋ is satisfied. The approach consists in applying the elliptic techniques to the variational setting of the Seiberg-Witten equation. The gauge invariance of the functional allows to restrict the problem to the Coulomb subspace 𝒞αℭ of configuration space. The coercivity of the 𝒮𝒲α-functional, when restricted into the Coulomb subspace, imply the existence of a weak solution. The regularity then follows from the boundedness of L∞-norms of spinor solutions and the gauge fixing lemma.

  19. Study of Application for Excursion Observation Method in Primary School 2nd Grade Social Studies

    Ahmet Ali GAZEL

    2014-04-01

    Full Text Available This study aims to investigate how field trips are conducted at 2nd grade of primary schools as a part of social studies course. Data for this research is compiled from 143 permanent Social Studies teachers working throughout 2011–2012 Education Year in the primary schools of central Kütahya and its districts. Data is compiled by using descriptive search model. In the research, after taking expert opinions, a measuring tool developed by the researcher is used. Data obtained from the research were transferred to computer, and analyses were made. In the analysis of the data, frequency and percentage values have been used to determine the distribution. Also a single factor variance analysis and t-test for independent samples have been used to determine the significance of difference between the variables. As a result of the research, it has been realized that insufficient importance is given to field trip method in Social Studies lessons. Most of the teachers using this method apply it in spring months. Teachers usually make use of field trips independent from unit/topic to increase the students’ motivation, and they generally use verbal expression in the class after tours. The biggest difficulty teachers encounter while using tour-observation method is the students’ undisciplined behavior.

  20. The 2nd and 3rd lower molars development of in utero gamma irradiated mouse fetus and neonates

    Pregnant mothers were irradiated by a single dose of gamma rays (0, 2, 4, 6 Gy cobalt 60) in the days 10, 12, 14, 16, 18 of pregnancy. The heads of the embryos, and those of the neonates were taken at consecutive intervals of irradiation, starting from 16 days of pregnancy till 3rd day after delivery. The effect of irradiation was investigated in the development of the 2nd and 3rd lower molars on serial tissue sections, within consecutive periods of their organogenesis. Irradiation led to growth-deficiency in the 2nd and 3rd molars, and causes delay in their development. This was observed in various degree depending on the dose, time of irradiation, and time after irradiation. This belated development was manifested in morphogenesis, histogenesis, and odontoblasts and ameloblastis cyto and functional differentiations. The study showed that the delay in the development-stages of the 2nd lower molar, under control, if compared with the same process, to which is exposed, the 1st lower molar - within two days difference - dose not diminish the later irradiation effect on the 2nd molar, when compared with the immediate irradiation effect in the 1st molar (demonstrated in a previous study by Osman and Al-Achkar, 2001). On the contrary, the present study showed that the 2nd lower molar is more radiosensitive to various doses than the 1st lower molar. Also it showed the irradiation with two doses 4 and 6 Gy leads to a delay in the formation of the 3rd lower molar's bud, and it does not go deeper beyond the lower molar. (Author)

  1. SrAl2O4∶Eu2+, Nd3+ and Dy3+ Long Afterglow Phosphor

    何大伟; 吕菁华; 崔兴龙

    2003-01-01

    The SrAl2O4∶Eu2+, Nd3+ and SrAl2O4∶Eu2+, Dy3+ long afterglow phosphor were synthesized. Their excitation and emission spectra at different excitation and afterglow characteristics were analyzed after the excitation power was taken off. The effects of Eu2+, Dy3+, Nd3+ mole concentrations on phosphorescence characteristics were also discussed. It is crucial to have trapping levels located at a suitable depth related to the thermal release rate at room temperature. The incorporation of Nd3+ ions as an auxiliary activator into the SrAl2O4∶Eu2+ system causes very intense and long phosphorescence. The response time of SrAl2O4∶Eu2+, Dy3+ phosphors is quicker than that of SrAl2O4∶Eu2+, Nd3+. Phosphorescence characteristics of SrAl2O4∶Eu2+, Nd3+ is much better than those of SrAl2O4∶Eu2+, Dy3+. The integrate area of the excitation spectrum of SrAl2O4∶Eu2+, Nd3+ phosphor is larger than that of SrAl2O4∶Eu2+, Dy3+ phosphor within the range of 250~360 nm. For phosphorescence characteristics to the system of SrAl2O4∶Eu2+, Nd3+ phosphor, the optimum concentration of Nd3+ trivalent rare earth ions is 0.05 mol.

  2. The Influence of Instructional Climates on Time Spent in Management Tasks and Physical Activity of 2nd-Grade Students during Physical Education

    Logan, Samuel W.; Robinson, Leah E.; Webster, E. Kipling; Rudisill, Mary E.

    2015-01-01

    The purpose of this study is to determine the effect of two physical education (PE) instructional climates (mastery, performance) on the percentage of time students spent in a) moderate-to-vigorous physical activity (MVPA) and b) management tasks during PE in 2nd-grade students. Forty-eight 2nd graders (mastery, n = 23; performance, n = 25)…

  3. A Study of Performance and Effort Expectancy Factors among Generational and Gender Groups to Predict Enterprise Social Software Technology Adoption

    Patel, Sunil S.

    2013-01-01

    Social software technology has gained considerable popularity over the last decade and has had a great impact on hundreds of millions of people across the globe. Businesses have also expressed their interest in leveraging its use in business contexts. As a result, software vendors and business consumers have invested billions of dollars to use…

  4. Mechanosensitivity of the 2nd Kind: TGF-β Mechanism of Cell Sensing the Substrate Stiffness

    Cockerill, Max; Rigozzi, Michelle K.; Terentjev, Eugene M.

    2015-01-01

    Cells can sense forces applied to them, but also the stiffness of their environment. These are two different phenomena, and here we investigate the mechanosensitivity of the 2nd kind: how the cell can measure an elastic modulus at a single point of adhesion—and how the cell can receive and interpret the chemical signal released from the sensor. Our model uses the example of large latent complex of TGF-β as a sensor. Stochastic theory gives the rate of breaking of latent complex, which initiates the signaling feedback loop after the active TGF-β release and leads to a change of cell phenotype driven by the α-smooth muscle actin. We investigate the dynamic and steady-state behaviors of the model, comparing them with experiments. In particular, we analyse the timescale of approach to the steady state, the stability of the non-linear dynamical system, and how the steady-state concentrations of the key markers vary depending on the elasticity of the substrate. We discover a crossover region for values of substrate elasticity closely corresponding to that of the fibroblast to myofibroblast transition. We suggest that the cell could actively vary the parameters of its dynamic feedback loop to ‘choose’ the position of the transition region and the range of substrate elasticity that it can detect. In this way, the theory offers the unifying mechanism for a variety of phenomena, such as the myofibroblast conversion in fibrosis of wounds and lungs and smooth muscle cell dysfunction in cardiac disease. PMID:26448620

  5. Re-fighting the 2nd Anglo-Boer War: historians in the trenches

    Ian Van der Waag

    2012-02-01

    Full Text Available Some one hundred years ago, South Africa was tom apart by the 2nd Anglo- Boer War (1899-1902. The war was a colossal psychological experience fought at great expense: It cost Britain twenty-two thousand men and £223 million. The social, economic and political cost to South Africa was greater than the statistics immediately indicate: at least ten thousand fighting men in addition to the camp deaths, where a combination of indifference and incompetence resulted in the deaths of 27 927 Boers and at least 14 154 Black South Africans. Yet these numbers belie the consequences. It was easy for the British to 'forget' the pain of the war, which seemed so insignificant after the losses sustained in 1914-18. With a long history of far-off battles and foreign wars, the British casualties of the Anglo-Boer War became increasingly insignificant as opposed to the lesser numbers held in the collective Afrikaner mind. This impact may be stated somewhat more candidly in terms of the war participation ratio for the belligerent populations. After all, not all South Africans fought in uniform. For the Australian colonies these varied between 4½per thousand (New South Wales to 42.3 per thousand (Tasmania. New Zealand 8 per thousand, Britain 8½ per thousand: and Canada 12.3 per thousand; while in parts of South Africa this was perhaps as high as 900 per thousand. The deaths and high South African participation ratio, together with the unjustness of the war in the eyes of most Afrikaners, introduced bitterness, if not a hatred, which has cast long shadows upon twentieth-century South Africa.

  6. The Ratio of 2nd to 4th Digit Length in Korean Alcohol-dependent Patients

    Han, Changwoo; Bae, Hwallip; Lee, Yu-Sang; Won, Sung-Doo; Kim, Dai Jin

    2016-01-01

    Objective The ratio of 2nd to 4th digit length (2D:4D) is a sexually dimorphic trait. Men have a relatively shorter second digit than fourth digit. This ratio is thought to be influenced by higher prenatal testosterone level or greater sensitivity to androgen. The purpose of this study is to investigate the relationship between alcohol dependence and 2D:4D in a Korean sample and whether 2D:4D can be a biologic marker in alcohol dependence. Methods In this study, we recruited 87 male patients with alcohol dependence from the alcohol center of one psychiatric hospital and 52 healthy male volunteers who were all employees in the same hospital as controls. We captured images of the right and left hands of patients and controls using a scanner and extracted data with a graphics program. We measured the 2D:4D of each hand and compared the alcohol dependence group with the control group. We analyzed these ratios using an independent-samples t-test. Results The mean 2D:4D of patients was 0.934 (right hand) and 0.942 (left hand), while the mean 2D:4D of controls was 0.956 (right hand) and 0.958 (left hand). Values for both hands were significantly lower for patients than controls (p<0.001, right hand; p=0.004, left hand). Conclusion Patients who are alcohol dependent have a significantly lower 2D:4D than controls, similar to the results of previous studies, which suggest that a higher prenatal testosterone level in the gonadal period is related to alcoholism. Furthermore, 2D:4D is a possible predictive marker of alcohol dependence. PMID:27121425

  7. The 2nd CAAP Convention & International Symposium on Modern and Contemporary English Literatures (June 8 -9, 2013 )%The 2nd CAAP Convention & International Symposium on Modern and Contemporary English Literatures (June 8 -9, 2013 )

    2012-01-01

    In order to further promote literary scholarship and international academic exchange, the University of Pennsylvania-based Chinese/American Association for Poetry and Poetics (CAAP) will collaborate with the School of Foreign Languages and School of Humanities of Central China Normal University, Foreign Literature Studies, and Forum for World Literature Studies in hosting "The 2nd CAAP Convention and International Symposium on Literatures in English" (June 8 -9, 2013) in Wuhan, China. Scholars and writers all over the world are welcome.

  8. 软件测试数据自动生成算法的仿真研究%Simulation Research on Automatically Generate Software Test Data Algorithm

    黄丽芬

    2012-01-01

    Testing data is the most crucial part in software testing software, and it is important for the software test automation degree to improve the automatic software test data generation method. Aiming at the defects of genetic algorithm and ant colony algorithm, a new software test data generation algorithm was proposed in this paper based on genetic and ant colony algorithm. Firstly, genetic algorithm which has the global searching ability was used to find the optimal solution, and then the optimal solution was converted into the initial pheromone of ant colony algorithm. Finally, the best test data were found by ant colony algorithm positive feedback mechanism quickly. The experimental results show that the proposed method improves the efficiency of software test data generation and has very important using value.%研究软件质量优化问题,传统遗传算法存在局部最优、收敛速度慢,使软件测试数据自动生成效率低.为提高软件测试数据生成效率,对传统遗传算法进行改进,提出一种遗传-蚁群算法的软件测试数据生成算法.针对测试数据自动生成的特点,充分发挥遗传算法的全局搜索和蚁群算法的局部搜索优势,提高了测试数据的生成能力.实验结果表明,遗传-蚁群算法提高了软件测试数据生成效率,是一种较为理想的软件测试数据生成算法.

  9. Proceedings of 2nd Korea-China Congress of Nuclear Medicine and the Korean Society Nuclear Medicine Spring Meeting 2000

    This proceedings contains articles of 2nd Korea-China Congress of Nuclear Medicine and 2000 spring meeting of the Korean Society Nuclear Medicine. It was held on May 17-19, 2000 in Seoul, Korean. This proceedings is comprised of 6 sessions. The subject titles of session are as follows: general nuclear medicine, neurology, oncology, radiopharmacy and biology, nuclear cardiology, nuclear cardiology: physics and instrumentation and so on. (Yi, J. H.)

  10. Teachers' Spatial Anxiety Relates to 1st-and 2nd-Graders' Spatial Learning

    Gunderson, Elizabeth A.; Ramirez, Gerardo; Beilock, Sian L.; Levine, Susan C.

    2013-01-01

    Teachers' anxiety about an academic domain, such as math, can impact students' learning in that domain. We asked whether this relation held in the domain of spatial skill, given the importance of spatial skill for success in math and science and its malleability at a young age. We measured 1st-and 2nd-grade teachers' spatial anxiety…

  11. Evolution of Corruption in Sub-Saharan Africa - from Nkruma to Mutharika The 2nd: Case Study Of South Africa

    Mavhungu Abel Mafukata

    2016-01-01

    Since Sub-Saharan Africa's first independence in Ghana, the region has experienced massive and costly political and bureaucratic corruption within public service and administration. The causes of the corruption, its nature and form are wide and intertwined. In Sub-Saharan Africa, efforts to curb corruption have failed to discard it. The paper focused on the period from Nkruma in Ghana to Mutharika the 2nd in Malawi. This paper reviewed existing literature on political and bureaucratic corrupt...

  12. 2nd Joint GOSUD/SAMOS Workshop, U.S.Coast Guard Base, Seattle, Washington, 10-12 June 2008.

    2008-01-01

    On 10-12 June 2008, the NOAA Climate Observation Division sponsored the 2nd Joint Global Ocean Surface Underway Data (GOSUD)/Shipboard Automated Meteorological and Oceanographic System (SAMOS) Workshop in Seattle, WA, USA. The workshop focused on the ongoing collaboration between GOSUD and SAMOS and addressing the needs of the research and operational community for highquality underway oceanographic and meteorological observations from ships. The SAMOS initiative is working to improve access ...

  13. An Inquiry into Perceived Autonomy Support of Iranian EFL Learners: 2nd, 3rd and 4th Grade University Students

    Husain Abdulhay

    2015-01-01

    Gaining an insight into Iranian EFL learning environment is increasingly felt, consonant with dissociation from the traditional and spoon-feeding rituals of Iranian indigenous teaching. To that end, the study tried to scour the grade level differences of 202 students in their perceived autonomy support in the context of Iranian universities. Exposures to autonomy supportive environment were examined in 2nd, 3rd and 4th grade-levels through the administration of Learning Climate Questionnaire ...

  14. CPAPD Held the 9th Joint Conference of Member Organizations and The 2nd Conference of the Board of Directors

    2014-01-01

    <正>On April 4,2014,the Chinese People’s Association for Peace and Disarmament(CPAPD)held the 9th Joint Conference of Member Organizations and the 2nd Conference of the Board of Directors in Beijing.Yan Junqi,Vice-Chairperson of the Standing Committee of the National People’s Congress(NPC):Han Qide,Vice-Chairman of the National Committee of the

  15. 2nd Nordic NJF Seminar on Reindeer Husbandry Research "Reindeer herding and land use management - Nordic perspectives"

    Päivi Soppela

    2015-06-01

    Full Text Available The 2nd NJF Seminar on Reindeer Husbandry Research was held at the Arctic Centre, University of Lapland, Rovaniemi, Finland from 19 to 21 October 2014. The seminar was organised under the framework of Reindeer Husbandry Research Section of NJF (Nordic Association of Agricultural Scientists, established in 2012. Over 100 Nordic and international delegates including researchers, managers, educators, students and reindeer herders participated in the seminar.

  16. White Paper Summary of 2nd ASTM International Workshop on Hydrides in Zirconium Alloy Cladding

    Sindelar, R. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Louthan, M. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); PNNL, B. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-05-29

    This white paper recommends that ASTM International develop standards to address the potential impact of hydrides on the long term performance of irradiated zirconium alloys. The need for such standards was apparent during the 2nd ASTM International Workshop on Hydrides in Zirconium Alloy Cladding and Assembly Components, sponsored by ASTM International Committee C26.13 and held on June 10-12, 2014, in Jackson, Wyoming. The potentially adverse impacts of hydrogen and hydrides on the long term performance of irradiated zirconium-alloy cladding on used fuel were shown to depend on multiple factors such as alloy chemistry and processing, irradiation and post irradiation history, residual and applied stresses and stress states, and the service environment. These factors determine the hydrogen content and hydride morphology in the alloy, which, in turn, influence the response of the alloy to the thermo-mechanical conditions imposed (and anticipated) during storage, transport and disposal of used nuclear fuel. Workshop presentations and discussions showed that although hydrogen/hydride induced degradation of zirconium alloys may be of concern, the potential for occurrence and the extent of anticipated degradation vary throughout the nuclear industry because of the variations in hydrogen content, hydride morphology, alloy chemistry and irradiation conditions. The tools and techniques used to characterize hydrides and hydride morphologies and their impacts on material performance also vary. Such variations make site-to-site comparisons of test results and observations difficult. There is no consensus that a single material or system characteristic (e.g., reactor type, burnup, hydrogen content, end-of life stress, alloy type, drying temperature, etc.) is an effective predictor of material response during long term storage or of performance after long term storage. Multi-variable correlations made for one alloy may not represent the behavior of another alloy exposed to

  17. Public Health Genomics European Network: Report from the 2nd Network Meeting in Rome

    Nicole Rosenkötter

    2007-03-01

    Full Text Available

    Dear Sirs,

    The Public Health Genomics European Network (PHGEN is a mapping exercise for the responsible and effective integration of genome-based knowledge and technologies into public policy and health services for the benefit of population health. In 2005, the European Commission called for a “networking exercise…to lead to an inventory report on genetic determinants relevant for public health”[1], this lead to the funding of a PHGEN three year project (EC project 2005313.This project started in early 2006 with a kick-off meeting in Bielefeld / Germany.The project work is comprised of, according to the public health trias, three one year periods of assessment, policy development and assurance.At the end of the assessment phase a network meeting was held in Rome from January, 31st to February 2nd 2007 with over 90 network members and network observers in attendance. The participants represented different organisations throughout the European Union with expertise in areas such as human genetics and other medical disciplines,epidemiology,public health, law, ethics, political and social sciences. The aim of the meeting was to wrap up the last year’s assessment period and to herald the policy development phase.The assessment period of PHGEN was characterised by several activities: - Contact and cooperation with other European and internationally funded networks and projects on public health genomics or related issues (e.g. EuroGenetest, EUnetHTA, Orphanet, IPTS, PHOEBE, GRaPHInt, P3G - Identification of key experts in public health genomics in the European members states, applicant countries and EFTA/EEA countries from different disciplines (e.g. human genetics and other medical disciplines, public health, law, philosophy, epidemiology, political and social sciences - Building up national task forces on public health genomics in the above mentioned countries - Establishing and work in three working groups: public health genomics

  18. PREFACE: 1st-2nd Young Researchers Meetings in Rome - Proceedings

    YRMR Organizing Committee; Cannuccia, E.; Mazzaferro, L.; Migliaccio, M.; Pietrobon, D.; Stellato, F.; Veneziani, M.

    2011-03-01

    Students in science, particularly in physics, face a fascinating and challenging future. Scientists have proposed very interesting theories, which describe the microscopic and macroscopic world fairly well, trying to match the quantum regime with cosmological scales. Between the extremes of this scenario, biological phenomena in all their complexity take place, challenging the laws we observe in the atomic and sub-atomic world. More and more accurate and complex experiments have been devised and these are now going to test the paradigms of physics. Notable experiments include: the Large Hadronic Collider (LHC), which is going to shed light on the physics of the Standard Model of Particles and its extensions; the Planck-Herschel satellites, which target a very precise measurement of the properties of our Universe; and the Free Electron Lasers facilities, which produce high-brilliance, ultrafast X-ray pulses, allowing the investigation of the fundamental processes of solid state physics, chemistry, and biology. These projects are the result of huge collaborations spread across the world, involving scientists belonging to different and complementary research fields: physicists, chemists, biologists and others, keen to make the best of these extraordinary laboratories. Even though each branch of science is experiencing a process of growing specialization, it is very important to keep an eye on the global picture, remaining aware of the deep interconnections between inherent fields. This is even more crucial for students who are beginning their research careers. These considerations motivated PhD students and young post-docs connected to the Roman scientific research area to organize a conference, to establish the background and the network for interactions and collaborations. This resulted in the 1st and 2nd Young Researchers Meetings in Rome (http://ryrm.roma2.infn.it), one day conferences aimed primarily at graduate students and post-docs, working in physics in Italy

  19. Archaeometric study of glass beads from the 2nd century BC cemetery of Numantia

    García Heras, Manuel

    2003-06-01

    Full Text Available Recent archaeologícalf ieldwork undertaken in the Celtiberian cremation necropolis of Numantia (Soria, Spain has provided a group of glass beads from the 2nd century BC. Such glass beads were part, together with other metallic and ceramic items, of the offerings deposited with the dead. They are ring-shaped in typology and deep-blue, amber, or semitransparent white in colour. This paper reports results derived from the chemical and microstructural characterization carried out on a representative sample set of this group of beads. The main goal of the research was to find out about their production technology to explore their probable provenance. In addítion, corrosion mechanisms were also assessed to determine the influence of crematíon on the beads' structure. The resulting data suggest that these blue and amber beads were made using soda-lime silicate glass, whereas semi-transparent white ones were manufactured from alumino-silicate glass. It has also determined that some transition metal oxides were used as chromophores, as well as lead oxide for decoration.

    La reciente excavación de la necrópolis celtibérica de Numancia (Garray, Soria ha permitido recuperar un conjunto de cuentas de vidrio del siglo II a.C. Las cuentas, junto con otros objetos de metaly cerámica, formaban parte de las ofrendas depositadas con el difunto, siendo de tipología anular y coloreadas en azul oscuro, ambar y blanco semitransparente. Este trabajo presenta los resultados obtenidos en la caracterización química y microestructural de una muestra representativa de este conjunto. El objetivo principal de la investigación consistió en recabar información sobre su tecnología de manufactura y evaluar su posible procedencia. Asimismo, también se investigaron sus mecanismos de corrosión para determinar si la cremación había inducido cambios en su estructura. Los resultados indican que las cuentas azules y ámbar se realizaron con vidrio de silicato s

  20. 一种面向云平台的软件配置与生成技术%A CLOUD PLATFORM-ORIENTED SOFTWARE CONFIGURATION AND GENERATION TECHNOLOGY

    蔡韵; 吴毅坚; 赵文耘

    2014-01-01

    提出一种面向云平台的软件配置与生成技术的实现,目的为支持云计算环境下的软件产品线工程。利用PIM平台独立模型到PSM平台相关模型的转换方法,并开发相应的配置与生成工具,来精确地定义和描述财务查询系统的特性。将现在应用最为广泛的GAE(谷歌应用引擎)和国内的SAE(新浪应用引擎)作为主要的研究对象,利用软件产品线方法生成和发布在云平台上可部署的软件项目。实际的部署结果表明将软件产品线应用在云平台上能够有效地简化部署和维护的负担。所研究的利用软件产品线导出特定云平台的软件产品的开发方式,对于避免重复开发、提高软件开发效率具有积极的意义。%The implementation of a cloud platform-oriented software configuration and generation technology is covered in this paper for the purpose of supporting the software product line in cloud environment.We use the approach of converting PIM ( platform-independent model) to PSM ( platform-specific model ) and develop the corresponding configuration and generation tool to accurately define and depict the characteristics of a financial query information system.The widely used GAE ( Google App Engine) and domestic SAE ( Sina App Engine) at resent are taken for the main objects of our study, and the software product line method is employed to generate and release the deployable software projects on the cloud platform.It is demonstrated from actual deployment result that applying the software product line on cloud platform can effectively simply the deployment and ease the burden of maintenance.The development approach of making use of software product line to derive specific software product for cloud platform studied in this paper has active significance on avoiding duplicated development and improving the efficiency of software development.

  1. An open-source software tool for the generation of relaxation time maps in magnetic resonance imaging

    In magnetic resonance (MR) imaging, T1, T2 and T2* relaxation times represent characteristic tissue properties that can be quantified with the help of specific imaging strategies. While there are basic software tools for specific pulse sequences, until now there is no universal software program available to automate pixel-wise mapping of relaxation times from various types of images or MR systems. Such a software program would allow researchers to test and compare new imaging strategies and thus would significantly facilitate research in the area of quantitative tissue characterization. After defining requirements for a universal MR mapping tool, a software program named MRmap was created using a high-level graphics language. Additional features include a manual registration tool for source images with motion artifacts and a tabular DICOM viewer to examine pulse sequence parameters. MRmap was successfully tested on three different computer platforms with image data from three different MR system manufacturers and five different sorts of pulse sequences: multi-image inversion recovery T1; Look-Locker/TOMROP T1; modified Look-Locker (MOLLI) T1; single-echo T2/T2*; and multi-echo T2/T2*. Computing times varied between 2 and 113 seconds. Estimates of relaxation times compared favorably to those obtained from non-automated curve fitting. Completed maps were exported in DICOM format and could be read in standard software packages used for analysis of clinical and research MR data. MRmap is a flexible cross-platform research tool that enables accurate mapping of relaxation times from various pulse sequences. The software allows researchers to optimize quantitative MR strategies in a manufacturer-independent fashion. The program and its source code were made available as open-source software on the internet

  2. An open-source software tool for the generation of relaxation time maps in magnetic resonance imaging

    Kühne Titus

    2010-07-01

    Full Text Available Abstract Background In magnetic resonance (MR imaging, T1, T2 and T2* relaxation times represent characteristic tissue properties that can be quantified with the help of specific imaging strategies. While there are basic software tools for specific pulse sequences, until now there is no universal software program available to automate pixel-wise mapping of relaxation times from various types of images or MR systems. Such a software program would allow researchers to test and compare new imaging strategies and thus would significantly facilitate research in the area of quantitative tissue characterization. Results After defining requirements for a universal MR mapping tool, a software program named MRmap was created using a high-level graphics language. Additional features include a manual registration tool for source images with motion artifacts and a tabular DICOM viewer to examine pulse sequence parameters. MRmap was successfully tested on three different computer platforms with image data from three different MR system manufacturers and five different sorts of pulse sequences: multi-image inversion recovery T1; Look-Locker/TOMROP T1; modified Look-Locker (MOLLI T1; single-echo T2/T2*; and multi-echo T2/T2*. Computing times varied between 2 and 113 seconds. Estimates of relaxation times compared favorably to those obtained from non-automated curve fitting. Completed maps were exported in DICOM format and could be read in standard software packages used for analysis of clinical and research MR data. Conclusions MRmap is a flexible cross-platform research tool that enables accurate mapping of relaxation times from various pulse sequences. The software allows researchers to optimize quantitative MR strategies in a manufacturer-independent fashion. The program and its source code were made available as open-source software on the internet.

  3. MIAWARE Software

    Wilkowski, Bartlomiej; Pereira, Oscar N. M.; Dias, Paulo;

    2008-01-01

    This article presents MIAWARE, a software for Medical Image Analysis With Automated Reporting Engine, which was designed and developed for doctor/radiologist assistance. It allows to analyze an image stack from computed axial tomography scan of lungs (thorax) and, at the same time, to mark all...... pathologies on images and report their characteristics. The reporting process is normalized - radiologists cannot describe pathological changes with their own words, but can only use some terms from a specific vocabulary set provided by the software. Consequently, a normalized radiological report is...... automatically generated. Furthermore, MIAWARE software is accompanied with an intelligent search engine for medical reports, based on the relations between parts of the lungs. A logical structure of the lungs is introduced to the search algorithm through the specially developed ontology. As a result, a...

  4. 2nd Radio and Antenna Days of the Indian Ocean (RADIO 2014)

    2014-10-01

    It was an honor and a great pleasure for all those involved in its organization to welcome the participants to the ''Radio and Antenna Days of the Indian Ocean'' (RADIO 2014) international conference that was held from 7th to 10th April 2014 at the Sugar Beach Resort, Wolmar, Flic-en-Flac, Mauritius. RADIO 2014 is the second of a series of conferences organized in the Indian Ocean region. The aim of the conference is to discuss recent developments, theories and practical applications covering the whole scope of radio-frequency engineering, including radio waves, antennas, propagation, and electromagnetic compatibility. The RADIO international conference emerged following discussions with engineers and scientists from the countries of the Indian Ocean as well as from other parts of the world and a need was felt for the organization of such an event in this region. Following numerous requests, the Island of Mauritius, worldwide known for its white sandy beaches and pleasant tropical atmosphere, was again chosen for the organization of the 2nd RADIO international conference. The conference was organized by the Radio Society, Mauritius and the Local Organizing Committee consisted of scientists from SUPELEC, France, the University of Mauritius, and the University of Technology, Mauritius. We would like to take the opportunity to thank all people, institutions and companies that made the event such a success. We are grateful to our gold sponsors CST and FEKO as well as URSI for their generous support which enabled us to partially support one PhD student and two scientists to attend the conference. We would also like to thank IEEE-APS and URSI for providing technical co-sponsorship. More than hundred and thirty abstracts were submitted to the conference. They were peer-reviewed by an international scientific committee and, based on the reviews, either accepted, eventually after revision, or rejected. RADIO 2014 brought together participants from twenty countries spanning

  5. Development of Hydrologic Characterization Technology of Fault Zones -- Phase I, 2nd Report

    Karasaki, Kenzi; Onishi, Tiemi; Black, Bill; Biraud, Sebastien

    2009-03-31

    This is the year-end report of the 2nd year of the NUMO-LBNL collaborative project: Development of Hydrologic Characterization Technology of Fault Zones under NUMO-DOE/LBNL collaboration agreement, the task description of which can be found in the Appendix 3. Literature survey of published information on the relationship between geologic and hydrologic characteristics of faults was conducted. The survey concluded that it may be possible to classify faults by indicators based on various geometric and geologic attributes that may indirectly relate to the hydrologic property of faults. Analysis of existing information on the Wildcat Fault and its surrounding geology was performed. The Wildcat Fault is thought to be a strike-slip fault with a thrust component that runs along the eastern boundary of the Lawrence Berkeley National Laboratory. It is believed to be part of the Hayward Fault system but is considered inactive. Three trenches were excavated at carefully selected locations mainly based on the information from the past investigative work inside the LBNL property. At least one fault was encountered in all three trenches. Detailed trench mapping was conducted by CRIEPI (Central Research Institute for Electric Power Industries) and LBNL scientists. Some intriguing and puzzling discoveries were made that may contradict with the published work in the past. Predictions are made regarding the hydrologic property of the Wildcat Fault based on the analysis of fault structure. Preliminary conceptual models of the Wildcat Fault were proposed. The Wildcat Fault appears to have multiple splays and some low angled faults may be part of the flower structure. In parallel, surface geophysical investigations were conducted using electrical resistivity survey and seismic reflection profiling along three lines on the north and south of the LBNL site. Because of the steep terrain, it was difficult to find optimum locations for survey lines as it is desirable for them to be as

  6. Development of Hydrologic Characterization Technology of Fault Zones: Phase I, 2nd Report

    This is the year-end report of the 2nd year of the NUMO-LBNL collaborative project: Development of Hydrologic Characterization Technology of Fault Zones under NUMO-DOE/LBNL collaboration agreement, the task description of which can be found in the Appendix 3. Literature survey of published information on the relationship between geologic and hydrologic characteristics of faults was conducted. The survey concluded that it may be possible to classify faults by indicators based on various geometric and geologic attributes that may indirectly relate to the hydrologic property of faults. Analysis of existing information on the Wildcat Fault and its surrounding geology was performed. The Wildcat Fault is thought to be a strike-slip fault with a thrust component that runs along the eastern boundary of the Lawrence Berkeley National Laboratory. It is believed to be part of the Hayward Fault system but is considered inactive. Three trenches were excavated at carefully selected locations mainly based on the information from the past investigative work inside the LBNL property. At least one fault was encountered in all three trenches. Detailed trench mapping was conducted by CRIEPI (Central Research Institute for Electric Power Industries) and LBNL scientists. Some intriguing and puzzling discoveries were made that may contradict with the published work in the past. Predictions are made regarding the hydrologic property of the Wildcat Fault based on the analysis of fault structure. Preliminary conceptual models of the Wildcat Fault were proposed. The Wildcat Fault appears to have multiple splays and some low angled faults may be part of the flower structure. In parallel, surface geophysical investigations were conducted using electrical resistivity survey and seismic reflection profiling along three lines on the north and south of the LBNL site. Because of the steep terrain, it was difficult to find optimum locations for survey lines as it is desirable for them to be as

  7. Mapping and industrial IT project to a 2nd semester design-build project

    Nyborg, Mads; Høgh, Stig

    2010-01-01

    system. A simple teaching model for software engineering is presented which combines technical disciplines with disciplines from section 2-4 in the CDIO syllabus. The implementation of a joint project involving several courses supports the CDIO perspective. Already the traditional IT-diploma education...... study. A successful implementation at this level requires careful planning of activities through the semester. Principles of the CDIO have been of great help in this regard. Finally we draw conclusions and give our recommendations based on those....

  8. IMPACTS OF EUROPEAN BIOFUEL POLICIES ON AGRICULTURAL MARKETS AND ENVIRONMENT UNDER CONSIDERATION OF 2ND GENERATION TECHNOLOGIES AND INTERNATIONAL TRADE

    Becker, A.; ADENÄUER Marcel; Blanco Fonseca, Maria

    2010-01-01

    Even though recent discussions on food prices and indirect land use change point at potential conflicts associated with the production of biofuels the appraisal of biofuels as an effective instrument to slow down climate change and reduce energy dependency still prevails. The EU Renewable Energy Directive (EUROPEAN COMMISSION, 2009) underlines this trend by setting a target of 10% share of energy from renewable sources in the transport sector by 2020. As economic competitiveness of biofuel pr...

  9. 1st or 2nd generation bioethanol-impacts of technology integration & on feed production and land use

    Bentsen, Niclas Scott; Felby, Claus

    2009-01-01

    production comparable to gasoline production in terms of energy loss. Utilisation of biomass in the energy sector is inevitably linked to the utilisation of land. This is a key difference between fossil and bio based energy systems. Thus evaluations of bioethanol production based on energy balances alone are...

  10. A 2nd generation static model for predicting greenhouse energy inputs, as an aid for production planning

    Jolliet, O; Munday, G L

    1985-01-01

    A model which allows accurate prediction of energy consumption of a greenhouse is a useful tool for production planning and optimisation of greenhouse components. To date two types of model have been developed; some very simple models of low precision, others, precise dynamic models unsuitable for employment over long periods and too complex for use in practice. A theoretical study and measurements at the CERN trial greenhouse have allowed development of a new static model named "HORTICERN", easy to use and as precise as more complex dynamic models. This paper demonstrates the potential of this model for long-term production planning. The model gives precise predictions of energy consumption when given greenhouse conditions of use (inside temperatures, dehumidification by ventilation, …) and takes into account local climatic conditions (wind radiative losses to the sky and solar gains), type of greenhouse (cladding, thermal screen …). The HORTICERN method has been developed for PC use and requires less...

  11. FACSGen 2.0 animation software: generating three-dimensional FACS-valid facial expressions for emotion research.

    Krumhuber, Eva G; Tamarit, Lucas; Roesch, Etienne B; Scherer, Klaus R

    2012-04-01

    In this article, we present FACSGen 2.0, new animation software for creating static and dynamic three-dimensional facial expressions on the basis of the Facial Action Coding System (FACS). FACSGen permits total control over the action units (AUs), which can be animated at all levels of intensity and applied alone or in combination to an infinite number of faces. In two studies, we tested the validity of the software for the AU appearance defined in the FACS manual and the conveyed emotionality of FACSGen expressions. In Experiment 1, four FACS-certified coders evaluated the complete set of 35 single AUs and 54 AU combinations for AU presence or absence, appearance quality, intensity, and asymmetry. In Experiment 2, lay participants performed a recognition task on emotional expressions created with FACSGen software and rated the similarity of expressions displayed by human and FACSGen faces. Results showed good to excellent classification levels for all AUs by the four FACS coders, suggesting that the AUs are valid exemplars of FACS specifications. Lay participants' recognition rates for nine emotions were high, and comparisons of human and FACSGen expressions were very similar. The findings demonstrate the effectiveness of the software in producing reliable and emotionally valid expressions, and suggest its application in numerous scientific areas, including perception, emotion, and clinical and neuroscience research. PMID:22251045

  12. Using semi-automated photogrammetry software to generate 3D surfaces from oblique and vertical photographs at Mount St. Helens, WA

    Schilling, S.; Diefenbach, A. K.

    2012-12-01

    Photogrammetry has been used to generate contours and Digital Elevation Models (DEMs) to monitor change at Mount St. Helens, WA since the 1980 eruption. We continue to improve techniques to monitor topographic changes within the crater. During the 2004-2008 eruption, 26 DEMs were used to track volume and rates of growth of a lava dome and changes of Crater Glacier. These measurements constrained seismogenic extrusion models and were compared with geodetic deflation volume to constrain magma chamber behavior. We used photogrammetric software to collect irregularly spaced 3D points primarily by hand and, in reasonably flat areas, by automated algorithms, from commercial vertical aerial photographs. These models took days to months to complete and the areal extent of each surface was determined by visual inspection. Later in the eruption, we pioneered the use of different software to generate irregularly spaced 3D points manually from oblique images captured by a hand-held digital camera. In each case, the irregularly spaced points and intervening interpolated points formed regular arrays of cells or DEMs. Calculations using DEMs produced from the hand-held images duplicated volumetric and rate results gleaned from the vertical aerial photographs. This manual point capture technique from oblique hand-held photographs required only a few hours to generate a model over a focused area such as the lava dome, but would have taken perhaps days to capture data over the entire crater. Here, we present results from new photogrammetric software that uses robust image-matching algorithms to produce 3D surfaces automatically after inner, relative, and absolute orientations between overlapping photographs are completed. Measurements using scans of vertical aerial photographs taken August 10, 2005 produced dome volume estimates within two percent of those from a surface generated using the vertical aerial photograph manual method. The new August 10th orientations took less than 8

  13. Noise Characteristics of 64-channel 2nd-order DROS Gradiometer System inside a Poorly Magnetically-shielded Room

    Kim, J. M.; Lee, Y. H.; Yu, K. K.; Kim, K.; Kwon, H.; Park, Y. K. [Biosignal Research Center, Korea Research Institute of Standards and Science, Daejeon (Korea, Republic of); Sasada, Ichiro [Dept. of Applied Science for Electronics and Materials, Ktushu University, Fukuoka (Korea, Republic of)

    2006-10-15

    We have developed a second-order double relaxation oscillation SQUID(DROS) gradiometer with a baseline of 35 mm, and constructed a poorly magnetically-shielded room(MSR) with an aluminum layer and permalloy layers for magnetocardiography(MCG). The 2nd-order DROS gradiometer has a noise level of 20 fT/Hz at 1 Hz and 8 fT/Hz at 200 Hz inside the heavily-shielded MSR with a shielding factor of10{sup 3}at 1 Hz and 10{sup 4} - 10{sup 5} at 100 Hz. The poorly-shielded MSR, built of a 12-mm-thick aluminum layer and 4-6 permalloy layers of 0.35 mm thickness, is 2.4 m x 2.4 m x 2.4 m in size, and has a shielding factor of 40 at 1 Hz, 10{sup 4} at 100 Hz. Our 64-channel second-order gradiometer MCG system consists of 64 2nd-order DROS gradiometers, flux-locked loop electronics, and analog signal processors. With the 2nd-order DROS gradiometers and flux-locked loop electronics installed inside the poorly-shielded MSR, and with the analog signal processor installed outside it, the noise level was measured to be 20 fT/Hz at 1 Hz and 8 fT/Hz at 200 Hz on the average even though the MSR door is open. This result leads to a low noise level, low enough to obtain a human MCG at the same level as that measured in the heavily-shielded MSR. However, filters or active shielding is needed fur clear MCG when there is large low-frequency noise from heavy air conditioning or large ac power consumption near the poorly-shielded MSR.

  14. Software-only IR image generation and reticle simulation for the HWIL testing of a single detector frequency modulated reticle seeker

    Delport, Jan Peet; le Roux, Francois P. J.; du Plooy, Matthys J. U.; Theron, Hendrik J.; Annamalai, Leeandran

    2004-08-01

    Hardware-in-the-Loop (HWIL) testing of seeker systems usually requires a 5-axis flight motion simulator (FMS) coupled to expensive hardware for infrared (IR) scene generation and projection. Similar tests can be conducted by using a 3-axis flight motion simulator, bypassing the seeker optics and injecting a synthetically calculated detector signal directly into the seeker. The constantly increasing speed and memory bandwidth of high-end personal computers make them attractive software rendering platforms. A software OpenGL pipeline provides flexibility in terms of access to the rendered output, colour channel dynamic range and lighting equations. This paper describes how a system was constructed using personal computer hardware to perform closed tracking loop HWIL testing of a single detector frequency modulated reticle seeker. The main parts of the system that are described include: * The software-only implementation of OpenGL used to render the IR image with floating point accuracy directly to system memory. * The software used to inject the detector signal and extract the seeker look position. * The architecture used to control the flight motion simulator.

  15. Virtual Visit to the ATLAS Control Room by 2nd High School of Eleftherio–Kordelio in Thessaloniki

    2013-01-01

    Our school is the 2nd High School of Eleftherio – Kordelio. It is located at the west suburbs of Thessaloniki in Greece and our students are between 15-17 years old. Thessaloniki is the second largest city in Greece with a port of a major role in trading at the area of South Balkans. During this period of time our students have heard so much about CERN and the great discoveries which have taken place there and they are really keen on visiting and learning many things about it.

  16. Order and disorder in Ca2ND0.90H0.10-A structural and thermal study

    The structure of calcium nitride hydride and its deuterided form has been re-examined at room temperature and studied at high temperature using neutron powder diffraction and thermal analysis. When synthesised at 600 deg. C, a mixture of both ordered and disordered Ca2ND0.90H0.10 phases results. The disordered phase is the minor component and has a primitive rocksalt structure (spacegroup Fm3m) with no ordering of D/N on the anion sites and the ordered phase is best described using the rhombohedral spacegroup R-3m with D and N arranged in alternate layers in (111) planes. This mixture of ordered and disordered phases exists up to 580 deg. C, at which the loss of deuterium yields Ca2ND0.85 with the disappearance of the disordered phase. In the new ordered phase there exists a similar content of vacancies on both anion sites; to achieve this balance, a little N transfers onto the D site, whereas there is no indication of D transferring onto the N-sites. These observations are thought to indicate that the D/N ordering is difficult to achieve with fully occupied anion sites. It has previously been reported that Ca2ND has an ordered cubic cell with alternating D and N sites in the [100] directions ; however, for the samples studied herein, there were clearly two coexisting phases with apparent broadening/splitting of the primitive peaks but not for the ordered peaks. The rhombohedral phase was in fact metrically cubic; however, all the observed peaks were consistent with the rhombohedral unit cell with no peaks requiring the larger ordered cubic unit cell to be utilised. Furthermore this rhombohedral cell displays the same form of N-D ordering as the Sr and Ba analogues, which are metrically rhombohedral. - Graphical abstract: Ca2ND0.90H0.10 forms a mixture of ordered and disordered phases when synthesised at 600 deg. C. The ordered phase disappears at high temperature upon release of structural deuterium/hydrogen, leaving a single, partially disordered phase. Research

  17. Numerical Study of Entropy Generation in the Flameless Oxidation Using Large Eddy Simulation Model and OpenFOAM Software

    Mousavi, Seyed Mahmood

    2014-01-01

    In this paper, in order to 3D investigation non-premixed flameless oxidation, large eddy simulation model using OpenFOAM software is applied. In this context, finite volume discrete ordinate model and partially stirred reactor are applied in order to model radiation and the combustion, respectively, and the full mechanism GRI-2.11 is used to precisely represent chemistry reactions. The flow field is discretized using the volume method and PISO algorithm coupled the pressure and velocity field...

  18. The MeqTrees software system and its use for third-generation calibration of radio interferometers

    Noordam, Jan E.; Smirnov, Oleg M.

    2011-01-01

    The formulation of the radio interferometer measurement equation (RIME) by Hamaker et al. has provided us with an elegant mathematical apparatus for better understanding, simulation and calibration of existing and future instruments. The calibration of the new radio telescopes (LOFAR, SKA) would be unthinkable without the RIME formalism, and new software to exploit it. MeqTrees is designed to implement numerical models such as the RIME, and to solve for arbitrary subsets of their parameters. ...

  19. FOREWORD: 2nd International Workshop on New Computational Methods for Inverse Problems (NCMIP 2012)

    Blanc-Féraud, Laure; Joubert, Pierre-Yves

    2012-09-01

    Conference logo This volume of Journal of Physics: Conference Series is dedicated to the scientific contributions presented during the 2nd International Workshop on New Computational Methods for Inverse Problems, (NCMIP 2012). This workshop took place at Ecole Normale Supérieure de Cachan, in Cachan, France, on 15 May 2012, at the initiative of Institut Farman. The first edition of NCMIP also took place in Cachan, France, within the scope of the ValueTools Conference, in May 2011 (http://www.ncmip.org/2011/). The NCMIP Workshop focused on recent advances in the resolution of inverse problems. Indeed inverse problems appear in numerous scientific areas such as geophysics, biological and medical imaging, material and structure characterization, electrical, mechanical and civil engineering, and finance. The resolution of inverse problems consists of estimating the parameters of the observed system or structure from data collected by an instrumental sensing or imaging device. Its success firstly requires the collection of relevant observation data. It also requires accurate models describing the physical interactions between the instrumental device and the observed system, as well as the intrinsic properties of the solution itself. Finally, it requires the design of robust, accurate and efficient inversion algorithms. Advanced sensor arrays and imaging devices provide high rate and high volume data; in this context, the efficient resolution of the inverse problem requires the joint development of new models and inversion methods, taking computational and implementation aspects into account. During this one-day workshop, researchers had the opportunity to bring to light and share new techniques and results in the field of inverse problems. The topics of the workshop were: algorithms and computational aspects of inversion, Bayesian estimation, kernel methods, learning methods, convex optimization, free discontinuity problems, metamodels, proper orthogonal decomposition

  20. Brain order disorder 2nd group report of f-EEG

    Lalonde, Francois; Gogtay, Nitin; Giedd, Jay; Vydelingum, Nadarajen; Brown, David; Tran, Binh Q.; Hsu, Charles; Hsu, Ming-Kai; Cha, Jae; Jenkins, Jeffrey; Ma, Lien; Willey, Jefferson; Wu, Jerry; Oh, Kenneth; Landa, Joseph; Lin, C. T.; Jung, T. P.; Makeig, Scott; Morabito, Carlo Francesco; Moon, Qyu; Yamakawa, Takeshi; Lee, Soo-Young; Lee, Jong-Hwan; Szu, Harold H.; Kaur, Balvinder; Byrd, Kenneth; Dang, Karen; Krzywicki, Alan; Familoni, Babajide O.; Larson, Louis; Harkrider, Susan; Krapels, Keith A.; Dai, Liyi

    2014-05-01

    Since the Brain Order Disorder (BOD) group reported on a high density Electroencephalogram (EEG) to capture the neuronal information using EEG to wirelessly interface with a Smartphone [1,2], a larger BOD group has been assembled, including the Obama BRAIN program, CUA Brain Computer Interface Lab and the UCSD Swartz Computational Neuroscience Center. We can implement the pair-electrodes correlation functions in order to operate in a real time daily environment, which is of the computation complexity of O(N3) for N=102~3 known as functional f-EEG. The daily monitoring requires two areas of focus. Area #(1) to quantify the neuronal information flow under arbitrary daily stimuli-response sources. Approach to #1: (i) We have asserted that the sources contained in the EEG signals may be discovered by an unsupervised learning neural network called blind sources separation (BSS) of independent entropy components, based on the irreversible Boltzmann cellular thermodynamics(ΔS brain at constant temperature, we can solve the minimum of Helmholtz free energy (H = E - TS) by computing BSS, and then their pairwise-entropy source correlation function. (i) Although the entropy itself is not the information per se, but the concurrence of the entropy sources is the information flow as a functional-EEG, sketched in this 2nd BOD report. Area #(2) applying EEG bio-feedback will improve collective decision making (TBD). Approach to #2: We introduce a novel performance quality metrics, in terms of the throughput rate of faster (Δt) & more accurate (ΔA) decision making, which applies to individual, as well as team brain dynamics. Following Nobel Laureate Daniel Kahnmen's novel "Thinking fast and slow", through the brainwave biofeedback we can first identify an individual's "anchored cognitive bias sources". This is done in order to remove the biases by means of individually tailored pre-processing. Then the training effectiveness can be maximized by the collective product Δt *

  1. RECONSTRUCTING THE IDEA OF PRAYER SPACE: A CRITICAL ANALYSIS OF THE TEMPORARY PRAYING PLATFORM PROJECT OF 2ND YEAR ARCHITECTURE STUDENTS IN THE NATIONAL UNIVERSITY OF MALAYSIA (UKM

    Nangkula Utaberta

    2014-02-01

    Full Text Available Abstract God created human as caliph on this earth. Caliph means leader, care-taker and guardian. Therefore humans have an obligation to maintain, preserve and conserve this natural for future generations. Today we see a lot of damage that occurs in the earth caused by human behavior. Islam saw the whole of nature as a place of prayer that must be maintained its cleanliness and purity. Therefore as Muslims we need to preserve nature as we keep our place of prayer. The main objective of this paper is to re-questioning and re-interpreting the idea of sustainability in Islamic Architecture through a critical analysis of first project of 2nd year architecture student of UKM which is the “Temporary Praying Platform”. The discussion itself will be divided into three (3 main parts. The first part will be discussing contemporary issues in Islamic Architecture especially in the design of Mosques while the second part will expand the framework of sustainability in Islamic Architecture. The last part will be analyzing some sample of design submission by 2nd year students of UKM on the temporary praying platform project. It is expected that this paper can start a further discussion on the inner meaning in Islam and how it was implemented in the design of praying spaces in the future. Keywords:  Sustainability, Islamic Architecture, Temporary Praying Platform Abstrak Tuhan menciptakan manusia sebagai khalifah di muka bumi ini. Khalifah berarti pemimpin, penjaga dan wali. Oleh karena itu, manusia memiliki kewajiban untuk memelihara, menjaga dan melestarikan alam ini untuk generasi mendatang. Sekaranginikitatelahmelihat banyak kerusakan yang terjadi di bumi yang disebabkan oleh perilaku manusia itu sendiri yang disebutkan sebagai khalifah di bumi. Islam melihat seluruh alam sebagai tempat beribadah yang harus dijaga kebersihan dan kemurniannya, oleh karena itu, sebagai umat Islam adalah perlu melestarikan alam seperti menjaga tempat ibadah mereka. Tujuan

  2. Tools for software visualization

    Stojanova, Aleksandra; Stojkovic, Natasa; Bikov, Dusan

    2015-01-01

    Software visualization is a kind of computer art, and in the same time is a science for generating visual representations of different software aspects and of software development process. There are many tools that allow software visualization but we are focusing on some of them. In this paper will be examined in details just four tools: Jeliot 3, SRec, jGrasp and DDD. Visualizations that they produce will be reviewed and analyzed and will be mentioned possible places for their application. A...

  3. Comparison of elution efficiency of 99Mo/99mTc generator using theoretical and a free web based software method

    Full text: Generator is constructed on the principle of decay growth relationship between a long lived parent radionuclide and short lived daughter radionuclide. Difference in chemical properties of daughter and parent radionuclide helps in efficient separation of the two radionuclides. Aim and Objectives: The present study was designed to calculate the elution efficiency of the generator using the traditional formula based method and free web based software method. Materials and Methods: 99Mo/99mTc MON.TEK (Monrol, Gebze) generator and sterile 0.9% NaCl vial and vacuum vial in the lead shield were used for the elution. A new 99Mo/99mTc generator (calibrated activity 30GBq) calibrated for thursday was received on monday morning in our department. Generator was placed behind lead bricks in fume hood. The rubber plugs of both vacuum and 0.9% NaCl vial were wiped with 70% isopropyl alcohol swabs. Vacuum vial placed inside the lead shield was inserted in the vacuum position simultaneously 10 ml NaCl vial was inserted in the second slot. After 1-2 min vacuum vial was removed without moving the emptied 0.9%NaCl vial. The vacuum slot was covered with another sterile vial to maintain sterility. The RAC was measured in the calibrated dose calibrator (Capintec, 15 CRC). The elution efficiency was calculated theoretically and using free web based software (Apache Web server (www.apache.org) and PHP (www.php.net). Web site of the Italian Association of Nuclear Medicine and Molecular Imaging (www.aimn.it). Results: The mean elution efficiency calculated by theoretical method was 93.95% +0.61. The mean elution efficiency as calculated by the software was 92.85% + 0.89. There was no statistical difference in both the methods. Conclusion: The free web based software provides precise and reproducible results and thus saves time and mathematical calculation steps. This enables a rational use of available activity and also enabling a selection of the type and number of procedures to

  4. Proceedings of the 2nd JAERI symposium on HTGR technologies October 21 ∼ 23, 1992, Oarai, Japan

    The Japan Atomic Energy Research Institute (JAERI) held the 2nd JAERI Symposium on HTGR Technologies on October 21 to 23, 1992, at Oarai Park Hotel at Oarai-machi, Ibaraki-ken, Japan, with support of International Atomic Energy Agency (IAEA), Science and Technology Agency of Japan and the Atomic Energy Society of Japan on the occasion that the construction of the High Temperature Engineering Test Reactor (HTTR), which is the first high temperature gas-cooled reactor (HTGR) in Japan, is now being proceeded smoothly. In this symposium, the worldwide present status of research and development (R and D) of the HTGRs and the future perspectives of the HTGR development were discussed with 47 papers including 3 invited lectures, focusing on the present status of HTGR projects and perspectives of HTGR Development, Safety, Operation Experience, Fuel and Heat Utilization. A panel discussion was also organized on how the HTGRs can contribute to the preservation of global environment. About 280 participants attended the symposium from Japan, Bangladesh, Germany, France, Indonesia, People's Republic of China, Poland, Russia, Switzerland, United Kingdom, United States of America, Venezuela and the IAEA. This paper was edited as the proceedings of the 2nd JAERI Symposium on HTGR Technologies, collecting the 47 papers presented in the oral and poster sessions along with 11 panel exhibitions on the results of research and development associated to the HTTR. (author)

  5. Surface-emitting quantum cascade laser with 2nd-order metal-semiconductor gratings for single-lobe emission

    Boyle, C.; Sigler, C.; Kirch, J. D.; Lindberg, D.; Earles, T.; Botez, D.; Mawst, L. J.

    2016-03-01

    Grating-coupled, surface-emitting (GCSE) quantum-cascade lasers (QCLs) are demonstrated with high-power, single-lobe surface emission. A 2nd-order Au-semiconductor distributed-feedback (DFB)/ distributed-Bragg-reflector (DBR) grating is used for feedback and out-coupling. The DFB and DBR grating regions are 2.55 mm- and 1.28 mm-long, respectively, for a total grating length of 5.1 mm. The lasers are designed to operate in a symmetric longitudinal mode by causing resonant coupling of the guided optical mode to the antisymmetric surface-plasmon modes of the 2nd-order metal/semiconductor grating. In turn, the antisymmetric longitudinal modes are strongly absorbed by the metal in the grating, causing the symmetric longitudinal mode to be favored to lase, which produces a single lobe beam over a grating duty-cycle range of 36-41 %. Simulations indicate that the symmetric mode is always favored to lase, independent of the random phase of residual reflections from the device's cleaved ends. Peak pulsed output powers of ~ 0.4 W were measured with single-lobe, single-mode operation near 4.75 μm.

  6. Synthetic CO, H2 and HI surveys of the Galactic 2nd Quadrant, and the properties of molecular gas

    Duarte-Cabral, A; Dobbs, C L; Mottram, J C; Gibson, S J; Brunt, C M; Douglas, K A

    2014-01-01

    We present CO, H2, HI and HISA distributions from a set of simulations of grand design spirals including stellar feedback, self-gravity, heating and cooling. We replicate the emission of the 2nd Galactic Quadrant by placing the observer inside the modelled galaxies and post process the simulations using a radiative transfer code, so as to create synthetic observations. We compare the synthetic datacubes to observations of the 2nd Quadrant of the Milky Way to test the ability of the current models to reproduce the basic chemistry of the Galactic ISM, as well as to test how sensitive such galaxy models are to different recipes of chemistry and/or feedback. We find that models which include feedback and self-gravity can reproduce the production of CO with respect to H2 as observed in our Galaxy, as well as the distribution of the material perpendicular to the Galactic plane. While changes in the chemistry/feedback recipes do not have a huge impact on the statistical properties of the chemistry in the simulated g...

  7. Numerical stability of 2nd order Runge-Kutta integration algorithms for use in particle-in-cell codes

    An essential ingredient of particle-in-cell (PIC) codes is a numerically accurate and stable integration scheme for the particle equations of motion. Such a scheme is the well known time-centered leapfrog (LF) method accurate to 2nd order with respect to the timestep Δt. However, this scheme can only be used for forces independent of velocity unless a simple enough implicit implementation is possible. The LF scheme is therefore inapplicable in Monte-Carlo treatments of particle collisions and/or interactions with radio-frequency fields. We examine here the suitability of the 2nd order Runge-Kutta (RK) method. We find that the basic RK scheme is numerically unstable, but that conditional stability can be attained by an implementation which preserves phase space area. Examples are presented to illustrate the performance of the RK schemes. We compare analytic and computed electron orbits in a traveling nonlinear wave and also show self-consistent PIC simulations describing plasma flow in the vicinity of a lower hybrid antenna. (author)

  8. Cosmology and Particle Astrophysics (2nd edn) and Extragalactic Astronomy and Cosmology: An Introduction

    Trimble, Virginia [Department of Physics and Astronomy, University of California-Irvine, CA 92697-4575 (United States); Las Cumbres Global Telescope Network, Goleta, CA 93117 (United States)

    2007-05-07

    a while to find the bits you want. The index lists neither lambda nor the cosmological constant, and inflation is said to appear on pp 307-412. The chapters are of equal length, in traditional textbook fashion. Neither volume has much to say about issues that are currently 'hot'-the importance of extra dimensions, fine tuning of cosmological parameters, possible evidence for cosmic geometry different from the simplest. Discussions of such things will, of course, date a textbook quickly. On the other hand, they are often the items that physics (etc) students will have heard about in colloquia and would like to have clarified. Names appear only as eponyms, from Altarelli Parisi evolution (which is not on the page to which B and G's index refers you) to the Zeeman effect, which is where PS's index says it is. Can I imagine using either of these as texts? Definitely yes for PS, since it is a possible fit to an astrophysics course that UCI offers as a 'vocabulary builder' for students coming out of mainstream physics (and for which we have yet to find an entirely suitable text). We are contemplating a faculty hire or two in astro-particle physics, in which case B and G might well be a good fit to a seminar for students beginning work in that area. If I were asked to teach the course, however, I would probably want an instructor's solution manual for the text problems. One may well exist, though the book does not mention it. Using PS, you will have to make up your own problems (which you can then reasonably be expected to be able to work without help). (Book review of Cosmology and Particle Astrophysics (2nd edn), Lars Bergstroem and Ariel Goobar, 2006 Berlin: Springer and Worthing: Praxis, ISBN 978-3-540-33174-2 and Extragalactic Astronomy and Cosmology: An Introduction, Peter Schneider, 2006 Berlin: Springer, ISBN 978-3-540-33174-2)

  9. XUV spectra of 2nd transition row elements: identification of 3d-4p and 3d-4f transition arrays

    Lokasani, Ragava; Long, Elaine; Maguire, Oisin; Sheridan, Paul; Hayden, Patrick; O'Reilly, Fergal; Dunne, Padraig; Sokell, Emma; Endo, Akira; Limpouch, Jiri; O'Sullivan, Gerry

    2015-12-01

    The use of laser produced plasmas (LPPs) in extreme ultraviolet/soft x-ray lithography and metrology at 13.5 nm has been widely reported and recent research efforts have focused on developing next generation sources for lithography, surface morphology, patterning and microscopy at shorter wavelengths. In this paper, the spectra emitted from LPPs of the 2nd transition row elements from yttrium (Z = 39) to palladium (Z = 46), with the exception of zirconium (Z = 40) and technetium (Z = 43), produced by two Nd:YAG lasers which delivered up to 600 mJ in 7 ns and 230 mJ in 170 ps, respectively, are reported. Intense emission was observed in the 2-8 nm spectral region resulting from unresolved transition arrays (UTAs) due to 3d-4p, 3d-4f and 3p-3d transitions. These transitions in a number of ion stages of yttrium, niobium, ruthenium and rhodium were identified by comparison with results from Cowan code calculations and previous studies. The theoretical data were parameterized using the UTA formalism and the mean wavelength and widths were calculated and compared with experimental results.

  10. 2nd International Conference on INformation Systems Design and Intelligent Applications

    Satapathy, Suresh; Sanyal, Manas; Sarkar, Partha; Mukhopadhyay, Anirban

    2015-01-01

    The second international conference on INformation Systems Design and Intelligent Applications (INDIA – 2015) held in Kalyani, India during January 8-9, 2015. The book covers all aspects of information system design, computer science and technology, general sciences, and educational research. Upon a double blind review process, a number of high quality papers are selected and collected in the book, which is composed of two different volumes, and covers a variety of topics, including natural language processing, artificial intelligence, security and privacy, communications, wireless and sensor networks, microelectronics, circuit and systems, machine learning, soft computing, mobile computing and applications, cloud computing, software engineering, graphics and image processing, rural engineering, e-commerce, e-governance, business computing, molecular computing, nano computing, chemical computing, intelligent computing for GIS and remote sensing, bio-informatics and bio-computing. These fields are not only ...

  11. Proceedings of the 2nd workshop on linking aspect technology and evolution, Bonn, Germany, 20.03.2006

    Tourwé, T.; Shepherd, D.; Kellens, A.; Ceccato, M.

    2006-01-01

    Software evolution lies at the heart of the software development process, and suffers from problems such as maintainability, evolvability, understandability, etc. Aspect-oriented software development (AOSD) is an emerging software development paradigm, that tries to achieve better separation of conc

  12. Generating statements at whole-body imaging with a workflow-optimized software tool - first experiences with multireader analysis

    Introduction: Due to technical innovations in sectional diagram methods, whole-body imaging has increased in importance for clinical radiology, particularly for the diagnosis of systemic tumor disease. Large numbers of images have to be evaluated in increasingly shorter time periods. The aim was to create and evaluate a new software tool to assist and automate the process of diagnosing whole-body datasets. Material and Methods: Thirteen whole-body datasets were evaluated by 3 readers using the conventional system and the new software tool. The times for loading the datasets, examining 5 different regions (head, neck, thorax, abdomen and pelvis/skeletal system) and retrieving a relevant finding for demonstration were acquired. Additionally a Student T-Test was performed. For qualitative analysis the 3 readers used a scale from 0 - 4 (0 = bad, 4 = very good) to assess dataset loading convenience, lesion location assistance, and ease of use. Additionally a kappa value was calculated. Results: The average loading time was 39.7 s (± 5.5) with the conventional system and 6.5 s (± 1.4) (p 0.9). The qualitative analysis showed a significant advantage with respect to convenience (p 0.9). (orig.)

  13. The MeqTrees software system and its use for third-generation calibration of radio interferometers

    Noordam, Jan E; 10.1051/0004-6361/201015013

    2011-01-01

    The formulation of the radio interferometer measurement equation (RIME) by Hamaker et al. has provided us with an elegant mathematical apparatus for better understanding, simulation and calibration of existing and future instruments. The calibration of the new radio telescopes (LOFAR, SKA) would be unthinkable without the RIME formalism, and new software to exploit it. MeqTrees is designed to implement numerical models such as the RIME, and to solve for arbitrary subsets of their parameters. The technical goal of MeqTrees is to provide a tool for rapid implementation of such models, while offering performance comparable to hand-written code. We are also pursuing the wider goal of increasing the rate of evolution of radio astronomical software, by offering a tool for rapid experimentation and exchange of ideas. MeqTrees is implemented as a Python-based front-end called the meqbrowser, and an efficient (C++-based) computational back-end called the meqserver. Numerical models are defined on the front-end via a P...

  14. Free Open Source Software: FOSS Based GIS for Spatial Retrievals of Appropriate Locations for Ocean Energy Utilizing Electric Power Generation Plants

    Kohei Arai

    2012-09-01

    Full Text Available Free Open Source Software: FOSS based Geographic Information System: GIS for spatial retrievals of appropriate locations for ocean wind and tidal motion utilizing electric power generation plants is proposed. Using scatterometer onboard earth observation satellites, strong wind coastal areas are retrieved with FOSS/GIS of PostgreSQL/GIS. PostGIS has to be modified together with altimeter and scatterometer database. These modification and database creation would be a good reference to the users who would like to create GIS system together with database with FOSS.

  15. A computer software system for the generation of global ocean tides including self-gravitation and crustal loading effects

    Estes, R. H.

    1977-01-01

    A computer software system is described which computes global numerical solutions of the integro-differential Laplace tidal equations, including dissipation terms and ocean loading and self-gravitation effects, for arbitrary diurnal and semidiurnal tidal constituents. The integration algorithm features a successive approximation scheme for the integro-differential system, with time stepping forward differences in the time variable and central differences in spatial variables. Solutions for M2, S2, N2, K2, K1, O1, P1 tidal constituents neglecting the effects of ocean loading and self-gravitation and a converged M2, solution including ocean loading and self-gravitation effects are presented in the form of cotidal and corange maps.

  16. Early prediction for necessity of 2nd I-131 ablation therapy with serum thyroglobulin levels in patients with differentiated thyroid cancer

    The aim of our study was to evaluate the predictive value of serum thyroglobulin levels, measured at preoperative status and just before 1st I-131 ablation therapy with high serum TSH, for necessity of 2nd I-131 ablation therapy in differentiated thyroid cancer patients. 111 patients with DTC who underwent total or near total thyroidectomy followed by immediate I-131 ablation therapy, were enrolled in this study. TSH, Tg and anti-Tg autoantibody were measured before thyroidectomy (TSHpreop, Tgpreop and Anti-Tgpreop) and just before 1st I-131 ablation therapy (TSHabl, Tgabl and Anti-Tgabl). All TSHabl levels were above 30mU/liter, ATg [(Tgpreop-Tgabl)X100/(Tgpreop)] was calculated. 29 patients(26.1%, 29/111) had to receive 2nd I-131 ablation therapy. Of 70 patients whose Tgabl were under 10 ng/ml, only 11 patients had received 2nd I-131 ablation therapy (15.7%). Patients with Tgabl greater than or equal to 10 ng/ml had received 2nd I-131 ablation therapy (18/41, 43.9%) than patients with lower Tgabl level. There was a disparity of necessity of 2nd I-131 ablation therapy between two groups(Tgabl <10 ng/ml and Tgabl =10 ng/ml, two by two /2 test p=0.0016). Of 41 patients with Tgabl greater than or equal to 10 ng/ml, 19 patients showed increased Tg levels (ATg<0). Patients with negative ATg and Tgabl greater than or equal to 10 ng/ml showed a strikingly high necessity of 2nd I-131 ablation therapy (11/19, 57.9%). There was also a significant disparity of necessity of 2nd I-131 ablation therapy between two groups(ATg<0 + Tgabl =10 ng/ml and the others, two by two /2 test, p=0.0012). These results suggest that high Tgabl level just before 1st I-131 ablation therapy can forecast the necessity of 2nd I-131 ablation therapy. Moreover, Difference of Tg level between preoperative status and just before 1st I-131 ablation therapy could also suggest necessity of 2nd I-131 ablation therapy at early period of DTC patients surveillance

  17. DETERMINATION OF ELECTROMAGNETIC PARAMETERS AND PHASE RELATIONS IN TURBO-GENERATORS BY THE AUTOMATED CALCULATION OF THE MAGNETIC FIELD IN THE SOFTWARE ENVIRONMENT FEMM

    V.I. Milykh

    2016-03-01

    Full Text Available The theoretical bases of calculation of electromagnetic quantities and time-phase relationship are presented for the turbo-generators. This is done by numerical calculations of the magnetic field in the software environment package FEMM (Finite Element Method Magnetics. A program which controls calculations and organizes the issuance of the results to a text file is created on the algorithmic language Lua. The program is universal in terms of a turbo-generator models, as well as steady-state modes of their work with a minimum of input data. The exciting current of the rotor and the phase currents of three-phase stator winding in accordance with their initial phase are given for the calculation of the magnetic field. The key function for the analysis of electromagnetic parameters is the calculated angular function of the magnetic flux phase stator winding. The expansion in the harmonic series is carried out and amplitude and initial phase are received for this function. Next, the phase EMF and voltage, phase shifts between all values, active power, electromagnetic torque, the magnetic flux in the gap and other parameters are determined. The presented Lua script is a prototype for a similar calculation software of electric machines of other types.

  18. A summary of the 2nd workshop on Human Resources Development (HRD) in the nuclear field in Asia. FY2000

    NONE

    2001-06-01

    The Human Resources Development (HRD) Project was added in 1999 as a Cooperation Activity of 'the Forum for Nuclear Cooperation in Asia (FNCA)' which is organized by Nuclear Committee. The HRD Project supports to solidify the foundation of nuclear development utilization in Asia by promoting human resources development in Asian countries. The principal activity of the HRD Project is to hold the Workshop on Human Resources Development in the Nuclear Field in Asia once a year. The objective of the Workshop is to clarify problems and needs of the human resources development of each country and to support it mutually by exchanging information etc. The report consists of a summary of the 2nd Workshop on Human Resources Development in the Nuclear Field in Asia held on November 27 and 28, 2000 at Tokai Research Establishment of JAERI. (author)

  19. International collaborative study for establishment of the 2nd WHO International Standard for Haemophilus influenzae type b polysaccharide.

    Mawas, Fatme; Burkin, Karena; Dougall, Thomas; Saydam, Manolya; Rigsby, Peter; Bolgiano, Barbara

    2015-11-01

    In this report we present the results of a collaborative study for the preparation and calibration of a replacement International Standard (IS) for Haemophilus influenzae type b polysaccharide (polyribosyl ribitol phosphate; 5-d-ribitol-(1 → 1)-β-d-ribose-3-phosphate; PRP). Two candidate preparations were evaluated. Thirteen laboratories from 9 different countries participated in the collaborative study to assess the suitability and determine the PRP content of two candidate standards. On the basis of the results from this study, Candidate 2 (NIBSC code 12/306) has been established as the 2nd WHO IS for PRP by the Expert Committee of Biological Standards of the World Health Organisation with a content of 4.904 ± 0.185mg/ampoule, as determined by the ribose assays carried out by 11 of the participating laboratories. PMID:26298195

  20. A summary of the 2nd workshop on Human Resources Development (HRD) in the nuclear field in Asia. FY2000

    The Human Resources Development (HRD) Project was added in 1999 as a Cooperation Activity of 'the Forum for Nuclear Cooperation in Asia (FNCA)' which is organized by Nuclear Committee. The HRD Project supports to solidify the foundation of nuclear development utilization in Asia by promoting human resources development in Asian countries. The principal activity of the HRD Project is to hold the Workshop on Human Resources Development in the Nuclear Field in Asia once a year. The objective of the Workshop is to clarify problems and needs of the human resources development of each country and to support it mutually by exchanging information etc. The report consists of a summary of the 2nd Workshop on Human Resources Development in the Nuclear Field in Asia held on November 27 and 28, 2000 at Tokai Research Establishment of JAERI. (author)

  1. Results of the independent verification of radiological remedial action at 217 South 2nd East Street, Monticello, Utah (MS00097)

    In 1980 the site of a vanadium and uranium mill at Monticello, Utah, was accepted into the US Department of Energy's (DOE's) Surplus Facilities Management Program, with the objectives of restoring the government-owned mill site to safe levels of radioactivity, disposing of or containing the tailings in an environmentally safe manner, and performing remedial actions on off-site (vicinity) properties that had been contaminated by radioactive material resulting from mill operations. During 1985 and 1986, UNC Geotech, the remedial action contractor designated by DOE, performed remedial action on the vicinity property at 217 South 2nd East Street, Monticello, Utah. The Pollutant Assessments Group (PAG) of Oak Ridge National Laboratory was assigned the responsibility of verifying the data supporting the adequacy of remedial action and confirming the site's compliance with DOE guidelines. The PAG found that the site successfully meets the DOE remedial action objectives. Procedures used by PAG are described. 3 refs., 2 tabs

  2. Anatomy of a 2nd-order unconformity: stratigraphy and facies of the Bakken formation during basin realignment

    Skinner, Orion; Canter, Lyn; Sonnenfeld, Mark; Williams, Mark [Whiting Oil and Gas Corp., Denver, CO (United States)

    2011-07-01

    Because classic Laramide compressional structures are relatively rare, the Williston Basin is often considered as structurally simple, but because of the presence of numerous sub-basins, simplistic lithofacies generalization is impossible, and detailed facies mapping is necessary to unravel Middle Bakken paleogeography. The unconformity above the Devonian Three Forks is explained by the infilling and destruction of the Devonian Elk Point basin, prepares the Bakken system, and introduces a Mississippian Williston Basin with a very different configuration. Black shales are too often considered as deposits that can only be found in deep water, but to a very different conclusion must be drawn after a review of stratigraphic geometry and facies successions. The whole Bakken is a 2nd-order lowstand to transgressive systems tract lying below the basal Lodgepole, which represents an interval of maximal flooding. This lowstand to transgressive stratigraphic context explains why the sedimentary process and provenance shows high aerial variability.

  3. 2nd International Symposium "Atomic Cluster Collisions : Structure and Dynamics from the Nuclear to the Biological Scale"

    Solov'yov, Andrey; ISACC 2007; Latest advances in atomic cluster collisions

    2008-01-01

    This book presents a 'snapshot' of the most recent and significant advances in the field of cluster physics. It is a comprehensive review based on contributions by the participants of the 2nd International Symposium on Atomic Cluster Collisions (ISACC 2007) held in July 19-23, 2007 at GSI, Darmstadt, Germany. The purpose of the Symposium is to promote the growth and exchange of scientific information on the structure and properties of nuclear, atomic, molecular, biological and complex cluster systems studied by means of photonic, electronic, heavy particle and atomic collisions. Particular attention is devoted to dynamic phenomena, many-body effects taking place in cluster systems of a different nature - these include problems of fusion and fission, fragmentation, collective electron excitations, phase transitions, etc.Both the experimental and theoretical aspects of cluster physics, uniquely placed between nuclear physics on the one hand and atomic, molecular and solid state physics on the other, are discuss...

  4. The 2nd DBCLS BioHackathon: interoperable bioinformatics Web services for integrated applications

    Katayama Toshiaki

    2011-08-01

    Full Text Available Abstract Background The interaction between biological researchers and the bioinformatics tools they use is still hampered by incomplete interoperability between such tools. To ensure interoperability initiatives are effectively deployed, end-user applications need to be aware of, and support, best practices and standards. Here, we report on an initiative in which software developers and genome biologists came together to explore and raise awareness of these issues: BioHackathon 2009. Results Developers in attendance came from diverse backgrounds, with experts in Web services, workflow tools, text mining and visualization. Genome biologists provided expertise and exemplar data from the domains of sequence and pathway analysis and glyco-informatics. One goal of the meeting was to evaluate the ability to address real world use cases in these domains using the tools that the developers represented. This resulted in i a workflow to annotate 100,000 sequences from an invertebrate species; ii an integrated system for analysis of the transcription factor binding sites (TFBSs enriched based on differential gene expression data obtained from a microarray experiment; iii a workflow to enumerate putative physical protein interactions among enzymes in a metabolic pathway using protein structure data; iv a workflow to analyze glyco-gene-related diseases by searching for human homologs of glyco-genes in other species, such as fruit flies, and retrieving their phenotype-annotated SNPs. Conclusions Beyond deriving prototype solutions for each use-case, a second major purpose of the BioHackathon was to highlight areas of insufficiency. We discuss the issues raised by our exploration of the problem/solution space, concluding that there are still problems with the way Web services are modeled and annotated, including: i the absence of several useful data or analysis functions in the Web service "space"; ii the lack of documentation of methods; iii lack of

  5. Optimal Planning of an Off-grid Electricity Generation with Renewable Energy Resources using the HOMER Software

    Hossein Shahinzadeh

    2015-03-01

    Full Text Available In recent years, several factors such as environmental pollution which is caused by fossil fuels and various diseases caused by them from one hand and concerns about the dwindling fossil fuels and price fluctuation of the products and resulting effects of these fluctuations in the economy from other hand has led most countries to seek alternative energy sources for fossil fuel supplies. Such a way that in 2006, about 18% of the consumed energy of the world is obtained through renewable energies. Iran is among the countries that are geographically located in hot and dry areas and has the most sun exposure in different months of the year. Except in the coasts of Caspian Sea, the percentage of sunny days throughout the year is between 63 to 98 percent in Iran. On the other hand, there are dispersed and remote areas and loads far from national grid which is impossible to provide electrical energy for them through transmission from national grid, therefore, for such cases the renewable energy technologies could be used to solve the problem and provide the energy. In this paper, technical and economic feasibility for the use of renewable energies for independent systems of the grid for a dispersed load in the area on the outskirts of Isfahan (Sepahan with the maximum energy consumption of 3Kwh in a day is studied and presented. In addition, the HOMER simulation software is used as the optimization tool.

  6. Development of a radioactive waste treatment equipment utilizing microwave heating, 2nd report

    The objective of the present study is to establish an incineration technique utilizing microwave heating which enables a high volume reduction of spent ion-exchange resins and filtering media generated at nuclear facilities. The past three years from 1982 to 1985, with the financial aid from the Agency of Science and Technology, brought a great and rapid progress to this project when the heating technique was switched from direct microwave heating to indirect heating by employing a bed of beads of silicon carbide. This material was also used to build a secondary furnace, walls and roster bars, to treat the obnoxious gases and soot arising in the primary incineration process by the radiating heat of this material heated to above 1000 deg C again by microwave energy, but not by the originarily applied direct plasma torch combustion. The incinerator and the secondary furnace were integrated into one unit as the principal treating equipment. This novel approach made possible a well stabilized continuous incineration operation. Further, developmental efforts toward industrial applications were made by setting up a pilot plant with microwave generators, 2 sets of 5 kW of 2450 MHz and 1 set of 25 kW of 915 MHz, and tests were carried out to prove remarkably high volume reduction capability well above roughly 200 on weight basis. For hot test runs, a one - tenth scale pilot test setup was installed at the TOKAI Laboratory of Japan Atmic Energy Research Institute and tested with materials spiked with radioisotopes and also with spent ion-exchange resins stored there. Very satisfactory results were obtained in these proving tests to show the efficient capability of high volume reduction treatment of otherwise stable radioactive waste materials such as spent ion-exchange resins. (author)

  7. The stages of development a healthy way of life of senior pupils in native pedagogy (2nd part of XX century)

    Iermakova T.S.

    2010-01-01

    The stages of formation and development the problem of healthy way of life of senior pupils were defined in native pedagogy of 2nd part of XX century. The peculiarities of forming healthy way of life of senior pupils at every stage were analysed and disclosed. The contribution of native scientific and pedagogues of the 2nd part of XX century in solution of the problem of forming healthy way of life of senior pupils were determined. On the investigated stages were defined the imperfections tha...

  8. The Effects of Star Strategy of Computer-Assisted Mathematics Lessons on the Achievement and Problem Solving Skills in 2nd Grade Courses

    İPEK, Jale; Hatice Malaş

    2013-01-01

    The aim of research is to determine the effect of STAR strategy on 2nd grade students’ academic achievement and problem solving skills in the computer assisted mathematics instruction. 30 students who attend to 2nd grade course in a primary school in Aydın in 2010-2011 education year join the study group. The research has been taken place in 7 week. 3 types of tests have been used as means of data collecting tool, “Academic Achievement Test”, “Problem Solving Achievement Test” and “The Evalua...

  9. Report of 2nd workshop on particle process. A report of the Yayoi study meeting

    In Nuclear Engineering Research Laboratory, Faculty of Engineering, University of Tokyo, a short term research named Yayoi Research Group, as a joint application research work of nuclear reactor (Yayoi) and electron Linac in Japan, has been held more than 10 times a year. This report is arranged the summaries of 'Research on Particle Method', one of them, held on August 7, 1996. As named 'Particle Method' here, the method explaining and calculating the fluids and powders as a group of particles is more suitable for treating a problem with boundary face and a large deformation of the fluids on comparison with the conventional method using lattice, which is more expectable in future development. In this report, the following studies are contained; 1) Stress analysis without necessary of element breakdown, 2) Local interpolation differential operator method and nonstructural lattice, 3) Selforganized simulation of the dynamical construction, 4) A lattice BGK solution of laminar flow over a background facing step, 5) Numerical analysis of solid-gas two phase flow using discrete element method, 6) Application of flow analysis technique to power generation plant equipments, 7) Corrision wave captured flow calculation using the particle method, and 8) Analysis of complex problem on thermal flow using the particle (MPS) method. (G.K.)

  10. GENII (Generation II): The Hanford Environmental Radiation Dosimetry Software System: Volume 3, Code maintenance manual: Hanford Environmental Dosimetry Upgrade Project

    Napier, B.A.; Peloquin, R.A.; Strenge, D.L.; Ramsdell, J.V.

    1988-09-01

    The Hanford Environmental Dosimetry Upgrade Project was undertaken to incorporate the internal dosimetry models recommended by the International Commission on Radiological Protection (ICRP) in updated versions of the environmental pathway analysis models used at Hanford. The resulting second generation of Hanford environmental dosimetry computer codes is compiled in the Hanford Environmental Dosimetry System (Generation II, or GENII). This coupled system of computer codes is intended for analysis of environmental contamination resulting from acute or chronic releases to, or initial contamination of, air, water, or soil, on through the calculation of radiation doses to individuals or populations. GENII is described in three volumes of documentation. This volume is a Code Maintenance Manual for the serious user, including code logic diagrams, global dictionary, worksheets to assist with hand calculations, and listings of the code and its associated data libraries. The first volume describes the theoretical considerations of the system. The second volume is a Users' Manual, providing code structure, users' instructions, required system configurations, and QA-related topics. 7 figs., 5 tabs.

  11. GENII [Generation II]: The Hanford Environmental Radiation Dosimetry Software System: Volume 3, Code maintenance manual: Hanford Environmental Dosimetry Upgrade Project

    The Hanford Environmental Dosimetry Upgrade Project was undertaken to incorporate the internal dosimetry models recommended by the International Commission on Radiological Protection (ICRP) in updated versions of the environmental pathway analysis models used at Hanford. The resulting second generation of Hanford environmental dosimetry computer codes is compiled in the Hanford Environmental Dosimetry System (Generation II, or GENII). This coupled system of computer codes is intended for analysis of environmental contamination resulting from acute or chronic releases to, or initial contamination of, air, water, or soil, on through the calculation of radiation doses to individuals or populations. GENII is described in three volumes of documentation. This volume is a Code Maintenance Manual for the serious user, including code logic diagrams, global dictionary, worksheets to assist with hand calculations, and listings of the code and its associated data libraries. The first volume describes the theoretical considerations of the system. The second volume is a Users' Manual, providing code structure, users' instructions, required system configurations, and QA-related topics. 7 figs., 5 tabs

  12. CIFLog: the 3rd generation logging software based on Java-NetBeans%基于Java-NetBeans的第三代测井软件CIFLog

    李宁; 王才志; 刘英明; 李伟忠; 夏守姬; 原野

    2013-01-01

    To follow direction of well-logging technology development, get rid of dependence on foreign well-logging software, improve capability of independent innovation, and enchance core competitiveness, CIFLog, the 3rd generation well-logging software which has independent intellectual property rights, has been successfully developed by PetroChina supported by the major national oil and gas project. Based on advanced Java-NetBeans programming technology, CIFLog adopts three-layer structure, and it can run under three major operating systems, i. e. Windows, Linux and Unix. Combine with well-logging evaluation of open-holes and cased holes, CIFLog offers the interpretation methods of complex reservoir including volcanic, carbonate, low-resistivity clastic rocks and flooded zones, and it is the first well-logging software that can be successfully applied to processing and interpretation of all domestically produced high-end imaging logging equipment in China. The successful development of the software not only breaks technical barriers of foreign software, and also fill gaps in relevant fields, meanwhile, it greatly enhances well logging technology and research and development of large-scale software in China.%为了紧跟测井技术的发展方向,摆脱对国外测井软件的依赖,提高自主创新能力,开发具有自主知识产权的测井处理解释软件,提升核心竞争力,中国石油天然气集团公司依托国家油气重大专项,成功研发了第三代测井软件CIFLog.CIFLog基于先进的Java-NetBeans编程技术,采用数据层、支持层和应用层3层框架结构,可以同时运行在Windows、Linux和Unix操作系统下.CI-FLog还将全系列裸眼测井评价与套后测井评价集成为一体,提供了火山岩、碳酸盐岩、低阻碎屑岩和水淹层等复杂储层的处理解释方法,并在国内首家对全部国产高端成像测井装备处理解释提供支持.该软件的成功研发不仅打破了国外软件技术封锁的

  13. The 2008—2013 crisis as metastasis : a preview of the 2nd edition of The cancer stage of capitalism by Pluto Press

    John McMurtry

    2013-01-01

    By means of selection of relevant excerpts, a preview is offered hereby of the 2nd edition of John McMurtry's prophetic 1999 book "The Cancer Stage of Capitalism", published by Pluto Press, and entitled "The Cancer Stage of Capitalism and Its Cure"

  14. The 2008—2013 Crisis as Metastasis. A Preview of the 2nd edition of The Cancer Stage of Capitalism by Pluto Press

    John McMurtry

    2013-03-01

    Full Text Available By means of selection of relevant excerpts, a preview is offered hereby of the 2nd edition of John McMurtry's prophetic 1999 book "The Cancer Stage of Capitalism", published by Pluto Press, and entitled "The Cancer Stage of Capitalism and Its Cure"

  15. Design of control system for the 2nd and 3rd charge exchange system in J-PARC 3GeV RCS

    J-PARC 3GeV Synchrotron Accelerator is using method of charge exchange injection using three carbon foils. In order to achieve this injection, three charge exchange devices installed in this facility. These devices are controlled by one control system. The 2nd and 3rd charge exchange devices are upgrading to increase maintainability and exhaust ability of the vacuum unit, and the control system has reconsidered. Basic policy of redesigning the control system is separated from centralized control system of the three devices, and we reconstruct the control system that independent from the centralized control system. On this condition, we are upgrading of the 2nd and 3rd charge exchange device. It is necessary to redesign the interlock unit about safety, because of being stand-alone control. Now, the error signal of the charge exchange unit consolidates the error signal of three devices, and it operates the Machine Protection System (MPS). Therefore, we needed long time to search occasion why the error happened. However, the MPS will be operated by the error signal on each unit, we hope it makes a difference to search occasion easily. The 2nd and 3rd charge exchange units adopt a simple control system using Yokogawa electric PLC FA-M3. We are designing of the control system with safety that fuses the drive unit and the vacuum unit. This report is about design of the 2nd and 3rd charge exchange unit control system that reconstructed the hardware of their unit. (author)

  16. Dielectric properties of [Ca1-x(Li1/2Nd1/2)x]1-yZnyTiO3 ceramics at microwave frequencies

    Dielectric properties of Ca1-x(Li1/2Nd1/2)xTiO3 (0.0≤x≤0.6, CLNT) and [Ca0.6(Li1/2Nd1/2)0.4]1-yZnyTiO3 (0.5≤y≤1.0, CLNZT) ceramics were investigated at 4-10 GHz. For the CLNT system, a single perovskite phase was detected in the entire composition range. The dielectric constants (K), and the temperature coefficients of the resonant frequency (TCF) were dependent on the B-site bond valence. Qf value decreases with the increase in (Li1/2Nd1/2)TiO3 content (x) due to a decrease in grain size. For the CLNZT system, Qf value and TCF were improved with Zn content (y) due to the formation of Zn2TiO4. Typically, K of 47, Qf of 13000 GHz, TCF of 14 ppm per deg. C were obtained for [Ca0.6(Li1/2Nd1/2)0.4]0.4Zn0.6TiO3 sintered at 1150 deg. C for 4 h

  17. Growth, structure, and optical properties of a self-activated crystal: Na2Nd2O(BO3)2

    Shan, Faxian; Zhang, Guochun; Yao, Jiyong; Xu, Tianxiang; Zhang, Xinyuan; Fu, Ying; Wu, Yicheng

    2015-08-01

    A self-activated crystal Na2Nd2O(BO3)2 has been grown from the Na2O-Nd2O3-B2O3-NaF system. Its structure was determined by single crystal X-ray diffraction, and verified by infrared spectrum and inductively coupled plasma optical emission spectrometry. Na2Nd2O(BO3)2 crystallizes in the monoclinic crystal system, space group P21/c with unit-cell parameters a = 10.804 Å, b = 6.421 Å, c = 10.450 Å, β = 117.95°, Z = 4, and V = 640.4 Å3. Its absorption and emission spectra were measured at room temperature. Based on the absorption spectrum, the spontaneous transition probabilities, fluorescence branch ratio, and the radiation lifetime of 4F3/2 state were calculated. The emission properties under the 355 nm excitation were also evaluated. The electronic structure of Na2Nd2O(BO3)2 was calculated by the first-principles method. The obtained results show that Na2Nd2O(BO3)2 may be a promising microchip laser material.

  18. Short rare hTERT-VNTR2-2nd alleles are associated with prostate cancer susceptibility and influence gene expression

    The hTERT (human telomerase reverse transcriptase) gene contains five variable number tandem repeats (VNTR) and previous studies have described polymorphisms for hTERT-VNTR2-2nd. We investigated how allelic variation in hTERT-VNTR2-2nd may affect susceptibility to prostate cancer. A case-control study was performed using DNA from 421 cancer-free male controls and 329 patients with prostate cancer. In addition, to determine whether the VNTR polymorphisms have a functional consequence, we examined the transcriptional levels of a reporter gene linked to these VNTRs and driven by the hTERT promoter in cell lines. Three new rare alleles were detected from this study, two of which were identified only in cancer subjects. A statistically significant association between rare hTERT-VNTR2-2nd alleles and risk of prostate cancer was observed [OR, 5.17; 95% confidence interval (CI), 1.09-24.43; P = 0.021]. Furthermore, the results indicated that these VNTRs inserted in the enhancer region could influence the expression of hTERT in prostate cancer cell lines. This is the first study to report that rare hTERT VNTRs are associated with prostate cancer predisposition and that the VNTRs can induce enhanced levels of hTERT promoter activity in prostate cancer cell lines. Thus, the hTERT-VNTR2-2nd locus may function as a modifier of prostate cancer risk by affecting gene expression

  19. Software engineering

    Sommerville, Ian

    2010-01-01

    The ninth edition of Software Engineering presents a broad perspective of software engineering, focusing on the processes and techniques fundamental to the creation of reliable, software systems. Increased coverage of agile methods and software reuse, along with coverage of 'traditional' plan-driven software engineering, gives readers the most up-to-date view of the field currently available. Practical case studies, a full set of easy-to-access supplements, and extensive web resources make teaching the course easier than ever.

  20. Software testing concepts and operations

    Mili, Ali

    2015-01-01

    Explores and identifies the main issues, concepts, principles and evolution of software testing, including software quality engineering and testing concepts, test data generation, test deployment analysis, and software test managementThis book examines the principles, concepts, and processes that are fundamental to the software testing function. This book is divided into five broad parts. Part I introduces software testing in the broader context of software engineering and explores the qualities that testing aims to achieve or ascertain, as well as the lifecycle of software testing. Part II c