WorldWideScience

Sample records for integrated verification experiment

  1. Material integrity verification radar

    International Nuclear Information System (INIS)

    Koppenjan, S.K.

    1999-01-01

    The International Atomic Energy Agency (IAEA) has the need for verification of 'as-built' spent fuel-dry storage containers and other concrete structures. The IAEA has tasked the Special Technologies Laboratory (STL) to fabricate, test, and deploy a stepped-frequency Material Integrity Verification Radar (MIVR) system to nondestructively verify the internal construction of these containers. The MIVR system is based on previously deployed high-frequency, ground penetrating radar (GPR) systems that have been developed by STL for the U.S. Department of Energy (DOE). Whereas GPR technology utilizes microwave radio frequency energy to create subsurface images, MTVR is a variation for which the medium is concrete instead of soil. The purpose is to nondestructively verify the placement of concrete-reinforcing materials, pipes, inner liners, and other attributes of the internal construction. The MIVR system underwent an initial field test on CANDU reactor spent fuel storage canisters at Atomic Energy of Canada Limited (AECL), Chalk River Laboratories, Ontario, Canada, in October 1995. A second field test at the Embalse Nuclear Power Plant in Embalse, Argentina, was completed in May 1996. The DOE GPR also was demonstrated at the site. Data collection and analysis were performed for the Argentine National Board of Nuclear Regulation (ENREN). IAEA and the Brazilian-Argentine Agency for the Control and Accounting of Nuclear Material (ABACC) personnel were present as observers during the test. Reinforcing materials were evident in the color, two-dimensional images produced by the MIVR system. A continuous pattern of reinforcing bars was evident and accurate estimates on the spacing, depth, and size were made. The potential uses for safeguard applications were jointly discussed. The MIVR system, as successfully demonstrated in the two field tests, can be used as a design verification tool for IAEA safeguards. A deployment of MIVR for Design Information Questionnaire (DIQ

  2. Integrated Verification Experiment data collected as part of the Los Alamos National Laboratory's Source Region Program

    Energy Technology Data Exchange (ETDEWEB)

    Whitaker, R.W.; Noel, S.D.

    1992-12-01

    The summary report by Tom Weaver gives the overall background for the series of IVE (Integrated Verification Experiment) experiments including information on the full set of measurements made. This appendix presents details of the infrasound data for the and discusses certain aspects of a few special experiments. Prior to FY90, the emphasis of the Infrasound Program was on underground nuclear test (UGT) detection and yield estimation. During this time the Infrasound Program was a separate program at Los Alamos, and it was suggested to DOE/OAC that a regional infrasound network be established around NTS. The IVE experiments took place in a time frame that allowed simultaneous testing of possible network sites and examination of propagation in different directions. Whenever possible, infrasound stations were combined with seismic stations so that a large number could be efficiently fielded. The regional infrasound network was not pursued by DOE, as world events began to change the direction of verification toward non-proliferation. Starting in FY90 the infrasound activity became part of the Source Region Program which has a goal of understanding how energy is transported from the UGT to a variety of measurement locations.

  3. Integrated Java Bytecode Verification

    DEFF Research Database (Denmark)

    Gal, Andreas; Probst, Christian; Franz, Michael

    2005-01-01

    Existing Java verifiers perform an iterative data-flow analysis to discover the unambiguous type of values stored on the stack or in registers. Our novel verification algorithm uses abstract interpretation to obtain definition/use information for each register and stack location in the program...

  4. Disarmament Verification - the OPCW Experience

    International Nuclear Information System (INIS)

    Lodding, J.

    2010-01-01

    The Chemical Weapons Convention is the only multilateral treaty that bans completely an entire category of weapons of mass destruction under international verification arrangements. Possessor States, i.e. those that have chemical weapons stockpiles at the time of becoming party to the CWC, commit to destroying these. All States undertake never to acquire chemical weapons and not to help other States acquire such weapons. The CWC foresees time-bound chemical disarmament. The deadlines for destruction for early entrants to the CWC are provided in the treaty. For late entrants, the Conference of States Parties intervenes to set destruction deadlines. One of the unique features of the CWC is thus the regime for verifying destruction of chemical weapons. But how can you design a system for verification at military sites, while protecting military restricted information? What degree of assurance is considered sufficient in such circumstances? How do you divide the verification costs? How do you deal with production capability and initial declarations of existing stockpiles? The founders of the CWC had to address these and other challenges in designing the treaty. Further refinement of the verification system has followed since the treaty opened for signature in 1993 and since inspection work was initiated following entry-into-force of the treaty in 1997. Most of this work concerns destruction at the two large possessor States, Russia and the United States. Perhaps some of the lessons learned from the OPCW experience may be instructive in a future verification regime for nuclear weapons. (author)

  5. Statistics and integral experiments in the verification of LOCA calculations models

    International Nuclear Information System (INIS)

    Margolis, S.G.

    1978-01-01

    The LOCA (loss of coolant accident) is a hypothesized, low-probability accident used as a licensing basis for nuclear power plants. Computer codes which have been under development for at least a decade have been the principal tools used to assess the consequences of the hypothesized LOCA. Models exist in two versions. In EM's (Evaluation Models) the basic engineering calculations are constrained by a detailed set of assumptions spelled out in the Code of Federal Regulations (10 CFR 50, Appendix K). In BE Models (Best Estimate Models) the calculations are based on fundamental physical laws and available empirical correlations. Evaluation models are intended to have a pessimistic bias; Best Estimate Models are intended to be unbiased. Because evaluation models play a key role in reactor licensing, they must be conservative. A long-sought objective has been to assess this conservatism by combining Best Estimate Models with statisticallly established error bounds, based on experiment. Within the last few years, an extensive international program of LOCA experiments has been established to provide the needed data. This program has already produced millions of measurements of temperature, density, and flow and millions of more measurements are yet to come

  6. Verification of the code ATHLET by post-test analysis of two experiments performed at the CCTF integral test facility

    International Nuclear Information System (INIS)

    Krepper, E.; Schaefer, F.

    2001-03-01

    In the framework of the external validation of the thermohydraulic code ATHLET Mod 1.2 Cycle C, which has been developed by the GRS, post test analyses of two experiments were done, which were performed at the japanese test facility CCTF. The test facility CCTF is a 1:25 volume-scaled model of a 1000 MW pressurized water reactor. The tests simulate a double end break in the cold leg of the PWR with ECC injection into the cold leg and with combined ECC injection into the hot and cold legs. The evaluation of the calculated results shows, that the main phenomena can be calculated in a good agreement with the experiment. Especially the behaviour of the quench front and the core cooling are calculated very well. Applying a two-channel representation of the reactor model the radial behaviour of the quench front could be reproduced. Deviations between calculations and experiment can be observed simulating the emergency injection in the beginning of the transient. Very high condensation rates were calculated and the pressure decrease in this phase of the transient is overestimated. Besides that, the pressurization due to evaporation in the refill phase is underestimated by ATHLET. (orig.) [de

  7. CTBT integrated verification system evaluation model supplement

    Energy Technology Data Exchange (ETDEWEB)

    EDENBURN,MICHAEL W.; BUNTING,MARCUS; PAYNE JR.,ARTHUR C.; TROST,LAWRENCE C.

    2000-03-02

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0.

  8. CTBT integrated verification system evaluation model supplement

    International Nuclear Information System (INIS)

    EDENBURN, MICHAEL W.; BUNTING, MARCUS; PAYNE, ARTHUR C. JR.; TROST, LAWRENCE C.

    2000-01-01

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0

  9. Accelerating functional verification of an integrated circuit

    Science.gov (United States)

    Deindl, Michael; Ruedinger, Jeffrey Joseph; Zoellin, Christian G.

    2015-10-27

    Illustrative embodiments include a method, system, and computer program product for accelerating functional verification in simulation testing of an integrated circuit (IC). Using a processor and a memory, a serial operation is replaced with a direct register access operation, wherein the serial operation is configured to perform bit shifting operation using a register in a simulation of the IC. The serial operation is blocked from manipulating the register in the simulation of the IC. Using the register in the simulation of the IC, the direct register access operation is performed in place of the serial operation.

  10. CTBT Integrated Verification System Evaluation Model

    Energy Technology Data Exchange (ETDEWEB)

    Edenburn, M.W.; Bunting, M.L.; Payne, A.C. Jr.

    1997-10-01

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia`s Monitoring Systems and Technology Center and has been funded by the US Department of Energy`s Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, top-level, modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM`s unique features is that it integrates results from the various CTBT sensor technologies (seismic, infrasound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection) and location accuracy of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system`s performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. This report describes version 1.2 of IVSEM.

  11. Automated radiotherapy treatment plan integrity verification

    Energy Technology Data Exchange (ETDEWEB)

    Yang Deshan; Moore, Kevin L. [Department of Radiation Oncology, School of Medicine, Washington University in Saint Louis, St. Louis, Missouri 63110 (United States)

    2012-03-15

    Purpose: In our clinic, physicists spend from 15 to 60 min to verify the physical and dosimetric integrity of radiotherapy plans before presentation to radiation oncology physicians for approval. The purpose of this study was to design and implement a framework to automate as many elements of this quality control (QC) step as possible. Methods: A comprehensive computer application was developed to carry out a majority of these verification tasks in the Philips PINNACLE treatment planning system (TPS). This QC tool functions based on both PINNACLE scripting elements and PERL sub-routines. The core of this technique is the method of dynamic scripting, which involves a PERL programming module that is flexible and powerful for treatment plan data handling. Run-time plan data are collected, saved into temporary files, and analyzed against standard values and predefined logical rules. The results were summarized in a hypertext markup language (HTML) report that is displayed to the user. Results: This tool has been in clinical use for over a year. The occurrence frequency of technical problems, which would cause delays and suboptimal plans, has been reduced since clinical implementation. Conclusions: In addition to drastically reducing the set of human-driven logical comparisons, this QC tool also accomplished some tasks that are otherwise either quite laborious or impractical for humans to verify, e.g., identifying conflicts amongst IMRT optimization objectives.

  12. Automated radiotherapy treatment plan integrity verification

    International Nuclear Information System (INIS)

    Yang Deshan; Moore, Kevin L.

    2012-01-01

    Purpose: In our clinic, physicists spend from 15 to 60 min to verify the physical and dosimetric integrity of radiotherapy plans before presentation to radiation oncology physicians for approval. The purpose of this study was to design and implement a framework to automate as many elements of this quality control (QC) step as possible. Methods: A comprehensive computer application was developed to carry out a majority of these verification tasks in the Philips PINNACLE treatment planning system (TPS). This QC tool functions based on both PINNACLE scripting elements and PERL sub-routines. The core of this technique is the method of dynamic scripting, which involves a PERL programming module that is flexible and powerful for treatment plan data handling. Run-time plan data are collected, saved into temporary files, and analyzed against standard values and predefined logical rules. The results were summarized in a hypertext markup language (HTML) report that is displayed to the user. Results: This tool has been in clinical use for over a year. The occurrence frequency of technical problems, which would cause delays and suboptimal plans, has been reduced since clinical implementation. Conclusions: In addition to drastically reducing the set of human-driven logical comparisons, this QC tool also accomplished some tasks that are otherwise either quite laborious or impractical for humans to verify, e.g., identifying conflicts amongst IMRT optimization objectives.

  13. Data storage accounting and verification at LHC experiments

    Energy Technology Data Exchange (ETDEWEB)

    Huang, C. H. [Fermilab; Lanciotti, E. [CERN; Magini, N. [CERN; Ratnikova, N. [Moscow, ITEP; Sanchez-Hernandez, A. [CINVESTAV, IPN; Serfon, C. [Munich U.; Wildish, T. [Princeton U.; Zhang, X. [Beijing, Inst. High Energy Phys.

    2012-01-01

    All major experiments at the Large Hadron Collider (LHC) need to measure real storage usage at the Grid sites. This information is equally important for resource management, planning, and operations. To verify the consistency of central catalogs, experiments are asking sites to provide a full list of the files they have on storage, including size, checksum, and other file attributes. Such storage dumps, provided at regular intervals, give a realistic view of the storage resource usage by the experiments. Regular monitoring of the space usage and data verification serve as additional internal checks of the system integrity and performance. Both the importance and the complexity of these tasks increase with the constant growth of the total data volumes during the active data taking period at the LHC. The use of common solutions helps to reduce the maintenance costs, both at the large Tier1 facilities supporting multiple virtual organizations and at the small sites that often lack manpower. We discuss requirements and solutions to the common tasks of data storage accounting and verification, and present experiment-specific strategies and implementations used within the LHC experiments according to their computing models.

  14. Data storage accounting and verification in LHC experiments

    CERN Document Server

    Ratnikova ,Natalia

    2012-01-01

    All major experiments at Large Hadron Collider (LHC) need to measure real storage usage at the Grid sites. This information is equally important for the resource management, planning, and operations. To verify consistency of the central catalogs, experiments are asking sites to provide full list of files they have on storage, including size, checksum, and other file attributes. Such storage dumps provided at regular intervals give a realistic view of the storage resource usage by the experiments. Regular monitoring of the space usage and data verification serve as additional internal checks of the system integrity and performance. Both the importance and the complexity of these tasks increase with the constant growth of the total data volumes during the active data taking period at the LHC. Developed common solutions help to reduce the maintenance costs both at the large Tier-1 facilities supporting multiple virtual organizations, and at the small sites that often lack manpower. We discuss requirements...

  15. A study of compositional verification based IMA integration method

    Science.gov (United States)

    Huang, Hui; Zhang, Guoquan; Xu, Wanmeng

    2018-03-01

    The rapid development of avionics systems is driving the application of integrated modular avionics (IMA) systems. But meanwhile it is improving avionics system integration, complexity of system test. Then we need simplify the method of IMA system test. The IMA system supports a module platform that runs multiple applications, and shares processing resources. Compared with federated avionics system, IMA system is difficult to isolate failure. Therefore, IMA system verification will face the critical problem is how to test shared resources of multiple application. For a simple avionics system, traditional test methods are easily realizing to test a whole system. But for a complex system, it is hard completed to totally test a huge and integrated avionics system. Then this paper provides using compositional-verification theory in IMA system test, so that reducing processes of test and improving efficiency, consequently economizing costs of IMA system integration.

  16. Design Development and Verification of a System Integrated Modular PWR

    International Nuclear Information System (INIS)

    Kim, S.-H.; Kim, K. K.; Chang, M. H.; Kang, C. S.; Park, G.-C.

    2002-01-01

    safety analyses for the SMART design have been performed and the results demonstrated that the key safety parameters of the limiting design base events do not violate the safety limits. Various fundamental thermal-hydraulic experiments were carried out during the design concept development to assure the fundamental behavior of major concepts of the SMART systems. Most technologies implemented into the SMART concept have been proven through the design and operation of the existing PWRs. Advanced design features require tests to confirm the performance of the design and to produce data for the design code verification. Tests including core flow distribution test, test of flow instability in steam generator, test of self-pressurizer performance, two-phase critical flow test with non-condensable gases, and high-temperature/high pressure integral thermal-hydraulic test are currently under preparation by installing equipment and facilities. The performance tests for key parts of MCP(Main Coolant pump) and CEDM(Control Element Drive mechanism) were performed. And also mechanical performance tests for the reactor assembly and major primary components will be carried out. Technical and economical evaluation for the commercialization of SMART were conducted by the Korean Nuclear Society (KNS) from August of 2000 to July 2001. Based on the results of the evaluation, the SMART technology is technically sound and has sufficient economic incentives for pursuing further development. Upon the completion of the basic design phase in March of 2002, the SMART design/engineering verification phase will be thus followed to conduct various separate effect tests and comprehensive integral tests as well as construction of the one fifth scaled pilot plant for demonstration of overall SMART performance. (author)

  17. Technical workshop on safeguards, verification technologies, and other related experience

    International Nuclear Information System (INIS)

    1998-01-01

    The aim of the Technical Workshop on safeguards was to encourage a clearer understanding of the IAEA Safeguards System, its origins and evolution and the present state of the art. Presentations held by the IAEA officials and outside experts examined as well other components of the non-proliferation regime, the current practices and procedures, and the future prospects. A series of presentations described the characteristics of the interaction between global and regional verification systems and described relevant past and present experience. Prominence given to such state of the art verification technologies as environmental sampling, satellite imaging and monitoring thorough remote and unattended techniques demonstrated, beyond any doubt, the essentially dynamic nature of verification. It is generally acknowledged that there have been major achievements in preventing spread of nuclear weapons, but no verification system can in itself prevent proliferation

  18. Technical workshop on safeguards, verification technologies, and other related experience

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-12-31

    The aim of the Technical Workshop on safeguards was to encourage a clearer understanding of the IAEA Safeguards System, its origins and evolution and the present state of the art. Presentations held by the IAEA officials and outside experts examined as well other components of the non-proliferation regime, the current practices and procedures, and the future prospects. A series of presentations described the characteristics of the interaction between global and regional verification systems and described relevant past and present experience. Prominence given to such state of the art verification technologies as environmental sampling, satellite imaging and monitoring thorough remote and unattended techniques demonstrated, beyond any doubt, the essentially dynamic nature of verification. It is generally acknowledged that there have been major achievements in preventing spread of nuclear weapons, but no verification system can in itself prevent proliferation Refs, figs, tabs

  19. Experiences in the formalisation and verification of medical protocols

    OpenAIRE

    Balser, Michael

    2003-01-01

    Experiences in the formalisation and verification of medical protocols / M. Balser ... - In: Artificial intelligence in medicine : 9th Conference on Artificial Intelligence in Medicine in Europe, AIME 2003, Protaras, Cyprus, October 18 - 22, 2003 ; proceedings / Michel Dojat ... (eds.). - Berlin u.a. : Springer, 2003. - S. 132-141. - (Lecture notes in computer science ; 2780 : Lecture notes in artificial intelligence)

  20. Integrated knowledge base tool for acquisition and verification of NPP alarm systems

    International Nuclear Information System (INIS)

    Park, Joo Hyun; Seong, Poong Hyun

    1998-01-01

    Knowledge acquisition and knowledge base verification are important activities in developing knowledge-based systems such as alarm processing systems. In this work, we developed the integrated tool, for knowledge acquisition and verification of NPP alarm processing systems, by using G2 tool. The tool integrates document analysis method and ECPN matrix analysis method, for knowledge acquisition and knowledge verification, respectively. This tool enables knowledge engineers to perform their tasks from knowledge acquisition to knowledge verification consistently

  1. Verification and validation guidelines for high integrity systems. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Hecht, H.; Hecht, M.; Dinsmore, G.; Hecht, S.; Tang, D. [SoHaR, Inc., Beverly Hills, CA (United States)

    1995-03-01

    High integrity systems include all protective (safety and mitigation) systems for nuclear power plants, and also systems for which comparable reliability requirements exist in other fields, such as in the process industries, in air traffic control, and in patient monitoring and other medical systems. Verification aims at determining that each stage in the software development completely and correctly implements requirements that were established in a preceding phase, while validation determines that the overall performance of a computer system completely and correctly meets system requirements. Volume I of the report reviews existing classifications for high integrity systems and for the types of errors that may be encountered, and makes recommendations for verification and validation procedures, based on assumptions about the environment in which these procedures will be conducted. The final chapter of Volume I deals with a framework for standards in this field. Volume II contains appendices dealing with specific methodologies for system classification, for dependability evaluation, and for two software tools that can automate otherwise very labor intensive verification and validation activities.

  2. Verification and validation guidelines for high integrity systems. Volume 1

    International Nuclear Information System (INIS)

    Hecht, H.; Hecht, M.; Dinsmore, G.; Hecht, S.; Tang, D.

    1995-03-01

    High integrity systems include all protective (safety and mitigation) systems for nuclear power plants, and also systems for which comparable reliability requirements exist in other fields, such as in the process industries, in air traffic control, and in patient monitoring and other medical systems. Verification aims at determining that each stage in the software development completely and correctly implements requirements that were established in a preceding phase, while validation determines that the overall performance of a computer system completely and correctly meets system requirements. Volume I of the report reviews existing classifications for high integrity systems and for the types of errors that may be encountered, and makes recommendations for verification and validation procedures, based on assumptions about the environment in which these procedures will be conducted. The final chapter of Volume I deals with a framework for standards in this field. Volume II contains appendices dealing with specific methodologies for system classification, for dependability evaluation, and for two software tools that can automate otherwise very labor intensive verification and validation activities

  3. Verification and validation as an integral part of the development of digital systems for nuclear applications

    International Nuclear Information System (INIS)

    Straker, E.A.; Thomas, N.C.

    1983-01-01

    The nuclear industry's current attitude toward verification and validation (V and V) is realized through the experiences gained to date. On the basis of these experiences, V and V can effectively be applied as an integral part of digital system development for nuclear electric power applications. An overview of a typical approach for integrating V and V with system development is presented. This approach represents a balance between V and V as applied in the aerospace industry and the standard practice commonly applied within the nuclear industry today

  4. Enrichment Assay Methods Development for the Integrated Cylinder Verification System

    International Nuclear Information System (INIS)

    Smith, Leon E.; Misner, Alex C.; Hatchell, Brian K.; Curtis, Michael M.

    2009-01-01

    International Atomic Energy Agency (IAEA) inspectors currently perform periodic inspections at uranium enrichment plants to verify UF6 cylinder enrichment declarations. Measurements are typically performed with handheld high-resolution sensors on a sampling of cylinders taken to be representative of the facility's entire product-cylinder inventory. Pacific Northwest National Laboratory (PNNL) is developing a concept to automate the verification of enrichment plant cylinders to enable 100 percent product-cylinder verification and potentially, mass-balance calculations on the facility as a whole (by also measuring feed and tails cylinders). The Integrated Cylinder Verification System (ICVS) could be located at key measurement points to positively identify each cylinder, measure its mass and enrichment, store the collected data in a secure database, and maintain continuity of knowledge on measured cylinders until IAEA inspector arrival. The three main objectives of this FY09 project are summarized here and described in more detail in the report: (1) Develop a preliminary design for a prototype NDA system, (2) Refine PNNL's MCNP models of the NDA system, and (3) Procure and test key pulse-processing components. Progress against these tasks to date, and next steps, are discussed.

  5. Enrichment Assay Methods Development for the Integrated Cylinder Verification System

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Leon E.; Misner, Alex C.; Hatchell, Brian K.; Curtis, Michael M.

    2009-10-22

    International Atomic Energy Agency (IAEA) inspectors currently perform periodic inspections at uranium enrichment plants to verify UF6 cylinder enrichment declarations. Measurements are typically performed with handheld high-resolution sensors on a sampling of cylinders taken to be representative of the facility's entire product-cylinder inventory. Pacific Northwest National Laboratory (PNNL) is developing a concept to automate the verification of enrichment plant cylinders to enable 100 percent product-cylinder verification and potentially, mass-balance calculations on the facility as a whole (by also measuring feed and tails cylinders). The Integrated Cylinder Verification System (ICVS) could be located at key measurement points to positively identify each cylinder, measure its mass and enrichment, store the collected data in a secure database, and maintain continuity of knowledge on measured cylinders until IAEA inspector arrival. The three main objectives of this FY09 project are summarized here and described in more detail in the report: (1) Develop a preliminary design for a prototype NDA system, (2) Refine PNNL's MCNP models of the NDA system, and (3) Procure and test key pulse-processing components. Progress against these tasks to date, and next steps, are discussed.

  6. IAEA verification experiment at the Portsmouth Gaseous Diffusion Plant

    International Nuclear Information System (INIS)

    Gordon, D.M.; Subudhi, M.; Calvert, O.L.; Bonner, T.N.; Cherry, R.C.; Whiting, N.E.

    1998-01-01

    In April 1996, the United States (US) added the Portsmouth Gaseous Diffusion Plant to the list of facilities eligible for the application of International Atomic Energy Agency (IAEA) safeguards. At that time, the US proposed that the IAEA carry out a Verification Experiment at the plant with respect to the downblending of about 13 metric tons of highly enriched uranium (HEU) in the form of UF 6 . This material is part of the 226 metric tons of fissile material that President Clinton has declared to be excess to US national-security needs and which will be permanently withdrawn from the US nuclear stockpile. In September 1997, the IAEA agreed to carry out this experiment, and during the first three weeks of December 1997, the IAEA verified the design information concerning the downblending process. The plant has been subject to short-notice random inspections since December 17, 1997. This paper provides an overview of the Verification Experiment, the monitoring technologies used in the verification approach, and some of the experience gained to date

  7. Technical experiences of implementing a wireless tracking and facial biometric verification system for a clinical environment

    Science.gov (United States)

    Liu, Brent; Lee, Jasper; Documet, Jorge; Guo, Bing; King, Nelson; Huang, H. K.

    2006-03-01

    By implementing a tracking and verification system, clinical facilities can effectively monitor workflow and heighten information security in today's growing demand towards digital imaging informatics. This paper presents the technical design and implementation experiences encountered during the development of a Location Tracking and Verification System (LTVS) for a clinical environment. LTVS integrates facial biometrics with wireless tracking so that administrators can manage and monitor patient and staff through a web-based application. Implementation challenges fall into three main areas: 1) Development and Integration, 2) Calibration and Optimization of Wi-Fi Tracking System, and 3) Clinical Implementation. An initial prototype LTVS has been implemented within USC's Healthcare Consultation Center II Outpatient Facility, which currently has a fully digital imaging department environment with integrated HIS/RIS/PACS/VR (Voice Recognition).

  8. Verification of Monte Carlo transport codes by activation experiments

    OpenAIRE

    Chetvertkova, Vera

    2013-01-01

    With the increasing energies and intensities of heavy-ion accelerator facilities, the problem of an excessive activation of the accelerator components caused by beam losses becomes more and more important. Numerical experiments using Monte Carlo transport codes are performed in order to assess the levels of activation. The heavy-ion versions of the codes were released approximately a decade ago, therefore the verification is needed to be sure that they give reasonable results. Present work is...

  9. A simple reliability block diagram method for safety integrity verification

    International Nuclear Information System (INIS)

    Guo Haitao; Yang Xianhui

    2007-01-01

    IEC 61508 requires safety integrity verification for safety related systems to be a necessary procedure in safety life cycle. PFD avg must be calculated to verify the safety integrity level (SIL). Since IEC 61508-6 does not give detailed explanations of the definitions and PFD avg calculations for its examples, it is difficult for common reliability or safety engineers to understand when they use the standard as guidance in practice. A method using reliability block diagram is investigated in this study in order to provide a clear and feasible way of PFD avg calculation and help those who take IEC 61508-6 as their guidance. The method finds mean down times (MDTs) of both channel and voted group first and then PFD avg . The calculated results of various voted groups are compared with those in IEC61508 part 6 and Ref. [Zhang T, Long W, Sato Y. Availability of systems with self-diagnostic components-applying Markov model to IEC 61508-6. Reliab Eng System Saf 2003;80(2):133-41]. An interesting outcome can be realized from the comparison. Furthermore, although differences in MDT of voted groups exist between IEC 61508-6 and this paper, PFD avg of voted groups are comparatively close. With detailed description, the method of RBD presented can be applied to the quantitative SIL verification, showing a similarity of the method in IEC 61508-6

  10. Verification of industrial x-ray machine: MINTs experience

    International Nuclear Information System (INIS)

    Aziz Amat; Saidi Rajab; Eesan Pasupathi; Saipo Bahari Abdul Ratan; Shaharudin Sayuti; Abd Nassir Ibrahim; Abd Razak Hamzah

    2005-01-01

    Radiation and electrical safety of the industrial x-ray equipment required to meet Atomic Energy Licensing Board(AELB) guidelines ( LEM/TEK/42 ) at the time of installation and subsequently a periodic verification should be ensured. The purpose of the guide is to explain the requirements employed in conducting the test on industrial x-ray apparatus and be certified in meeting with our local legislative and regulation. Verification is aimed to provide safety assurance information on electrical requirements and the minimum radiation exposure to the operator. This regulation is introduced on new models imported into the Malaysian market. Since June, 1997, Malaysian Institute for Nuclear Technology Research (MINT) has been approved by AELB to provide verification services to private company, government and corporate body throughout Malaysia. Early January 1997, AELB has made it mandatory that all x-ray equipment for industrial purpose (especially Industrial Radiography) must fulfill certain performance test based on the LEM/TEK/42 guidelines. MINT as the third party verification encourages user to improve maintenance of the equipment. MINT experiences in measuring the performance on intermittent and continuous duty rating single-phase industrial x-ray machine in the year 2004 indicated that all of irradiating apparatus tested pass the test and met the requirements of the guideline. From MINT record, 1997 to 2005 , three x-ray models did not meet the requirement and thus not allowed to be used unless the manufacturers willing to modify it to meet AELB requirement. This verification procedures on electrical and radiation safety on industrial x-ray has significantly improved the the maintenance cultures and safety awareness in the usage of x-ray apparatus in the industrial environment. (Author)

  11. Integrated Design Validation: Combining Simulation and Formal Verification for Digital Integrated Circuits

    Directory of Open Access Journals (Sweden)

    Lun Li

    2006-04-01

    Full Text Available The correct design of complex hardware continues to challenge engineers. Bugs in a design that are not uncovered in early design stages can be extremely expensive. Simulation is a predominantly used tool to validate a design in industry. Formal verification overcomes the weakness of exhaustive simulation by applying mathematical methodologies to validate a design. The work described here focuses upon a technique that integrates the best characteristics of both simulation and formal verification methods to provide an effective design validation tool, referred as Integrated Design Validation (IDV. The novelty in this approach consists of three components, circuit complexity analysis, partitioning based on design hierarchy, and coverage analysis. The circuit complexity analyzer and partitioning decompose a large design into sub-components and feed sub-components to different verification and/or simulation tools based upon known existing strengths of modern verification and simulation tools. The coverage analysis unit computes the coverage of design validation and improves the coverage by further partitioning. Various simulation and verification tools comprising IDV are evaluated and an example is used to illustrate the overall validation process. The overall process successfully validates the example to a high coverage rate within a short time. The experimental result shows that our approach is a very promising design validation method.

  12. Top-down design and verification methodology for analog mixed-signal integrated circuits

    NARCIS (Netherlands)

    Beviz, P.

    2016-01-01

    The current report contains the introduction of a novel Top-Down Design and Verification methodology for AMS integrated circuits. With the introduction of new design and verification flow, more reliable and efficient development of AMS ICs is possible. The assignment incorporated the research on the

  13. Verification of Monte Carlo transport codes by activation experiments

    Energy Technology Data Exchange (ETDEWEB)

    Chetvertkova, Vera

    2012-12-18

    With the increasing energies and intensities of heavy-ion accelerator facilities, the problem of an excessive activation of the accelerator components caused by beam losses becomes more and more important. Numerical experiments using Monte Carlo transport codes are performed in order to assess the levels of activation. The heavy-ion versions of the codes were released approximately a decade ago, therefore the verification is needed to be sure that they give reasonable results. Present work is focused on obtaining the experimental data on activation of the targets by heavy-ion beams. Several experiments were performed at GSI Helmholtzzentrum fuer Schwerionenforschung. The interaction of nitrogen, argon and uranium beams with aluminum targets, as well as interaction of nitrogen and argon beams with copper targets was studied. After the irradiation of the targets by different ion beams from the SIS18 synchrotron at GSI, the γ-spectroscopy analysis was done: the γ-spectra of the residual activity were measured, the radioactive nuclides were identified, their amount and depth distribution were detected. The obtained experimental results were compared with the results of the Monte Carlo simulations using FLUKA, MARS and SHIELD. The discrepancies and agreements between experiment and simulations are pointed out. The origin of discrepancies is discussed. Obtained results allow for a better verification of the Monte Carlo transport codes, and also provide information for their further development. The necessity of the activation studies for accelerator applications is discussed. The limits of applicability of the heavy-ion beam-loss criteria were studied using the FLUKA code. FLUKA-simulations were done to determine the most preferable from the radiation protection point of view materials for use in accelerator components.

  14. Expose : procedure and results of the joint experiment verification tests

    Science.gov (United States)

    Panitz, C.; Rettberg, P.; Horneck, G.; Rabbow, E.; Baglioni, P.

    The International Space Station will carry the EXPOSE facility accommodated at the universal workplace URM-D located outside the Russian Service Module. The launch will be affected in 2005 and it is planned to stay in space for 1.5 years. The tray like structure will accomodate 2 chemical and 6 biological PI-experiments or experiment systems of the ROSE (Response of Organisms to Space Environment) consortium. EXPOSE will support long-term in situ studies of microbes in artificial meteorites, as well as of microbial communities from special ecological niches, such as endolithic and evaporitic ecosystems. The either vented or sealed experiment pockets will be covered by an optical filter system to control intensity and spectral range of solar UV irradiation. Control of sun exposure will be achieved by the use of individual shutters. To test the compatibility of the different biological systems and their adaptation to the opportunities and constraints of space conditions a profound ground support program has been developed. The procedure and first results of this joint Experiment Verification Tests (EVT) will be presented. The results will be essential for the success of the EXPOSE mission and have been done in parallel with the development and construction of the final hardware design of the facility. The results of the mission will contribute to the understanding of the organic chemistry processes in space, the biological adaptation strategies to extreme conditions, e.g. on early Earth and Mars, and the distribution of life beyond its planet of origin.

  15. Experience in non-proliferation verification: The Treaty of Raratonga

    International Nuclear Information System (INIS)

    Walker, R.A.

    1998-01-01

    The verification provisions of the Treaty of Raratonga are subdivided into two categories: those performed by IAEA and those performed by other entities. A final provision of the Treaty of Raratonga is relevant to IAEA safeguards according to support of the continued effectiveness of the international non-proliferation system based on the Non-proliferation Treaty and the IAEA safeguards system. The non-IAEA verification process is described as well

  16. The joint verification experiments as a global non-proliferation exercise

    International Nuclear Information System (INIS)

    Shaner, J.W.

    1998-01-01

    This conference commemorates the 10th anniversary of the second of two Joint Verification Experiments conducted by the Soviet Union and the US. These two experiments, one at the Nevada test site in the US, and the second here at the Semipalatinsk test site were designed to test the verification of a nuclear testing treaty limiting the size underground explosions to 150 kilotons. By building trust and technical respect between the weapons scientists of the two most powerful adversaries, the Joint Verification Experiment (JVE) had the unanticipated result of initiating a suite of cooperative projects and programs aimed at reducing the Cold War threats and preventing the proliferation of weapons of mass destruction

  17. 78 FR 32010 - Pipeline Safety: Public Workshop on Integrity Verification Process

    Science.gov (United States)

    2013-05-28

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No... Hazardous Materials Safety Administration, DOT. ACTION: Notice of public meeting. SUMMARY: This notice is announcing a public workshop to be held on the concept of ``Integrity Verification Process.'' The Integrity...

  18. A service-oriented architecture for integrating the modeling and formal verification of genetic regulatory networks

    Directory of Open Access Journals (Sweden)

    Page Michel

    2009-12-01

    Full Text Available Abstract Background The study of biological networks has led to the development of increasingly large and detailed models. Computer tools are essential for the simulation of the dynamical behavior of the networks from the model. However, as the size of the models grows, it becomes infeasible to manually verify the predictions against experimental data or identify interesting features in a large number of simulation traces. Formal verification based on temporal logic and model checking provides promising methods to automate and scale the analysis of the models. However, a framework that tightly integrates modeling and simulation tools with model checkers is currently missing, on both the conceptual and the implementational level. Results We have developed a generic and modular web service, based on a service-oriented architecture, for integrating the modeling and formal verification of genetic regulatory networks. The architecture has been implemented in the context of the qualitative modeling and simulation tool GNA and the model checkers NUSMV and CADP. GNA has been extended with a verification module for the specification and checking of biological properties. The verification module also allows the display and visual inspection of the verification results. Conclusions The practical use of the proposed web service is illustrated by means of a scenario involving the analysis of a qualitative model of the carbon starvation response in E. coli. The service-oriented architecture allows modelers to define the model and proceed with the specification and formal verification of the biological properties by means of a unified graphical user interface. This guarantees a transparent access to formal verification technology for modelers of genetic regulatory networks.

  19. Technology Integration Experiences of Teachers

    Science.gov (United States)

    Çoklar, Ahmet Naci; Yurdakul, Isil Kabakçi

    2017-01-01

    Teachers are important providers of educational sustainability. Teachers' ability to adapt themselves to rapidly developing technologies applicable to learning environments is connected with technology integration. The purpose of this study is to investigate teachers' technology integration experiences in the course of learning and teaching…

  20. Verification of the Korsar code on results of experiments executed on the PSB-VVER facility

    International Nuclear Information System (INIS)

    Roginskaya, V.L.; Pylev, S.S.; Elkin, I.V.

    2005-01-01

    Full text of publication follows: Paper represents some results of computational research executed within the framework of verification of the KORSAR thermal hydraulic code. This code was designed in the NITI by A.P. Aleksandrov (Russia). The general purpose of the work was development of a nodding scheme of the PSB-VVER integral facility, scheme testing and computational modelling of the experiment 'The PSB-VVER Natural Circulation Test With Stepwise Reduction of the Primary Inventory'. The NC test has been performed within the framework of the OECD PSB-VVER Project (task no. 3). This Project is focused upon the provision of experimental data for codes assessment with regard to VVER analysis. Paper presents a nodding scheme of the PSB-VVER facility and results of pre- and post-test calculations of the specified experiment, obtained with the KORSAR code. The experiment data and the KORSAR pre-test calculation results are in good agreement. A post-test calculation of the experiment with KORSAR code has been performed in order to assess the code capability to simulate the phenomena relevant to the test. The code showed a reasonable prediction of the phenomena measured in the experiment. (authors)

  1. Verification of the Korsar code on results of experiments executed on the PSB-VVER facility

    Energy Technology Data Exchange (ETDEWEB)

    Roginskaya, V.L.; Pylev, S.S.; Elkin, I.V. [NSI RRC ' Kurchatov Institute' , Kurchatov Sq., 1, Moscow, 123182 (Russian Federation)

    2005-07-01

    Full text of publication follows: Paper represents some results of computational research executed within the framework of verification of the KORSAR thermal hydraulic code. This code was designed in the NITI by A.P. Aleksandrov (Russia). The general purpose of the work was development of a nodding scheme of the PSB-VVER integral facility, scheme testing and computational modelling of the experiment 'The PSB-VVER Natural Circulation Test With Stepwise Reduction of the Primary Inventory'. The NC test has been performed within the framework of the OECD PSB-VVER Project (task no. 3). This Project is focused upon the provision of experimental data for codes assessment with regard to VVER analysis. Paper presents a nodding scheme of the PSB-VVER facility and results of pre- and post-test calculations of the specified experiment, obtained with the KORSAR code. The experiment data and the KORSAR pre-test calculation results are in good agreement. A post-test calculation of the experiment with KORSAR code has been performed in order to assess the code capability to simulate the phenomena relevant to the test. The code showed a reasonable prediction of the phenomena measured in the experiment. (authors)

  2. Engineering within the assembly, verification, and integration (AIV) process in ALMA

    Science.gov (United States)

    Lopez, Bernhard; McMullin, Joseph P.; Whyborn, Nicholas D.; Duvall, Eugene

    2010-07-01

    The Atacama Large Millimeter/submillimeter Array (ALMA) is a joint project between astronomical organizations in Europe, North America, and East Asia, in collaboration with the Republic of Chile. ALMA will consist of at least 54 twelve-meter antennas and 12 seven-meter antennas operating as an interferometer in the millimeter and sub-millimeter wavelength range. It will be located at an altitude above 5000m in the Chilean Atacama desert. As part of the ALMA construction phase the Assembly, Verification and Integration (AIV) team receives antennas and instrumentation from Integrated Product Teams (IPTs), verifies that the sub-systems perform as expected, performs the assembly and integration of the scientific instrumentation and verifies that functional and performance requirements are met. This paper aims to describe those aspects related to the AIV Engineering team, its role within the 4-station AIV process, the different phases the group underwent, lessons learned and potential space for improvement. AIV Engineering initially focused on the preparation of the necessary site infrastructure for AIV activities, on the purchase of tools and equipment and on the first ALMA system installations. With the first antennas arriving on site the team started to gather experience with AIV Station 1 beacon holography measurements for the assessment of the overall antenna surface quality, and with optical pointing to confirm the antenna pointing and tracking capabilities. With the arrival of the first receiver AIV Station 2 was developed which focuses on the installation of electrical and cryogenic systems and incrementally establishes the full connectivity of the antenna as an observing platform. Further antenna deliveries then allowed to refine the related procedures, develop staff expertise and to transition towards a more routine production process. Stations 3 and 4 deal with verification of the antenna with integrated electronics by the AIV Science Team and is not covered

  3. Verification and clarification of patterns of sensory integrative dysfunction.

    Science.gov (United States)

    Mailloux, Zoe; Mulligan, Shelley; Roley, Susanne Smith; Blanche, Erna; Cermak, Sharon; Coleman, Gina Geppert; Bodison, Stefanie; Lane, Christianne Joy

    2011-01-01

    Building on established relationships between the constructs of sensory integration in typical and special needs populations, in this retrospective study we examined patterns of sensory integrative dysfunction in 273 children ages 4-9 who had received occupational therapy evaluations in two private practice settings. Test results on the Sensory Integration and Praxis Tests, portions of the Sensory Processing Measure representing tactile overresponsiveness, and parent report of attention and activity level were included in the analyses. Exploratory factor analysis identified patterns similar to those found in early studies by Ayres (1965, 1966a, 1966b, 1969, 1972b, 1977, & 1989), namely Visuodyspraxia and Somatodyspraxia, Vestibular and Proprioceptive Bilateral Integration and Sequencing, Tactile and Visual Discrimination, and Tactile Defensiveness and Attention. Findings reinforce associations between constructs of sensory integration and assist with understanding sensory integration disorders that may affect childhood occupation. Limitations include the potential for subjective interpretation in factor analysis and inability to adjust measures available in charts in a retrospective research.

  4. Experimental Verification of a Vehicle Localization based on Moving Horizon Estimation Integrating LRS and Odometry

    International Nuclear Information System (INIS)

    Sakaeta, Kuniyuki; Nonaka, Kenichiro; Sekiguchi, Kazuma

    2016-01-01

    Localization is an important function for the robots to complete various tasks. For localization, both internal and external sensors are used generally. The odometry is widely used as the method based on the internal sensors, but it suffers from cumulative errors. In the method using the laser range sensor (LRS) which is a kind of external sensor, the estimation accuracy is affected by the number of available measurement data. In our previous study, we applied moving horizon estimation (MHE) to the vehicle localization for integrating the LRS measurement data and the odometry information where the weightings of them are balanced relatively adapting to the number of the available LRS measurement data. In this paper, the effectiveness of the proposed localization method is verified through both numerical simulations and experiments using a 1/10 scale vehicle. The verification is conducted in the situations where the vehicle position cannot be localized uniquely on a certain direction using the LRS measurement data only. We achieve accurate localization even in such a situation by integrating the odometry and LRS based on MHE. We also show the superiority of the method through comparisons with a method using extended Kalman filter (EKF). (paper)

  5. A Scheme for Verification on Data Integrity in Mobile Multicloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Laicheng Cao

    2016-01-01

    Full Text Available In order to verify the data integrity in mobile multicloud computing environment, a MMCDIV (mobile multicloud data integrity verification scheme is proposed. First, the computability and nondegeneracy of verification can be obtained by adopting BLS (Boneh-Lynn-Shacham short signature scheme. Second, communication overhead is reduced based on HVR (Homomorphic Verifiable Response with random masking and sMHT (sequence-enforced Merkle hash tree construction. Finally, considering the resource constraints of mobile devices, data integrity is verified by lightweight computing and low data transmission. The scheme improves shortage that mobile device communication and computing power are limited, it supports dynamic data operation in mobile multicloud environment, and data integrity can be verified without using direct source file block. Experimental results also demonstrate that this scheme can achieve a lower cost of computing and communications.

  6. Application of Integrated Verification Approach to FPGA-based Safety-Critical I and C System of Nuclear Power Plant

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed, Ibrahim; Heo, Gyunyoung [Kyunghee Univ., Yongin (Korea, Republic of); Jung, Jaecheon [KEPCO, Ulsan (Korea, Republic of)

    2016-10-15

    Safety-critical instrumentation and control (I and C) system in nuclear power plant (NPP) implemented on programmable logic controllers (PLCs) plays a vital role in safe operation of the plant. The challenges such as fast obsolescence, the vulnerability to cyber-attack, and other related issues of software systems have currently led to the consideration of field programmable gate arrays (FPGAs) as an alternative to PLCs because of their advantages and hardware related benefits. Generally in FPGA design verification, the designers make use of verification techniques by writing the test benches which involved various stages of verification activities of register-transfer level (RTL), gate-level, and place and route. Writing the test benches is considerably time consuming and require a lot of efforts to achieve a satisfied desire results. Furthermore, performing the verification at each stage is a major bottleneck and demanded much activities and time. In addition, verification is conceivably, the most difficult and complicated aspect of any design. Therefore, in view of these, this work applied an integrated verification approach to the verification of FPGA-based I and C system in NPP that simultaneously verified the whole design modules using MATLAB/Simulink HDL Co-simulation models. Verification is conceivably, the most difficult and complicated aspect of any design, and an FPGA design is not an exception. Therefore, in this work, we introduced and discussed how an application of integrated verification technique to the verification and testing of FPGA-based I and C system design in NPP can facilitate the verification processes, and verify the entire design modules of the system simultaneously using MATLAB/Simulink HDL co-simulation models. In conclusion, the results showed that, the integrated verification approach through MATLAB/Simulink models, if applied to any design to be verified, could speed up the design verification and reduce the V and V tasks.

  7. Application of Integrated Verification Approach to FPGA-based Safety-Critical I and C System of Nuclear Power Plant

    International Nuclear Information System (INIS)

    Ahmed, Ibrahim; Heo, Gyunyoung; Jung, Jaecheon

    2016-01-01

    Safety-critical instrumentation and control (I and C) system in nuclear power plant (NPP) implemented on programmable logic controllers (PLCs) plays a vital role in safe operation of the plant. The challenges such as fast obsolescence, the vulnerability to cyber-attack, and other related issues of software systems have currently led to the consideration of field programmable gate arrays (FPGAs) as an alternative to PLCs because of their advantages and hardware related benefits. Generally in FPGA design verification, the designers make use of verification techniques by writing the test benches which involved various stages of verification activities of register-transfer level (RTL), gate-level, and place and route. Writing the test benches is considerably time consuming and require a lot of efforts to achieve a satisfied desire results. Furthermore, performing the verification at each stage is a major bottleneck and demanded much activities and time. In addition, verification is conceivably, the most difficult and complicated aspect of any design. Therefore, in view of these, this work applied an integrated verification approach to the verification of FPGA-based I and C system in NPP that simultaneously verified the whole design modules using MATLAB/Simulink HDL Co-simulation models. Verification is conceivably, the most difficult and complicated aspect of any design, and an FPGA design is not an exception. Therefore, in this work, we introduced and discussed how an application of integrated verification technique to the verification and testing of FPGA-based I and C system design in NPP can facilitate the verification processes, and verify the entire design modules of the system simultaneously using MATLAB/Simulink HDL co-simulation models. In conclusion, the results showed that, the integrated verification approach through MATLAB/Simulink models, if applied to any design to be verified, could speed up the design verification and reduce the V and V tasks

  8. Integrated Aero–Vibroacoustics: The Design Verification Process of Vega-C Launcher

    Directory of Open Access Journals (Sweden)

    Davide Bianco

    2018-01-01

    Full Text Available The verification of a space launcher at the design level is a complex issue because of (i the lack of a detailed modeling capability of the acoustic pressure produced by the rocket; and (ii the difficulties in applying deterministic methods to the large-scale metallic structures. In this paper, an innovative integrated design verification process is described, based on the bridging between a new semiempirical jet noise model and a hybrid finite-element method/statistical energy analysis (FEM/SEA approach for calculating the acceleration produced at the payload and equipment level within the structure, vibrating under the external acoustic forcing field. The result is a verification method allowing for accurate prediction of the vibroacoustics in the launcher interior, using limited computational resources and without resorting to computational fluid dynamics (CFD data. Some examples concerning the Vega-C launcher design are shown.

  9. Role of experiments in soil-structure interaction methodology verification

    International Nuclear Information System (INIS)

    Srinivasan, M.G.; Kot, C.A.; Hsieh, B.J.

    1986-01-01

    Different kinds of experimental data may be useful for partial or full verification of SSI analysis methods. The great bulk of existing data comes from earthquake records and dynamic testing of as-built structures. However, much of this data may not be suitable for the present purpose as the measurement locations were not selected with the verification of SSI analysis in mind and hence are too few in number or inappropriate in character. Data from scale model testing that include the soil in the model - both in-situ and laboratory - are relatively scarce. If the difficulty in satisfying the requirements of similitude laws on the one hand and simulating realistic soil behavior on the other can be resolved, scale model testing may generate very useful data for relatively low cost. The current NRC sponsored programs are expected to generate data very useful for verifying analysis methods for SSI. A systematic effort to inventory, evaluate and classify existing data is first necessary. This effort would probably show that more data is needed for the better understanding of SSI aspects such as spatial variation of ground motion and the related issue of foundation input motion, and soil stiffness. Collection of response data from in-structure and free field (surface and downhole) through instrumentation of selected as-built structures in seismically active regions may be the most efficient way to obtain the needed data. Augmentation of this data from properly designed scale model tests should also be considered

  10. Two-Level Verification of Data Integrity for Data Storage in Cloud Computing

    Science.gov (United States)

    Xu, Guangwei; Chen, Chunlin; Wang, Hongya; Zang, Zhuping; Pang, Mugen; Jiang, Ping

    Data storage in cloud computing can save capital expenditure and relive burden of storage management for users. As the lose or corruption of files stored may happen, many researchers focus on the verification of data integrity. However, massive users often bring large numbers of verifying tasks for the auditor. Moreover, users also need to pay extra fee for these verifying tasks beyond storage fee. Therefore, we propose a two-level verification of data integrity to alleviate these problems. The key idea is to routinely verify the data integrity by users and arbitrate the challenge between the user and cloud provider by the auditor according to the MACs and ϕ values. The extensive performance simulations show that the proposed scheme obviously decreases auditor's verifying tasks and the ratio of wrong arbitration.

  11. Provenance based data integrity checking and verification in cloud environments.

    Science.gov (United States)

    Imran, Muhammad; Hlavacs, Helmut; Haq, Inam Ul; Jan, Bilal; Khan, Fakhri Alam; Ahmad, Awais

    2017-01-01

    Cloud computing is a recent tendency in IT that moves computing and data away from desktop and hand-held devices into large scale processing hubs and data centers respectively. It has been proposed as an effective solution for data outsourcing and on demand computing to control the rising cost of IT setups and management in enterprises. However, with Cloud platforms user's data is moved into remotely located storages such that users lose control over their data. This unique feature of the Cloud is facing many security and privacy challenges which need to be clearly understood and resolved. One of the important concerns that needs to be addressed is to provide the proof of data integrity, i.e., correctness of the user's data stored in the Cloud storage. The data in Clouds is physically not accessible to the users. Therefore, a mechanism is required where users can check if the integrity of their valuable data is maintained or compromised. For this purpose some methods are proposed like mirroring, checksumming and using third party auditors amongst others. However, these methods use extra storage space by maintaining multiple copies of data or the presence of a third party verifier is required. In this paper, we address the problem of proving data integrity in Cloud computing by proposing a scheme through which users are able to check the integrity of their data stored in Clouds. In addition, users can track the violation of data integrity if occurred. For this purpose, we utilize a relatively new concept in the Cloud computing called "Data Provenance". Our scheme is capable to reduce the need of any third party services, additional hardware support and the replication of data items on client side for integrity checking.

  12. Provenance based data integrity checking and verification in cloud environments.

    Directory of Open Access Journals (Sweden)

    Muhammad Imran

    Full Text Available Cloud computing is a recent tendency in IT that moves computing and data away from desktop and hand-held devices into large scale processing hubs and data centers respectively. It has been proposed as an effective solution for data outsourcing and on demand computing to control the rising cost of IT setups and management in enterprises. However, with Cloud platforms user's data is moved into remotely located storages such that users lose control over their data. This unique feature of the Cloud is facing many security and privacy challenges which need to be clearly understood and resolved. One of the important concerns that needs to be addressed is to provide the proof of data integrity, i.e., correctness of the user's data stored in the Cloud storage. The data in Clouds is physically not accessible to the users. Therefore, a mechanism is required where users can check if the integrity of their valuable data is maintained or compromised. For this purpose some methods are proposed like mirroring, checksumming and using third party auditors amongst others. However, these methods use extra storage space by maintaining multiple copies of data or the presence of a third party verifier is required. In this paper, we address the problem of proving data integrity in Cloud computing by proposing a scheme through which users are able to check the integrity of their data stored in Clouds. In addition, users can track the violation of data integrity if occurred. For this purpose, we utilize a relatively new concept in the Cloud computing called "Data Provenance". Our scheme is capable to reduce the need of any third party services, additional hardware support and the replication of data items on client side for integrity checking.

  13. Provenance based data integrity checking and verification in cloud environments

    Science.gov (United States)

    Haq, Inam Ul; Jan, Bilal; Khan, Fakhri Alam; Ahmad, Awais

    2017-01-01

    Cloud computing is a recent tendency in IT that moves computing and data away from desktop and hand-held devices into large scale processing hubs and data centers respectively. It has been proposed as an effective solution for data outsourcing and on demand computing to control the rising cost of IT setups and management in enterprises. However, with Cloud platforms user’s data is moved into remotely located storages such that users lose control over their data. This unique feature of the Cloud is facing many security and privacy challenges which need to be clearly understood and resolved. One of the important concerns that needs to be addressed is to provide the proof of data integrity, i.e., correctness of the user’s data stored in the Cloud storage. The data in Clouds is physically not accessible to the users. Therefore, a mechanism is required where users can check if the integrity of their valuable data is maintained or compromised. For this purpose some methods are proposed like mirroring, checksumming and using third party auditors amongst others. However, these methods use extra storage space by maintaining multiple copies of data or the presence of a third party verifier is required. In this paper, we address the problem of proving data integrity in Cloud computing by proposing a scheme through which users are able to check the integrity of their data stored in Clouds. In addition, users can track the violation of data integrity if occurred. For this purpose, we utilize a relatively new concept in the Cloud computing called “Data Provenance”. Our scheme is capable to reduce the need of any third party services, additional hardware support and the replication of data items on client side for integrity checking. PMID:28545151

  14. Integrated Disposal Facility FY 2016: ILAW Verification and Validation of the eSTOMP Simulator

    Energy Technology Data Exchange (ETDEWEB)

    Freedman, Vicky L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Bacon, Diana H. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fang, Yilin [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-05-13

    This document describes two sets of simulations carried out to further verify and validate the eSTOMP simulator. In this report, a distinction is made between verification and validation, and the focus is on verifying eSTOMP through a series of published benchmarks on cementitious wastes, and validating eSTOMP based on a lysimeter experiment for the glassified waste. These activities are carried out within the context of a scientific view of validation that asserts that models can only be invalidated, and that model validation (and verification) is a subjective assessment.

  15. A New Integrated Threshold Selection Methodology for Spatial Forecast Verification of Extreme Events

    Science.gov (United States)

    Kholodovsky, V.

    2017-12-01

    Extreme weather and climate events such as heavy precipitation, heat waves and strong winds can cause extensive damage to the society in terms of human lives and financial losses. As climate changes, it is important to understand how extreme weather events may change as a result. Climate and statistical models are often independently used to model those phenomena. To better assess performance of the climate models, a variety of spatial forecast verification methods have been developed. However, spatial verification metrics that are widely used in comparing mean states, in most cases, do not have an adequate theoretical justification to benchmark extreme weather events. We proposed a new integrated threshold selection methodology for spatial forecast verification of extreme events that couples existing pattern recognition indices with high threshold choices. This integrated approach has three main steps: 1) dimension reduction; 2) geometric domain mapping; and 3) thresholds clustering. We apply this approach to an observed precipitation dataset over CONUS. The results are evaluated by displaying threshold distribution seasonally, monthly and annually. The method offers user the flexibility of selecting a high threshold that is linked to desired geometrical properties. The proposed high threshold methodology could either complement existing spatial verification methods, where threshold selection is arbitrary, or be directly applicable in extreme value theory.

  16. Research on Linux Trusted Boot Method Based on Reverse Integrity Verification

    Directory of Open Access Journals (Sweden)

    Chenlin Huang

    2016-01-01

    Full Text Available Trusted computing aims to build a trusted computing environment for information systems with the help of secure hardware TPM, which has been proved to be an effective way against network security threats. However, the TPM chips are not yet widely deployed in most computing devices so far, thus limiting the applied scope of trusted computing technology. To solve the problem of lacking trusted hardware in existing computing platform, an alternative security hardware USBKey is introduced in this paper to simulate the basic functions of TPM and a new reverse USBKey-based integrity verification model is proposed to implement the reverse integrity verification of the operating system boot process, which can achieve the effect of trusted boot of the operating system in end systems without TPMs. A Linux operating system booting method based on reverse integrity verification is designed and implemented in this paper, with which the integrity of data and executable files in the operating system are verified and protected during the trusted boot process phase by phase. It implements the trusted boot of operation system without TPM and supports remote attestation of the platform. Enhanced by our method, the flexibility of the trusted computing technology is greatly improved and it is possible for trusted computing to be applied in large-scale computing environment.

  17. Integration of KESS III models in ATHLET-CD and contributions to program verification. Final report

    International Nuclear Information System (INIS)

    Bruder, M.; Schatz, A.

    1994-07-01

    The development of the computer code ATHLET-CD is a contribution to the reactor safety research. ATHLET-CD is an extension of the system code ATHLET by core degradation models especially of the modular software package KESS. The aim of the ATHLET-CD development is the simulation of severe accident sequences from their initialisation to severe core degradation in a continous manner. In the framework of this project the ATHLET-CD development has been focused on the integration of KESS model like the control rod model as well as the models describing chemical interactions and material relocation along a rod and fission product release. The present ATHLET-CD version is able to describe severe accidents in a PWR up to the early core degradation (relocation of material along a rod surface in axial direction). Contributions to the verification of ATHLET-CD comprised calculations of the experiments PHEBUS AIC and PBF SFD 1-4. The PHEBUS AIC calculation was focused on the examination of the control rod model whereas the PBF SFD 1-4 claculation served to check the models describing melting, material relocation and fission product release. (orig.)

  18. X447 EBR-II Experiment Benchmark for Verification of Audit Code of SFR Metal Fuel

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Yong Won; Bae, Moo-Hoon; Shin, Andong; Suh, Namduk [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2016-10-15

    In KINS (Korea Institute of Nuclear Safety), to prepare audit calculation of PGSFR licensing review, the project has been started to develop the regulatory technology for SFR system including a fuel area. To evaluate the fuel integrity and safety during an irradiation, the fuel performance code must be used for audit calculation. In this study, to verify the new code system, the benchmark analysis is performed. In the benchmark, X447 EBR-II experiment data are used. Additionally, the sensitivity analysis according to mass flux change of coolant is performed. In case of LWR fuel performance modeling, various and advanced models have been proposed and validated based on sufficient in-reactor test results. However, due to the lack of experience of SFR operation, the current understanding of SFR fuel behavior is limited. In this study, X447 EBR-II Experiment data are used for benchmark. The fuel composition of X447 assembly is U-10Zr and PGSFR also uses this composition in initial phase. So we select X447 EBR-II experiment for benchmark analysis. Due to the lack of experience of SFR operation and data, the current understanding of SFR fuel behavior is limited. However, in order to prepare the licensing of PGSFR, regulatory audit technologies of SFR must be secured. So, in this study, to verify the new audit fuel performance analysis code, the benchmark analysis is performed using X447 EBR-II experiment data. Also, the sensitivity analysis with mass flux change of coolant is performed. In terms of verification, it is considered that the results of benchmark and sensitivity analysis are reasonable.

  19. X447 EBR-II Experiment Benchmark for Verification of Audit Code of SFR Metal Fuel

    International Nuclear Information System (INIS)

    Choi, Yong Won; Bae, Moo-Hoon; Shin, Andong; Suh, Namduk

    2016-01-01

    In KINS (Korea Institute of Nuclear Safety), to prepare audit calculation of PGSFR licensing review, the project has been started to develop the regulatory technology for SFR system including a fuel area. To evaluate the fuel integrity and safety during an irradiation, the fuel performance code must be used for audit calculation. In this study, to verify the new code system, the benchmark analysis is performed. In the benchmark, X447 EBR-II experiment data are used. Additionally, the sensitivity analysis according to mass flux change of coolant is performed. In case of LWR fuel performance modeling, various and advanced models have been proposed and validated based on sufficient in-reactor test results. However, due to the lack of experience of SFR operation, the current understanding of SFR fuel behavior is limited. In this study, X447 EBR-II Experiment data are used for benchmark. The fuel composition of X447 assembly is U-10Zr and PGSFR also uses this composition in initial phase. So we select X447 EBR-II experiment for benchmark analysis. Due to the lack of experience of SFR operation and data, the current understanding of SFR fuel behavior is limited. However, in order to prepare the licensing of PGSFR, regulatory audit technologies of SFR must be secured. So, in this study, to verify the new audit fuel performance analysis code, the benchmark analysis is performed using X447 EBR-II experiment data. Also, the sensitivity analysis with mass flux change of coolant is performed. In terms of verification, it is considered that the results of benchmark and sensitivity analysis are reasonable

  20. Verification of the integrity of barriers using gas diffusion

    International Nuclear Information System (INIS)

    Ward, D.B.; Williams, C.V.

    1997-06-01

    In-situ barrier materials and designs are being developed for containment of high risk contamination as an alternative to immediate removal or remediation. The intent of these designs is to prevent the movement of contaminants in either the liquid or vapor phase by long-term containment, essentially buying time until the contaminant depletes naturally or a remediation can be implemented. The integrity of the resultant soil-binder mixture is typically assessed by a number of destructive laboratory tests (leaching, compressive strength, mechanical stability with respect to wetting and freeze-thaw cycles) which as a group are used to infer the likelihood of favorable long-term performance of the barrier. The need exists for a minimally intrusive yet quantifiable methods for assessment of a barrier's integrity after emplacement, and monitoring of the barrier's performance over its lifetime. Here, the authors evaluate non-destructive measurements of inert-gas diffusion (specifically, SF 6 ) as an indicator of waste-form integrity. The goals of this project are to show that diffusivity can be measured in core samples of soil jet-grouted with Portland cement, validate the experimental method through measurements on samples, and to calculate aqueous diffusivities from a series of diffusion measurements. This study shows that it is practical to measure SF 6 diffusion rates in the laboratory on samples of grout (Portland cement and soil) typical of what might be used in a barrier. Diffusion of SF 6 through grout (Portland cement and soil) is at least an order of magnitude slower than through air. The use of this tracer should be sensitive to the presence of fractures, voids, or other discontinuities in the grout/soil structure. Field-scale measurements should be practical on time-scales of a few days

  1. Fluor Hanford Integrated Safety Management System Phase II Verification Vol 1 & Vol 2

    Energy Technology Data Exchange (ETDEWEB)

    PARSONS, J.E.

    2000-07-15

    The U.S. Department of Energy (DOE) is committed to conducting work efficiently and in a manner that ensures protection of the workers, public, and environment. DOE policy mandates that safety management systems be used to systematically integrate safety into management and work practices at all levels while accomplishing mission goals in an effective and efficient manner. The purpose of the Fluor Hanford (FH) Integrated Safety Management System (ISMS) verification was to determine whether FH's ISM system and processes are sufficiently implemented to accomplish the goal of ''Do work safely.'' The purpose of the DOE, Richland Operations Office (RL) verification was to determine whether RL has established processes that adequately describe RL's role in safety management and if those processes are sufficiently implemented.

  2. Fluor Hanford Integrated Safety Management System Phase II Verification Vol 1 and Vol 2

    CERN Document Server

    Parsons, J E

    2000-01-01

    The U.S. Department of Energy (DOE) is committed to conducting work efficiently and in a manner that ensures protection of the workers, public, and environment. DOE policy mandates that safety management systems be used to systematically integrate safety into management and work practices at all levels while accomplishing mission goals in an effective and efficient manner. The purpose of the Fluor Hanford (FH) Integrated Safety Management System (ISMS) verification was to determine whether FH's ISM system and processes are sufficiently implemented to accomplish the goal of ''Do work safely.'' The purpose of the DOE, Richland Operations Office (RL) verification was to determine whether RL has established processes that adequately describe RL's role in safety management and if those processes are sufficiently implemented.

  3. Fluor Hanford Integrated Safety Management System Phase II Verification Vol 1 and Vol 2

    International Nuclear Information System (INIS)

    PARSONS, J.E.

    2000-01-01

    The U.S. Department of Energy (DOE) is committed to conducting work efficiently and in a manner that ensures protection of the workers, public, and environment. DOE policy mandates that safety management systems be used to systematically integrate safety into management and work practices at all levels while accomplishing mission goals in an effective and efficient manner. The purpose of the Fluor Hanford (FH) Integrated Safety Management System (ISMS) verification was to determine whether FH's ISM system and processes are sufficiently implemented to accomplish the goal of ''Do work safely.'' The purpose of the DOE, Richland Operations Office (RL) verification was to determine whether RL has established processes that adequately describe RL's role in safety management and if those processes are sufficiently implemented

  4. Efficient Dynamic Integrity Verification for Big Data Supporting Users Revocability

    Directory of Open Access Journals (Sweden)

    Xinpeng Zhang

    2016-05-01

    Full Text Available With the advent of the big data era, cloud data storage and retrieval have become popular for efficient data management in large companies and organizations, thus they can enjoy the on-demand high-quality cloud storage service. Meanwhile, for security reasons, those companies and organizations would like to verify the integrity of their data once storing it in the cloud. To address this issue, they need a proper cloud storage auditing scheme which matches their actual demands. Current research often focuses on the situation where the data manager owns the data; however, the data belongs to the company, rather than the data managers in the real situation which has been overlooked. For example, the current data manager is no longer suitable to manage the data stored in the cloud after a period and will be replaced by another one. The successor needs to verify the integrity of the former managed data; this problem is obviously inevitable in reality. In this paper, we fill this gap by giving a practical efficient revocable privacy-preserving public auditing scheme for cloud storage meeting the auditing requirement of large companies and organization’s data transfer. The scheme is conceptually simple and is proven to be secure even when the cloud service provider conspires with revoked users.

  5. Development and verification test of integral reactor major components

    Energy Technology Data Exchange (ETDEWEB)

    Kim, J. I.; Kim, Y. W.; Kim, J. H. and others

    1999-03-01

    The conceptual designs for SG, MCP, CEDM to be installed in the integral reactor SMART were developed. Three-dimensional CAD models for the major components were developed to visualize the design concepts. Once-through helical steam generator was conceptually designed for SMART. Canned motor pump was adopted in the conceptual design of MCP. Linear pulse motor type and ballscrew type CEDM, which have fine control capabilities were studied for adoption in SMART. In parallel with the structural design, the electro-magnetic design was performed for the sizing motors and electro-magnet. Prototypes for the CEDM and MCP sub-assemblies were developed and tested to verify the performance. The impeller design procedure and the computer program to analyze the dynamic characteristics of MCP rotor shaft were developed. The design concepts of SG, MCP, CEDM were also invetigated for the fabricability.

  6. Development and verification test of integral reactor major components

    International Nuclear Information System (INIS)

    Kim, J. I.; Kim, Y. W.; Kim, J. H. and others

    1999-03-01

    The conceptual designs for SG, MCP, CEDM to be installed in the integral reactor SMART were developed. Three-dimensional CAD models for the major components were developed to visualize the design concepts. Once-through helical steam generator was conceptually designed for SMART. Canned motor pump was adopted in the conceptual design of MCP. Linear pulse motor type and ballscrew type CEDM, which have fine control capabilities were studied for adoption in SMART. In parallel with the structural design, the electro-magnetic design was performed for the sizing motors and electro-magnet. Prototypes for the CEDM and MCP sub-assemblies were developed and tested to verify the performance. The impeller design procedure and the computer program to analyze the dynamic characteristics of MCP rotor shaft were developed. The design concepts of SG, MCP, CEDM were also invetigated for the fabricability

  7. Optical integration and verification of LINC-NIRVANA

    Science.gov (United States)

    Moreno-Ventas, J.; Baumeister, H.; Bertram, Thomas; Bizenberger, P.; Briegel, F.; Greggio, D.; Kittmann, F.; Marafatto, L.; Mohr, L.; Radhakrishnan, K.; Schray, H.

    2014-07-01

    The LBT (Large Binocular Telescope) located in Mount Graham near Tucson/Arizona at an altitude of about 3200m, is an innovative project being undertaken by institutions from Europe and USA. The structure of the telescope incorporates two 8.4-meter telescopes on a 14.4 center-to-center common mount. This configuration provides the equivalent collecting area of a 12m single-dish telescope. LINC-NIRVANA is an instrument to combine the light from both LBT primary mirrors in an imaging Fizeau interferometer. Many requirements must be fulfilled in order to get a good interferometric combination of the beams, being among the most important plane wavefronts, parallel input beams, homotheticity and zero optical path difference (OPD) required for interferometry. The philosophy is to have an internally aligned instrument first, and then align the telescope to match the instrument. The sum of different subsystems leads to a quite ambitious system, which requires a well-defined strategy for alignment and testing. In this paper I introduce and describe the followed strategy, as well as the different solutions, procedures and tools used during integration. Results are presented at every step.

  8. Data-driven property verification of grey-box systems by Bayesian experiment design

    NARCIS (Netherlands)

    Haesaert, S.; Van den Hof, P.M.J.; Abate, A.

    2015-01-01

    A measurement-based statistical verification approach is developed for systems with partly unknown dynamics. These grey-box systems are subject to identification experiments which, new in this contribution, enable accepting or rejecting system properties expressed in a linear-time logic. We employ a

  9. A New Integrated Weighted Model in SNOW-V10: Verification of Categorical Variables

    Science.gov (United States)

    Huang, Laura X.; Isaac, George A.; Sheng, Grant

    2014-01-01

    This paper presents the verification results for nowcasts of seven categorical variables from an integrated weighted model (INTW) and the underlying numerical weather prediction (NWP) models. Nowcasting, or short range forecasting (0-6 h), over complex terrain with sufficient accuracy is highly desirable but a very challenging task. A weighting, evaluation, bias correction and integration system (WEBIS) for generating nowcasts by integrating NWP forecasts and high frequency observations was used during the Vancouver 2010 Olympic and Paralympic Winter Games as part of the Science of Nowcasting Olympic Weather for Vancouver 2010 (SNOW-V10) project. Forecast data from Canadian high-resolution deterministic NWP system with three nested grids (at 15-, 2.5- and 1-km horizontal grid-spacing) were selected as background gridded data for generating the integrated nowcasts. Seven forecast variables of temperature, relative humidity, wind speed, wind gust, visibility, ceiling and precipitation rate are treated as categorical variables for verifying the integrated weighted forecasts. By analyzing the verification of forecasts from INTW and the NWP models among 15 sites, the integrated weighted model was found to produce more accurate forecasts for the 7 selected forecast variables, regardless of location. This is based on the multi-categorical Heidke skill scores for the test period 12 February to 21 March 2010.

  10. IVA2 verification: Expansion phase experiment in SNR geometry

    International Nuclear Information System (INIS)

    Kolev, N.I.

    1987-09-01

    Using the IVA2/005 computer code the SNR model explosion experiment SGI-09-1 was numerically simulated. The experiment consists of high pressure gas injection into a low pressure liquid pool with a free surface in a cylindrical geometry with internals. Bubble formation and pressure history as a function of time was predicted and compared with the experimental observation. A good agreement between theory and experiment was obtained. Numerical diffusion and its influence on the results are discussed. (orig.) [de

  11. VHTRC experiment for verification test of H∞ reactivity estimation method

    International Nuclear Information System (INIS)

    Fujii, Yoshio; Suzuki, Katsuo; Akino, Fujiyoshi; Yamane, Tsuyoshi; Fujisaki, Shingo; Takeuchi, Motoyoshi; Ono, Toshihiko

    1996-02-01

    This experiment was performed at VHTRC to acquire the data for verifying the H∞ reactivity estimation method. In this report, the experimental method, the measuring circuits and data processing softwares are described in details. (author)

  12. Flight Experiment Verification of Shuttle Boundary Layer Transition Prediction Tool

    Science.gov (United States)

    Berry, Scott A.; Berger, Karen T.; Horvath, Thomas J.; Wood, William A.

    2016-01-01

    Boundary layer transition at hypersonic conditions is critical to the design of future high-speed aircraft and spacecraft. Accurate methods to predict transition would directly impact the aerothermodynamic environments used to size a hypersonic vehicle's thermal protection system. A transition prediction tool, based on wind tunnel derived discrete roughness correlations, was developed and implemented for the Space Shuttle return-to-flight program. This tool was also used to design a boundary layer transition flight experiment in order to assess correlation uncertainties, particularly with regard to high Mach-number transition and tunnel-to-flight scaling. A review is provided of the results obtained from the flight experiment in order to evaluate the transition prediction tool implemented for the Shuttle program.

  13. Verification experiment of EPR paradox by (d, 2He) reaction

    International Nuclear Information System (INIS)

    Sakai, Hideyuki

    2003-01-01

    FBR paradox which was brought forward by Einstein, Podolsky and Rosen is expressed by Bell's inequality of spin correlation theoretically. In principle it is possible to verify the inequality by measuring spin correlation between two particles having spin 1/2 from a decay of 1 S 0 experimentally. Most of the past experiments to verify the inequality, however, have been performed by using photons. On the other hand, only one experiment by using hadron system was carried out by Lamehi and Mitting, where the [ 1 S 0 ] state was produced by proton-proton scattering at first, and then the spin orientations after the scattering were measured. Unfortunately, there exit some sources of ambiguity to reach definite conclusion from their result because the experiment was done at rather high energy of 13.5 MeV. In the experiment planned by the present author it is designed to overcome the experimental difficulties, which Lamehi and Mitting encountered, by (1) generating high purity singlet [ 1 S 0 ] state of two protons by (d, 2 He) type nuclear reaction at intermediate energy range, and by (2) developing high performance spin-correlation polarimeter which can analyze spins of two protons simultaneously to minimize the systematic errors. The excitation energy of 2 He corresponding to the proton-proton relative energy can be experimentally controlled. An idea singlet is realized by choosing the state with sufficiently small relative energy. It is planned to measure the spin correlation function by using SMART (Swinger and Magnetic Analyzer with Rotator and Twister) at RIKEN Accelerator Research Facility. Einstein POLarimeter (EPOL) to be installed on the second focal plane of SMART is under development, with which high precision measurements of spin orientations of two high energy protons simultaneously coming into limited space from 2 He decay are made selecting the subject events from very many background events. Monte Carlo simulation predicts the possibility to verify the

  14. Operational experiences with on line BWR condenser tube leak verification

    International Nuclear Information System (INIS)

    Bryant, R.A.; Duvall, W.E.; Kirkley, W.B.; Zavadoski, R.W.

    1988-01-01

    Verifying condenser tube leaks at a boiling water reactor is, at best, a difficult task carried out in hot steamy water boxes with concurrent radiation exposure. For small apparent leaks with slight chemical changes there is always uncertainty of whether the problem is a condenser tube leak or a feedback from radwaste. Most conventional methods (e.g soap tests, Saran wrap suction, and helium tests) usually involve a load reduction to isolate the water boxes one at a time and hours of drain down on each box. The sensitivity of the most sensitive test (helium) is of the order of 7500 l per day per box. Sulfur hexafluoride has been successfully used at a BWR to identify one leaking water box out of four while the unit was at 100 % power. The actual tubes leakig in the water box were identified by injecting helium during drain down of the box and subsequent manifold testing. Additional tests with sulfur hexafluoride on the second BWR unit indicated tight water boxes to within the sensitivity of the measurement, i.e. less than 19 l per day for all four boxes. Problems encountered in both tests included sulfur hexafluoride carry over from the plume of the cooling towers and off gas considerations. In brief sulfur hexafluoride can be used to quickly identify which particular water box has a condenser tube leak or, just as quickly, establish the integrity of all the water boxes to a level not previously attainable. (author)

  15. Advanced Communication Technology Satellite (ACTS) multibeam antenna technology verification experiments

    Science.gov (United States)

    Acosta, Roberto J.; Larko, Jeffrey M.; Lagin, Alan R.

    1992-01-01

    The Advanced Communication Technology Satellite (ACTS) is a key to reaching NASA's goal of developing high-risk, advanced communications technology using multiple frequency bands to support the nation's future communication needs. Using the multiple, dynamic hopping spot beams, and advanced on board switching and processing systems, ACTS will open a new era in communications satellite technology. One of the key technologies to be validated as part of the ACTS program is the multibeam antenna with rapidly reconfigurable hopping and fixed spot beam to serve users equipped with small-aperature terminals within the coverage areas. The proposed antenna technology experiments are designed to evaluate in-orbit ACTS multibeam antenna performance (radiation pattern, gain, cross pol levels, etc.).

  16. Verification test for radiation reduction effect and material integrity on PWR primary system by zinc injection

    Energy Technology Data Exchange (ETDEWEB)

    Kawakami, H.; Nagata, T.; Yamada, M. [Nuclear Power Engineering Corp. (Japan); Kasahara, K.; Tsuruta, T.; Nishimura, T. [Mitsubishi Heavy Industries, Ltd. (Japan); Ishigure, K. [Saitama Inst. of Tech. (Japan)

    2002-07-01

    Zinc injection is known to be an effective method for the reduction of radiation source in the primary water system of a PWR. There is a need to verify the effect of Zn injection operation on radiation source reduction and materials integrity of PWR primary circuit. In order to confirm the effectiveness of Zn injection, verification test as a national program sponsored by Ministry of Economy, Trade and Industry (METI) was started in 1995 for 7-year program, and will be finished by the end of March in 2002. This program consists of irradiation test and material integrity test. Irradiation test as an In-Pile-Test managed by AEAT Plc(UK) was performed using the LVR-15 reactor of NRI Rez in Check Republic. Furthermore, Out-of-Pile-Test using film adding unit was also performed to obtain supplemental data for In-Pile-Test at Takasago Engineering Laboratory of NUPEC. Material Integrity test was planned to perform constant load test, constant strain test and corrosion test at the same time using large scale Loop and slow strain extension rate testing (SSRT) at Takasago Engineering Laboratory of NUPEC. In this paper, the results of the verification test for Zinc program at present are discussed. (authors)

  17. Experimental study on design verification of new concept for integral reactor safety system

    International Nuclear Information System (INIS)

    Chung, Moon Ki; Choi, Ki Yong; Park, Hyun Sik; Cho, Seok; Park, Choon Kyung; Lee, Sung Jae; Song, Chul Hwa

    2004-01-01

    The pressurized light water cooled, medium power (330 MWt) SMART (System-integrated Modular Advanced ReacTor) has been under development at KAERI for a dual purpose : seawater desalination and electricity generation. The SMART design verification phase was followed to conduct various separate effects tests and comprehensive integral effect tests. The high temperature / high pressure thermal-hydraulic test facility, VISTA(Experimental Verification by Integral Simulation of Transient and Accidents) has been constructed to simulate the SMART-P (the one fifth scaled pilot plant) by KAERI. Experimental tests have been performed to investigate the thermal-hydraulic dynamic characteristics of the primary and the secondary systems. Heat transfer characteristics and natural circulation performance of the PRHRS (Passive Residual Heat Removal System) of SMART-P were also investigated using the VISTA facility. The coolant flows steadily in the natural circulation loop which is composed of the Steam Generator (SG) primary side, the secondary system, and the PRHRS. The heat transfers through the PRHRS heat exchanger and ECT are sufficient enough to enable the natural circulation of the coolant

  18. Application verification research of cloud computing technology in the field of real time aerospace experiment

    Science.gov (United States)

    Wan, Junwei; Chen, Hongyan; Zhao, Jing

    2017-08-01

    According to the requirements of real-time, reliability and safety for aerospace experiment, the single center cloud computing technology application verification platform is constructed. At the IAAS level, the feasibility of the cloud computing technology be applied to the field of aerospace experiment is tested and verified. Based on the analysis of the test results, a preliminary conclusion is obtained: Cloud computing platform can be applied to the aerospace experiment computing intensive business. For I/O intensive business, it is recommended to use the traditional physical machine.

  19. CTBT verification-related technologies for peaceful purposes: the French experiences of international cooperation

    International Nuclear Information System (INIS)

    Massinon, B.

    1999-01-01

    The French experience concerning CTBT verification-related technologies for peaceful purposes as well a the international cooperation in this field are presented. Possible objectives and cooperation program needs are cited. French experience in international cooperation is related to seismology and seismic hazards in Bolivia, Madagascar, Nepal and Indonesia and is considered as very constructive, meaning that technical experience is developed, consistent scientific results are obtained, and a considerable yield to the CTBT task is achieved. Large scientific benefits are expected from the CTBTO

  20. Formal Verification Method for Configuration of Integrated Modular Avionics System Using MARTE

    Directory of Open Access Journals (Sweden)

    Lisong Wang

    2018-01-01

    Full Text Available The configuration information of Integrated Modular Avionics (IMA system includes almost all details of whole system architecture, which is used to configure the hardware interfaces, operating system, and interactions among applications to make an IMA system work correctly and reliably. It is very important to ensure the correctness and integrity of the configuration in the IMA system design phase. In this paper, we focus on modelling and verification of configuration information of IMA/ARINC653 system based on MARTE (Modelling and Analysis for Real-time and Embedded Systems. Firstly, we define semantic mapping from key concepts of configuration (such as modules, partitions, memory, process, and communications to components of MARTE element and propose a method for model transformation between XML-formatted configuration information and MARTE models. Then we present a formal verification framework for ARINC653 system configuration based on theorem proof techniques, including construction of corresponding REAL theorems according to the semantics of those key components of configuration information and formal verification of theorems for the properties of IMA, such as time constraints, spatial isolation, and health monitoring. After that, a special issue of schedulability analysis of ARINC653 system is studied. We design a hierarchical scheduling strategy with consideration of characters of the ARINC653 system, and a scheduling analyzer MAST-2 is used to implement hierarchical schedule analysis. Lastly, we design a prototype tool, called Configuration Checker for ARINC653 (CC653, and two case studies show that the methods proposed in this paper are feasible and efficient.

  1. Integrated safeguards: Australian views and experience

    International Nuclear Information System (INIS)

    Carlson, J.; Bragin, V.; Leslie, R.

    2001-01-01

    Full text: Australia has had a pioneering role in assisting the IAEA to develop the procedures and methods for strengthened safeguards, both before and after the conclusion of Australia's additional protocol. Australia played a key role in the negotiation of the model additional protocol, and made ratification a high priority in order to encourage early ratification by other States. Australia was the first State to ratify an additional protocol, on 10 December 1997, and was the first State in which the IAEA exercised complementary access and managed access under an additional protocol. Australia has undergone three full cycles of evaluation under strengthened safeguards measures, enabling the Agency to conclude it was appropriate to commence implementation of integrated safeguards. In January 2001 Australia became the first State in which integrated safeguards are being applied. As such, Australia's experience will be of interest to other States as they consult with the IAEA on the modalities for the introduction of integrated safeguards in their jurisdictions. The purpose of the paper is to outline Australia's experience with strengthened safeguards and Australia's views on the implementation of integrated safeguards. Australia has five Material Balance Areas (MBAs), the principal one covering the 10 MWt research reactor at Lucas Heights and the associated inventory of fresh and irradiated HEU fuel. Under classical safeguards, generally Australia was subject to annual Physical Inventory Verifications (PIVs) for the four MBAs at Lucas Heights, plus quarterly interim inspections, making a total of four inspections a year (PIVs for the different MBAs were conducted concurrently with each other or with interim inspections in other MBAs), although there was a period when the fresh fuel inventory exceeded one SQ, requiring monthly inspections. Under strengthened safeguards, this pattern of four inspections a year was maintained, with the addition of complementary

  2. Raman laser spectrometer optical head: qualification model assembly and integration verification

    Science.gov (United States)

    Ramos, G.; Sanz-Palomino, M.; Moral, A. G.; Canora, C. P.; Belenguer, T.; Canchal, R.; Prieto, J. A. R.; Santiago, A.; Gordillo, C.; Escribano, D.; Lopez-Reyes, G.; Rull, F.

    2017-08-01

    Raman Laser Spectrometer (RLS) is the Pasteur Payload instrument of the ExoMars mission, within the ESA's Aurora Exploration Programme, that will perform for the first time in an out planetary mission Raman spectroscopy. RLS is composed by SPU (Spectrometer Unit), iOH (Internal Optical Head), and ICEU (Instrument Control and Excitation Unit). iOH focuses the excitation laser on the samples (excitation path), and collects the Raman emission from the sample (collection path, composed on collimation system and filtering system). Its original design presented a high laser trace reaching to the detector, and although a certain level of laser trace was required for calibration purposes, the high level degrades the Signal to Noise Ratio confounding some Raman peaks. So, after the bread board campaign, some light design modifications were implemented in order to fix the desired amount of laser trace, and after the fabrication and the commitment of the commercial elements, the assembly and integration verification process was carried out. A brief description of the iOH design update for the engineering and qualification model (iOH EQM) as well as the assembly process are briefly described in this papers. In addition, the integration verification and the first functional tests, carried out with the RLS calibration target (CT), results are reported on.

  3. Experience in verification regimes. United States On-Site Inspection Agency

    International Nuclear Information System (INIS)

    Reppert, J.

    1998-01-01

    Experiences are described of the United States On-site Inspection Agency in verification regimes all over the world where it has been applied in the last 30 years. The challenge for the future is to extend the benefits of the applied tools to all states in all regions to enhance stability and to create conditions for peace at lower levels of armaments than currently exist. The USA need to engage states currently caught in cycles of violence and arms escalation. They must examine technologies which together with on-site aspects of verification or transparency regimes can provide a comprehensive picture at affordable costs. They foresee a growth in combined training with new states entering for the first time into regime that include arms control and transparency measure

  4. U.S. integral and benchmark experiments

    International Nuclear Information System (INIS)

    Maienschein, F.C.

    1978-01-01

    Verification of methods for analysis of radiation-transport (shielding) problems in Liquid-Metal Fast Breeder Reactors has required a series of experiments that can be classified as benchmark, parametric, or design-confirmation experiments. These experiments, performed at the Oak Ridge Tower Shielding Facility, have included measurements of neutron transport in bulk shields of sodium, steel, and inconel and in configurations that simulate lower axial shields, pipe chases, and top-head shields. They have also included measurements of the effects of fuel stored within the reactor vessel and of gamma-ray energy deposition (heating). The paper consists of brief comments on these experiments, and also on a recent experiment in which neutron streaming problems in a Gas-Cooled Fast Breeder Reactor were studied. The need for additional experiments for a few areas of LMFBR shielding is also cited

  5. Validation and Verification of Future Integrated Safety-Critical Systems Operating under Off-Nominal Conditions

    Science.gov (United States)

    Belcastro, Christine M.

    2010-01-01

    Loss of control remains one of the largest contributors to aircraft fatal accidents worldwide. Aircraft loss-of-control accidents are highly complex in that they can result from numerous causal and contributing factors acting alone or (more often) in combination. Hence, there is no single intervention strategy to prevent these accidents and reducing them will require a holistic integrated intervention capability. Future onboard integrated system technologies developed for preventing loss of vehicle control accidents must be able to assure safe operation under the associated off-nominal conditions. The transition of these technologies into the commercial fleet will require their extensive validation and verification (V and V) and ultimate certification. The V and V of complex integrated systems poses major nontrivial technical challenges particularly for safety-critical operation under highly off-nominal conditions associated with aircraft loss-of-control events. This paper summarizes the V and V problem and presents a proposed process that could be applied to complex integrated safety-critical systems developed for preventing aircraft loss-of-control accidents. A summary of recent research accomplishments in this effort is also provided.

  6. Issues of verification and validation of application-specific integrated circuits in reactor trip systems

    International Nuclear Information System (INIS)

    Battle, R.E.; Alley, G.T.

    1993-01-01

    Concepts of using application-specific integrated circuits (ASICs) in nuclear reactor safety systems are evaluated. The motivation for this evaluation stems from the difficulty of proving that software-based protection systems are adequately reliable. Important issues concerning the reliability of computers and software are identified and used to evaluate features of ASICS. These concepts indicate that ASICs have several advantages over software for simple systems. The primary advantage of ASICs over software is that verification and validation (V ampersand V) of ASICs can be done with much higher confidence than can be done with software. A method of performing this V ampersand V on ASICS is being developed at Oak Ridge National Laboratory. The purpose of the method's being developed is to help eliminate design and fabrication errors. It will not solve problems with incorrect requirements or specifications

  7. Verification and validation guidelines for high integrity systems: Appendices A--D, Volume 2

    International Nuclear Information System (INIS)

    Hecht, H.; Hecht, M.; Dinsmore, G.; Hecht, S.; Tang, D.

    1995-03-01

    The following material is furnished as an experimental guide for the use of risk based classification for nuclear plant protection systems. As shown in Sections 2 and 3 of this report, safety classifications for the nuclear field are application based (using the function served as the primary criterion), whereas those in use by the process industry and the military are risk based. There are obvious obstacles to the use of risk based classifications (and the associated integrity levels) for nuclear power plants, yet there are also many potential benefits, including: it considers all capabilities provided for dealing with a specific hazard, thus assigning a lower risk where multiple protection is provided (either at the same or at lower layers); this permits the plant management to perform trade-offs between systems that meet the highest qualification levels or multiple diverse systems at lower qualification levels; it motivates the use (and therefore also the development) of protection systems with demonstrated low failure probability; and it may permit lower cost process industry equipment of an established integrity level to be used in nuclear applications (subject to verification of the integrity level and regulatory approval). The totality of these benefits may reduce the cost of digital protection systems significantly an motivate utilities to much more rapid upgrading of the capabilities than is currently the case. Therefore the outline of a risk based classification is presented here, to serve as a starting point for further investigation and possible trial application

  8. Sealing of process valves for the HEU downblending verification experiment at Portsmouth

    International Nuclear Information System (INIS)

    Baldwin, G.T.; Bartberger, J.C.; Jenkins, C.D.; Perlinski, A.W.; Schoeneman, J.L.; Gordon, D.M.; Whiting, N.E.; Bonner, T.N.; Castle, J.M.

    1998-01-01

    At the Portsmouth Gaseous Diffusion Plant in Piketon, Ohio, USA, excess inventory of highly-enriched uranium (HEU) from US defense programs is being diluted to low-enriched uranium (LEU) for commercial use. The conversion is subject to a Verification Experiment overseen by the International Atomic Energy Agency (IAEA). The Verification Experiment is making use of monitoring technologies developed and installed by several DOE laboratories. One of the measures is a system for sealing valves in the process piping, which secures the path followed by uranium hexafluoride gas (UF 6 ) from cylinders at the feed stations to the blend point, where the HEU is diluted with LEU. The Authenticated Item Monitoring System (AIMS) was the alternative proposed by Sandia National Laboratories that was selected by the IAEA. Approximately 30 valves were sealed by the IAEA using AIMS fiber-optic seals (AFOS). The seals employ single-core plastic fiber rated to 125 C to withstand the high-temperature conditions of the heated piping enclosures at Portsmouth. Each AFOS broadcasts authenticated seal status and state-of-health messages via a tamper-protected radio-frequency transmitter mounted outside of the heated enclosure. The messages are received by two collection stations, operated redundantly

  9. US monitoring and verification technology: on-site inspection experience and future challenges

    International Nuclear Information System (INIS)

    Gullickson, R.L.; Carlson, D.; Ingraham, J.; Laird, B.

    2013-01-01

    The United States has a long and successful history of cooperation with treaty partners in monitoring and verification. For strategic arms reduction treaties, our collaboration has resulted in the development and application of systems with limited complexity and intrusiveness. As we progress beyond New START (NST) along the 'road to zero', the reduced number of nuclear weapons is likely to require increased confidence in monitoring and verification techniques. This may place increased demands on the technology to verify the presence of a nuclear weapon and even confirm the presence of a certain type. Simultaneously, this technology must include the ability to protect each treaty partner's sensitive nuclear weapons information. Mutual development of this technology by treaty partners offers the best approach for acceptance in treaty negotiations. This same approach of mutual cooperation and development is essential for developing nuclear test monitoring technology in support of the Comprehensive Nuclear Test Ban Treaty (CTBT). Our ability to detect low yield and evasive testing will be enhanced through mutually developed techniques and experiments using laboratory laser experiments and high explosives tests in a variety of locations and geologies. (authors)

  10. Experiences in Building Python Automation Framework for Verification and Data Collections

    Directory of Open Access Journals (Sweden)

    2010-09-01

    Full Text Available

    This paper describes our experiences in building a Python automation framework. Specifically, the automation framework is used to support verification and data collection scripts. The scripts control various test equipments in addition to the device under test (DUT to characterize a specific performance with a specific configuration or to evaluate the correctness of the behaviour of the DUT. The specific focus on this paper is on documenting our experiences in building an automation framework using Python: on the purposes, goals and the benefits, rather than on a tutorial of how to build such a framework.

  11. Integration of SPICE with TEK LV500 ASIC Design Verification System

    Directory of Open Access Journals (Sweden)

    A. Srivastava

    1996-01-01

    Full Text Available The present work involves integration of the simulation stage of design of a VLSI circuit and its testing stage. The SPICE simulator, TEK LV500 ASIC Design Verification System, and TekWaves, a test program generator for LV500, were integrated. A software interface in ‘C’ language in UNIX ‘solaris 1.x’ environment has been developed between SPICE and the testing tools (TekWAVES and LV500. The function of the software interface developed is multifold. It takes input from either SPICE2G.6 or SPICE 3e.1. The output generated by the interface software can be given as an input to either TekWAVES or LV500. A graphical user interface has also been developed with OPENWlNDOWS using Xview tool kit on SUN workstation. As an example, a two phase clock generator circuit has been considered and usefulness of the software demonstrated. The interface software could be easily linked with VLSI design such as MAGIC layout editor.

  12. Security Architecture and Protocol for Trust Verifications Regarding the Integrity of Files Stored in Cloud Services

    Directory of Open Access Journals (Sweden)

    Alexandre Pinheiro

    2018-03-01

    Full Text Available Cloud computing is considered an interesting paradigm due to its scalability, availability and virtually unlimited storage capacity. However, it is challenging to organize a cloud storage service (CSS that is safe from the client point-of-view and to implement this CSS in public clouds since it is not advisable to blindly consider this configuration as fully trustworthy. Ideally, owners of large amounts of data should trust their data to be in the cloud for a long period of time, without the burden of keeping copies of the original data, nor of accessing the whole content for verifications regarding data preservation. Due to these requirements, integrity, availability, privacy and trust are still challenging issues for the adoption of cloud storage services, especially when losing or leaking information can bring significant damage, be it legal or business-related. With such concerns in mind, this paper proposes an architecture for periodically monitoring both the information stored in the cloud and the service provider behavior. The architecture operates with a proposed protocol based on trust and encryption concepts to ensure cloud data integrity without compromising confidentiality and without overloading storage services. Extensive tests and simulations of the proposed architecture and protocol validate their functional behavior and performance.

  13. Security Architecture and Protocol for Trust Verifications Regarding the Integrity of Files Stored in Cloud Services.

    Science.gov (United States)

    Pinheiro, Alexandre; Dias Canedo, Edna; de Sousa Junior, Rafael Timoteo; de Oliveira Albuquerque, Robson; García Villalba, Luis Javier; Kim, Tai-Hoon

    2018-03-02

    Cloud computing is considered an interesting paradigm due to its scalability, availability and virtually unlimited storage capacity. However, it is challenging to organize a cloud storage service (CSS) that is safe from the client point-of-view and to implement this CSS in public clouds since it is not advisable to blindly consider this configuration as fully trustworthy. Ideally, owners of large amounts of data should trust their data to be in the cloud for a long period of time, without the burden of keeping copies of the original data, nor of accessing the whole content for verifications regarding data preservation. Due to these requirements, integrity, availability, privacy and trust are still challenging issues for the adoption of cloud storage services, especially when losing or leaking information can bring significant damage, be it legal or business-related. With such concerns in mind, this paper proposes an architecture for periodically monitoring both the information stored in the cloud and the service provider behavior. The architecture operates with a proposed protocol based on trust and encryption concepts to ensure cloud data integrity without compromising confidentiality and without overloading storage services. Extensive tests and simulations of the proposed architecture and protocol validate their functional behavior and performance.

  14. The Mailbox Computer System for the IAEA verification experiment on HEU downblending at the Portsmouth Gaseous Diffusion Plant

    International Nuclear Information System (INIS)

    Aronson, A.L.; Gordon, D.M.

    2000-01-01

    IN APRIL 1996, THE UNITED STATES (US) ADDED THE PORTSMOUTH GASEOUS DIFFUSION PLANT TO THE LIST OF FACILITIES ELIGIBLE FOR THE APPLICATION OF INTERNATIONAL ATOMIC ENERGY AGENCY (IAEA) SAFEGUARDS. AT THAT TIME, THE US PROPOSED THAT THE IAEA CARRY OUT A ''VERIFICATION EXPERIMENT'' AT THE PLANT WITH RESPECT TO DOOWNBLENDING OF ABOUT 13 METRIC TONS OF HIGHLY ENRICHED URANIUM (HEU) IN THE FORM OF URANIUM HEXAFLUROIDE (UF6). DURING THE PERIOD DECEMBER 1997 THROUGH JULY 1998, THE IAEA CARRIED OUT THE REQUESTED VERIFICATION EXPERIMENT. THE VERIFICATION APPROACH USED FOR THIS EXPERIMENT INCLUDED, AMONG OTHER MEASURES, THE ENTRY OF PROCESS-OPERATIONAL DATA BY THE FACILITY OPERATOR ON A NEAR-REAL-TIME BASIS INTO A ''MAILBOX'' COMPUTER LOCATED WITHIN A TAMPER-INDICATING ENCLOSURE SEALED BY THE IAEA

  15. Development of a data bank system for LWR integral experiment

    International Nuclear Information System (INIS)

    Naito, Yoshitaka; Aoyagi, Hideo

    1983-01-01

    A data bank system for LWR integral experiment has been developed for the purpose of alleviating various efforts associated with the verification of computer codes. The final aim of this system is such that the imput data for the code to be verified can be easily obtained, and the results of calculation can be obtained in the form of the comparison with measurement. Geometry and material composition as well as measured data are stored in the data bank. This data bank system is composed of four sub-programs; (1) registration program, (2) information retrieval program, (3) maintenance program, and (4) figure representation program. In this report, the structure of this data bank system and how to use the system are explained. An example of the use of this system is also included. (Aoki, K.)

  16. The Integrated Safety Management System Verification Enhancement Review of the Plutonium Finishing Plant (PFP)

    International Nuclear Information System (INIS)

    BRIGGS, C.R.

    2000-01-01

    The primary purpose of the verification enhancement review was for the DOE Richland Operations Office (RL) to verify contractor readiness for the independent DOE Integrated Safety Management System Verification (ISMSV) on the Plutonium Finishing Plant (PFP). Secondary objectives included: (1) to reinforce the engagement of management and to gauge management commitment and accountability; (2) to evaluate the ''value added'' benefit of direct public involvement; (3) to evaluate the ''value added'' benefit of direct worker involvement; (4) to evaluate the ''value added'' benefit of the panel-to-panel review approach; and, (5) to evaluate the utility of the review's methodology/adaptability to periodic assessments of ISM status. The review was conducted on December 6-8, 1999, and involved the conduct of two-hour interviews with five separate panels of individuals with various management and operations responsibilities related to PFP. A semi-structured interview process was employed by a team of five ''reviewers'' who directed open-ended questions to the panels which focused on: (1) evidence of management commitment, accountability, and involvement; and, (2) consideration and demonstration of stakeholder (including worker) information and involvement opportunities. The purpose of a panel-to-panel dialogue approach was to better spotlight: (1) areas of mutual reinforcement and alignment that could serve as good examples of the management commitment and accountability aspects of ISMS implementation, and, (2) areas of potential discrepancy that could provide opportunities for improvement. In summary, the Review Team found major strengths to include: (1) the use of multi-disciplinary project work teams to plan and do work; (2) the availability and broad usage of multiple tools to help with planning and integrating work; (3) senior management presence and accessibility; (4) the institutionalization of worker involvement; (5) encouragement of self-reporting and self

  17. Design exploration and verification platform, based on high-level modeling and FPGA prototyping, for fast and flexible digital communication in physics experiments

    International Nuclear Information System (INIS)

    Magazzù, G; Borgese, G; Costantino, N; Fanucci, L; Saponara, S; Incandela, J

    2013-01-01

    In many research fields as high energy physics (HEP), astrophysics, nuclear medicine or space engineering with harsh operating conditions, the use of fast and flexible digital communication protocols is becoming more and more important. The possibility to have a smart and tested top-down design flow for the design of a new protocol for control/readout of front-end electronics is very useful. To this aim, and to reduce development time, costs and risks, this paper describes an innovative design/verification flow applied as example case study to a new communication protocol called FF-LYNX. After the description of the main FF-LYNX features, the paper presents: the definition of a parametric SystemC-based Integrated Simulation Environment (ISE) for high-level protocol definition and validation; the set up of figure of merits to drive the design space exploration; the use of ISE for early analysis of the achievable performances when adopting the new communication protocol and its interfaces for a new (or upgraded) physics experiment; the design of VHDL IP cores for the TX and RX protocol interfaces; their implementation on a FPGA-based emulator for functional verification and finally the modification of the FPGA-based emulator for testing the ASIC chipset which implements the rad-tolerant protocol interfaces. For every step, significant results will be shown to underline the usefulness of this design and verification approach that can be applied to any new digital protocol development for smart detectors in physics experiments.

  18. Design exploration and verification platform, based on high-level modeling and FPGA prototyping, for fast and flexible digital communication in physics experiments

    Science.gov (United States)

    Magazzù, G.; Borgese, G.; Costantino, N.; Fanucci, L.; Incandela, J.; Saponara, S.

    2013-02-01

    In many research fields as high energy physics (HEP), astrophysics, nuclear medicine or space engineering with harsh operating conditions, the use of fast and flexible digital communication protocols is becoming more and more important. The possibility to have a smart and tested top-down design flow for the design of a new protocol for control/readout of front-end electronics is very useful. To this aim, and to reduce development time, costs and risks, this paper describes an innovative design/verification flow applied as example case study to a new communication protocol called FF-LYNX. After the description of the main FF-LYNX features, the paper presents: the definition of a parametric SystemC-based Integrated Simulation Environment (ISE) for high-level protocol definition and validation; the set up of figure of merits to drive the design space exploration; the use of ISE for early analysis of the achievable performances when adopting the new communication protocol and its interfaces for a new (or upgraded) physics experiment; the design of VHDL IP cores for the TX and RX protocol interfaces; their implementation on a FPGA-based emulator for functional verification and finally the modification of the FPGA-based emulator for testing the ASIC chipset which implements the rad-tolerant protocol interfaces. For every step, significant results will be shown to underline the usefulness of this design and verification approach that can be applied to any new digital protocol development for smart detectors in physics experiments.

  19. Design, Development, and Automated Verification of an Integrity-Protected Hypervisor

    Science.gov (United States)

    2012-07-16

    also require considerable manual effort. For example, the verification of the SEL4 operating system [45] required several man years effort. In...Winwood. seL4 : formal verification of an OS kernel. In Proc. of SOSP, 2009. [46] K. Kortchinsky. Cloudburst: A VMware guest to host escape story

  20. Integrated Medical Model (IMM) Project Verification, Validation, and Credibility (VVandC)

    Science.gov (United States)

    Walton, M.; Boley, L.; Keenan, L.; Kerstman, E.; Shah, R.; Young, M.; Saile, L.; Garcia, Y.; Meyers, J.; Reyes, D.

    2015-01-01

    The Integrated Medical Model (IMM) Project supports end user requests by employing the Integrated Medical Evidence Database (iMED) and IMM tools as well as subject matter expertise within the Project. The iMED houses data used by the IMM. The IMM is designed to forecast relative changes for a specified set of crew health and mission success risk metrics by using a probabilistic model based on historical data, cohort data, and subject matter expert opinion. A stochastic approach is taken because deterministic results would not appropriately reflect the uncertainty in the IMM inputs. Once the IMM was conceptualized, a plan was needed to rigorously assess input information, framework and code, and output results of the IMM, and ensure that end user requests and requirements were considered during all stages of model development and implementation, as well as lay the foundation for external review and application. METHODS: In 2008, the Project team developed a comprehensive verification and validation (VV) plan, which specified internal and external review criteria encompassing 1) verification of data and IMM structure to ensure proper implementation of the IMM, 2) several validation techniques to confirm that the simulation capability of the IMM appropriately represents occurrences and consequences of medical conditions during space missions, and 3) credibility processes to develop user confidence in the information derived from the IMM. When the NASA-STD-7009 (7009) [1] was published, the Project team updated their verification, validation, and credibility (VVC) project plan to meet 7009 requirements and include 7009 tools in reporting VVC status of the IMM. Construction of these tools included meeting documentation and evidence requirements sufficient to meet external review success criteria. RESULTS: IMM Project VVC updates are compiled recurrently and include updates to the 7009 Compliance and Credibility matrices. Reporting tools have evolved over the lifetime of

  1. GPFS HPSS Integration: Implementation Experience

    Energy Technology Data Exchange (ETDEWEB)

    Hazen, Damian; Hick, Jason

    2008-08-12

    In 2005 NERSC and IBM Global Services Federal began work to develop an integrated HSM solution using the GPFS file system and the HPSS hierarchical storage system. It was foreseen that this solution would play a key role in data management at NERSC, and fill a market niche for IBM. As with many large and complex software projects, there were a number of unforeseen difficulties encountered during implementation. As the effort progressed, it became apparent that DMAPI alone could not be used to tie two distributed, high performance systems together without serious impact on performance. This document discusses the evolution of the development effort, from one which attempted to synchronize the GPFS and HPSS name spaces relying solely on GPFS?s implementation of the DMAPI specification, to one with a more traditional HSM functionality that had no synchronized namespace in HPSS, and finally to an effort, still underway, which will provide traditional HSM functionality, but requires features from the GPFS Information Lifecycle Management (ILM) to fully achieve this goal in a way which is scalable and meets the needs of sites with aggressive performance requirements. The last approach makes concessions to portability by using file system features such as ILM and snapshotting in order to achieve a scalable design.

  2. Further optimisations of constant Q cepstral processing for integrated utterance and text-dependent speaker verification

    DEFF Research Database (Denmark)

    Delgado, Hector; Todisco, Massimiliano; Sahidullah, Md

    2016-01-01

    Many authentication applications involving automatic speaker verification (ASV) demand robust performance using short-duration, fixed or prompted text utterances. Text constraints not only reduce the phone-mismatch between enrollment and test utterances, which generally leads to improved performa...

  3. Integrating software testing and run-time checking in an assertion verification framework

    OpenAIRE

    Mera, E.; López García, Pedro; Hermenegildo, Manuel V.

    2009-01-01

    We have designed and implemented a framework that unifies unit testing and run-time verification (as well as static verification and static debugging). A key contribution of our approach is that a unified assertion language is used for all of these tasks. We first propose methods for compiling runtime checks for (parts of) assertions which cannot be verified at compile-time via program transformation. This transformation allows checking preconditions and postconditions, including conditional...

  4. Integrating Fingerprint Verification into the Smart Card-Based Healthcare Information System

    OpenAIRE

    Jin-Won Park; Sung Bum Pan; Yongwha Chung; Daesung Moon

    2009-01-01

    As VLSI technology has been improved, a smart card employing 32-bit processors has been released, and more personal information such as medical, financial data can be stored in the card. Thus, it becomes important to protect personal information stored in the card. Verification of the card holder's identity using a fingerprint has advantages over the present practices of Personal Identification Numbers (PINs) and passwords. However, the computational workload of fingerprint verification i...

  5. Integrating Fingerprint Verification into the Smart Card-Based Healthcare Information System

    Directory of Open Access Journals (Sweden)

    Jin-Won Park

    2009-01-01

    Full Text Available As VLSI technology has been improved, a smart card employing 32-bit processors has been released, and more personal information such as medical, financial data can be stored in the card. Thus, it becomes important to protect personal information stored in the card. Verification of the card holder's identity using a fingerprint has advantages over the present practices of Personal Identification Numbers (PINs and passwords. However, the computational workload of fingerprint verification is much heavier than that of the typical PIN-based solution. In this paper, we consider three strategies to implement fingerprint verification in a smart card environment and how to distribute the modules of fingerprint verification between the smart card and the card reader. We first evaluate the number of instructions of each step of a typical fingerprint verification algorithm, and estimate the execution time of several cryptographic algorithms to guarantee the security/privacy of the fingerprint data transmitted in the smart card with the client-server environment. Based on the evaluation results, we analyze each scenario with respect to the security level and the real-time execution requirements in order to implement fingerprint verification in the smart card with the client-server environment.

  6. Integrating Fingerprint Verification into the Smart Card-Based Healthcare Information System

    Science.gov (United States)

    Moon, Daesung; Chung, Yongwha; Pan, Sung Bum; Park, Jin-Won

    2009-12-01

    As VLSI technology has been improved, a smart card employing 32-bit processors has been released, and more personal information such as medical, financial data can be stored in the card. Thus, it becomes important to protect personal information stored in the card. Verification of the card holder's identity using a fingerprint has advantages over the present practices of Personal Identification Numbers (PINs) and passwords. However, the computational workload of fingerprint verification is much heavier than that of the typical PIN-based solution. In this paper, we consider three strategies to implement fingerprint verification in a smart card environment and how to distribute the modules of fingerprint verification between the smart card and the card reader. We first evaluate the number of instructions of each step of a typical fingerprint verification algorithm, and estimate the execution time of several cryptographic algorithms to guarantee the security/privacy of the fingerprint data transmitted in the smart card with the client-server environment. Based on the evaluation results, we analyze each scenario with respect to the security level and the real-time execution requirements in order to implement fingerprint verification in the smart card with the client-server environment.

  7. Verification of atmospheric diffusion models using data of long term atmospheric diffusion experiments

    International Nuclear Information System (INIS)

    Tamura, Junji; Kido, Hiroko; Hato, Shinji; Homma, Toshimitsu

    2009-03-01

    Straight-line or segmented plume models as atmospheric diffusion models are commonly used in probabilistic accident consequence assessment (PCA) codes due to cost and time savings. The PCA code, OSCAAR developed by Japan Atomic Energy Research Institute (Present; Japan Atomic Energy Agency) uses the variable puff trajectory model to calculate atmospheric transport and dispersion of released radionuclides. In order to investigate uncertainties involved with the structure of the atmospheric dispersion/deposition model in OSCAAR, we have introduced the more sophisticated computer codes that included regional meteorological models RAMS and atmospheric transport model HYPACT, which were developed by Colorado State University, and comparative analyses between OSCAAR and RAMS/HYPACT have been performed. In this study, model verification of OSCAAR and RAMS/HYPACT was conducted using data of long term atmospheric diffusion experiments, which were carried out in Tokai-mura, Ibaraki-ken. The predictions by models and the results of the atmospheric diffusion experiments indicated relatively good agreements. And it was shown that model performance of OSCAAR was the same degree as it of RAMS/HYPACT. (author)

  8. Practical experience with a local verification system for containment and surveillance sensors

    International Nuclear Information System (INIS)

    Lauppe, W.D.; Richter, B.; Stein, G.

    1984-01-01

    With the growing number of nuclear facilities and a number of large commercial bulk handling facilities steadily coming into operation the International Atomic Energy Agency is faced with increasing requirements as to reducing its inspection efforts. One means of meeting these requirements will be to deploy facility based remote interrogation methods for its containment and surveillance instrumentation. Such a technical concept of remote interrogation was realized through the so-called LOVER system development, a local verification system for electronic safeguards seal systems. In the present investigations the application was extended to radiation monitoring by introducing an electronic interface between the electronic safeguards seal and the neutron detector electronics of a waste monitoring system. The paper discusses the safeguards motivation and background, the experimental setup of the safeguards system and the performance characteristics of this LOVER system. First conclusions can be drawn from the performance results with respect to the applicability in international safeguards. This comprises in particular the definition of design specifications for an integrated remote interrogation system for various types of containment and surveillance instruments and the specifications of safeguards applications employing such a system

  9. DOE handbook: Integrated safety management systems (ISMS) verification. Team leader's handbook

    International Nuclear Information System (INIS)

    1999-06-01

    The primary purpose of this handbook is to provide guidance to the ISMS verification Team Leader and the verification team in conducting ISMS verifications. The handbook describes methods and approaches for the review of the ISMS documentation (Phase I) and ISMS implementation (Phase II) and provides information useful to the Team Leader in preparing the review plan, selecting and training the team, coordinating the conduct of the verification, and documenting the results. The process and techniques described are based on the results of several pilot ISMS verifications that have been conducted across the DOE complex. A secondary purpose of this handbook is to provide information useful in developing DOE personnel to conduct these reviews. Specifically, this handbook describes methods and approaches to: (1) Develop the scope of the Phase 1 and Phase 2 review processes to be consistent with the history, hazards, and complexity of the site, facility, or activity; (2) Develop procedures for the conduct of the Phase 1 review, validating that the ISMS documentation satisfies the DEAR clause as amplified in DOE Policies 450.4, 450.5, 450.6 and associated guidance and that DOE can effectively execute responsibilities as described in the Functions, Responsibilities, and Authorities Manual (FRAM); (3) Develop procedures for the conduct of the Phase 2 review, validating that the description approved by the Approval Authority, following or concurrent with the Phase 1 review, has been implemented; and (4) Describe a methodology by which the DOE ISMS verification teams will be advised, trained, and/or mentored to conduct subsequent ISMS verifications. The handbook provides proven methods and approaches for verifying that commitments related to the DEAR, the FRAM, and associated amplifying guidance are in place and implemented in nuclear and high risk facilities. This handbook also contains useful guidance to line managers when preparing for a review of ISMS for radiological

  10. Analytical and Experimental Verification of a Flight Article for a Mach-8 Boundary-Layer Experiment

    Science.gov (United States)

    Richards, W. Lance; Monaghan, Richard C.

    1996-01-01

    Preparations for a boundary-layer transition experiment to be conducted on a future flight mission of the air-launched Pegasus(TM) rocket are underway. The experiment requires a flight-test article called a glove to be attached to the wing of the Mach-8 first-stage booster. A three-dimensional, nonlinear finite-element analysis has been performed and significant small-scale laboratory testing has been accomplished to ensure the glove design integrity and quality of the experiment. Reliance on both the analysis and experiment activities has been instrumental in the success of the flight-article design. Results obtained from the structural analysis and laboratory testing show that all glove components are well within the allowable thermal stress and deformation requirements to satisfy the experiment objectives.

  11. Office of River Protection Integrated Safety Management System Phase 1 Verification Corrective Action Plan; FINAL

    International Nuclear Information System (INIS)

    CLARK, D.L.

    1999-01-01

    The purpose of this Corrective Action Plan is to demonstrate the OW planned and/or completed actions to implement ISMS as well as prepare for the RPP ISMS Phase II Verification scheduled for August, 1999. This Plan collates implied or explicit ORP actions identified in several key ISMS documents and aligns those actions and responsibilities perceived necessary to appropriately disposition all ISM Phase II preparation activities specific to the ORP. The objective will be to complete or disposition the corrective actions prior to the commencement of the ISMS Phase II Verification. Improvement products/tasks not slated for completion prior to the RPP Phase II verification will be incorporated as corrective actions into the Strategic System Execution Plan (SSEP) Gap Analysis. Many of the business and management systems that were reviewed in the ISMS Phase I verification are being modified to support the ORP transition and are being assessed through the SSEP. The actions and processes identified in the SSEP will support the development of the ORP and continued ISMS implementation as committed to be complete by end of FY-2000

  12. Office of River Protection Integrated Safety Management System Phase 1 Verification Corrective Action Plan

    International Nuclear Information System (INIS)

    CLARK, D.L.

    1999-01-01

    The purpose of this Corrective Action Plan is to demonstrate the OW planned and/or completed actions to implement ISMS as well as prepare for the RPP ISMS Phase II Verification scheduled for August, 1999. This Plan collates implied or explicit ORP actions identified in several key ISMS documents and aligns those actions and responsibilities perceived necessary to appropriately disposition all ISM Phase II preparation activities specific to the ORP. The objective will be to complete or disposition the corrective actions prior to the commencement of the ISMS Phase II Verification. Improvement products/tasks not slated for completion prior to the RPP Phase II verification will be incorporated as corrective actions into the Strategic System Execution Plan (SSEP) Gap Analysis. Many of the business and management systems that were reviewed in the ISMS Phase I verification are being modified to support the ORP transition and are being assessed through the SSEP. The actions and processes identified in the SSEP will support the development of the ORP and continued ISMS implementation as committed to be complete by end of FY-2000

  13. Neutron radiography experiments for verification of soluble boron mixing and transport modeling under natural circulation conditions

    International Nuclear Information System (INIS)

    Morlang, M.M.; Feltus, M.A.

    1996-01-01

    The use of neutron radiography for visualization of fluid flow through flow visualization modules has been very successful. Current experiments at the Penn State Breazeale Reactor serve to verify the mixing and transport of soluble boron under natural flow conditions as would be experienced in a pressurized water reactor. Different flow geometries have been modeled including holes, slots, and baffles. Flow modules are constructed of aluminum box material 1 1/2 inches by 4 inches in varying lengths. An experimental flow system was built which pumps fluid to a head tank and natural circulation flow occurs from the head tank through the flow visualization module to be radio-graphed. The entire flow system is mounted on a portable assembly to allow placement of the flow visualization module in front of the neutron beam port. A neutron-transparent fluor-inert fluid is used to simulate water at different densities. Boron is modeled by gadolinium oxide powder as a tracer element, which is placed in a mixing assembly and injected into the system a remotely operated electric valve, once the reactor is at power. The entire sequence is recorded on real-time video. Still photographs are made frame-by-frame from the video tape. Computers are used to digitally enhance the video and still photographs. The data obtained from the enhancement will be used for verification of simple geometry predictions using the TRAC and RELAP thermal-hydraulic codes. A detailed model of a reactor vessel inlet plenum, downcomer region, flow distribution area and core inlet is being constructed to model the APGOO plenum. Successive radiography experiments of each section of the model under identical conditions will provide a complete vessel / core model for comparison with the thermal-hydraulic codes

  14. Neutron radiography experiments for verification of soluble boron mixing and transport modeling under natural circulation conditions

    International Nuclear Information System (INIS)

    Feltus, M.A.; Morlang, G.M.

    1996-01-01

    The use of neutron radiography for visualization of fluid flow through flow visualization modules has been very successful. Current experiments at the Penn State Breazeale Reactor serve to verify the mixing and transport of soluble boron under natural flow conditions as would be experienced in a pressurized water reactor. Different flow geometries have been modeled including holes, slots, and baffles. Flow modules are constructed of aluminum box material 1 1/2 inches by 4 inches in varying lengths. An experimental flow system was built which pumps fluid to a head tank and natural circulation flow occurs from the head tank through the flow visualization module to be radiographed. The entire flow system is mounted on a portable assembly to allow placement of the flow visualization module in front of the neutron beam port. A neutron-transparent fluorinert fluid is used to simulate water at different densities. Boron is modeled by gadolinium oxide powder as a tracer element, which is placed in a mixing assembly and injected into the system by remote operated electric valve, once the reactor is at power. The entire sequence is recorded on real-time video. Still photographs are made frame-by-frame from the video tape. Computers are used to digitally enhance the video and still photographs. The data obtained from the enhancement will be used for verification of simple geometry predictions using the TRAC and RELAP thermal-hydraulic codes. A detailed model of a reactor vessel inlet plenum, downcomer region, flow distribution area and core inlet is being constructed to model the AP600 plenum. Successive radiography experiments of each section of the model under identical conditions will provide a complete vessel/core model for comparison with the thermal-hydraulic codes

  15. First Experience With Real-Time EPID-Based Delivery Verification During IMRT and VMAT Sessions

    International Nuclear Information System (INIS)

    Woodruff, Henry C.; Fuangrod, Todsaporn; Van Uytven, Eric; McCurdy, Boyd M.C.; Beek, Timothy van; Bhatia, Shashank; Greer, Peter B.

    2015-01-01

    Purpose: Gantry-mounted megavoltage electronic portal imaging devices (EPIDs) have become ubiquitous on linear accelerators. WatchDog is a novel application of EPIDs, in which the image frames acquired during treatment are used to monitor treatment delivery in real time. We report on the preliminary use of WatchDog in a prospective study of cancer patients undergoing intensity modulated radiation therapy (IMRT) and volumetric modulated arc therapy (VMAT) and identify the challenges of clinical adoption. Methods and Materials: At the time of submission, 28 cancer patients (head and neck, pelvis, and prostate) undergoing fractionated external beam radiation therapy (24 IMRT, 4 VMAT) had ≥1 treatment fraction verified in real time (131 fractions or 881 fields). EPID images acquired continuously during treatment were synchronized and compared with model-generated transit EPID images within a frame time (∼0.1 s). A χ comparison was performed to cumulative frames to gauge the overall delivery quality, and the resulting pass rates were reported graphically during treatment delivery. Every frame acquired (500-1500 per fraction) was saved for postprocessing and analysis. Results: The system reported the mean ± standard deviation in real time χ 91.1% ± 11.5% (83.6% ± 13.2%) for cumulative frame χ analysis with 4%, 4 mm (3%, 3 mm) criteria, global over the integrated image. Conclusions: A real-time EPID-based radiation delivery verification system for IMRT and VMAT has been demonstrated that aims to prevent major mistreatments in radiation therapy.

  16. First Experience With Real-Time EPID-Based Delivery Verification During IMRT and VMAT Sessions

    Energy Technology Data Exchange (ETDEWEB)

    Woodruff, Henry C., E-mail: henry.woodruff@newcastle.edu.au [Faculty of Science and Information Technology, School of Mathematical and Physical Sciences, University of Newcastle, New South Wales (Australia); Fuangrod, Todsaporn [Faculty of Engineering and Built Environment, School of Electrical Engineering and Computer Science, University of Newcastle, New South Wales (Australia); Van Uytven, Eric; McCurdy, Boyd M.C.; Beek, Timothy van [Division of Medical Physics, CancerCare Manitoba, Winnipeg, Manitoba (Canada); Department of Physics and Astronomy, University of Manitoba, Winnipeg, Manitoba (Canada); Department of Radiology, University of Manitoba, Winnipeg, Manitoba (Canada); Bhatia, Shashank [Department of Radiation Oncology, Calvary Mater Newcastle Hospital, Newcastle, New South Wales (Australia); Greer, Peter B. [Faculty of Science and Information Technology, School of Mathematical and Physical Sciences, University of Newcastle, New South Wales (Australia); Department of Radiation Oncology, Calvary Mater Newcastle Hospital, Newcastle, New South Wales (Australia)

    2015-11-01

    Purpose: Gantry-mounted megavoltage electronic portal imaging devices (EPIDs) have become ubiquitous on linear accelerators. WatchDog is a novel application of EPIDs, in which the image frames acquired during treatment are used to monitor treatment delivery in real time. We report on the preliminary use of WatchDog in a prospective study of cancer patients undergoing intensity modulated radiation therapy (IMRT) and volumetric modulated arc therapy (VMAT) and identify the challenges of clinical adoption. Methods and Materials: At the time of submission, 28 cancer patients (head and neck, pelvis, and prostate) undergoing fractionated external beam radiation therapy (24 IMRT, 4 VMAT) had ≥1 treatment fraction verified in real time (131 fractions or 881 fields). EPID images acquired continuously during treatment were synchronized and compared with model-generated transit EPID images within a frame time (∼0.1 s). A χ comparison was performed to cumulative frames to gauge the overall delivery quality, and the resulting pass rates were reported graphically during treatment delivery. Every frame acquired (500-1500 per fraction) was saved for postprocessing and analysis. Results: The system reported the mean ± standard deviation in real time χ 91.1% ± 11.5% (83.6% ± 13.2%) for cumulative frame χ analysis with 4%, 4 mm (3%, 3 mm) criteria, global over the integrated image. Conclusions: A real-time EPID-based radiation delivery verification system for IMRT and VMAT has been demonstrated that aims to prevent major mistreatments in radiation therapy.

  17. Integrated Safety Management System Phase 1 and 2 Verification for the Environmental Restoration Contractor Volumes 1 and 2

    Energy Technology Data Exchange (ETDEWEB)

    CARTER, R.P.

    2000-04-04

    DOE Policy 450.4 mandates that safety be integrated into all aspects of the management and operations of its facilities. The goal of an institutionalized Integrated Safety Management System (ISMS) is to have a single integrated system that includes Environment, Safety, and Health requirements in the work planning and execution processes to ensure the protection of the worker, public, environment, and the federal property over the life cycle of the Environmental Restoration (ER) Project. The purpose of this Environmental Restoration Contractor (ERC) ISMS Phase MI Verification was to determine whether ISMS programs and processes were institutionalized within the ER Project, whether these programs and processes were implemented, and whether the system had promoted the development of a safety conscious work culture.

  18. Design and Verification of Application Specific Integrated Circuits in a Network of Online Labs

    Directory of Open Access Journals (Sweden)

    A.Y. Al-Zoubi

    2009-08-01

    Full Text Available A solution to implement a remote laboratory for testing and designing analog Application-Specific Integrated Circuits of the type (ispPAC10 is presented. The application allows electrical engineering students to access and perform measurements and conduct analog electronics experiments over the internet. PAC-Designer software, running on a Citrix server, is used in the circuit design in which the signals are generated and the responses are acquired by a data acquisition board controlled by LabVIEW. Three interconnected remote labs located in three different continents will be implementing the proposed system.

  19. INTEGRATION POLICY TOWARDS IMMIGRANTS: CURRENT EXPERIENCE

    Directory of Open Access Journals (Sweden)

    Nadiia Bureiko

    2012-03-01

    Full Text Available In the contemporary world the intensity of the immigration movements is constantly increasing. Countries which experience great immigrant flows are facing numerous problems which should be solved. The article studies the current immigration flows in EU countries, the United States of America and Canada and presents three main models of integration policy towards immigrants – political assimilation, functional integration and multicultural model. Separate models are distinguished for the Muslims’ integration. The author examines the peculiarities of every model and examines the conclusions provided by the Migrant Integration Policy Index (MIPEX concerning the situation of the immigrants’ integration in 31 countries in 2011. Among all the policy indicators the first that are defined are as follows: political participation, education, labour market mobility and anti-discrimination. The situation with immigrants’ integration in Ukraine is also studied as it is gaining a great attention of the authorities and the public. The measures and practical steps done regarding this situation in Ukraine in recent years are analyzed using the information offered by the State Migration Service of Ukraine.

  20. Validating the BISON fuel performance code to integral LWR experiments

    Energy Technology Data Exchange (ETDEWEB)

    Williamson, R.L., E-mail: Richard.Williamson@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Gamble, K.A., E-mail: Kyle.Gamble@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Perez, D.M., E-mail: Danielle.Perez@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Novascone, S.R., E-mail: Stephen.Novascone@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Pastore, G., E-mail: Giovanni.Pastore@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Gardner, R.J., E-mail: Russell.Gardner@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Hales, J.D., E-mail: Jason.Hales@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Liu, W., E-mail: Wenfeng.Liu@anatech.com [ANATECH Corporation, 5435 Oberlin Dr., San Diego, CA 92121 (United States); Mai, A., E-mail: Anh.Mai@anatech.com [ANATECH Corporation, 5435 Oberlin Dr., San Diego, CA 92121 (United States)

    2016-05-15

    Highlights: • The BISON multidimensional fuel performance code is being validated to integral LWR experiments. • Code and solution verification are necessary prerequisites to validation. • Fuel centerline temperature comparisons through all phases of fuel life are very reasonable. • Accuracy in predicting fission gas release is consistent with state-of-the-art modeling and the involved uncertainties. • Rod diameter comparisons are not satisfactory and further investigation is underway. - Abstract: BISON is a modern finite element-based nuclear fuel performance code that has been under development at Idaho National Laboratory (INL) since 2009. The code is applicable to both steady and transient fuel behavior and has been used to analyze a variety of fuel forms in 1D spherical, 2D axisymmetric, or 3D geometries. Code validation is underway and is the subject of this study. A brief overview of BISON's computational framework, governing equations, and general material and behavioral models is provided. BISON code and solution verification procedures are described, followed by a summary of the experimental data used to date for validation of Light Water Reactor (LWR) fuel. Validation comparisons focus on fuel centerline temperature, fission gas release, and rod diameter both before and following fuel-clad mechanical contact. Comparisons for 35 LWR rods are consolidated to provide an overall view of how the code is predicting physical behavior, with a few select validation cases discussed in greater detail. Results demonstrate that (1) fuel centerline temperature comparisons through all phases of fuel life are very reasonable with deviations between predictions and experimental data within ±10% for early life through high burnup fuel and only slightly out of these bounds for power ramp experiments, (2) accuracy in predicting fission gas release appears to be consistent with state-of-the-art modeling and with the involved uncertainties and (3) comparison

  1. ORNL fusion reactor shielding integral experiments

    International Nuclear Information System (INIS)

    Santoro, R.T.; Alsmiller, R.G. Jr.; Barnes, J.M.; Chapman, G.T.

    1980-01-01

    Integral experiments that measure the neutron and gamma-ray energy spectra resulting from the attenuation of approx. 14 MeV T(D,n) 4 He reaction neutrons in laminated slabs of stainless steel type 304, borated polyethylene, and a tungsten alloy (Hevimet) and from neutrons streaming through a 30-cm-diameter iron duct (L/D = 3) imbedded in a concrete shield have been performed. The facility, the NE-213 liquid scintillator detector system, and the experimental techniques used to obtain the measured data are described. The two-dimensional discrete ordinates radiation transport codes, calculational models, and nuclear data used in the analysis of the experiments are reviewed

  2. Hydraulic experiment on formation mechanism of tsunami deposit and verification of sediment transport model for tsunamis

    Science.gov (United States)

    Yamamoto, A.; Takahashi, T.; Harada, K.; Sakuraba, M.; Nojima, K.

    2017-12-01

    observation of velocity in Kesennnuma bay had a low accuracy. On the other hand, this hydraulic experiment measured accurate velocity and sand deposition distribution of various condition. Based on these data, we tried more accurate verification of the model of Takahashi et al. (1999).

  3. The developments and verifications of trace model for IIST LOCA experiments

    Energy Technology Data Exchange (ETDEWEB)

    Zhuang, W. X. [Inst. of Nuclear Engineering and Science, National Tsing-Hua Univ., Taiwan, No. 101, Kuang-Fu Road, Hsinchu 30013, Taiwan (China); Wang, J. R.; Lin, H. T. [Inst. of Nuclear Energy Research, Taiwan, No. 1000, Wenhua Rd., Longtan Township, Taoyuan County 32546, Taiwan (China); Shih, C.; Huang, K. C. [Inst. of Nuclear Engineering and Science, National Tsing-Hua Univ., Taiwan, No. 101, Kuang-Fu Road, Hsinchu 30013, Taiwan (China); Dept. of Engineering and System Science, National Tsing-Hua Univ., Taiwan, No. 101, Kuang-Fu Road, Hsinchu 30013, Taiwan (China)

    2012-07-01

    The test facility IIST (INER Integral System Test) is a Reduced-Height and Reduced-Pressure (RHRP) integral test loop, which was constructed for the purposes of conducting thermal hydraulic and safety analysis of the Westinghouse three-loop PWR Nuclear Power Plants. The main purpose of this study is to develop and verify TRACE models of IIST through the IIST small break loss of coolant accident (SBLOCA) experiments. First, two different IIST TRACE models which include a pipe-vessel model and a 3-D vessel component model have been built. The steady state and transient calculation results show that both TRACE models have the ability to simulate the related IIST experiments. Comparing with IIST SBLOCA experiment data, the 3-D vessel component model has shown better simulation capabilities so that it has been chosen for all further thermal hydraulic studies. The second step is the sensitivity studies of two phase multiplier and subcooled liquid multiplier in choked flow model; and two correlation constants in CCFL model respectively. As a result, an appropriate set of multipliers and constants can be determined. In summary, a verified IIST TRACE model with 3D vessel component, and fine-tuned choked flow model and CCFL model is established for further studies on IIST experiments in the future. (authors)

  4. Six years of experience in the planning and verification of the IMRT dynamics with portal dosimetry

    International Nuclear Information System (INIS)

    Molina Lopez, M. Y.; Pardo Perez, E.; Ruiz Maqueda, S.; Castro Novais, J.; Diaz Gavela, A. A.

    2013-01-01

    The objective of this study is the make a review of the method of verification of the IMRT throughout the 6 years of functioning of the service of-radiophysics and radiology protection, analyzing the parameters of each field evaluation to the 718 made IMRT during this period. (Author)

  5. Examining the examiners: an online eyebrow verification experiment inspired by FISWG

    NARCIS (Netherlands)

    Zeinstra, Christopher Gerard; Veldhuis, Raymond N.J.; Spreeuwers, Lieuwe Jan

    2015-01-01

    In forensic face comparison, one of the features taken into account are the eyebrows. In this paper, we investigate human performance on an eyebrow verification task. This task is executed twice by participants: a "best-effort" approach and an approach using features based on forensic knowledge. The

  6. Planning for an Integrated Research Experiment

    International Nuclear Information System (INIS)

    Barnard, J.J.; Ahle, L.E.; Bangerter, R.O.; Bieniosek, F.M.; Celata, C.M.; Faltens, A.; Friedman, A.; Grote, D.P.; Haber, I.; Henestroza, E.; Kishek, R.A.; Hoon, M.J.L. de; Karpenko, V.P.; Kwan, J.W.; Lee, E.P.; Logan, B.G.; Lund, S.M.; Meier, W.R.; Molvik, A.W.; Sangster, T.C.; Seidl, P.A.; Sharp, W.M.

    2000-01-01

    The authors describe the goals and research program leading to the Heavy Ion Integrated Research Experiment (IRE). They review the basic constraints which lead to a design and give examples of parameters and capabilities of an IRE. We also show design tradeoffs generated by the systems code IBEAM. A multi-pronged Phase 1 research effort is laying the groundwork for the Integrated Research Experiment. Experiment, technology development, theory, simulation, and systems studies are all playing major roles in this Phase I research. The key research areas are: (1) Source and injector (for investigation of a high brightness, multiple beam, low cost injector); (2) High current transport (to examine effects at full driver-scale line charge density, including the maximization of the beam filling-factor and control of electrons); (3) Enabling technology development (low cost and high performance magnetic core material, superconducting magnetic quadrupole arrays, insulators, and pulsers); and (4) Beam simulations and theory (for investigations of beam matching, specification of accelerator errors, studies of emittance growth, halo, and bunch compression, in the accelerator, and neutralization methods, stripping effects, spot size minimization in the chamber); and (5) Systems optimization (minimization of cost and maximization of pulse energy and beam intensity). They have begun the process of designing, simulating, and optimizing the next major heavy-ion induction accelerator, the IRE. This accelerator facility will, in turn, help provide the basis to proceed to the next step in the development of IFE as an attractive source of fusion energy

  7. Simulation of integrated beam experiment designs

    International Nuclear Information System (INIS)

    Grote, D.P.; Sharp, W.M.

    2004-01-01

    Simulation of designs of an Integrated Beam Experiment (IBX) class accelerator have been carried out. These simulations are an important tool for validating such designs. Issues such as envelope mismatch and emittance growth can be examined in a self-consistent manner, including the details of injection, accelerator transitions, long-term transport, and longitudinal compression. The simulations are three-dimensional and time-dependent, and begin at the source. They continue up through the end of the acceleration region, at which point the data is passed on to a separate simulation of the drift compression. Results are be presented

  8. An Integrated Approach to Conversion, Verification, Validation and Integrity of AFRL Generic Engine Model and Simulation (Postprint)

    Science.gov (United States)

    2007-02-01

    and Astronautics 11 PS3C W3 P3 T3 FAR3 Ps3 W41 P41 T41 FAR41 Ps41 W4 P4 T4 FAR4 Ps4 7 NozFlow 6 Flow45 5 Flow44 4 Flow41 3 Flow4 2 Flow3 1 N2Bal... Motivation for Modeling and Simulation Work The Augmented Generic Engine Model (AGEM) Model Verification and Validation (V&V) Assessment of AGEM V&V

  9. Fusion Ignition Research Experiment System Integration

    International Nuclear Information System (INIS)

    Brown, T.

    1999-01-01

    The FIRE (Fusion Ignition Research Experiment) configuration has been designed to meet the physics objectives and subsystem requirements in an arrangement that allows remote maintenance of in-vessel components and hands-on maintenance of components outside the TF (toroidal-field) boundary. The general arrangement consists of sixteen wedged-shaped TF coils that surround a free-standing central solenoid (CS), a double-wall vacuum vessel and internal plasma-facing components. A center tie rod is used to help support the vertical magnetic loads and a compression ring is used to maintain wedge pressure in the inboard corners of the TF coils. The magnets are liquid nitrogen cooled and the entire device is surrounded by a thermal enclosure. The double-wall vacuum vessel integrates cooling and shielding in a shape that maximizes shielding of ex-vessel components. The FIRE configuration development and integration process has evolved from an early stage of concept selection to a higher level of machine definition and component details. This paper describes the status of the configuration development and the integration of the major subsystem components

  10. Integrated verification and testing system (IVTS) for HAL/S programs

    Science.gov (United States)

    Senn, E. H.; Ames, K. R.; Smith, K. A.

    1983-01-01

    The IVTS is a large software system designed to support user-controlled verification analysis and testing activities for programs written in the HAL/S language. The system is composed of a user interface and user command language, analysis tools and an organized data base of host system files. The analysis tools are of four major types: (1) static analysis, (2) symbolic execution, (3) dynamic analysis (testing), and (4) documentation enhancement. The IVTS requires a split HAL/S compiler, divided at the natural separation point between the parser/lexical analyzer phase and the target machine code generator phase. The IVTS uses the internal program form (HALMAT) between these two phases as primary input for the analysis tools. The dynamic analysis component requires some way to 'execute' the object HAL/S program. The execution medium may be an interpretive simulation or an actual host or target machine.

  11. The calculation and experiment verification of geometry factors of disk sources and detectors

    International Nuclear Information System (INIS)

    Shi Zhixia; Minowa, Y.

    1993-01-01

    In alpha counting the efficiency of counting system is most frequently determined from the counter response to a calibrated source. Whenever this procedure is used, however, question invariably arise as to the integrity of the standard source, or indeed the validity of the primary calibration. As a check, therefore, it is often helped to be able to calculate the disintegration rate from counting rate data. The conclusion are: 1. If the source is thin enough the error E is generally less than 5%. It is acceptable in routine measurement. When the standard source lacks for experiment we can use the geometry factor calculated instead of measured efficiency. 2. The geometry factor calculated can be used to correct the counter system, study the effect of each parameters and identify those parameters needing careful control. 3. The method of overlapping area of the source and the projection of the detector is very believable, simple and convenient for calculating geometry. (5 tabs.)

  12. DABIE: a data banking system of integral experiments for reactor core characteristics computer codes

    International Nuclear Information System (INIS)

    Matsumoto, Kiyoshi; Naito, Yoshitaka; Ohkubo, Shuji; Aoyanagi, Hideo.

    1987-05-01

    A data banking system of integral experiments for reactor core characteristics computer codes, DABIE, has been developed to lighten the burden on searching so many documents to obtain experiment data required for verification of reactor core characteristics computer code. This data banking system, DABIE, has capabilities of systematic classification, registration and easy retrieval of experiment data. DABIE consists of data bank and supporting programs. Supporting programs are data registration program, data reference program and maintenance program. The system is designed so that user can easily register information of experiment systems including figures as well as geometry data and measured data or obtain those data through TSS terminal interactively. This manual describes the system structure, how-to-use and sample uses of this code system. (author)

  13. Instrumentation: Nondestructive Examination for Verification of Canister and Cladding Integrity. FY2014 Status Update

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, Ryan M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Suter, Jonathan D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Jones, Anthony M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-09-12

    This report documents FY14 efforts for two instrumentation subtasks under storage and transportation. These instrumentation tasks relate to developing effective nondestructive evaluation (NDE) methods and techniques to (1) verify the integrity of metal canisters for the storage of used nuclear fuel (UNF) and to (2) verify the integrity of dry storage cask internals.

  14. Spent Nuclear Fuel (SNF) project Integrated Safety Management System phase I and II Verification Review Plan

    International Nuclear Information System (INIS)

    CARTER, R.P.

    1999-01-01

    The U.S. Department of Energy (DOE) commits to accomplishing its mission safely. To ensure this objective is met, DOE issued DOE P 450.4, Safety Management System Policy, and incorporated safety management into the DOE Acquisition Regulations ([DEAR] 48 CFR 970.5204-2 and 90.5204-78). Integrated Safety Management (ISM) requires contractors to integrate safety into management and work practices at all levels so that missions are achieved while protecting the public, the worker, and the environment. The contractor is required to describe the Integrated Safety Management System (ISMS) to be used to implement the safety performance objective

  15. Spent Nuclear Fuel (SNF) project Integrated Safety Management System phase I and II Verification Review Plan

    Energy Technology Data Exchange (ETDEWEB)

    CARTER, R.P.

    1999-11-19

    The U.S. Department of Energy (DOE) commits to accomplishing its mission safely. To ensure this objective is met, DOE issued DOE P 450.4, Safety Management System Policy, and incorporated safety management into the DOE Acquisition Regulations ([DEAR] 48 CFR 970.5204-2 and 90.5204-78). Integrated Safety Management (ISM) requires contractors to integrate safety into management and work practices at all levels so that missions are achieved while protecting the public, the worker, and the environment. The contractor is required to describe the Integrated Safety Management System (ISMS) to be used to implement the safety performance objective.

  16. Security Architecture and Protocol for Trust Verifications Regarding the Integrity of Files Stored in Cloud Services †

    Science.gov (United States)

    2018-01-01

    Cloud computing is considered an interesting paradigm due to its scalability, availability and virtually unlimited storage capacity. However, it is challenging to organize a cloud storage service (CSS) that is safe from the client point-of-view and to implement this CSS in public clouds since it is not advisable to blindly consider this configuration as fully trustworthy. Ideally, owners of large amounts of data should trust their data to be in the cloud for a long period of time, without the burden of keeping copies of the original data, nor of accessing the whole content for verifications regarding data preservation. Due to these requirements, integrity, availability, privacy and trust are still challenging issues for the adoption of cloud storage services, especially when losing or leaking information can bring significant damage, be it legal or business-related. With such concerns in mind, this paper proposes an architecture for periodically monitoring both the information stored in the cloud and the service provider behavior. The architecture operates with a proposed protocol based on trust and encryption concepts to ensure cloud data integrity without compromising confidentiality and without overloading storage services. Extensive tests and simulations of the proposed architecture and protocol validate their functional behavior and performance. PMID:29498641

  17. Clinical Experience and Evaluation of Patient Treatment Verification With a Transit Dosimeter

    Energy Technology Data Exchange (ETDEWEB)

    Ricketts, Kate, E-mail: k.ricketts@ucl.ac.uk [Division of Surgery and Interventional Sciences, University College London, London (United Kingdom); Department of Radiotherapy Physics, Royal Berkshire NHS Foundation Trust, Reading (United Kingdom); Navarro, Clara; Lane, Katherine; Blowfield, Claire; Cotten, Gary; Tomala, Dee; Lord, Christine; Jones, Joanne; Adeyemi, Abiodun [Department of Radiotherapy Physics, Royal Berkshire NHS Foundation Trust, Reading (United Kingdom)

    2016-08-01

    Purpose: To prospectively evaluate a protocol for transit dosimetry on a patient population undergoing intensity modulated radiation therapy (IMRT) and to assess the issues in clinical implementation of electronic portal imaging devices (EPIDs) for treatment verification. Methods and Materials: Fifty-eight patients were enrolled in the study. Amorphous silicon EPIDs were calibrated for dose and used to acquire images of delivered fields. Measured EPID dose maps were back-projected using the planning computed tomographic (CT) images to calculate dose at prespecified points within the patient and compared with treatment planning system dose offline using point dose difference and point γ analysis. The deviation of the results was used to inform future action levels. Results: Two hundred twenty-five transit images were analyzed, composed of breast, prostate, and head and neck IMRT fields. Patient measurements demonstrated the potential of the dose verification protocol to model dose well under complex conditions: 83.8% of all delivered beams achieved the initial set tolerance level of Δ{sub D} of 0 ± 5 cGy or %Δ{sub D} of 0% ± 5%. Importantly, the protocol was also sensitive to anatomic changes and spotted that 3 patients from 20 measured prostate patients had undergone anatomic change in comparison with the planning CT. Patient data suggested an EPID-reconstructed versus treatment planning system dose difference action level of 0% ± 7% for breast fields. Asymmetric action levels were more appropriate for inversed IMRT fields, using absolute dose difference (−2 ± 5 cGy) or summed field percentage dose difference (−6% ± 7%). Conclusions: The in vivo dose verification method was easy to use and simple to implement, and it could detect patient anatomic changes that impacted dose delivery. The system required no extra dose to the patient or treatment time delay and so could be used throughout the course of treatment to identify and limit

  18. Dosimetric pre-treatment verification of IMRT using an EPID; clinical experience

    International Nuclear Information System (INIS)

    Zijtveld, Mathilda van; Dirkx, Maarten L.P.; Boer, Hans C.J. de; Heijmen, Ben J.M.

    2006-01-01

    Background and purpose: In our clinic a QA program for IMRT verification, fully based on dosimetric measurements with electronic portal imaging devices (EPID), has been running for over 3 years. The program includes a pre-treatment dosimetric check of all IMRT fields. During a complete treatment simulation at the linac, a portal dose image (PDI) is acquired with the EPID for each patient field and compared with a predicted PDI. In this paper, the results of this pre-treatment procedure are analysed, and intercepted errors are reported. An automated image analysis procedure is proposed to limit the number of fields that need human intervention in PDI comparison. Materials and methods: Most of our analyses are performed using the γ index with 3% local dose difference and 3 mm distance to agreement as reference values. Scalar parameters are derived from the γ values to summarize the agreement between measured and predicted 2D PDIs. Areas with all pixels having γ values larger than one are evaluated, making decisions based on clinically relevant criteria more straightforward. Results: In 270 patients, the pre-treatment checks revealed four clinically relevant errors. Calculation of statistics for a group of 75 patients showed that the patient-averaged mean γ value inside the field was 0.43 ± 0.13 (1 SD) and only 6.1 ± 6.8% of pixels had a γ value larger than one. With the proposed automated image analysis scheme, visual inspection of images can be avoided in 2/3 of the cases. Conclusion: EPIDs may be used for high accuracy and high resolution routine verification of IMRT fields to intercept clinically relevant dosimetric errors prior to the start of treatment. For the majority of fields, PDI comparison can fully rely on an automated procedure, avoiding excessive workload

  19. Integrating conceptualizations of experience into the interaction design process

    DEFF Research Database (Denmark)

    Dalsgaard, Peter

    2010-01-01

    From a design perspective, the increasing awareness of experiential aspects of interactive systems prompts the question of how conceptualizations of experience can inform and potentially be integrated into the interaction design process. This paper presents one approach to integrating theoretical...

  20. Integrating deductive verification and symbolic execution for abstract object creation in dynamic logic

    NARCIS (Netherlands)

    C.P.T. de Gouw (Stijn); F.S. de Boer (Frank); W. Ahrendt (Wolfgang); R. Bubel (Richard)

    2016-01-01

    textabstractWe present a fully abstract weakest precondition calculus and its integration with symbolic execution. Our assertion language allows both specifying and verifying properties of objects at the abstraction level of the programming language, abstracting from a specific implementation of

  1. Instrumentation. Nondestructive Examination for Verification of Canister and Cladding Integrity - FY2013 Status Update

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, Ryan M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Jones, Anthony M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Pardini, Allan F. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Denslow, Kayte M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Crawford, Susan L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Larche, Michael R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2013-09-30

    This report documents FY13 efforts for two instrumentation subtasks under storage and transportation. These instrumentation tasks relate to developing effective nondestructive evaluation (NDE) methods and techniques to (1) verify the integrity of metal canisters for the storage of used nuclear fuel (UNF) and to (2) characterize hydrogen effects in UNF cladding to facilitate safe storage and retrieval.

  2. Integrated Safety Management System Phase I Verification for the Plutonium Finishing Plant (PFP) [VOL 1 & 2

    Energy Technology Data Exchange (ETDEWEB)

    SETH, S.S.

    2000-01-10

    U.S. Department of Energy (DOE) Policy 450.4, Safety Management System Policy commits to institutionalizing an Integrated Safety Management System (ISMS) throughout the DOE complex as a means of accomplishing its missions safely. DOE Acquisition Regulation 970.5204-2 requires that contractors manage and perform work in accordance with a documented safety management system.

  3. The Vehicle Integrated Performance Analysis Experience: Reconnecting With Technical Integration

    Science.gov (United States)

    McGhee, D. S.

    2006-01-01

    Very early in the Space Launch Initiative program, a small team of engineers at MSFC proposed a process for performing system-level assessments of a launch vehicle. Aimed primarily at providing insight and making NASA a smart buyer, the Vehicle Integrated Performance Analysis (VIPA) team was created. The difference between the VIPA effort and previous integration attempts is that VIPA a process using experienced people from various disciplines, which focuses them on a technically integrated assessment. The foundations of VIPA s process are described. The VIPA team also recognized the need to target early detailed analysis toward identifying significant systems issues. This process is driven by the T-model for technical integration. VIPA s approach to performing system-level technical integration is discussed in detail. The VIPA process significantly enhances the development and monitoring of realizable project requirements. VIPA s assessment validates the concept s stated performance, identifies significant issues either with the concept or the requirements, and then reintegrates these issues to determine impacts. This process is discussed along with a description of how it may be integrated into a program s insight and review process. The VIPA process has gained favor with both engineering and project organizations for being responsive and insightful

  4. Tracer experiment data sets for the verification of local and meso-scale atmospheric dispersion models including topographic effects

    International Nuclear Information System (INIS)

    Sartori, E.; Schuler, W.

    1992-01-01

    Software and data for nuclear energy applications are acquired, tested and distributed by several information centres; in particular, relevant computer codes are distributed internationally by the OECD/NEA Data Bank (France) and by ESTSC and EPIC/RSIC (United States). This activity is coordinated among the centres and is extended outside the OECD area through an arrangement with the IAEA. This article proposes more specifically a scheme for acquiring, storing and distributing atmospheric tracer experiment data (ATE) required for verification of atmospheric dispersion models especially the most advanced ones including topographic effects and specific to the local and meso-scale. These well documented data sets will form a valuable complement to the set of atmospheric dispersion computer codes distributed internationally. Modellers will be able to gain confidence in the predictive power of their models or to verify their modelling skills. (au)

  5. The Verification of ESF-CCS Integration Test procedure by utilizing Lab view

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jayoung; Lee, Sangseok; Sohn, Kwangyoung [Korea Reliability Technology and System, Daejeon (Korea, Republic of); Lee, Junku; Park, Geunok [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2013-05-15

    Since the Fukushima event, especially it is considered to be important to guarantee the safety of plant by mitigating the major accident. ESF-CCS (Engineered Safety Feature-Component Control System) is monitoring all the plant variables and generates the ESF-CCS actuation signals when the plant variables violate the setpoint. Taking a look at the classic design, ESF-CCS is composed of the sub-components such as Minimum Inventory (MI), ESCM (ESF-CCS Soft Control Module), CPM, ITP, Group Controller (GC), Loop Controller (LC), CCG (Control Channel Gate), MTP, CIM (Component Interface Module). By help of Lab view simulation in integration test procedure preparation, the following benefits are attained; - Control logic and design evaluation by Lab view - Eliminating the time-consuming test case design, and determining the 'expected result' with design validation - The reliability upgrade of integration test quality.

  6. Verification experiment on the downblending of high enriched uranium (HEU) at the Portsmouth Gaseous Diffusion Plant. Digital video surveillance of the HEU feed stations

    International Nuclear Information System (INIS)

    Martinez, R.L.; Tolk, K.; Whiting, N.; Castleberry, K.; Lenarduzzi, R.

    1998-01-01

    As part of a Safeguards Agreement between the US and the International Atomic Energy Agency (IAEA), the Portsmouth Gaseous Diffusion Plant, Piketon, Ohio, was added to the list of facilities eligible for the application of IAEA safeguards. Currently, the facility is in the process of downblending excess inventory of HEU to low enriched uranium (LEU) from US defense related programs for commercial use. An agreement was reached between the US and the IAEA that would allow the IAEA to conduct an independent verification experiment at the Portsmouth facility, resulting in the confirmation that the HEU was in fact downblended. The experiment provided an opportunity for the DOE laboratories to recommend solutions/measures for new IAEA safeguards applications. One of the measures recommended by Sandia National Laboratories (SNL), and selected by the IAEA, was a digital video surveillance system for monitoring activity at the HEU feed stations. This paper describes the SNL implementation of the digital video system and its integration with the Load Cell Based Weighing System (LCBWS) from Oak Ridge National Laboratory (ORNL). The implementation was based on commercially available technology that also satisfied IAEA criteria for tamper protection and data authentication. The core of the Portsmouth digital video surveillance system was based on two Digital Camera Modules (DMC-14) from Neumann Consultants, Germany

  7. Mathematical Verification for Transmission Performance of Centralized Lightwave WDM-RoF-PON with Quintuple Services Integrated in Each Wavelength Channel

    Directory of Open Access Journals (Sweden)

    Shuai Chen

    2015-01-01

    Full Text Available Wavelength-division-multiplexing passive-optical-network (WDM-PON has been recognized as a promising solution of the “last mile” access as well as multibroadband data services access for end users, and WDM-RoF-PON, which employs radio-over-fiber (RoF technique in WDM-PON, is even a more attractive approach for future broadband fiber and wireless access for its strong availability of centralized multiservices transmission operation and its transparency for bandwidth and signal modulation formats. As for multiservices development in WDM-RoF-PON, various system designs have been reported and verified via simulation or experiment till now, and the scheme with multiservices transmitted in each single wavelength channel is believed as the one that has the highest bandwidth efficiency; however, the corresponding mathematical verification is still hard to be found in state-of-the-art literature. In this paper, system design and data transmission performance of a quintuple services integrated WDM-RoF-PON which jointly employs carrier multiplexing and orthogonal modulation techniques, have been theoretically analyzed and verified in detail; moreover, the system design has been duplicated and verified experimentally and the theory system of such WDM-RoF-PON scheme has thus been formed.

  8. Mergers and integrated care: the Quebec experience.

    Science.gov (United States)

    Demers, Louis

    2013-01-01

    As a researcher, I have studied the efforts to increase the integration of health and social services in Quebec, as well as the mergers in the Quebec healthcare system. These mergers have often been presented as a necessary transition to break down the silos that compartmentalize the services dispensed by various organisations. A review of the studies about mergers and integrated care projects in the Quebec healthcare system, since its inception, show that mergers cannot facilitate integrated care unless they are desired and represent for all of the actors involved an appropriate way to deal with service organisation problems. Otherwise, mergers impede integrated care by creating increased bureaucratisation and standardisation and by triggering conflicts and mistrust among the staff of the merged organisations. It is then preferable to let local actors select the most appropriate organisational integration model for their specific context and offer them resources and incentives to cooperate.

  9. Mergers and integrated care: the Quebec experience

    Directory of Open Access Journals (Sweden)

    Louis Demers

    2013-02-01

    Full Text Available As a researcher, I have studied the efforts to increase the integration of health and social services in Quebec, as well as the mergers in the Quebec healthcare system. These mergers have often been presented as a necessary transition to break down the silos that compartmentalize the services dispensed by various organisations. A review of the studies about mergers and integrated care projects in the Quebec healthcare system, since its inception, show that mergers cannot facilitate integrated care unless they are desired and represent for all of the actors involved an appropriate way to deal with service organisation problems. Otherwise, mergers impede integrated care by creating increased bureaucratisation and standardisation and by triggering conflicts and mistrust among the staff of the merged organisations. It is then preferable to let local actors select the most appropriate organisational integration model for their specific context and offer them resources and incentives to cooperate.

  10. Fluor Daniel Hanford Inc. integrated safety management system phase 1 verification final report

    International Nuclear Information System (INIS)

    PARSONS, J.E.

    1999-01-01

    The purpose of this review is to verify the adequacy of documentation as submitted to the Approval Authority by Fluor Daniel Hanford, Inc. (FDH). This review is not only a review of the Integrated Safety Management System (ISMS) System Description documentation, but is also a review of the procedures, policies, and manuals of practice used to implement safety management in an environment of organizational restructuring. The FDH ISMS should support the Hanford Strategic Plan (DOE-RL 1996) to safely clean up and manage the site's legacy waste; deploy science and technology while incorporating the ISMS theme to ''Do work safely''; and protect human health and the environment

  11. Night vision imaging systems design, integration, and verification in military fighter aircraft

    Science.gov (United States)

    Sabatini, Roberto; Richardson, Mark A.; Cantiello, Maurizio; Toscano, Mario; Fiorini, Pietro; Jia, Huamin; Zammit-Mangion, David

    2012-04-01

    This paper describes the developmental and testing activities conducted by the Italian Air Force Official Test Centre (RSV) in collaboration with Alenia Aerospace, Litton Precision Products and Cranfiled University, in order to confer the Night Vision Imaging Systems (NVIS) capability to the Italian TORNADO IDS (Interdiction and Strike) and ECR (Electronic Combat and Reconnaissance) aircraft. The activities consisted of various Design, Development, Test and Evaluation (DDT&E) activities, including Night Vision Goggles (NVG) integration, cockpit instruments and external lighting modifications, as well as various ground test sessions and a total of eighteen flight test sorties. RSV and Litton Precision Products were responsible of coordinating and conducting the installation activities of the internal and external lights. Particularly, an iterative process was established, allowing an in-site rapid correction of the major deficiencies encountered during the ground and flight test sessions. Both single-ship (day/night) and formation (night) flights were performed, shared between the Test Crews involved in the activities, allowing for a redundant examination of the various test items by all participants. An innovative test matrix was developed and implemented by RSV for assessing the operational suitability and effectiveness of the various modifications implemented. Also important was definition of test criteria for Pilot and Weapon Systems Officer (WSO) workload assessment during the accomplishment of various operational tasks during NVG missions. Furthermore, the specific technical and operational elements required for evaluating the modified helmets were identified, allowing an exhaustive comparative evaluation of the two proposed solutions (i.e., HGU-55P and HGU-55G modified helmets). The results of the activities were very satisfactory. The initial compatibility problems encountered were progressively mitigated by incorporating modifications both in the front and

  12. Test and verification of a reactor protection system application-specific integrated circuit

    International Nuclear Information System (INIS)

    Battle, R.E.; Turner, G.W.; Vandermolen, R.I.; Vitalbo, C.

    1997-01-01

    Application-specific integrated circuits (ASICs) were utilized in the design of nuclear plant safety systems because they have certain advantages over software-based systems and analog-based systems. An advantage they have over software-based systems is that an ASIC design can be simple enough to not include branch statements and also can be thoroughly tested. A circuit card on which an ASIC is mounted can be configured to replace various versions of older analog equipment with fewer design types required. The approach to design and testing of ASICs for safety system applications is discussed in this paper. Included are discussions of the ASIC architecture, how it is structured to assist testing, and of the functional and enhanced circuit testing

  13. Fluor Daniel Hanford Inc. integrated safety management system phase 1 verification final report

    Energy Technology Data Exchange (ETDEWEB)

    PARSONS, J.E.

    1999-10-28

    The purpose of this review is to verify the adequacy of documentation as submitted to the Approval Authority by Fluor Daniel Hanford, Inc. (FDH). This review is not only a review of the Integrated Safety Management System (ISMS) System Description documentation, but is also a review of the procedures, policies, and manuals of practice used to implement safety management in an environment of organizational restructuring. The FDH ISMS should support the Hanford Strategic Plan (DOE-RL 1996) to safely clean up and manage the site's legacy waste; deploy science and technology while incorporating the ISMS theme to ''Do work safely''; and protect human health and the environment.

  14. River Protection Project Integrated safety management system phase II verification review plan - 7/29/99

    International Nuclear Information System (INIS)

    SHOOP, D.S.

    1999-01-01

    The purpose of this review is to verify the implementation status of the Integrated Safety Management System (ISMS) for the River Protection Project (RPP) facilities managed by Fluor Daniel Hanford, Inc. (FDH) and operated by Lockheed Martin Hanford Company (LMHC). This review will also ascertain whether within RPP facilities and operations the work planning and execution processes are in place and functioning to effectively protect the health and safety of the workers, public, environment, and federal property over the RPP life cycle. The RPP ISMS should support the Hanford Strategic Plan (DOERL-96-92) to safely clean up and manage the site's legacy waste and deploy science and technology while incorporating the ISMS central theme to ''Do work safely'' and protect human health and the environment

  15. Night vision imaging system design, integration and verification in spacecraft vacuum thermal test

    Science.gov (United States)

    Shang, Yonghong; Wang, Jing; Gong, Zhe; Li, Xiyuan; Pei, Yifei; Bai, Tingzhu; Zhen, Haijing

    2015-08-01

    The purposes of spacecraft vacuum thermal test are to characterize the thermal control systems of the spacecraft and its component in its cruise configuration and to allow for early retirement of risks associated with mission-specific and novel thermal designs. The orbit heat flux is simulating by infrared lamp, infrared cage or electric heater. As infrared cage and electric heater do not emit visible light, or infrared lamp just emits limited visible light test, ordinary camera could not operate due to low luminous density in test. Moreover, some special instruments such as satellite-borne infrared sensors are sensitive to visible light and it couldn't compensate light during test. For improving the ability of fine monitoring on spacecraft and exhibition of test progress in condition of ultra-low luminous density, night vision imaging system is designed and integrated by BISEE. System is consist of high-gain image intensifier ICCD camera, assistant luminance system, glare protect system, thermal control system and computer control system. The multi-frame accumulation target detect technology is adopted for high quality image recognition in captive test. Optical system, mechanical system and electrical system are designed and integrated highly adaptable to vacuum environment. Molybdenum/Polyimide thin film electrical heater controls the temperature of ICCD camera. The results of performance validation test shown that system could operate under vacuum thermal environment of 1.33×10-3Pa vacuum degree and 100K shroud temperature in the space environment simulator, and its working temperature is maintains at 5° during two-day test. The night vision imaging system could obtain video quality of 60lp/mm resolving power.

  16. Initial Clinical Experience Performing Patient Treatment Verification With an Electronic Portal Imaging Device Transit Dosimeter

    Energy Technology Data Exchange (ETDEWEB)

    Berry, Sean L., E-mail: BerryS@MSKCC.org [Department of Applied Physics and Applied Mathematics, Columbia University, New York, New York (United States); Department of Medical Physics, Memorial Sloan-Kettering Cancer Center, New York, New York (United States); Polvorosa, Cynthia; Cheng, Simon; Deutsch, Israel; Chao, K. S. Clifford; Wuu, Cheng-Shie [Department of Radiation Oncology, Columbia University, New York, New York (United States)

    2014-01-01

    Purpose: To prospectively evaluate a 2-dimensional transit dosimetry algorithm's performance on a patient population and to analyze the issues that would arise in a widespread clinical adoption of transit electronic portal imaging device (EPID) dosimetry. Methods and Materials: Eleven patients were enrolled on the protocol; 9 completed and were analyzed. Pretreatment intensity modulated radiation therapy (IMRT) patient-specific quality assurance was performed using a stringent local 3%, 3-mm γ criterion to verify that the planned fluence had been appropriately transferred to and delivered by the linear accelerator. Transit dosimetric EPID images were then acquired during treatment and compared offline with predicted transit images using a global 5%, 3-mm γ criterion. Results: There were 288 transit images analyzed. The overall γ pass rate was 89.1% ± 9.8% (average ± 1 SD). For the subset of images for which the linear accelerator couch did not interfere with the measurement, the γ pass rate was 95.7% ± 2.4%. A case study is presented in which the transit dosimetry algorithm was able to identify that a lung patient's bilateral pleural effusion had resolved in the time between the planning CT scan and the treatment. Conclusions: The EPID transit dosimetry algorithm under consideration, previously described and verified in a phantom study, is feasible for use in treatment delivery verification for real patients. Two-dimensional EPID transit dosimetry can play an important role in indicating when a treatment delivery is inconsistent with the original plan.

  17. Remote communications technology redefines integrity verification and monitoring of low pressure isolation

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    2010-01-15

    In 2007, a ship collided with the southeast face of a satellite platform jacket in the North Sea, damaging the 12-inch export riser. Emergency shutdown valves immediately shut-in production from the platform, leaving the pressure in the pipeline at approximately 4 barg. The riser had to be repaired before production could resume. TDW Offshore Services (TDW) was hired to develop a low pressure solution to isolate the damaged section of the pipeline riser from the export pipeline gas inventory. TDW used its range of specialist pipeline pigging, pig tracking and remote communications technology to solve the problem. The solution consisted of a custom-designed TDW pig trap and pigging spread; a high friction pig train furnished with the SmartTrack remote tracking and pressure-monitoring system; a SmartTrack subsea remote tracking and pressure-monitoring system; a SmartTrack topside tracking and monitoring system with radio link to the dive support vessel; and a pipeline isolation ball valve. TDW was able to monitor the downstream pressure of each isolation pig continuously throughout the operation using its innovative technology that sends isolation integrity data by radio link to a dive support vessel through pipe wall communications. The use of remote tracking and pressure monitoring technology enabled TDW to make repairs to the damaged riser while maintaining a continuous flow throughout the duration of the operation. 4 figs.

  18. Integration of multidisciplinary technologies for real time target visualization and verification for radiotherapy.

    Science.gov (United States)

    Chang, Wen-Chung; Chen, Chin-Sheng; Tai, Hung-Chi; Liu, Chia-Yuan; Chen, Yu-Jen

    2014-01-01

    The current practice of radiotherapy examines target coverage solely from digitally reconstructed beam's eye view (BEV) in a way that is indirectly accessible and that is not in real time. We aimed to visualize treatment targets in real time from each BEV. The image data of phantom or patients from ultrasound (US) and computed tomography (CT) scans were captured to perform image registration. We integrated US, CT, US/CT image registration, robotic manipulation of US, a radiation treatment planning system, and a linear accelerator to constitute an innovative target visualization system. The performance of this algorithm segmented the target organ in CT images, transformed and reconstructed US images to match each orientation, and generated image registration in real time mode with acceptable accuracy. This image transformation allowed physicians to visualize the CT image-reconstructed target via a US probe outside the BEV that was non-coplanar to the beam's plane. It allowed the physicians to remotely control the US probe that was equipped on a robotic arm to dynamically trace and real time monitor the coverage of the target within the BEV during a simulated beam-on situation. This target visualization system may provide a direct remotely accessible and real time way to visualize, verify, and ensure tumor targeting during radiotherapy.

  19. Preliminary Study for Development of Welds Integrity Verification Equipment for the Small Bore Piping

    International Nuclear Information System (INIS)

    Choi, Geun Suk; Lee, Jong Eun; Ryu, Jung Hoon; Cho, Kyoung Youn; Sohn, Myoung Sung; Lee, Sanghoon; Sung, Gi Ho; Cho, Hong Seok

    2016-01-01

    It has been reported leakage accident of small-bore piping in Korea. Leakage accident of small-bore pipes are those that will increase due to the aging of the nuclear power plant. And if leakage of the pipe is repaired by using the clamping device when it occur accident, it is economically benefits. The clamping device is a fastening device used to hold or secure objects tightly together to prevent movement or separation through the application of inward pressure. However, when the accident occurs, it can't immediately respond because maintenance and repairing technology are not institutionalized in KEPIC. Thus it appears an economic loss. The technology for corresponding thereto is necessary for the safety of the operation of nuclear power plants. The purpose of this research is to develop an online repairing technology of socket welded pipe and vibration monitoring system of small-bore pipe in the nuclear power plant. Specifically, detailed studies are as follows : • Development of weld overlay method of safety class socket welded connections • Development of Mechanical Clamping Devices for Safety Class 2, 3 small-bore pipe. The purpose of this study is to develop an online repairing technology of socket welded pipe and vibration monitoring system of small-bore pipe, resulting in degraded plant systems. And it is necessary to institutionalize the technology. The fatigue crack testing of socket welded overlay will be performed and fatigue life evaluation method will be developed in second year. Also prototype fabrication of mechanical clamping device will be completed. Base on final goal, the intent is to propose practical evaluation tools, design and fabrication methods for socket welded connection integrity. And result of this study is to development of KEPIC code case approved technology for on-line repairing system of socket welded connection and fabrication of mechanical clamping device

  20. Integration of multidisciplinary technologies for real time target visualization and verification for radiotherapy

    Directory of Open Access Journals (Sweden)

    Chang WC

    2014-06-01

    Full Text Available Wen-Chung Chang,1,* Chin-Sheng Chen,2,* Hung-Chi Tai,3 Chia-Yuan Liu,4,5 Yu-Jen Chen3 1Department of Electrical Engineering, National Taipei University of Technology, Taipei, Taiwan; 2Graduate Institute of Automation Technology, National Taipei University of Technology, Taipei, Taiwan; 3Department of Radiation Oncology, Mackay Memorial Hospital, Taipei, Taiwan; 4Department of Internal Medicine, Mackay Memorial Hospital, Taipei, Taiwan; 5Department of Medicine, Mackay Medical College, New Taipei City, Taiwan  *These authors contributed equally to this work Abstract: The current practice of radiotherapy examines target coverage solely from digitally reconstructed beam's eye view (BEV in a way that is indirectly accessible and that is not in real time. We aimed to visualize treatment targets in real time from each BEV. The image data of phantom or patients from ultrasound (US and computed tomography (CT scans were captured to perform image registration. We integrated US, CT, US/CT image registration, robotic manipulation of US, a radiation treatment planning system, and a linear accelerator to constitute an innovative target visualization system. The performance of this algorithm segmented the target organ in CT images, transformed and reconstructed US images to match each orientation, and generated image registration in real time mode with acceptable accuracy. This image transformation allowed physicians to visualize the CT image-reconstructed target via a US probe outside the BEV that was non-coplanar to the beam's plane. It allowed the physicians to remotely control the US probe that was equipped on a robotic arm to dynamically trace and real time monitor the coverage of the target within the BEV during a simulated beam-on situation. This target visualization system may provide a direct remotely accessible and real time way to visualize, verify, and ensure tumor targeting during radiotherapy. Keywords: ultrasound, computerized tomography

  1. A tomographic method for verification of the integrity of spent nuclear fuel

    International Nuclear Information System (INIS)

    Jacobsson, Staffan; Haakansson, Ane; Andersson, Camilla; Jansson, Peter; Baecklin, Anders

    1998-03-01

    A tomographic method for experimental investigation of the integrity of used LWR fuel has been developed. It is based on measurements of the gamma radiation from the fission products in the fuel rods. A reconstruction code of the algebraic type has been written. The potential of the technique has been examined in extensive simulations assuming a gamma-ray energy of either 0.66 MeV ( 137 Cs) or 1.27 MeV ( 154 Eu). The results of the simulations for BWR fuel indicate that single fuel rods or groups of rods replaced with water or fresh fuel can be reliably detected independent of their position in the fuel assembly using 137 Cs radiation. For PWR fuel the same result is obtained with the exception of the most central positions. Here the more penetrable radiation from 154 Eu must be used in order to allow a water channel to be distinguished from a fuel rod. The results of the simulations have been verified experimentally for a 8x8 BWR fuel assembly. Special equipment has been constructed and installed at the interim storage CLAB. The equipment allows the mapping of the radiation field around a fuel assembly with the aid of a germanium detector fitted with a collimator with a vertical slit. The intensities measured in 2520 detector positions were used as input for the reconstruction code used in the simulations. The results agreed very well with the simulations and revealed significantly a position containing a water channel in the central part of the assembly

  2. Experimental Verification of Integrity of Low-Pressure Injection Piles Structure - Pile Internal Capacity

    Science.gov (United States)

    Pachla, Henryk

    2017-12-01

    The idea of strengthening the foundation using injection piles lies in transferring loads from the foundation to the piles anchorage in existing structure and formed in the soil. Such a system has to be able to transfer loads from the foundation to the pile and from the pile onto the soil. Pile structure often reinforced with steel element has to also be able to transfer such a loading. According to the rules of continuum mechanics, the bearing capacity of such a system and a deformation of its individual elements can be determined by way of an analysis of the contact problem of three interfaces. Each of these surfaces is determined by different couples of materials. Those surfaces create: pile-foundation anchorage, bonding between reinforcement and material from which the pile is formed and pilesoil interface. What is essential is that on the contact surfaces the deformation of materials which adhere to each other can vary and depends on the mechanical properties and geometry of these surfaces. Engineering practice and experimental research point out that the failure in such structures occurs at interfaces. The paper is concentrating on presenting the experiments on interaction between cement grout and various types of steel reinforcement. The tests were conducted on the special low pressure injection piles widely used to strengthen foundations of already existing structures of historical buildings due to the technology of formation and injection pressure.

  3. Experimental Verification of Integrity of Low-Pressure Injection Piles Structure – Pile Internal Capacity

    Directory of Open Access Journals (Sweden)

    Pachla Henryk

    2017-12-01

    Full Text Available The idea of strengthening the foundation using injection piles lies in transferring loads from the foundation to the piles anchorage in existing structure and formed in the soil. Such a system has to be able to transfer loads from the foundation to the pile and from the pile onto the soil. Pile structure often reinforced with steel element has to also be able to transfer such a loading. According to the rules of continuum mechanics, the bearing capacity of such a system and a deformation of its individual elements can be determined by way of an analysis of the contact problem of three interfaces. Each of these surfaces is determined by different couples of materials. Those surfaces create: pile-foundation anchorage, bonding between reinforcement and material from which the pile is formed and pilesoil interface. What is essential is that on the contact surfaces the deformation of materials which adhere to each other can vary and depends on the mechanical properties and geometry of these surfaces. Engineering practice and experimental research point out that the failure in such structures occurs at interfaces. The paper is concentrating on presenting the experiments on interaction between cement grout and various types of steel reinforcement. The tests were conducted on the special low pressure injection piles widely used to strengthen foundations of already existing structures of historical buildings due to the technology of formation and injection pressure.

  4. Laboratory Testing and Performance Verification of the CHARIS Integral Field Spectrograph

    Science.gov (United States)

    Groff, Tyler D.; Chilcote, Jeffrey; Kasdin, N. Jeremy; Galvin, Michael; Loomis, Craig; Carr, Michael A.; Brandt, Timothy; Knapp, Gillian; Limbach, Mary Anne; Guyon, Olivier; hide

    2016-01-01

    The Coronagraphic High Angular Resolution Imaging Spectrograph (CHARIS) is an integral field spectrograph (IFS) that has been built for the Subaru telescope. CHARIS has two imaging modes; the high-resolution mode is R82, R69, and R82 in J, H, and K bands respectively while the low-resolution discovery mode uses a second low-resolution prism with R19 spanning 1.15-2.37 microns (J+H+K bands). The discovery mode is meant to augment the low inner working angle of the Subaru Coronagraphic Extreme Adaptive Optics (SCExAO) adaptive optics system, which feeds CHARIS a coronagraphic image. The goal is to detect and characterize brown dwarfs and hot Jovian planets down to contrasts five orders of magnitude dimmer than their parent star at an inner working angle as low as 80 milliarcseconds. CHARIS constrains spectral crosstalk through several key aspects of the optical design. Additionally, the repeatability of alignment of certain optical components is critical to the calibrations required for the data pipeline. Specifically the relative alignment of the lens let array, prism, and detector must be highly stable and repeatable between imaging modes. We report on the measured repeatability and stability of these mechanisms, measurements of spectral crosstalk in the instrument, and the propagation of these errors through the data pipeline. Another key design feature of CHARIS is the prism, which pairs Barium Fluoride with Ohara L-BBH2 high index glass. The dispersion of the prism is significantly more uniform than other glass choices, and the CHARIS prisms represent the first NIR astronomical instrument that uses L-BBH2as the high index material. This material choice was key to the utility of the discovery mode, so significant efforts were put into cryogenic characterization of the material. The final performance of the prism assemblies in their operating environment is described in detail. The spectrograph is going through final alignment, cryogenic cycling, and is being

  5. VHTRC experiment for verification test of H{infinity} reactivity estimation method

    Energy Technology Data Exchange (ETDEWEB)

    Fujii, Yoshio; Suzuki, Katsuo; Akino, Fujiyoshi; Yamane, Tsuyoshi; Fujisaki, Shingo; Takeuchi, Motoyoshi; Ono, Toshihiko [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1996-02-01

    This experiment was performed at VHTRC to acquire the data for verifying the H{infinity} reactivity estimation method. In this report, the experimental method, the measuring circuits and data processing softwares are described in details. (author).

  6. Strategy for assessment of WWER steam generator tube integrity. Report prepared within the framework of the coordinated research project on verification of WWER steam generator tube integrity

    International Nuclear Information System (INIS)

    2007-12-01

    Steam generator heat exchanger tube degradations happen in WWER Nuclear Power Plant (NPP). The situation varies from country to country and from NPP to NPP. More severe degradation is observed in WWER-1000 NPPs than in case of WWER-440s. The reasons for these differences could be, among others, differences in heat exchanger tube material (chemical composition, microstructure, residual stresses), in thermal and mechanical loadings, as well as differences in water chemistry. However, WWER steam generators had not been designed for eddy current testing which is the usual testing method in steam generators of western PWRs. Moreover, their supplier provided neither adequate methodology and criteria nor equipment for planning and implementing In-Service Inspection (ISI). Consequently, WWER steam generator ISI infrastructure was established with delay. Even today, there are still big differences in the eddy current inspection strategy and practice as well as in the approach to steam generator heat exchanger tube structural integrity assessment (plugging criteria for defective tubes vary from 40 to 90% wall thickness degradation). Recognizing this situation, the WWER operating countries expressed their need for a joint effort to develop methodology to establish reasonable commonly accepted integrity assessment criteria for the heat exchanger tubes. The IAEA's programme related to steam generator life management is embedded into the systematic activity of its Technical Working Group on Life Management of Nuclear Power Plants (TWG-LMNPP). Under the advice of the TWG-LMNPP, an IAEA coordinated research project (CRP) on Verification of WWER Steam Generator Tube Integrity was launched in 2001. It was completed in 2005. Thirteen organizations involved in in-service inspection of steam generators in WWER operating countries participated: Croatia, Czech Republic, Finland, France, Hungary, Russian Federation, Slovakia, Spain, Ukraine, and the USA. The overall objective was to

  7. Advanced verification topics

    CERN Document Server

    Bhattacharya, Bishnupriya; Hall, Gary; Heaton, Nick; Kashai, Yaron; Khan Neyaz; Kirshenbaum, Zeev; Shneydor, Efrat

    2011-01-01

    The Accellera Universal Verification Methodology (UVM) standard is architected to scale, but verification is growing and in more than just the digital design dimension. It is growing in the SoC dimension to include low-power and mixed-signal and the system integration dimension to include multi-language support and acceleration. These items and others all contribute to the quality of the SOC so the Metric-Driven Verification (MDV) methodology is needed to unify it all into a coherent verification plan. This book is for verification engineers and managers familiar with the UVM and the benefits it brings to digital verification but who also need to tackle specialized tasks. It is also written for the SoC project manager that is tasked with building an efficient worldwide team. While the task continues to become more complex, Advanced Verification Topics describes methodologies outside of the Accellera UVM standard, but that build on it, to provide a way for SoC teams to stay productive and profitable.

  8. Experiences in integrated leak rate measurements

    International Nuclear Information System (INIS)

    Shirk, R.E.

    1982-01-01

    During a hypothetical design basis accident for nuclear power plants, the reactor containment system is relied upon to maintain radioactive exposure limits below acceptable limits. Integrated leak rate testing is a means of verifying that the leakage of radioactivity material from the reactor containment will be below allowable limits. Leakage rate computations are based on the ideal gas law. The absolute method of leakage rate testing utilizing mass point method of data analysis is recommended. Integrated leak rate testing data is obtained from pressure, drybulb temperature, dewpoint temperature, and flow measuring systems. Test data does not support the usual leakage (flow) - pressure square root relationship. The major source of potential leakage from the reactor containment is reactor containment isolation valves

  9. Field Test and Performance Verification: Integrated Active Desiccant Rooftop Hybrid System Installed in a School - Final Report: Phase 4A

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, J

    2005-12-21

    This report summarizes the results of a field verification pilot site investigation that involved the installation of a hybrid integrated active desiccant/vapor-compression rooftop heating, ventilation, and air-conditioning (HVAC) unit at an elementary school in the Atlanta Georgia area. For years, the school had experienced serious humidity and indoor air quality (IAQ) problems that had resulted in occupant complaints and microbial (mold) remediation. The outdoor air louvers of the original HVAC units had been closed in an attempt to improve humidity control within the space. The existing vapor compression variable air volume system was replaced by the integrated active desiccant rooftop (IADR) system that was described in detail in an Oak Ridge National Laboratory (ORNL) report published in 2004 (Fischer and Sand 2004). The IADR system and all space conditions have been monitored remotely for more than a year. The hybrid system was able to maintain both the space temperature and humidity as desired while delivering the outdoor air ventilation rate required by American Society of Heating, Refrigerating and Air-Conditioning Engineers Standard 62. The performance level of the IADR unit and the overall system energy efficiency was measured and found to be very high. A comprehensive IAQ investigation was completed by the Georgia Tech Research Institute before and after the system retrofit. Before-and-after data resulting from this investigation confirmed a significant improvement in IAQ, humidity control, and occupant comfort. These observations were reported by building occupants and are echoed in a letter to ORNL from the school district energy manager. The IADR system was easily retrofitted in place of the original rooftop system using a custom curb adapter. All work was completed in-house by the school's maintenance staff over one weekend. A subsequent cost analysis completed for the school district by the design engineer of record concluded that the IADR

  10. Fusion Ignition Research Experiment System Integration

    International Nuclear Information System (INIS)

    Brown, T.

    2000-01-01

    This paper describes the current status of the FIRE configuration and the integration of the major subsystem components. FIRE has a major radius of 2 m, a field on axis of 10T, a plasma current of 6.4 MA. It is capable of 18 second pulses when operated with DT and 26 s when operated with DD. The general arrangement consists of sixteen wedged TF coils that surround a free standing central solenoid, a double wall vacuum vessel and internal plasma facing components that are segmented for maintenance through horizontal ports. Large rings located outside the TF coils are used to obtain a load balance between wedging of the intercoil case structure and wedging at the upper/lower inboard corners of the TF coil winding. The magnets are liquid nitrogen cooled and the entire device is surrounded by a thermal enclosure. The double wall vacuum vessel integrates cooling and shielding in a shape that maximizes shielding of ex-vessel components. Within the vacuum vessel, plasma-facing components frame the plasma. First wall tiles are attached directly to inboard and outboard vacuum vessel walls. The divertor is designed for a high triangularity, double-null plasma with a short inner null point-to-wall distance and near vertical outer divertor flux line. The FIRE configuration has been developed to meet the physics objectives and subsystem requirements in an arrangement that allows remote maintenance of in-vessel components and hands-on maintenance of components outside the TF boundary

  11. Recover Act. Verification of Geothermal Tracer Methods in Highly Constrained Field Experiments

    Energy Technology Data Exchange (ETDEWEB)

    Becker, Matthew W. [California State University, Long Beach, CA (United States)

    2014-05-16

    The prediction of the geothermal system efficiency is strong linked to the character of the flow system that connects injector and producer wells. If water flow develops channels or “short circuiting” between injection and extraction wells thermal sweep is poor and much of the reservoir is left untapped. The purpose of this project was to understand how channelized flow develops in fracture geothermal reservoirs and how it can be measured in the field. We explored two methods of assessing channelization: hydraulic connectivity tests and tracer tests. These methods were tested at a field site using two verification methods: ground penetrating radar (GPR) images of saline tracer and heat transfer measurements using distributed temperature sensing (DTS). The field site for these studies was the Altona Flat Fractured Rock Research Site located in northeastern New York State. Altona Flat Rock is an experimental site considered a geologic analog for some geothermal reservoirs given its low matrix porosity. Because soil overburden is thin, it provided unique access to saturated bedrock fractures and the ability image using GPR which does not effectively penetrate most soils. Five boreholes were drilled in a “five spot” pattern covering 100 m2 and hydraulically isolated in a single bedding plane fracture. This simple system allowed a complete characterization of the fracture. Nine small diameter boreholes were drilled from the surface to just above the fracture to allow the measurement of heat transfer between the fracture and the rock matrix. The focus of the hydraulic investigation was periodic hydraulic testing. In such tests, rather than pumping or injection in a well at a constant rate, flow is varied to produce an oscillating pressure signal. This pressure signal is sensed in other wells and the attenuation and phase lag between the source and receptor is an indication of hydraulic connection. We found that these tests were much more effective than constant

  12. Integrated circuits for particle physics experiments

    CERN Document Server

    Snoeys, W; Campbell, M; Cantatore, E; Faccio, F; Heijne, Erik H M; Jarron, Pierre; Kloukinas, Kostas C; Marchioro, A; Moreira, P; Toifl, Thomas H; Wyllie, Ken H

    2000-01-01

    High energy particle physics experiments investigate the nature of matter through the identification of subatomic particles produced in collisions of protons, electrons, or heavy ions which have been accelerated to very high energies. Future experiments will have hundreds of millions of detector channels to observe the interaction region where collisions take place at a 40 MHz rate. This paper gives an overview of the electronics requirements for such experiments and explains how data reduction, timing distribution, and radiation tolerance in commercial CMOS circuits are achieved for these big systems. As a detailed example, the electronics for the innermost layers of the future tracking detector, the pixel vertex detector, is discussed with special attention to system aspects. A small-scale prototype (130 channels) implemented in standard 0.25 mu m CMOS remains fully functional after a 30 Mrad(SiO/sub 2/) irradiation. A full-scale pixel readout chip containing 8000 readout channels in a 14 by 16 mm/sup 2/ ar...

  13. Verification experiment of EPR paradox by (d, {sup 2}He) reaction

    Energy Technology Data Exchange (ETDEWEB)

    Sakai, Hideyuki [Tokyo Univ., Graudate School of Science, Tokyo (Japan)

    2003-01-01

    FBR paradox which was brought forward by Einstein, Podolsky and Rosen is expressed by Bell's inequality of spin correlation theoretically. In principle it is possible to verify the inequality by measuring spin correlation between two particles having spin 1/2 from a decay of {sup 1}S{sub 0} experimentally. Most of the past experiments to verify the inequality, however, have been performed by using photons. On the other hand, only one experiment by using hadron system was carried out by Lamehi and Mitting, where the [{sup 1}S{sub 0}] state was produced by proton-proton scattering at first, and then the spin orientations after the scattering were measured. Unfortunately, there exit some sources of ambiguity to reach definite conclusion from their result because the experiment was done at rather high energy of 13.5 MeV. In the experiment planned by the present author it is designed to overcome the experimental difficulties, which Lamehi and Mitting encountered, by (1) generating high purity singlet [{sup 1}S{sub 0}] state of two protons by (d, {sup 2}He) type nuclear reaction at intermediate energy range, and by (2) developing high performance spin-correlation polarimeter which can analyze spins of two protons simultaneously to minimize the systematic errors. The excitation energy of {sup 2}He corresponding to the proton-proton relative energy can be experimentally controlled. An idea singlet is realized by choosing the state with sufficiently small relative energy. It is planned to measure the spin correlation function by using SMART (Swinger and Magnetic Analyzer with Rotator and Twister) at RIKEN Accelerator Research Facility. Einstein POLarimeter (EPOL) to be installed on the second focal plane of SMART is under development, with which high precision measurements of spin orientations of two high energy protons simultaneously coming into limited space from {sup 2}He decay are made selecting the subject events from very many background events. Monte Carlo

  14. Integrated vector management: the Zambian experience.

    Science.gov (United States)

    Chanda, Emmanuel; Masaninga, Fred; Coleman, Michael; Sikaala, Chadwick; Katebe, Cecilia; Macdonald, Michael; Baboo, Kumar S; Govere, John; Manga, Lucien

    2008-08-27

    The Zambian Malaria Control Programme with the Roll Back Malaria (RBM) partners have developed the current National Malaria Strategic Plan (NMSP 2006-2011) which focuses on prevention based on the Integrated Vector Management (IVM) strategy. The introduction and implementation of an IVM strategy was planned in accordance with the World Health Organization (WHO) steps towards IVM implementation namely Introduction Phase, Consolidation Phase and Expansion Phase. IVM has created commitment for Legal and Regulatory policy review, monitoring, Research and a strong stewardship by the chemical suppliers. It has also leveraged additional resources, improved inter-sectoral collaboration, capacity building and enhanced community participation which facilitated a steady scaling up in coverage and utilisation of key preventive interventions. Thus, markedly reducing malaria incidence and case fatalities in the country. Zambia has successfully introduced, consolidated and expanded IVM activities. Resulting in increased coverage and utilization of interventions and markedly reducing malaria-related morbidity and mortality while ensuring a better protection of the environment.

  15. Integrating mental health services: the Finnish experience

    Directory of Open Access Journals (Sweden)

    Ville Lehtinen

    2001-06-01

    Full Text Available The aim of this paper is to give a short description of the most important developments of mental health services in Finland during the 1990s, examine their influences on the organisation and provision of services, and describe shortly some national efforts to handle the new situation. The Finnish mental health service system experienced profound changes in the beginning of the 1990s. These included the integration of mental health services, being earlier under own separate administration, with other specialised health services, decentralisation of the financing of health services, and de-institutionalisation of the services. The same time Finland underwent the deepest economic recession in Western Europe, which resulted in cut-offs especially in the mental health budgets. Conducting extensive national research and development programmes in the field of mental health has been one typically Finnish way of supporting the mental health service development. The first of these national programmes was the Schizophrenia Project 1981–97, whose main aims were to decrease the incidence of new long-term patients and the prevalence of old long-stay patients by developing an integrated treatment model. The Suicide Prevention Project 1986–96 aimed at raising awareness of this special problem and decreasing by 20% the proportionally high suicide rate in Finland. The National Depression Programme 1994–98 focused at this clearly increasing public health concern by several research and development project targeted both to the general population and specifically to children, primary care and specialised services. The latest, still on-going Meaningful Life Programme 1998–2003 which main aim is, by multi-sectoral co-operation, to improve the quality of life for people suffering from or living with the threat of mental disorders. Furthermore, the government launched in 1999 a new Goal and Action Programme for Social Welfare and Health Care 2000–2003, in

  16. Verification of atmospheric diffusion models with data of atmospheric diffusion experiments

    International Nuclear Information System (INIS)

    Hato, Shinji; Homma, Toshimitsu

    2009-02-01

    The atmospheric diffusion experiments were implemented by Japan Atomic Energy Research Institute (JAERI) around Mount Tsukuba in 1989 and 1990, and the tracer gas concentration were monitored. In this study, the Gauss Plume Model and RAMS/HYPACT that are meteorological forecast code and atmospheric diffusion code with detailed physical law are made a comparison between monitored concentration. In conclusion, the Gauss Plume Model is better than RAM/HYPACT even complex topography if the estimation is around tens of kilometer form release point and the change in weather is constant for short time. This reason is difference of wind between RAMS and observation. (author)

  17. Integrated Networks: National and International Online Experiences

    Directory of Open Access Journals (Sweden)

    Osvaldo Muniz-Solari

    2009-02-01

    Full Text Available There is an increasing impression among online geography educators that interaction can be developed based on specific teaching and learning methods. The authors developed a practical research study to investigate this issue. The study was based on advanced graduate courses in geography at Beijing Normal University and Texas State University. International interaction was complemented by online collaboration among the US local group. Both synchronous and asynchronous communication systems were used, which spanned two platforms. Results of this experience indicate that teaching and learning methods must be enhanced by a flexible online learning model and extensive organizational support in order to increase interaction and reach a certain level of cooperation.

  18. Development of independent MU/treatment time verification algorithm for non-IMRT treatment planning: A clinical experience

    Science.gov (United States)

    Tatli, Hamza; Yucel, Derya; Yilmaz, Sercan; Fayda, Merdan

    2018-02-01

    The aim of this study is to develop an algorithm for independent MU/treatment time (TT) verification for non-IMRT treatment plans, as a part of QA program to ensure treatment delivery accuracy. Two radiotherapy delivery units and their treatment planning systems (TPS) were commissioned in Liv Hospital Radiation Medicine Center, Tbilisi, Georgia. Beam data were collected according to vendors' collection guidelines, and AAPM reports recommendations, and processed by Microsoft Excel during in-house algorithm development. The algorithm is designed and optimized for calculating SSD and SAD treatment plans, based on AAPM TG114 dose calculation recommendations, coded and embedded in MS Excel spreadsheet, as a preliminary verification algorithm (VA). Treatment verification plans were created by TPSs based on IAEA TRS 430 recommendations, also calculated by VA, and point measurements were collected by solid water phantom, and compared. Study showed that, in-house VA can be used for non-IMRT plans MU/TT verifications.

  19. Sustainable Development Impacts of NAMAs: An integrated approach to assessment of co-benefits based on experience with the CDM

    DEFF Research Database (Denmark)

    Olsen, Karen Holm

    to assess the SD impacts of NAMAs. This paper argues for a new integrated approach to asses NAMAs' SD impacts that consists of SD indicators, procedures for stakeholder involvement and safeguards against negative impacts. The argument is based on a review of experience with the CDM’s contribution to SD...... and a comparison of similarities and differences between NAMAs and CDM. Five elements of a new approach towards assessment of NAMAs SD impacts are suggested based on emerging approaches and methodologies for monitoring, reporting and verification (MRV) of greenhouse gas reductions and SD impacts of NAMAs....

  20. The design of verification regimes

    International Nuclear Information System (INIS)

    Gallagher, N.W.

    1991-01-01

    Verification of a nuclear agreement requires more than knowledge of relevant technologies and institutional arrangements. It also demands thorough understanding of the nature of verification and the politics of verification design. Arms control efforts have been stymied in the past because key players agreed to verification in principle, only to disagree radically over verification in practice. In this chapter, it is shown that the success and stability of arms control endeavors can be undermined by verification designs which promote unilateral rather than cooperative approaches to security, and which may reduce, rather than enhance, the security of both sides. Drawing on logical analysis and practical lessons from previous superpower verification experience, this chapter summarizes the logic and politics of verification and suggests implications for South Asia. The discussion begins by determining what properties all forms of verification have in common, regardless of the participants or the substance and form of their agreement. Viewing verification as the political process of making decisions regarding the occurrence of cooperation points to four critical components: (1) determination of principles, (2) information gathering, (3) analysis and (4) projection. It is shown that verification arrangements differ primarily in regards to how effectively and by whom these four stages are carried out

  1. Complementarity of integral and differential experiments for reactor physics purposes

    International Nuclear Information System (INIS)

    Tellier, Henry.

    1981-04-01

    In this paper, the following topics are studied: uranium 238 effective integral; thermal range uranium 238 capture cross section; Americium 242 m capture cross section. The mentioned examples show that differential and integral experiments are both useful to the reactor physicists

  2. Large scale statistics for computational verification of grain growth simulations with experiments

    International Nuclear Information System (INIS)

    Demirel, Melik C.; Kuprat, Andrew P.; George, Denise C.; Straub, G.K.; Misra, Amit; Alexander, Kathleen B.; Rollett, Anthony D.

    2002-01-01

    It is known that by controlling microstructural development, desirable properties of materials can be achieved. The main objective of our research is to understand and control interface dominated material properties, and finally, to verify experimental results with computer simulations. We have previously showed a strong similarity between small-scale grain growth experiments and anisotropic three-dimensional simulations obtained from the Electron Backscattered Diffraction (EBSD) measurements. Using the same technique, we obtained 5170-grain data from an Aluminum-film (120 (micro)m thick) with a columnar grain structure. Experimentally obtained starting microstructure and grain boundary properties are input for the three-dimensional grain growth simulation. In the computational model, minimization of the interface energy is the driving force for the grain boundary motion. The computed evolved microstructure is compared with the final experimental microstructure, after annealing at 550 C. Characterization of the structures and properties of grain boundary networks (GBN) to produce desirable microstructures is one of the fundamental problems in interface science. There is an ongoing research for the development of new experimental and analytical techniques in order to obtain and synthesize information related to GBN. The grain boundary energy and mobility data were characterized by Electron Backscattered Diffraction (EBSD) technique and Atomic Force Microscopy (AFM) observations (i.e., for ceramic MgO and for the metal Al). Grain boundary energies are extracted from triple junction (TJ) geometry considering the local equilibrium condition at TJ's. Relative boundary mobilities were also extracted from TJ's through a statistical/multiscale analysis. Additionally, there are recent theoretical developments of grain boundary evolution in microstructures. In this paper, a new technique for three-dimensional grain growth simulations was used to simulate interface migration

  3. Integrated vector management: The Zambian experience

    Directory of Open Access Journals (Sweden)

    Katebe Cecilia

    2008-08-01

    Full Text Available Abstract Background The Zambian Malaria Control Programme with the Roll Back Malaria (RBM partners have developed the current National Malaria Strategic Plan (NMSP 2006–2011 which focuses on prevention based on the Integrated Vector Management (IVM strategy. The introduction and implementation of an IVM strategy was planned in accordance with the World Health Organization (WHO steps towards IVM implementation namely Introduction Phase, Consolidation Phase and Expansion Phase. Achievements IVM has created commitment for Legal and Regulatory policy review, monitoring, Research and a strong stewardship by the chemical suppliers. It has also leveraged additional resources, improved inter-sectoral collaboration, capacity building and enhanced community participation which facilitated a steady scaling up in coverage and utilisation of key preventive interventions. Thus, markedly reducing malaria incidence and case fatalities in the country. Conclusion Zambia has successfully introduced, consolidated and expanded IVM activities. Resulting in increased coverage and utilization of interventions and markedly reducing malaria-related morbidity and mortality while ensuring a better protection of the environment.

  4. The integrated project as a learning experience

    Directory of Open Access Journals (Sweden)

    Maria Angeles Antequera

    2012-03-01

    Full Text Available Florida is a higher education centre specialising in technical and business training. Postgraduate programs, university qualifications, vocational training, secondary education, further education, occupational training and languages are taught at Florida. An educational model in accordance with the demands of the European Higher Education Area has been designed, focussing on teaching for professional competencies. We have chosen to use a methodology which promotes the development of skills and abilities, it promotes participation and it is student-centric as s/he must look for knowledge him/herself thus connecting the educational and the real world. In the different university degrees taught in our centre, each year the student carries out a project set in a real context which integrates specific competencies from the course subject and develops transversal competencies associated with the project which are the purpose of planning and progressive learning: team work, effective communication, conflict resolution, leadership skills, innovation and creativity. The IP counts for 25% of each course in terms of objectives, scheduling and final evaluation. The project grade is an individual grade for each student and is the same for all subjects which form part of the project.

  5. Verification of HELIOS-MASTER system through benchmark of critical experiments

    International Nuclear Information System (INIS)

    Kim, H. Y.; Kim, K. Y.; Cho, B. O.; Lee, C. C.; Zee, S. O.

    1999-01-01

    The HELIOS-MASTER code system is verified through the benchmark of the critical experiments that were performed by RRC 'Kurchatov Institute' with water-moderated hexagonally pitched lattices of highly enriched Uranium fuel rods (80w/o). We also used the same input by using the MCNP code that was described in the evaluation report, and compared our results with those of the evaluation report. HELIOS, developed by Scandpower A/S, is a two-dimensional transport program for the generation of group cross-sections, and MASTER, developed by KAERI, is a three-dimensional nuclear design and analysis code based on the two-group diffusion theory. It solves neutronics model with the AFEN (Analytic Function Expansion Nodal) method for hexagonal geometry. The results show that the HELIOS-MASTER code system is fast and accurate enough to be used as nuclear core analysis tool for hexagonal geometry

  6. A new solver for granular avalanche simulation: Indoor experiment verification and field scale case study

    Science.gov (United States)

    Wang, XiaoLiang; Li, JiaChun

    2017-12-01

    A new solver based on the high-resolution scheme with novel treatments of source terms and interface capture for the Savage-Hutter model is developed to simulate granular avalanche flows. The capability to simulate flow spread and deposit processes is verified through indoor experiments of a two-dimensional granular avalanche. Parameter studies show that reduction in bed friction enhances runout efficiency, and that lower earth pressure restraints enlarge the deposit spread. The April 9, 2000, Yigong avalanche in Tibet, China, is simulated as a case study by this new solver. The predicted results, including evolution process, deposit spread, and hazard impacts, generally agree with site observations. It is concluded that the new solver for the Savage-Hutter equation provides a comprehensive software platform for granular avalanche simulation at both experimental and field scales. In particular, the solver can be a valuable tool for providing necessary information for hazard forecasts, disaster mitigation, and countermeasure decisions in mountainous areas.

  7. Verification of MC{sup 2}-3 Doppler Sample Models in ZPPR-15 D Experiments

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Min Jae; Hartanto, Donny; Kim, Sang Ji [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    In this paper, the change of reaction rate and broadened cross section were estimated by as-built MCNP models for metallic uranium sample in ZPPR-15D using ENDF/B-VII.0 library, and the results were compared with deterministic calculations provided in previous work. The Doppler broadening is an instant feedback mechanism that improves safety and stability for both thermal and fast reactors. Therefore, the accuracy of Doppler coefficient becomes an important parameter in reactor design as well as in the safety analysis. The capability of the Doppler worth calculation by a modern computer code suites such as MC2-3 and DIF3DVARIANT, has been validated against the Zero Power Physics Reactor-15 (ZPPR-15) Doppler worth measurement experiments. For the same experiments, our previous work suggested four different MC2-3 Doppler sample models for enhanced accuracy, which are combinations of heterogeneous models and the super cell approach. The MOC and MOC-SPC models showed the smallest error in estimating the U-238 total cross section of Doppler sample N-11, and the Doppler broadening effects are well applied to the cross section compared to other two models, HOM and SPC. The effects of the super cell approach can be hardly seen, since the broadened cross section is almost the same with and without the super cell approach. Comparing the transition of reaction density, MOC and MOC-SPC models also show similar behavior as MCNP's with minor errors. As a conclusion, we could obtain more consistent broadened cross section as well as reaction density transition by providing heterogeneous models from MC2-3's MOC module.

  8. BIOMEX Experiment: Ultrastructural Alterations, Molecular Damage and Survival of the Fungus Cryomyces antarcticus after the Experiment Verification Tests

    Science.gov (United States)

    Pacelli, Claudia; Selbmann, Laura; Zucconi, Laura; De Vera, Jean-Pierre; Rabbow, Elke; Horneck, Gerda; de la Torre, Rosa; Onofri, Silvano

    2017-06-01

    The search for traces of extinct or extant life in extraterrestrial environments is one of the main goals for astrobiologists; due to their ability to withstand stress producing conditions, extremophiles are perfect candidates for astrobiological studies. The BIOMEX project aims to test the ability of biomolecules and cell components to preserve their stability under space and Mars-like conditions, while at the same time investigating the survival capability of microorganisms. The experiment has been launched into space and is being exposed on the EXPOSE-R2 payload, outside of the International Space Station (ISS) over a time-span of 1.5 years. Along with a number of other extremophilic microorganisms, the Antarctic cryptoendolithic black fungus Cryomyces antarcticus CCFEE 515 has been included in the experiment. Before launch, dried colonies grown on Lunar and Martian regolith analogues were exposed to vacuum, irradiation and temperature cycles in ground based experiments (EVT1 and EVT2). Cultural and molecular tests revealed that the fungus survived on rock analogues under space and simulated Martian conditions, showing only slight ultra-structural and molecular damage.

  9. Scalable Techniques for Formal Verification

    CERN Document Server

    Ray, Sandip

    2010-01-01

    This book presents state-of-the-art approaches to formal verification techniques to seamlessly integrate different formal verification methods within a single logical foundation. It should benefit researchers and practitioners looking to get a broad overview of the spectrum of formal verification techniques, as well as approaches to combining such techniques within a single framework. Coverage includes a range of case studies showing how such combination is fruitful in developing a scalable verification methodology for industrial designs. This book outlines both theoretical and practical issue

  10. The regional energy integration: the latin-american experiences

    International Nuclear Information System (INIS)

    2003-01-01

    The ways of the regional economic integrations are not identical and generate different repercussions on the markets and the energy industries evolution. The example of the Latin America proposes many various experiences to evaluate the stakes and the limits of each regional integrations. These limits lead to solution researches including indisputable convergencies. The first part of this document presents the genesis of these regional economic integrations experiences in Latina America, to study in the second part the energy consequences of the liberal ALENA and of the more political MERCOSUR. (A.L.B.)

  11. Experiment and analysis of CASTOR type model cask for verification of radiation shielding

    Energy Technology Data Exchange (ETDEWEB)

    Hattori, Seiichi; Ueki, Kohtaro.

    1988-08-01

    The radiation shielding system of CASTOR type cask is composed of the graphite cast iron and the polyethylene lod. The former fomes the cylndrical body of the cask to shield gamma rays and the latter is embeded in the body to shield neutrons. Characteristic of radiation shielding of CASTOR type cask is that zigzag arrangement of the polyethylene lod is adopted to unify the penetrating dose rate. It is necessary to use the three-dimensional analysis code to analyse the shielding performance of the cask with the complicated shielding system precisely. However, it takes too much time as well as too much cost. Therefore, the two-dimensional analysis is usually applied, in which the three-dimensional model is equivalently transformed into the two-dimensional calculation. The reseach study was conducted to verify the application of the two-dimensional analysis, in which the experiment and the analysis using CASTOR type model cask was perfomed. The model cask was manufactured by GNS campany in West Germany and the shielding ability test facilities in CRIEPI were used. It was judged from the study that the two-dimensional analysis is useful means for the practical use.

  12. Fabrication Improvement of Cold Forging Hexagonal Nuts by Computational Analysis and Experiment Verification

    Directory of Open Access Journals (Sweden)

    Shao-Yi Hsia

    2015-01-01

    Full Text Available Cold forging has played a critical role in fasteners and has been applied to the automobile industry, construction industry, aerospace industry, and living products so that cold forging presents the opportunities for manufacturing more products. By using computer simulation, this study attempts to analyze the process of creating machine parts, such as hexagonal nuts. The DEFORM-3D forming software is applied to analyze the process at various stages in the computer simulation, and the compression test is also used for the flow stress equation in order to compare the differences between the experimental results and the equation that is built into the computer simulation software. At the same time, the metallography and hardness of experiments are utilized to understand the cold forging characteristics of hexagonal nuts. The research results would benefit machinery businesses to realize the forging load and forming conditions at various stages before the fastener formation. In addition to planning proper die design and production, the quality of the produced hexagonal nuts would be more stable to promote industrial competitiveness.

  13. Verification of the Global Precipitation Measurement (GPM) Satellite by the Olympic Mountains Experiment (OLYMPEX)

    Science.gov (United States)

    McMurdie, L. A.; Houze, R.

    2017-12-01

    Measurements of global precipitation are critical for monitoring Earth's water resources and hydrological processes, including flooding and snowpack accumulation. As such, the Global Precipitation Measurement (GPM) Mission `Core' satellite detects precipitation ranging from light snow to heavy downpours in a wide range locations including remote mountainous regions. The Olympic Mountains Experiment (OLYMPEX) during the 2015-2016 fall-winter season in the mountainous Olympic Peninsula of Washington State provide physical and hydrological validation for GPM precipitation algorithms and insight into the modification of midlatitude storms by passage over mountains. The instrumentation included ground-based dual-polarization Doppler radars on the windward and leeward sides of the Olympic Mountains, surface stations that measured precipitation rates, particle size distributions and fall velocities at various altitudes, research aircraft equipped with cloud microphysics probes, radars, lidar, and passive radiometers, supplemental rawinsondes and dropsondes, and autonomous recording cameras that monitored snowpack accumulation. Results based on dropsize distributions (DSDs) and cross-sections of radar reflectivity over the ocean and windward slopes have revealed important considerations for GPM algorithm development. During periods of great precipitation accumulation and enhancement by the mountains on windward slopes, both warm rain and ice-phase processes are present, implying that it is important for GPM retrievals be sensitive to both types of precipitation mechanisms and to represent accurately the concentration of precipitation at the lowest possible altitudes. OLYMPEX data revealed that a given rain rate could be associated with a variety of DSDs, which presents a challenge for GPM precipitation retrievals in extratropical cyclones passing over mountains. Some of the DSD regimes measured during OLYMPEX stratiform periods have the same characteristics found in prior

  14. Design and verification experiments for the windowless spallation target of the ADS prototype Myrrha

    International Nuclear Information System (INIS)

    Kantrien Van, Tichelen; Kupschus, P.; Arien, B.; Ait Abderrahim, H.

    2003-01-01

    future programme for the optimisation of the windowless design. These design activities include both experiments and CFD calculations and their interaction. (authors)

  15. Engineering Physics Division integral experiments and their analyses

    International Nuclear Information System (INIS)

    Anon.

    1980-01-01

    Integral experiments are performed as part of the Engineering Physics Division's on-going research in the development and application of radiation shielding methods. Integral experiments performed at the Oak Ridge Electron Linear Accelerator (ORELA) under the Division's Magnetic Fusion program are designed to provide data against which ORNL and all other organizations involved in shielding calculations for fusion devices can test their calculational methods and interaction data. The Tower Shielding Facility (TSF) continues to be the primary source of integral data for fission reactor shielding design. The experiments performed at the TSF during the last few years have been sponsored by the Gas Cooled Fast Reactor (GCFR) program. During this report period final documentation was also prepared for the remaining LMFBR shielding experiments, including an examination of streaming through annular slits and measurement of secondary gamma-ray production in reinforced concrete

  16. The regional energy integration: the latin-american experiences; L'integration energetique regionale: les experiences latino-americaines

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2003-07-01

    The ways of the regional economic integrations are not identical and generate different repercussions on the markets and the energy industries evolution. The example of the Latin America proposes many various experiences to evaluate the stakes and the limits of each regional integrations. These limits lead to solution researches including indisputable convergencies. The first part of this document presents the genesis of these regional economic integrations experiences in Latina America, to study in the second part the energy consequences of the liberal ALENA and of the more political MERCOSUR. (A.L.B.)

  17. The regional energy integration: the latin-american experiences; L'integration energetique regionale: les experiences latino-americaines

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2003-07-01

    The ways of the regional economic integrations are not identical and generate different repercussions on the markets and the energy industries evolution. The example of the Latin America proposes many various experiences to evaluate the stakes and the limits of each regional integrations. These limits lead to solution researches including indisputable convergencies. The first part of this document presents the genesis of these regional economic integrations experiences in Latina America, to study in the second part the energy consequences of the liberal ALENA and of the more political MERCOSUR. (A.L.B.)

  18. SU-F-J-116: Clinical Experience-Based Verification and Improvement of a 4DCT Program

    Energy Technology Data Exchange (ETDEWEB)

    Fogg, P; West, M; Aland, T [Genesis Cancer Care, Auchenflower, Qld (Australia)

    2016-06-15

    Purpose: To demonstrate the role of continuous improvement fulfilled by the Medical Physicist in clinical 4DCT and CBCT scanning. Methods: Lung (SABR and Standard) patients’ 4D respiratory motion and image data were reviewed over a 3, 6 and 12 month period following commissioning testing. By identifying trends of clinically relevant parameters and respiratory motions, variables were tested with a programmable motion phantom and assessed. Patient traces were imported to a motion phantom and 4DCT and CBCT imaging were performed. Cos6 surrogate and sup-inf motion was also programmed into the phantom to simulate the long exhale of patients for image contrast tests. Results: Patient surrogate motion amplitudes were 9.9+5.2mm (3–35) at 18+6bpm (6–30). Expiration/Inspiration time ratios of 1.4+0.5second (0.6–2.9) showed image contrast effects evident in the AveCT and 3DCBCT images. Small differences were found for patients with multiple 4DCT data sets. Patient motion assessments were simulated and verified with the phantom within 2mm. Initial image reviews to check for reconstructed artefacts and data loss identified a small number of patients with irregularities in the automatic placement of inspiration and expiration points. Conclusion: The Physicist’s involvement in the continuous improvements of a clinically commissioned technique, processes and workflows continues beyond the commissioning stage of a project. Our experience with our clinical 4DCT program shows that Physics presence is required at the clinical 4DCT scan to assist with technical aspects of the scan and also for clinical image quality assessment prior to voluming. The results of this work enabled the sharing of information from the Medical Physics group with the Radiation Oncologists and Radiation Therapists. This results in an improved awareness of clinical patient respiration variables and how they may affect 4D simulation images and also may also affect the treatment verification images.

  19. Verification study of the FORE-2M nuclear/thermal-hydraulilc analysis computer code

    International Nuclear Information System (INIS)

    Coffield, R.D.; Tang, Y.S.; Markley, R.A.

    1982-01-01

    The verification of the LMFBR core transient performance code, FORE-2M, was performed in two steps. Different components of the computation (individual models) were verified by comparing with analytical solutions and with results obtained from other conventionally accepted computer codes (e.g., TRUMP, LIFE, etc.). For verification of the integral computation method of the code, experimental data in TREAT, SEFOR and natural circulation experiments in EBR-II were compared with the code calculations. Good agreement was obtained for both of these steps. Confirmation of the code verification for undercooling transients is provided by comparisons with the recent FFTF natural circulation experiments. (orig.)

  20. Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) verification and validation plan. version 1.

    Energy Technology Data Exchange (ETDEWEB)

    Bartlett, Roscoe Ainsworth; Arguello, Jose Guadalupe, Jr.; Urbina, Angel; Bouchard, Julie F.; Edwards, Harold Carter; Freeze, Geoffrey A.; Knupp, Patrick Michael; Wang, Yifeng; Schultz, Peter Andrew; Howard, Robert (Oak Ridge National Laboratory, Oak Ridge, TN); McCornack, Marjorie Turner

    2011-01-01

    The objective of the U.S. Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) is to provide an integrated suite of computational modeling and simulation (M&S) capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive-waste storage facility or disposal repository. To meet this objective, NEAMS Waste IPSC M&S capabilities will be applied to challenging spatial domains, temporal domains, multiphysics couplings, and multiscale couplings. A strategic verification and validation (V&V) goal is to establish evidence-based metrics for the level of confidence in M&S codes and capabilities. Because it is economically impractical to apply the maximum V&V rigor to each and every M&S capability, M&S capabilities will be ranked for their impact on the performance assessments of various components of the repository systems. Those M&S capabilities with greater impact will require a greater level of confidence and a correspondingly greater investment in V&V. This report includes five major components: (1) a background summary of the NEAMS Waste IPSC to emphasize M&S challenges; (2) the conceptual foundation for verification, validation, and confidence assessment of NEAMS Waste IPSC M&S capabilities; (3) specifications for the planned verification, validation, and confidence-assessment practices; (4) specifications for the planned evidence information management system; and (5) a path forward for the incremental implementation of this V&V plan.

  1. Surgical clips for position verification and correction of non-rigid breast tissue in simultaneously integrated boost (SIB) treatments

    International Nuclear Information System (INIS)

    Penninkhof, Joan; Quint, Sandra; Boer, Hans de; Mens, Jan Willem; Heijmen, Ben; Dirkx, Maarten

    2009-01-01

    Background and purpose: The aim of this study is to investigate whether surgical clips in the lumpectomy cavity are representative for position verification of both the tumour bed and the whole breast in simultaneously integrated boost (SIB) treatments. Materials and methods: For a group of 30 patients treated with a SIB technique, kV and MV planar images were acquired throughout the course of the fractionated treatment. The 3D set-up error for the tumour bed was derived by matching the surgical clips (3-8 per patient) in two almost orthogonal planar kV images. By projecting the 3D set-up error derived from the planar kV images to the (u, v)-plane of the tangential beams, the correlation with the 2D set-up error for the whole breast, derived from the MV EPID images, was determined. The stability of relative clip positions during the fractionated treatment was investigated. In addition, for a subgroup of 15 patients, the impact of breathing was determined from fluoroscopic movies acquired at the linac. Results: The clip configurations were stable over the course of radiotherapy, showing an inter-fraction variation (1 SD) of 0.5 mm on average. Between the start and the end of the treatment, the mean distance between the clips and their center of mass was reduced by 0.9 mm. A decrease larger than 2 mm was observed in eight patients (17 clips). The top-top excursion of the clips due to breathing was generally less than 2.5 mm in all directions. The population averages of the difference (±1 SD) between kV and MV matches in the (u, v)-plane were 0.2 ± 1.8 mm and 0.9 ± 1.5 mm, respectively. In 30% of the patients, time trends larger than 3 mm were present over the course of the treatment in either or in both kV and MV match results. Application of the NAL protocol based on the clips reduced the population mean systematic error to less than 2 mm in all directions, both for the tumour bed and the whole breast. Due to the observed time trends, these systematic errors can

  2. Integrated Virtual Environment Test Concepts and Objectives

    National Research Council Canada - National Science Library

    Tackett, Gregory

    2001-01-01

    ...), a series of integration and verification tests were conducted to provide development milestones for the simulation architecture and tools that would be needed for the full-up live/virtual field experiment...

  3. How to Find a Bug in Ten Thousand Lines Transport Solver? Outline of Experiences from AN Advection-Diffusion Code Verification

    Science.gov (United States)

    Zamani, K.; Bombardelli, F.

    2011-12-01

    Almost all natural phenomena on Earth are highly nonlinear. Even simplifications to the equations describing nature usually end up being nonlinear partial differential equations. Transport (ADR) equation is a pivotal equation in atmospheric sciences and water quality. This nonlinear equation needs to be solved numerically for practical purposes so academicians and engineers thoroughly rely on the assistance of numerical codes. Thus, numerical codes require verification before they are utilized for multiple applications in science and engineering. Model verification is a mathematical procedure whereby a numerical code is checked to assure the governing equation is properly solved as it is described in the design document. CFD verification is not a straightforward and well-defined course. Only a complete test suite can uncover all the limitations and bugs. Results are needed to be assessed to make a distinction between bug-induced-defect and innate limitation of a numerical scheme. As Roache (2009) said, numerical verification is a state-of-the-art procedure. Sometimes novel tricks work out. This study conveys the synopsis of the experiences we gained during a comprehensive verification process which was done for a transport solver. A test suite was designed including unit tests and algorithmic tests. Tests were layered in complexity in several dimensions from simple to complex. Acceptance criteria defined for the desirable capabilities of the transport code such as order of accuracy, mass conservation, handling stiff source term, spurious oscillation, and initial shape preservation. At the begining, mesh convergence study which is the main craft of the verification is performed. To that end, analytical solution of ADR equation gathered. Also a new solution was derived. In the more general cases, lack of analytical solution could be overcome through Richardson Extrapolation and Manufactured Solution. Then, two bugs which were concealed during the mesh convergence

  4. Experimental verification of displacement control on integrated ionic polymer-metal composite actuators with stochastic on/off controller

    Science.gov (United States)

    Kimura, Keishiro; Kamamichi, Norihiro

    2017-04-01

    An ionic polymer-metal composite (IPMC) actuator is one of polymer-based soft actuators. It is produced by chemically plating gold or platinum on both surface of a perfluorosulfonic acid membrane which is known as an ion-exchange membrane. It is able to be activated by a simple driving circuit and generate a large deformation under a low applied voltage (0.5-3 V). However, individual difference and characteristics changes from environmental conditions should be considered for realizing a stable or precise control. To solve these problems, we applied a stochastic ON/OFF controller to an integrated IPMC actuator with parallel connections. The controller consists of a central controller and distributed controllers. The central controller broadcasts a control signal such as an error signal to distributed controllers uniformly. The distributed controllers switch the ON/OFF states based on the broadcasted signal stochastically. The central controller dose not measure the states of each IPMC actuator, and the control signals is calculated by using the output signal of the integrated actuator and reference signal. The validity of the applied method was investigated through numerical simulations and experiments.

  5. BDI: the Cadarache data bank for LMBFR integral experiment data

    International Nuclear Information System (INIS)

    Rimpault, G.; Reynaud, G.

    1986-09-01

    The Integral Data Bank is part of the procedure to create the so-called neutronic formulaire with which every design calculation is performed with associated uncertainty. A modern way to store the integral data has been set up in order to handle and recalculate easily with a standard procedure (fig. 1) each experimental programme. A direct access way to read the data allows an automatic way to obtain the calculation/experiment discrepancies associated with a particular data base. The BDI has proved to be fully operational and has been used with the new nuclear data file JEF1. In the present version of the BDI more than 140 experiments (critical mass, spectrum indexes, buckling, etc...) both from MASURCA and SNEAK critical experiments are documented and stored in an easy-to-retrieve from. Also included are irradiation experiments in PHENIX and the STEK fission product related experiments. Future plans of development concern reactivity measurements in critical assemblies, irradiation experiments and start-up experiments of SUPER PHENIX and ZEBRA critical experiments

  6. Tablets in K-12 Education: Integrated Experiences and Implications

    Science.gov (United States)

    An, Heejung, Ed.; Alon, Sandra, Ed.; Fuentes, David, Ed.

    2015-01-01

    The inclusion of new and emerging technologies in the education sector has been a topic of interest to researchers, educators, and software developers alike in recent years. Utilizing the proper tools in a classroom setting is a critical factor in student success. "Tablets in K-12 Education: Integrated Experiences and Implications"…

  7. Comment for nuclear data from the FNS integral experiments

    International Nuclear Information System (INIS)

    Maekawa, Hiroshi

    1983-01-01

    Among the integral experiments that were carried out during last one year at FNS, the following three experimental results and their analyses are discribed. 1) Tritium production-rate distribution in a Li 2 O-C assembly, 2) Angle dependent neutron leakage spectra from Li 2 O slab assemblies, 3) Induced activity of Type 316 stainless steel. (author)

  8. Teaching with Videogames: How Experience Impacts Classroom Integration

    Science.gov (United States)

    Bell, Amanda; Gresalfi, Melissa

    2017-01-01

    Digital games have demonstrated great potential for supporting students' learning across disciplines. But integrating games into instruction is challenging and requires teachers to shift instructional practices. One factor that contributes to the successful use of games in a classroom is teachers' experience implementing the technologies. But how…

  9. International Students' Experiences of Integrating into the Workforce

    Science.gov (United States)

    Nunes, Sarah; Arthur, Nancy

    2013-01-01

    This study explored the integration experiences of 16 international students entering the Canadian workforce using a semistructured interview and constant comparison method. The international students were pursuing immigration to Canada, despite unmet job prospects. Students recommended that employers refrain from discriminating against students…

  10. Colleges' Experiences: Integrating Support Services for Military Veterans

    Science.gov (United States)

    Karp, Melinda Mechur; Klempin, Serena

    2017-01-01

    To improve the educational experiences and outcomes of student veterans, the Kisco Foundation developed the Kohlberg Prize in 2015. Two cohorts of colleges were awarded competitive grants to enhance their veterans services. This piece examines the process of creating integrated services for student veterans through the institutionalization of…

  11. Experiences of technology integration in home care nursing.

    Science.gov (United States)

    Johnson, K A; Valdez, R S; Casper, G R; Kossman, S P; Carayon, P; Or, C K L; Burke, L J; Brennan, P F

    2008-11-06

    The infusion of health care technologies into the home leads to substantial changes in the nature of work for home care nurses and their patients. Nurses and nursing practice must change to capitalize on these innovations. As part of a randomized field experiment evaluating web-based support for home care of patients with chronic heart disease, we engaged nine nurses in a dialogue about their experience integrating this modification of care delivery into their practice. They shared their perceptions of the work they needed to do and their perceptions and expectations for patients and themselves in using technologies to promote and manage self-care. We document three overarching themes that identify preexisting factors that influenced integration or represent the consequences of technology integration into home care: doing tasks differently, making accommodations in the home for devices and computers, and being mindful of existing expectations and skills of both nurses and patients.

  12. Patient-specific IMRT verification using independent fluence-based dose calculation software: experimental benchmarking and initial clinical experience

    International Nuclear Information System (INIS)

    Georg, Dietmar; Stock, Markus; Kroupa, Bernhard; Olofsson, Joergen; Nyholm, Tufve; Ahnesjoe, Anders; Karlsson, Mikael

    2007-01-01

    Experimental methods are commonly used for patient-specific intensity-modulated radiotherapy (IMRT) verification. The purpose of this study was to investigate the accuracy and performance of independent dose calculation software (denoted as 'MUV' (monitor unit verification)) for patient-specific quality assurance (QA). 52 patients receiving step-and-shoot IMRT were considered. IMRT plans were recalculated by the treatment planning systems (TPS) in a dedicated QA phantom, in which an experimental 1D and 2D verification (0.3 cm 3 ionization chamber; films) was performed. Additionally, an independent dose calculation was performed. The fluence-based algorithm of MUV accounts for collimator transmission, rounded leaf ends, tongue-and-groove effect, backscatter to the monitor chamber and scatter from the flattening filter. The dose calculation utilizes a pencil beam model based on a beam quality index. DICOM RT files from patient plans, exported from the TPS, were directly used as patient-specific input data in MUV. For composite IMRT plans, average deviations in the high dose region between ionization chamber measurements and point dose calculations performed with the TPS and MUV were 1.6 ± 1.2% and 0.5 ± 1.1% (1 S.D.). The dose deviations between MUV and TPS slightly depended on the distance from the isocentre position. For individual intensity-modulated beams (total 367), an average deviation of 1.1 ± 2.9% was determined between calculations performed with the TPS and with MUV, with maximum deviations up to 14%. However, absolute dose deviations were mostly less than 3 cGy. Based on the current results, we aim to apply a confidence limit of 3% (with respect to the prescribed dose) or 6 cGy for routine IMRT verification. For off-axis points at distances larger than 5 cm and for low dose regions, we consider 5% dose deviation or 10 cGy acceptable. The time needed for an independent calculation compares very favourably with the net time for an experimental approach

  13. Empirical Tests and Preliminary Results with the Krakatoa Tool for Full Static Program Verification

    Directory of Open Access Journals (Sweden)

    Ramírez-de León Edgar Darío

    2014-10-01

    Full Text Available XJML (Ramírez et al., 2012 is a modular external platform for Verification and Validation of Java classes using the Java Modeling Language (JML through contracts written in XML. One problem faced in the XJML development was how to integrate Full Static Program Verification (FSPV. This paper presents the experiments and results that allowed us to define what tool to embed in XJML to execute FSPV.

  14. Verification of operation of the actuator control system using the integration the B&R Automation Studio software with a virtual model of the actuator system

    Science.gov (United States)

    Herbuś, K.; Ociepka, P.

    2017-08-01

    In the work is analysed a sequential control system of a machine for separating and grouping work pieces for processing. Whereas, the area of the considered problem is related with verification of operation of an actuator system of an electro-pneumatic control system equipped with a PLC controller. Wherein to verification is subjected the way of operation of actuators in view of logic relationships assumed in the control system. The actuators of the considered control system were three drives of linear motion (pneumatic cylinders). And the logical structure of the system of operation of the control system is based on the signals flow graph. The tested logical structure of operation of the electro-pneumatic control system was implemented in the Automation Studio software of B&R company. This software is used to create programs for the PLC controllers. Next, in the FluidSIM software was created the model of the actuator system of the control system of a machine. To verify the created program for the PLC controller, simulating the operation of the created model, it was utilized the approach of integration these two programs using the tool for data exchange in the form of the OPC server.

  15. Symposium on international safeguards: Verification and nuclear material security. Book of extended synopses

    International Nuclear Information System (INIS)

    2001-01-01

    The symposium covered the topics related to international safeguards, verification and nuclear materials security, namely: verification and nuclear material security; the NPT regime: progress and promises; the Additional Protocol as an important tool for the strengthening of the safeguards system; the nuclear threat and the nuclear threat initiative. Eighteen sessions dealt with the following subjects: the evolution of IAEA safeguards (including strengthened safeguards, present and future challenges; verification of correctness and completeness of initial declarations; implementation of the Additional Protocol, progress and experience; security of material; nuclear disarmament and ongoing monitoring and verification in Iraq; evolution of IAEA verification in relation to nuclear disarmament); integrated safeguards; physical protection and illicit trafficking; destructive analysis for safeguards; the additional protocol; innovative safeguards approaches; IAEA verification and nuclear disarmament; environmental sampling; safeguards experience; safeguards equipment; panel discussion on development of state systems of accountancy and control; information analysis in the strengthened safeguard system; satellite imagery and remote monitoring; emerging IAEA safeguards issues; verification technology for nuclear disarmament; the IAEA and the future of nuclear verification and security

  16. Symposium on international safeguards: Verification and nuclear material security. Book of extended synopses

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-07-01

    The symposium covered the topics related to international safeguards, verification and nuclear materials security, namely: verification and nuclear material security; the NPT regime: progress and promises; the Additional Protocol as an important tool for the strengthening of the safeguards system; the nuclear threat and the nuclear threat initiative. Eighteen sessions dealt with the following subjects: the evolution of IAEA safeguards (including strengthened safeguards, present and future challenges; verification of correctness and completeness of initial declarations; implementation of the Additional Protocol, progress and experience; security of material; nuclear disarmament and ongoing monitoring and verification in Iraq; evolution of IAEA verification in relation to nuclear disarmament); integrated safeguards; physical protection and illicit trafficking; destructive analysis for safeguards; the additional protocol; innovative safeguards approaches; IAEA verification and nuclear disarmament; environmental sampling; safeguards experience; safeguards equipment; panel discussion on development of state systems of accountancy and control; information analysis in the strengthened safeguard system; satellite imagery and remote monitoring; emerging IAEA safeguards issues; verification technology for nuclear disarmament; the IAEA and the future of nuclear verification and security.

  17. Symposium on international safeguards: Verification and nuclear material security. Book of extended synopses. Addendum

    International Nuclear Information System (INIS)

    2001-01-01

    The symposium covered the topics related to international safeguards, verification and nuclear materials security, namely: verification and nuclear material security; the NPT regime: progress and promises; the Additional Protocol as an important tool for the strengthening of the safeguards system; the nuclear threat and the nuclear threat initiative. Eighteen sessions dealt with the following subjects: the evolution of IAEA safeguards ( including strengthened safeguards, present and future challenges; verification of correctness and completeness of initial declarations; implementation of the Additional Protocol, progress and experience; security of material; nuclear disarmament and ongoing monitoring and verification in Iraq; evolution of IAEA verification in relation to nuclear disarmament); integrated safeguards; physical protection and illicit trafficking; destructive analysis for safeguards; the additional protocol; innovative safeguards approaches; IAEA verification and nuclear disarmament; environmental sampling; safeguards experience; safeguards equipment; panel discussion on development of state systems of accountancy and control; information analysis in the strengthened safeguard system; satellite imagery and remote monitoring; emerging IAEA safeguards issues; verification technology for nuclear disarmament; the IAEA and the future of nuclear verification and security

  18. Seismic verification methods for structures and equipment of VVER-type and RBMK-type NPPs (summary of experiences)

    International Nuclear Information System (INIS)

    Masopust, R.

    2003-01-01

    The main verification methods for structures and equipment of already existing VVER-type and RBMK-type NPPs are briefly described. The following aspects are discussed: fundamental seismic safety assessment principles for VVER/RBMK-type NPPs (seismic safety assessment procedure, typical work plan for seismic safety assessment of existing NPPs, SMA (HCLPF) calculations, modified GIP (GIP-VVER) procedure, similarity of VVER/RBMK equipment to that included in the SQUG databases and seismic interactions

  19. Simulation environment based on the Universal Verification Methodology

    International Nuclear Information System (INIS)

    Fiergolski, A.

    2017-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC design; (2) the C3PD 180 nm HV-CMOS active sensor ASIC design; (3) the FPGA-based DAQ system of the CLICpix chip. This paper, based on the experience from the above projects, introduces briefly UVM and presents a set of tips and advices applicable at different stages of the verification process-cycle.

  20. Trends in integrated circuit design for particle physics experiments

    International Nuclear Information System (INIS)

    Atkin, E V

    2017-01-01

    Integrated circuits are one of the key complex units available to designers of multichannel detector setups. A whole number of factors makes Application Specific Integrated Circuits (ASICs) valuable for Particle Physics and Astrophysics experiments. Among them the most important ones are: integration scale, low power dissipation, radiation tolerance. In order to make possible future experiments in the intensity, cosmic, and energy frontiers today ASICs should provide new level of functionality at a new set of constraints and trade-offs, like low-noise high-dynamic range amplification and pulse shaping, high-speed waveform sampling, low power digitization, fast digital data processing, serialization and data transmission. All integrated circuits, necessary for physical instrumentation, should be radiation tolerant at an earlier not reached level (hundreds of Mrad) of total ionizing dose and allow minute almost 3D assemblies. The paper is based on literary source analysis and presents an overview of the state of the art and trends in nowadays chip design, using partially own ASIC lab experience. That shows a next stage of ising micro- and nanoelectronics in physical instrumentation. (paper)

  1. Experience Supporting the Integration of LHC Experiments Software Framework with the LCG Middleware

    CERN Document Server

    Santinelli, Roberto

    2006-01-01

    The LHC experiments are currently preparing for data acquisition in 2007 and because of the large amount of required computing and storage resources, they decided to embrace the grid paradigm. The LHC Computing Project (LCG) provides and operates a computing infrastructure suitable for data handling, Monte Carlo production and analysis. While LCG offers a set of high level services, intended to be generic enough to accommodate the needs of different Virtual Organizations, the LHC experiments software framework and applications are very specific and focused on the computing and data models. The LCG Experiment Integration Support team works in close contact with the experiments, the middleware developers and the LCG certification and operations teams to integrate the underlying grid middleware with the experiment specific components. The strategical position between the experiments and the middleware suppliers allows EIS team to play a key role at communications level between the customers and the service provi...

  2. HDL to verification logic translator

    Science.gov (United States)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  3. Technology choices for the Integrated Beam Experiment (IBX)

    Energy Technology Data Exchange (ETDEWEB)

    Leitner, M.A.; Celata, C.M.; Lee, E.P.; Sabbi, G.; Waldron, W.L.; Barnard, J.J.

    2002-10-31

    Over the next three years the research program of the Heavy Ion Fusion Virtual National Laboratory (HIF-VNL), a collaboration among LBNL, LLNL, and PPPL, is focused on separate scientific experiments in the injection, transport and focusing of intense heavy ion beams at currents from 100 mA to 1 A. As a next major step in the HIF-VNL program, we aim for a complete ''source-to-target'' experiment, the Integrated Beam Experiment (IBX). By combining the experience gained in the current separate beam experiments IBX would allow the integrated scientific study of the evolution of a single heavy ion beam at high current ({approx}1 A) through all sections of a possible heavy ion fusion accelerator: the injection, acceleration, compression, and beam focusing. This paper describes the main parameters and technology choices of the planned IBX experiment. IBX will accelerate singly charged potassium or argon ion beams up to 10 MeV final energy and a longitudinal beam compression ratio of 10, resulting in a beam current at target of more than 10 Amperes. Different accelerator cell design options are described in detail: Induction cores incorporating either room temperature pulsed focusing-magnets or superconducting magnets.

  4. Induction Accelerator Technology Choices for the Integrated Beam Experiment (IBX)

    International Nuclear Information System (INIS)

    Leitner, M.A.; Celata, C.M.; Lee, E.P.; Logan, B.G.; Sabbi, G.; Waldron, W.L.; Barnard, J.J.

    2003-01-01

    Over the next three years the research program of the Heavy Ion Fusion Virtual National Laboratory (HIF-VNL), a collaboration among LBNL, LLNL, and PPPL, is focused on separate scientific experiments in the injection, transport and focusing of intense heavy ion beams at currents from 100 mA to 1 A. As a next major step in the HIF-VNL program, we aim for a complete 'source-to-target' experiment, the Integrated Beam Experiment (IBX). By combining the experience gained in the current separate beam experiments IBX would allow the integrated scientific study of the evolution of a single heavy ion beam at high current (∼1 A) through all sections of a possible heavy ion fusion accelerator: the injection, acceleration, compression, and beam focusing.This paper describes the main parameters and technology choices of the planned IBX experiment. IBX will accelerate singly charged potassium or argon ion beams up to 10 MeV final energy and a longitudinal beam compression ratio of 10, resulting in a beam current at target of more than 10 Amperes. Different accelerator cell design options are described in detail: Induction cores incorporating either room temperature pulsed focusing-magnets or superconducting magnets

  5. OECD/NEA data bank scientific and integral experiments databases in support of knowledge preservation and transfer

    International Nuclear Information System (INIS)

    Sartori, E.; Kodeli, I.; Mompean, F.J.; Briggs, J.B.; Gado, J.; Hasegawa, A.; D'hondt, P.; Wiesenack, W.; Zaetta, A.

    2004-01-01

    The OECD/Nuclear Energy Data Bank was established by its member countries as an institution to allow effective sharing of knowledge and its basic underlying information and data in key areas of nuclear science and technology. The activities as regards preserving and transferring knowledge consist of the: 1) Acquisition of basic nuclear data, computer codes and experimental system data needed over a wide range of nuclear and radiation applications; 2) Independent verification and validation of these data using quality assurance methods, adding value through international benchmark exercises, workshops and meetings and by issuing relevant reports with conclusions and recommendations, as well as by organising training courses to ensure their qualified and competent use; 3) Dissemination of the different products to authorised establishments in member countries and collecting and integrating user feedback. Of particular importance has been the establishment of basic and integral experiments databases and the methodology developed with the aim of knowledge preservation and transfer. Databases established thus far include: 1) IRPhE - International Reactor Physics Experimental Benchmarks Evaluations, 2) SINBAD - a radiation shielding experiments database (nuclear reactors, fusion neutronics and accelerators), 3) IFPE - International Fuel Performance Benchmark Experiments Database, 4) TDB - The Thermochemical Database Project, 5) ICSBE - International Nuclear Criticality Safety Benchmark Evaluations, 6) CCVM - CSNI Code Validation Matrix of Thermal-hydraulic Codes for LWR LOCA and Transients. This paper will concentrate on knowledge preservation and transfer concepts and methods related to some of the integral experiments and TDB. (author)

  6. A research on the verification of models used in the computational codes and the uncertainty reduction method for the containment integrity evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Moo Hwan; Seo, Kyoung Woo [POSTECH, Pohang (Korea, Republic of)

    2001-03-15

    In the probability approach, the calculated CCFPs of all the scenarios were zero, which meant that it was expected that for all the accident scenarios the maximum pressure load induced by DCH was lower than the containment failure pressure obtained from the fragility curve. Thus, it can be stated that the KSNP containment is robust to the DCH threat. And uncertainty of computer codes used to be two (deterministic and probabilistic) approaches were reduced by the sensitivity tests and the research with the verification and comparison of the DCH models in each code. So, this research was to evaluate synthetic result of DCH issue and expose accurate methodology to assess containment integrity about operating PWR in Korea.

  7. Experience and Strategy of Biodiversity Data Integration in Taiwan

    Directory of Open Access Journals (Sweden)

    K T Shao

    2013-02-01

    Full Text Available The integration of Taiwan's biodiversity databases started in 2001, the same year that Taiwan joined GBIF as an associate participant. Taiwan, hence, embarked on a decade of integrating biodiversity data. Under the support of NSC and COA, the database and websites of TaiBIF, TaiBNET (TaiCOL, TaiBOL, and TaiEOL have been established separately and collaborate with the GBIF, COL, BOL, and EOL respectively. A cross-agency committee was thus established in Academia Sinica in 2008 to formulate policies on data collection and integration as well as the mechanism to make data available to the public. Any commissioned project will hereafter be asked to include these policy requirements in its contract. So far, TaiBIF has gained recognition in Taiwan and abroad for its efforts over the past several years. It can provide its experience and insights for others to reference or replicate.

  8. Integrated multiscale biomaterials experiment and modelling: a perspective

    Science.gov (United States)

    Buehler, Markus J.; Genin, Guy M.

    2016-01-01

    Advances in multiscale models and computational power have enabled a broad toolset to predict how molecules, cells, tissues and organs behave and develop. A key theme in biological systems is the emergence of macroscale behaviour from collective behaviours across a range of length and timescales, and a key element of these models is therefore hierarchical simulation. However, this predictive capacity has far outstripped our ability to validate predictions experimentally, particularly when multiple hierarchical levels are involved. The state of the art represents careful integration of multiscale experiment and modelling, and yields not only validation, but also insights into deformation and relaxation mechanisms across scales. We present here a sampling of key results that highlight both challenges and opportunities for integrated multiscale experiment and modelling in biological systems. PMID:28981126

  9. Software for the Integration of Multiomics Experiments in Bioconductor.

    Science.gov (United States)

    Ramos, Marcel; Schiffer, Lucas; Re, Angela; Azhar, Rimsha; Basunia, Azfar; Rodriguez, Carmen; Chan, Tiffany; Chapman, Phil; Davis, Sean R; Gomez-Cabrero, David; Culhane, Aedin C; Haibe-Kains, Benjamin; Hansen, Kasper D; Kodali, Hanish; Louis, Marie S; Mer, Arvind S; Riester, Markus; Morgan, Martin; Carey, Vince; Waldron, Levi

    2017-11-01

    Multiomics experiments are increasingly commonplace in biomedical research and add layers of complexity to experimental design, data integration, and analysis. R and Bioconductor provide a generic framework for statistical analysis and visualization, as well as specialized data classes for a variety of high-throughput data types, but methods are lacking for integrative analysis of multiomics experiments. The MultiAssayExperiment software package, implemented in R and leveraging Bioconductor software and design principles, provides for the coordinated representation of, storage of, and operation on multiple diverse genomics data. We provide the unrestricted multiple 'omics data for each cancer tissue in The Cancer Genome Atlas as ready-to-analyze MultiAssayExperiment objects and demonstrate in these and other datasets how the software simplifies data representation, statistical analysis, and visualization. The MultiAssayExperiment Bioconductor package reduces major obstacles to efficient, scalable, and reproducible statistical analysis of multiomics data and enhances data science applications of multiple omics datasets. Cancer Res; 77(21); e39-42. ©2017 AACR . ©2017 American Association for Cancer Research.

  10. Experiences of giving and receiving care in traumatic brain injury: An integrative review.

    Science.gov (United States)

    Kivunja, Stephen; River, Jo; Gullick, Janice

    2018-04-01

    To synthesise the literature on the experiences of giving or receiving care for traumatic brain injury for people with traumatic brain injury, their family members and nurses in hospital and rehabilitation settings. Traumatic brain injury represents a major source of physical, social and economic burden. In the hospital setting, people with traumatic brain injury feel excluded from decision-making processes and perceive impatient care. Families describe inadequate information and support for psychological distress. Nurses find the care of people with traumatic brain injury challenging particularly when experiencing heavy workloads. To date, a contemporary synthesis of the literature on people with traumatic brain injury, family and nurse experiences of traumatic brain injury care has not been conducted. Integrative literature review. A systematic search strategy guided by the PRISMA statement was conducted in CINAHL, PubMed, Proquest, EMBASE and Google Scholar. Whittemore and Knafl's (Journal of Advanced Nursing, 52, 2005, 546) integrative review framework guided data reduction, data display, data comparison and conclusion verification. Across the three participant categories (people with traumatic brain injury/family members/nurses) and sixteen subcategories, six cross-cutting themes emerged: seeking personhood, navigating challenging behaviour, valuing skills and competence, struggling with changed family responsibilities, maintaining productive partnerships and reflecting on workplace culture. Traumatic brain injury creates changes in physical, cognitive and emotional function that challenge known ways of being in the world for people. This alters relationship dynamics within families and requires a specific skill set among nurses. Recommendations include the following: (i) formal inclusion of people with traumatic brain injury and families in care planning, (ii) routine risk screening for falls and challenging behaviour to ensure that controls are based on

  11. Cuban experience in verification of the execution of the safety requirements during the transport of radioactive materials

    International Nuclear Information System (INIS)

    Quevedo Garcia, J.R.; Lopez Forteza, Y.

    2001-01-01

    The Cuban Regulatory Authority has paid special attention to the verification of the execution of the safety requirements during the transport of radioactive material in the country. With this purpose, the Authority has followed a consequent policy based on supplementary demands to those collections in the juridical mark settled down in 1987 in the sphere of transport of radioactive substances. In the work the technical approaches are exposed kept in mind when establishing the one referred politics, the current situation is characterized, the results are evaluated obtained in correspondence with the pursued objectives and the essential aspects are exposed to keep in mind for the adopted politics ulterior development. (author)

  12. Verification and validation benchmarks.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  13. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  14. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William L.; Trucano, Timothy G.

    2008-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  15. A research on verification of the models in the CONTAIN Code and the uncertainty reduction method for containment integrity evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Moo Hwan; Seo, Kyoung Woo [Pohang University of Science and Technology, Pohang (Korea, Republic of)

    2000-03-15

    The final goal of this research is to evaluate synthetic results of DCH issue and expose accurate methodology to assess containment integrity about operating PWR in Korea. This research is aimed to expose methodology for synthetic resolution of the DCH issue for KSNPP, and make the guide of DCH issue for containment integrity which will be used to design to nuclear power plants.

  16. Verification of consumers' experiences and perceptions of genetic discrimination and its impact on utilization of genetic testing.

    Science.gov (United States)

    Barlow-Stewart, Kristine; Taylor, Sandra D; Treloar, Susan A; Stranger, Mark; Otlowski, Margaret

    2009-03-01

    To undertake a systematic process of verification of consumer accounts of alleged genetic discrimination. Verification of incidents reported in life insurance and other contexts that met the criteria of genetic discrimination, and the impact of fear of such treatment, was determined, with consent, through interview, document analysis and where appropriate, direct contact with the third party involved. The process comprised obtaining evidence that the alleged incident was accurately reported and determining whether the decision or action seemed to be justifiable and/or ethical. Reported incidents of genetic discrimination were verified in life insurance access, underwriting and coercion (9), applications for worker's compensation (1) and early release from prison (1) and in two cases of fear of discrimination impacting on access to genetic testing. Relevant conditions were inherited cancer susceptibility (8), Huntington disease (3), hereditary hemochromatosis (1), and polycystic kidney disease (1). In two cases, the reversal of an adverse underwriting decision to standard rate after intervention with insurers by genetics health professionals was verified. The mismatch between consumer and third party accounts in three life insurance incidents involved miscommunication or lack of information provision by financial advisers. These first cases of verified genetic discrimination make it essential for policies and guidelines to be developed and implemented to ensure appropriate use of genetic test results in insurance underwriting, to promote education and training in the financial industry, and to provide support for consumers and health professionals undertaking challenges of adverse decisions.

  17. Systems analysis programs for Hands-on integrated reliability evaluations (SAPHIRE) Version 5.0: Verification and validation (V ampersand V) manual. Volume 9

    International Nuclear Information System (INIS)

    Jones, J.L.; Calley, M.B.; Capps, E.L.; Zeigler, S.L.; Galyean, W.J.; Novack, S.D.; Smith, C.L.; Wolfram, L.M.

    1995-03-01

    A verification and validation (V ampersand V) process has been performed for the System Analysis Programs for Hands-on Integrated Reliability Evaluation (SAPHIRE) Version 5.0. SAPHIRE is a set of four computer programs that NRC developed for performing probabilistic risk assessments. They allow an analyst to perform many of the functions necessary to create, quantify, and evaluate the risk associated with a facility or process being analyzed. The programs are Integrated Reliability and Risk Analysis System (IRRAS) System Analysis and Risk Assessment (SARA), Models And Results Database (MAR-D), and Fault tree, Event tree, and Piping and instrumentation diagram (FEP) graphical editor. Intent of this program is to perform a V ampersand V of successive versions of SAPHIRE. Previous efforts have been the V ampersand V of SAPHIRE Version 4.0. The SAPHIRE 5.0 V ampersand V plan is based on the SAPHIRE 4.0 V ampersand V plan with revisions to incorporate lessons learned from the previous effort. Also, the SAPHIRE 5.0 vital and nonvital test procedures are based on the test procedures from SAPHIRE 4.0 with revisions to include the new SAPHIRE 5.0 features as well as to incorporate lessons learned from the previous effort. Most results from the testing were acceptable; however, some discrepancies between expected code operation and actual code operation were identified. Modifications made to SAPHIRE are identified

  18. Verification of a three-dimensional FEM model for FBGs in PANDA fibers by transversal load experiments

    Science.gov (United States)

    Fischer, Bennet; Hopf, Barbara; Lindner, Markus; Koch, Alexander W.; Roths, Johannes

    2017-04-01

    A 3D FEM model of an FBG in a PANDA fiber with an extended fiber length of 25.4 mm is presented. Simulating long fiber lengths with limited computer power is achieved by using an iterative solver and by optimizing the FEM mesh. For verification purposes, the model is adapted to a configuration with transversal loads on the fiber. The 3D FEM model results correspond with experimental data and with the results of an additional 2D FEM plain strain model. In further studies, this 3D model shall be applied to more sophisticated situations, for example to study the temperature dependence of surface-glued or embedded FBGs in PANDA fibers that are used for strain-temperature decoupling.

  19. Experiences with integral microelectronics on smart structures for space

    Science.gov (United States)

    Nye, Ted; Casteel, Scott; Navarro, Sergio A.; Kraml, Bob

    1995-05-01

    One feature of a smart structure implies that some computational and signal processing capability can be performed at a local level, perhaps integral to the controlled structure. This requires electronics with a minimal mechanical influence regarding structural stiffening, heat dissipation, weight, and electrical interface connectivity. The Advanced Controls Technology Experiment II (ACTEX II) space-flight experiments implemented such a local control electronics scheme by utilizing composite smart members with integral processing electronics. These microelectronics, tested to MIL-STD-883B levels, were fabricated with conventional thick film on ceramic multichip module techniques. Kovar housings and aluminum-kapton multilayer insulation was used to protect against harsh space radiation and thermal environments. Development and acceptance testing showed the electronics design was extremely robust, operating in vacuum and at temperature range with minimal gain variations occurring just above room temperatures. Four electronics modules, used for the flight hardware configuration, were connected by a RS-485 2 Mbit per second serial data bus. The data bus was controlled by Actel field programmable gate arrays arranged in a single master, four slave configuration. An Intel 80C196KD microprocessor was chosen as the digital compensator in each controller. It was used to apply a series of selectable biquad filters, implemented via Delta Transforms. Instability in any compensator was expected to appear as large amplitude oscillations in the deployed structure. Thus, over-vibration detection circuitry with automatic output isolation was incorporated into the design. This was not used however, since during experiment integration and test, intentionally induced compensator instabilities resulted in benign mechanical oscillation symptoms. Not too surprisingly, it was determined that instabilities were most detectable by large temperature increases in the electronics, typically

  20. Vehicle Integrated Performance Analysis, the VIPA Experience: Reconnecting with Technical Integration

    Science.gov (United States)

    McGhee, David S.

    2005-01-01

    Today's NASA is facing significant challenges and changes. The Exploration initiative indicates a large increase in projects with limited increase in budget. The Columbia report has criticized NASA for its lack of insight and technical integration impacting its ability to provide safety. The Aldridge report is advocating NASA find new ways of doing business. Very early in the Space Launch Initiative (SLI) program a small team of engineers at MSFC were asked to propose a process for performing a system level assessment of a launch vehicle. The request was aimed primarily at providing insight and making NASA a "smart buyer." Out of this effort the VIPA team was created. The difference between the VIPA effort and many integration attempts is that VIPA focuses on using experienced people from various disciplines and a process which focuses them on a technically integrated assessment. Most previous attempts have focused on developing an all encompassing software tool. In addition, VIPA anchored its process formulation in the experience of its members and in early developmental Space Shuttle experience. The primary reference for this is NASA-TP-2001-210092, "Launch Vehicle Design Process: Characterization, Technical Integration, and Lessons Learned," and discussions with its authors. The foundations of VIPA's process are described. The VIPA team also recognized the need to drive detailed analysis earlier in the design process. Analyses and techniques typically done in later design phases, are brought forward using improved computing technology. The intent is to allow the identification of significant sensitivities, trades, and design issues much earlier in the program. This process is driven by the T-model for Technical Integration described in the aforementioned reference. VIPA's approach to performing system level technical integration is discussed in detail. Proposed definitions are offered to clarify this discussion and the general systems integration dialog. VIPA

  1. A research on verification of the CONTAIN CODE model and the uncertainty reduction method for containment integrity

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jae Hong [Korea Institute of Nuclear Safety, Taejon (Korea, Republic of); Kim, Moo Hwan; Kang, Seok Hun; Seo, Kyoung Woo [Pohang University of Science and Technology, Pohang (Korea, Republic of)

    1999-03-15

    The final goal of this research is to verify methodology for evaluating more accurately the integrity of containment and develop the methodology to reduce the uncertainty using the data of the operating PWR, KSNPP, KNGR during a severe accident. Therefore, the research selected an indispensable factor about DCH, and analysed sensitivity test at this year.

  2. Experimental verification of integrated pressure suppression systems in fusion reactors at in-vessel loss-of-coolant events

    International Nuclear Information System (INIS)

    Takase, K.; Akimoto, H.

    2001-01-01

    An integrated ICE (Ingress-of-Coolant Event) test facility was constructed to demonstrate that the ITER safety design approach and design parameters for the ICE events are adequate. Major objectives of the integrated ICE test facility are: to estimate the performance of an integrated pressure suppression system; to obtain the validation data for safety analysis codes; and to clarify the effects of two-phase pressure drop at a divertor and the direct-contact condensation in a suppression tank. A scaling factor between the test facility and ITER-FEAT is around 1/1600. The integrated ICE test facility simulates the ITER pressure suppression system and mainly consists of a plasma chamber, vacuum vessel, simulated divertor, relief pipe and suppression tank. From the experimental results it was found quantitatively that the ITER pressure suppression system is very effective to reduce the pressurization due to the ICE event. Furthermore, it was confirmed that the analytical results of the TRAC-PF1 code can simulate the experimental results with high accuracy. (author)

  3. Integrated Safety Management System Phase I Verification for the Plutonium Finishing Plant (PFP) [VOL 1 and 2

    International Nuclear Information System (INIS)

    SETH, S.S.

    2000-01-01

    U.S. Department of Energy (DOE) Policy 450.4, Safety Management System Policy commits to institutionalizing an Integrated Safety Management System (ISMS) throughout the DOE complex as a means of accomplishing its missions safely. DOE Acquisition Regulation 970.5204-2 requires that contractors manage and perform work in accordance with a documented safety management system

  4. Development of nuclear thermal hydraulic verification test and evaluation technology - Development of fundamental technique for experiment of natural circulation phenomena in PWR systems

    Energy Technology Data Exchange (ETDEWEB)

    Park, Goon Cherl; Lee, Tae Ho; Kim, Moon Oh; Kim, Hak Joon [Seoul National University, Seoul (Korea)

    2000-04-01

    The dimensional analysis applied two-fluid model of CFX-4,2 were performed. For verification of analysis results, experimental measurement data of two-phase flow parameters in subcooled boiling flow were produced for vertical(0 deg) and inclination (60 deg). And through comparison analysis and experiments the application possibility of various two -phase flow models and the analysis ability of code were evaluated. Measurement technique of bubble velocity in two-phase flow using backscattering standard LDV was investigated from slug to bubbly flow regime. The range of velocity measured is from 0.2 to 1.5 m/s and that of bubble size is from 2 to 20 mm. For local temperature of boiling flow measurement, microthermocouple were manufactured and local liquid and vapor temperatures were measured in pool boiling and boiling flow. 66 refs., 74 figs., 4 tabs. (Author)

  5. The verification basis of the PM-ALPHA code

    Energy Technology Data Exchange (ETDEWEB)

    Theofanous, T.G.; Yuen, W.W.; Angelini, S. [California Univ., Santa Barbara, CA (United States). Center for Risk Studies and Safety

    1998-01-01

    An overall verification approach for the PM-ALPHA code is presented and implemented. The approach consists of a stepwise testing procedure focused principally on the multifield aspects of the premixing phenomenon. Breakup is treated empirically, but it is shown that, through reasonable choices of the breakup parameters, consistent interpretations of existing integral premixing experiments can be obtained. The present capability is deemed adequate for bounding energetics evaluations. (author)

  6. Progress in heavy ion driven inertial fusion energy: From scaled experiments to the integrated research experiment

    International Nuclear Information System (INIS)

    Barnard, J.J.; Ahle, L.E.; Baca, D.; Bangerter, R.O.; Bieniosek, F.M.; Celata, C.M.; Chacon-Golcher, E.; Davidson, R.C.; Faltens, A.; Friedman, A.; Franks, R.M.; Grote, D.P.; Haber, I.; Henestroza, E.; Hoon, M.J.L. de; Kaganovich, I.; Karpenko, V.P.; Kishek, R.A.; Kwan, J.W.; Lee, E.P.; Logan, B.G.; Lund, S.M.; Meier, W.R.; Molvik, A.W.; Olson, C.; Prost, L.R.; Qin, H.; Rose, D.; Sabbi, G.-L.; Sangster, T.C.; Seidl, P.A.; Sharp, W.M.; Shuman, D.; Vay, J.-L.; Waldron, W.L.; Welch, D.; Yu, S.S.

    2001-01-01

    The promise of inertial fusion energy driven by heavy ion beams requires the development of accelerators that produce ion currents (∼100's Amperes/beam) and ion energies (∼1-10 GeV) that have not been achieved simultaneously in any existing accelerator. The high currents imply high generalized perveances, large tune depressions, and high space charge potentials of the beam center relative to the beam pipe. Many of the scientific issues associated with ion beams of high perveance and large tune depression have been addressed over the last two decades on scaled experiments at Lawrence Berkeley and Lawrence Livermore National Laboratories, the University of Maryland, and elsewhere. The additional requirement of high space charge potential (or equivalently high line charge density) gives rise to effects (particularly the role of electrons in beam transport) which must be understood before proceeding to a large scale accelerator. The first phase of a new series of experiments in Heavy Ion Fusion Virtual National Laboratory (HIF VNL), the High Current Experiments (HCX), is now being constructed at LBNL. The mission of the HCX will be to transport beams with driver line charge density so as to investigate the physics of this regime, including constraints on the maximum radial filling factor of the beam through the pipe. This factor is important for determining both cost and reliability of a driver scale accelerator. The HCX will provide data for design of the next steps in the sequence of experiments leading to an inertial fusion energy power plant. The focus of the program after the HCX will be on integration of all of the manipulations required for a driver. In the near term following HCX, an Integrated Beam Experiment (IBX) of the same general scale as the HCX is envisioned. The step which bridges the gap between the IBX and an engineering test facility for fusion has been designated the Integrated Research Experiment (IRE). The IRE (like the IBX) will provide an

  7. GumTree-An integrated scientific experiment environment

    International Nuclear Information System (INIS)

    Lam, Tony; Hauser, Nick; Goetz, Andy; Hathaway, Paul; Franceschini, Fredi; Rayner, Hugh; Zhang, Lidia

    2006-01-01

    GumTree is an open source and multi-platform graphical user interface for performing neutron scattering and X-ray experiments. It handles the complete experiment life cycle from instrument calibration, data acquisition, and real time data analysis to results publication. The aim of the GumTree Project is to create a highly Integrated Scientific Experiment Environment (ISEE), allowing interconnectivity and data sharing between different distributed components such as motors, detectors, user proposal database and data analysis server. GumTree is being adapted to several instrument control server systems such as TANGO, EPICS and SICS, providing an easy-to-use front-end for users and simple-to-extend model for software developers. The design of GumTree is aimed to be reusable and configurable for any scientific instrument. GumTree will be adapted to six neutron beam instruments for the OPAL reactor at ANSTO. Other European institutes including ESRF, ILL and PSI have shown interest in using GumTree as their workbench for instrument control and data analysis

  8. GumTree - An Integrated Scientific Experiment Environment

    International Nuclear Information System (INIS)

    Lam, Tony; Hauser, Nick; Hathaway, Paul; Franceschini, Fredi; Rayner, Hugh; Zhang, Lidia; Goetz, Andy

    2005-01-01

    Full text: GumTree is an open source and multi-platform graphical user interface for performing neutron scattering and X-ray experiments. It handles the complete experiment life cycle from instrument calibration, data acquisition, and real time data analysis to results publication. The aim of the GumTree Project is to create a highly Integrated Scientific Experiment Environment (ISEE), allowing interconnectivity and data sharing between different distributed components such as motors, detectors, user proposal database and data analysis server. GumTree is being adapted to several instrument control server systems such as TANGO, EPICS and SICS, providing an easy-to-use front-end for users and simple-to-extend model for software developers. The design of GumTree is aimed to be reusable and configurable for any scientific instrument. GumTree will be adapted to six neutron beam instruments for the OPAL reactor at ANSTO. Other European institutes including ESRF, ILL and PSI have shown interest in using GumTree as their workbench for instrument control and data analysis. (authors)

  9. Joint experiment on verification of the treaty on the limitation of underground nuclear tests and its value in nuclear disarmament problem

    International Nuclear Information System (INIS)

    Mikhailov, V.N.

    1998-01-01

    This conference commemorates the 10th anniversary of the Joint Verification Experiment. The experiment was performed in order to specify methods controlling yield of underground explosions in the USA and the USSR. Basic of the the experiment were coordinated and formulated in the Agreement signed by Heads of departments of foreign policies in Moscow on 31 May 1988. The tasks can be briefly revealed the following way: - each of the parties can measure (on mutual basis) the explosion yield in the course of the experiment performed on the test site of the other party using tele seismic and hydrodynamic methods; - each party also makes tele seismic measurement of both explosions of the experiment with the help of its national net of seismic stations; - each party makes hydrodynamic measurements of explosion yield in the course of the experiment in a special additional borehole; - each party performs tele-seismic measurements of both explosions' yield at its five seismic stations with which the parties exchanged data on the explosions made earlier. In the course of the experiment the parties exchanged the data obtained in the same volume. The analysis showed: 1. The experiment conformed to all the requirements of the Agreement in spite of all the complexity of the procedures and differences in conditions of the experiment performance. 2. The experiment became an example of an unprecedented level of cooperation between two countries in one of the most significant for national security fields of defense activity. 3. The experiment gave the basis for concrete coordination of underground test yield control measures. It also considerable advanced the elaboration of protocols to treaties of 1974 and 1976. 4. In the course of the experiment there appeared an opportunity to compare scientific-technical level of hydrodynamic and seismic measurements and safety provision for nuclear tests of both countries. Cooperative development of anti intrusive devices for hydrodynamic method

  10. Verification of geomechanical integrity and prediction of long-term mineral trapping for the Ketzin CO2 storage pilot site

    Science.gov (United States)

    Kempka, Thomas; De Lucia, Marco; Kühn, Michael

    2014-05-01

    Static and dynamic numerical modelling generally accompany the entire CO2 storage site life cycle. Thereto, it is required to match the employed models with field observations on a regular basis in order to predict future site behaviour. We investigated the coupled processes at the Ketzin CO2 storage pilot site [1] using a model coupling concept focusing on the temporal relevance of processes involved (hydraulic, chemical and mechanical) at given time-scales (site operation, abandonment and long-term stabilization). For that purpose, long-term dynamic multi-phase flow simulations [2], [3] established the basis for all simulations discussed in the following. Hereby, pressure changes resulting in geomechanical effects are largest during site operation, whereas geochemical reactions are governed by slow kinetics resulting in a long-term stabilization. To account for mechanical integrity, which may be mainly affected during site operation, we incorporated a regional-scale coupled hydro-mechanical model. Our simulation results show maximum ground surface displacements of about 4 mm, whereas shear and tensile failure are not observed. Consequently, the CO2 storage operation at the Ketzin pilot site does not compromise reservoir, caprock and fault integrity. Chemical processes responsible for mineral trapping are expected to mainly occur during long-term stabilization at the Ketzin pilot site [4]. Hence, our previous assessment [3] was extended by integrating two long-term mineral trapping scenarios. Thereby, mineral trapping contributes to the trapping mechanisms with 11.7 % after 16,000 years of simulation in our conservative and with 30.9 % in our maximum reactivity scenarios. Dynamic flow simulations indicate that only 0.2 % of the CO2 injected (about 67,270 t CO2 in total) is in gaseous state, but structurally trapped after 16,000 years. Depending on the studied long-term scenario, CO2 dissolution is the dominating trapping mechanism with 68.9 % and 88

  11. Property-based Code Slicing for Efficient Verification of OSEK/VDX Operating Systems

    Directory of Open Access Journals (Sweden)

    Mingyu Park

    2012-12-01

    Full Text Available Testing is a de-facto verification technique in industry, but insufficient for identifying subtle issues due to its optimistic incompleteness. On the other hand, model checking is a powerful technique that supports comprehensiveness, and is thus suitable for the verification of safety-critical systems. However, it generally requires more knowledge and cost more than testing. This work attempts to take advantage of both techniques to achieve integrated and efficient verification of OSEK/VDX-based automotive operating systems. We propose property-based environment generation and model extraction techniques using static code analysis, which can be applied to both model checking and testing. The technique is automated and applied to an OSEK/VDX-based automotive operating system, Trampoline. Comparative experiments using random testing and model checking for the verification of assertions in the Trampoline kernel code show how our environment generation and abstraction approach can be utilized for efficient fault-detection.

  12. A research on verification of the CONTAIN CODE model and the uncertainty reduction method for containment integrity

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jae-Hong; Kim, Moo-Hwan; Bae, Seong-Won; Byun, Sang-Chul [Pohang University of Science and Technology, Pohang (Korea, Republic of)

    1998-03-15

    The final objectives of this study are to establish the way of measuring the integrity of containment building structures and safety analysis in the period of a postuIated severe accidents and to decrease the uncertainty of these methods. For that object, the CONTAIN 1.2 codes model for analyzing the severe accidents phenomena and the heat transfer between the air inside the containment buildings and inner walls have been reviewed and analyzed. For the double containment wall provided to the next generation nuclear reactor, which is different to the previous type of containment, the temperature and pressure rising history were calculated and compared to the results of previous ones.

  13. Dealing with scientific integrity issues: the Spanish experience.

    Science.gov (United States)

    Puigdomènech, Pere

    2014-02-01

    Integrity has been an important matter of concern for the scientific community as it affects the basis of its activities. Most countries having a significant scientific activity have dealt with this problem by different means, including drafting specific legal or soft law regulations and the appointment of stable or ad hoc committees that take care of these questions. This has also been the case in Spain. After the period of transition between dictatorship to a democratic regime, and, particularly, after the entrance in the European Union, scientific activity has increased in the country. As it could be expected, problems of misconduct have appeared and different institutions have been dealing with these matters. One of the best examples is that of Consejo Superior de Investigaciones Cientificas (CSIC), the largest institution devoted to scientific research belonging to the Spanish Government. The experience of the CSIC’s Ethics Committee in dealing with conflicts related to scientific practices is discussed here.

  14. A Preliminary Shielding Study on the Integrated Operation Verification System in the Head-End Hot-Cell of the Pyro-processing

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jinhwam; Kim, Yewon; Park, Se-Hwan; Ahn, Seong-Kyu; Cho, Gyuseong [KAIST, Daejeon (Korea, Republic of)

    2016-10-15

    Nuclear power accounts for more than 30 percent of power production in Korea. Its significance has annually been increased. Disposal spent fuel containing uranium, transuranic elements, and fission products is unavoidable byproduct of nuclear power production. it is recognized that finding appropriate sites for interim storage of disposal spent fuel is not easy because isolated sites should be required. Pyro-processing technology, Pyro-processing should be operated under high radiation environment in hot-cell structures. Because of this reason, all workers should be unauthorized to access inside the hot-cell areas under any circumstances except for acceptable dose verification and a normal operation should be remotely manipulated. For the reliable normal operation of pyroprocessing, it is noted that an evaluation of the space dose distribution in the hot-cell environments is necessary in advance in order to determine which technologies or instruments can be utilized on or near the process as the Integrated Operation Verification System (IOVS) is measured. Not like the electroreduction and electro-refining hot-cells, the head-end hot-cell equips Camera Radiation Detector (CRD) in which plutonium is securely measured and monitored for the safeguard of the pyro-processing. Results have been obtained using F2 surface tally in order to observe the magnitude of the gamma-ray and neutron flux which pass through the surface of the process cell. Furthermore, T-mesh tally has also been used to obtain the space dose distribution in the headend hot-cell. The hot-cell was divided into 7,668 cells in which each dimension was 1 x 1 x 1m for the T-mesh tally. To determine the position of the CRD and the surveillance camera, divergent approaches were required. Because the purpose of the CRD which contains a gamma-ray detector and a neutron detector is to identify the material composition as the process proceeds, the position in which detectable flux is exposed is required, whereas

  15. Experience with building integrated solar collectors; Erfaring med bygningsintegrerte solfangere

    Energy Technology Data Exchange (ETDEWEB)

    Simonsen, Ingeborg; Time, Berit; Andresen, Inger

    2011-07-01

    The main objective of the research 'Zero Emission Buildings' ZEB is to develop products and solutions that provide buildings with zero greenhouse gas emissions associated with the production, operation and disposal. Can we make this happen must the building produce more energy than it needs to compensate for greenhouse gas emissions from the production of materials and the actual construction.To build up knowledge on experience with building integrated solar collectors in Norway, we have in this study made interviews with suppliers and manufacturers of solar collectors and some building owners. Since the focus is on climate shell, we have limited the study to include solar collectors to replace a part of the cladding or roofing. Construction upstairs roofing, outside facade or freestanding rack is not considered as building integrated in this context. The providers we have been in contact with appeals to slightly different parts of the market. This is reflected in the product's development, assembly and approach to the calculation of energy delivery. Overall, providers may offer a range of products suitable for both the professional and skilled carpenter, the interested 'man in the street' . The feedback we have received shows generally good experiences with the product and the installation. Because of the preliminary short operating periods of the investigated plants we have little data on energy supply from these plants. In summary, we can say that the knowledge and the products are available and it is up to use to use them.(Author)

  16. Students' integration of multiple representations in a titration experiment

    Science.gov (United States)

    Kunze, Nicole M.

    A complete understanding of a chemical concept is dependent upon a student's ability to understand the microscopic or particulate nature of the phenomenon and integrate the microscopic, symbolic, and macroscopic representations of the phenomenon. Acid-base chemistry is a general chemistry topic requiring students to understand the topics of chemical reactions, solutions, and equilibrium presented earlier in the course. In this study, twenty-five student volunteers from a second semester general chemistry course completed two interviews. The first interview was completed prior to any classroom instruction on acids and bases. The second interview took place after classroom instruction, a prelab activity consisting of a titration calculation worksheet, a titration computer simulation, or a microscopic level animation of a titration, and two microcomputer-based laboratory (MBL) titration experiments. During the interviews, participants were asked to define and describe acid-base concepts and in the second interview they also drew the microscopic representations of four stages in an acid-base titration. An analysis of the data showed that participants had integrated the three representations of an acid-base titration to varying degrees. While some participants showed complete understanding of acids, bases, titrations, and solution chemistry, other participants showed several alternative conceptions concerning strong acid and base dissociation, the formation of titration products, and the dissociation of soluble salts. Before instruction, participants' definitions of acid, base, and pH were brief and consisted of descriptive terms. After instruction, the definitions were more scientific and reflected the definitions presented during classroom instruction.

  17. Saudi Aramco experience towards establishing Pipelines Integrity Management Systems (PIMS)

    Energy Technology Data Exchange (ETDEWEB)

    AlAhmari, Saad A. [Saudi Aramco, Dhahran (Saudi Arabia)

    2009-12-19

    Saudi Aramco pipelines network transports hydrocarbons to export terminals, processing plants and domestic users. This network faced several safety and operational-related challenges that require having a more effective Pipelines Integrity Management System (PIMS). Therefore Saudi Aramco decided to develop its PIMS on the basis of geographical information system (GIS) support through different phases, i.e., establishing the integrity management framework, risk calculation approach, conducting a gap analysis toward the envisioned PIMS, establishing the required scope of work, screening the PIMS applications market, and selecting suitable tools that satisfy expected deliverables, and implement PIMS applications. Saudi Aramco expects great benefits from implementing PIMS, e.g., enhancing safety, enhancing pipeline network robustness, optimizing inspection and maintenance expenditures, and facilitating pipeline management and the decision-making process. Saudi Aramco's new experience in adopting PIMS includes many challenges and lessons-learned associated with all of the PIMS development phases. These challenges include performing the gap analysis, conducting QA/QC sensitivity analysis for the acquired data, establishing the scope of work, selecting the appropriate applications and implementing PIMS. (author)

  18. Saudi Aramco experience towards establishing Pipelines Integrity Management System (PIMS)

    Energy Technology Data Exchange (ETDEWEB)

    Al-Ahmari, Saad A. [Saudi Aramco, Dhahran (Saudi Arabia)

    2009-07-01

    Saudi Aramco pipelines network transports hydrocarbons to export terminals, processing plants and domestic users. This network faced several safety and operational-related challenges that require having a more effective Pipelines Integrity Management System (PIMS). Therefore Saudi Aramco decided to develop its PIMS on the basis of geographical information system (GIS) support through different phases, i.e., establishing the integrity management framework, risk calculation approach, conducting a gap analysis toward the envisioned PIMS, establishing the required scope of work, screening the PIMS applications market, and selecting suitable tools that satisfy expected deliverables, and implement PIMS applications. Saudi Aramco expects great benefits from implementing PIMS, e.g., enhancing safety, enhancing pipeline network robustness, optimizing inspection and maintenance expenditures, and facilitating pipeline management and the decision-making process. Saudi Aramco's new experience in adopting PIMS includes many challenges and lessons-learned associated with all of the PIMS development phases. These challenges include performing the gap analysis, conducting QA/QC sensitivity analysis for the acquired data, establishing the scope of work, selecting the appropriate applications and implementing PIMS. (author)

  19. Nurse practitioner integration: Qualitative experiences of the change management process.

    Science.gov (United States)

    Lowe, Grainne; Plummer, Virginia; Boyd, Leanne

    2018-04-30

    The aim of this qualitative research was to explore perceptions of organisational change related to the integration of nurse practitioners from key nursing stakeholders. The ongoing delivery of effective and efficient patient services is reliant upon the development and sustainability of nurse practitioner roles. Examination of the factors contributing to the underutilization of nurse practitioner roles is crucial to inform future management policies. A change management theory is used to reveal the complexity involved. Qualitative interviews were undertaken using a purposive sampling strategy of key stakeholders. Thematic analysis was undertaken and key themes were correlated to the theoretical framework. The results confirm the benefits of nurse practitioner roles, but suggest organisational structures and embedded professional cultures present barriers to full role optimization. Complicated policy processes are creating barriers to the integration of nurse practitioner roles. The findings increase understanding of the links between strategic planning, human resource management, professional and organisational cultures, governance and politics in change management. Effective leadership drives the change process through the ability to align key components necessary for success. Sustainability of nurse practitioners relies on recognition of their full potential in the health care team. The results of this study highlight the importance of management and leadership in the promotion of advanced nursing skills and experience to better meet patient outcomes. The findings reinforce the potential of nurse practitioners to deliver patient centred, timely and efficient health care. © 2018 John Wiley & Sons Ltd.

  20. From Differentiation to Concretisation: Integrative Experiments in Sustainable Architecture

    Directory of Open Access Journals (Sweden)

    Graham Farmer

    2017-12-01

    Full Text Available It is widely recognised that the achievement of a sustainable built environment requires holistic design practices and approaches that are capable of balancing the varied, and often conflicting, demands of environmental, social and economic concerns. However, academics and practitioners have recently highlighted, and expressed concerns about the knowledge gap that currently exists within environmental policy, research and practice between understandings of the technical performance of buildings and their social meaning and relevance. This paper acknowledges these concerns and is developed from the author’s own direct experiences of practice-led research and active participation in design-build projects. It argues for a theoretically-informed and socially-engaged approach to built environment research, pedagogy and practice that seeks to encourage an integrative understanding of the design, realisation and use of sustainable architecture. The paper draws on the Philosophy of Technology and in particular the work of Andrew Feenberg to analyse the buildings and to propose an integrated and inclusive framework for understanding sustainable design that acknowledges not just what the built environment does, but also what it means. It also suggests that what a building means also informs what it can do, and for whom. Although the technical and social dimensions of design can be interpreted as distinct practices and are often institutionally separated, this paper argues that the realisation of sustainable design must seek a conscious interaction and interchange between these two differentiated dimensions.

  1. In vivo dosimetry with semiconducting diodes for dose verification in total-body irradiation. A 10-year experience

    International Nuclear Information System (INIS)

    Ramm, U.; Licher, J.; Moog, J.; Scherf, C.; Kara, E.; Boettcher, H.D.; Roedel, C.; Mose, S.

    2008-01-01

    Background and purpose: for total-body irradiation (TBI) using the translation method, dose distribution cannot be computed with computer-assisted three-dimensional planning systems. Therefore, dose distribution has to be primarily estimated based on CT scans (beam-zone method) which is followed by in vivo measurements to ascertain a homogeneous dose delivery. The aim of this study was to clinically establish semiconductor probes as a simple and fast method to obtain an online verification of the dose at relevant points. Patients and methods: in 110 consecutively irradiated TBI patients (12.6 Gy, 2 x 1.8 Gy/day), six semiconductor probes were attached to the body surface at dose-relevant points (eye/head, neck, lung, navel). The mid-body point of the abdomen was defined as dose reference point. The speed of translation was optimized to definitively reach the prescribed dose in this point. Based on the entrance and exit doses, the mid-body doses at the other points were computed. The dose homogeneity in the entire target volume was determined comparing all measured data with the dose at the reference point. Results: after calibration of the semiconductor probes under treatment conditions the dose in selected points and the dose homogeneity in the target volume could be quantitatively specified. In the TBI patients, conformity of calculated and measured doses in the given points was achieved with small deviations of adequate accuracy. The data of 80% of the patients are within an uncertainty of ± 5%. Conclusion: during TBI using the translation method, dose distribution and dose homogeneity can be easily controlled in selected points by means of semiconductor probes. Semiconductor probes are recommended for further use in the physical evaluation of TBI. (orig.)

  2. Integrated Advanced Microwave Sounding Unit-A (AMSU-A). Performance Verification Report: Final Comprehensive Performance Test Report, P/N 1331720-2TST, S/N 105/A1

    Science.gov (United States)

    Platt, R.

    1999-01-01

    This is the Performance Verification Report, Final Comprehensive Performance Test (CPT) Report, for the Integrated Advanced Microwave Sounding Unit-A (AMSU-A). This specification establishes the requirements for the CPT and Limited Performance Test (LPT) of the AMSU-1A, referred to here in as the unit. The sequence in which the several phases of this test procedure shall take place is shown.

  3. Verification Games: Crowd-Sourced Formal Verification

    Science.gov (United States)

    2016-03-01

    additional paintbrushes. Additionally, in Paradox , human players are never given small optimization problems (for example, toggling the values of 50...were developed by the Center for Game Science: Pipe Jam, Traffic Jam, Flow Jam and Paradox . Verification tools and games were integrated to verify...4 4. Paradox …………………………………………………......5 5. MyClass ………………………………………………….....7 6. Results …………………………………………………......11 7. Time to

  4. Swarm Verification

    Science.gov (United States)

    Holzmann, Gerard J.; Joshi, Rajeev; Groce, Alex

    2008-01-01

    Reportedly, supercomputer designer Seymour Cray once said that he would sooner use two strong oxen to plow a field than a thousand chickens. Although this is undoubtedly wise when it comes to plowing a field, it is not so clear for other types of tasks. Model checking problems are of the proverbial "search the needle in a haystack" type. Such problems can often be parallelized easily. Alas, none of the usual divide and conquer methods can be used to parallelize the working of a model checker. Given that it has become easier than ever to gain access to large numbers of computers to perform even routine tasks it is becoming more and more attractive to find alternate ways to use these resources to speed up model checking tasks. This paper describes one such method, called swarm verification.

  5. 1. Introduction. 2. Laboratory experiments. 3. Field experiments. 4. Integrated field-laboratory experiments. 5. Panel recommendations

    International Nuclear Information System (INIS)

    Anon.

    1975-01-01

    Some recommendations for the design of laboratory and field studies in marine radioecology are formulated. The difficulties concerning the comparability of various experimental methods used to measure the fluxes of radionuclides through marine organisms and ecosystems, and also the use of laboratory results to make predictions for the natural environment are discussed. Three working groups were established during the panel meeting, to consider laboratory experiments, field studies, and the design and execution of integrated laboratory and field studies respectively. A number of supporting papers dealing with marine radioecological experiments were presented

  6. Verification and validation process for the safety software in KNICS

    International Nuclear Information System (INIS)

    Kwon, Kee-Choon; Lee, Jang-Soo; Kim, Jang-Yeol

    2004-01-01

    This paper describes the Verification and Validation (V and V ) process for safety software of Programmable Logic Controller (PLC), Digital Reactor Protection System (DRPS), and Engineered Safety Feature-Component Control System (ESF-CCS) that are being developed in Korea Nuclear Instrumentation and Control System (KNICS) projects. Specifically, it presents DRPS V and V experience according to the software development life cycle. The main activities of DRPS V and V process are preparation of software planning documentation, verification of Software Requirement Specification (SRS), Software Design Specification (SDS) and codes, and testing of the integrated software and the integrated system. In addition, they include software safety analysis and software configuration management. SRS V and V of DRPS are technical evaluation, licensing suitability evaluation, inspection and traceability analysis, formal verification, preparing integrated system test plan, software safety analysis, and software configuration management. Also, SDS V and V of RPS are technical evaluation, licensing suitability evaluation, inspection and traceability analysis, formal verification, preparing integrated software test plan, software safety analysis, and software configuration management. The code V and V of DRPS are traceability analysis, source code inspection, test case and test procedure generation, software safety analysis, and software configuration management. Testing is the major V and V activity of software integration and system integration phase. Software safety analysis at SRS phase uses Hazard Operability (HAZOP) method, at SDS phase it uses HAZOP and Fault Tree Analysis (FTA), and at implementation phase it uses FTA. Finally, software configuration management is performed using Nu-SCM (Nuclear Software Configuration Management) tool developed by KNICS project. Through these activities, we believe we can achieve the functionality, performance, reliability and safety that are V

  7. Multi-Disciplinary Research Experiences Integrated with Industry –Field Experiences

    Directory of Open Access Journals (Sweden)

    Suzanne Lunsford

    2015-10-01

    Full Text Available The purpose of this environmentally inquiry-based lab was to allow the students to engage into real-world concepts that integrate industry setting (Ohio Aggregate Industrial Mineral Association with the academia setting. Our students are engaged into a field trip where mining occurs to start the problem based learning of how the heavy metals leak in the mining process. These heavy metals such as lead and indium in the groundwater are a serious concern for the environment (Environmental Protection Agency from the mining process. The field experiences at the mining process assist in building our students interest in developing sensors to detect heavy metals of concern such as lead and indium simultaneously by a unique electrochemistry technique called Square Wave Anodic Stripping Voltammetry (SWASV. The field experience assists building the students interest in real –world application and what qualities do they want the electrochemical sensor to possess to be successful for real world usage. During the field trip the students are engaged into learning novel instrumentation such as an SEM (Scanning Electron Microscope to study the working electrode sensor developed to understand the sensor surface morphology properties better as well. The integration of industry setting with academia has been a positive experience for our students that has allowed their understanding of real-world science research needs to succeed in an industrial setting of research.

  8. Ride comfort optimization of a multi-axle heavy motorized wheel dump truck based on virtual and real prototype experiment integrated Kriging model

    Directory of Open Access Journals (Sweden)

    Bian Gong

    2015-06-01

    Full Text Available The optimization of hydro-pneumatic suspension parameters of a multi-axle heavy motorized wheel dump truck is carried out based on virtual and real prototype experiment integrated Kriging model in this article. The root mean square of vertical vibration acceleration, in the center of sprung mass, is assigned as the optimization objective. The constraints are the natural frequency, the working stroke, and the dynamic load of wheels. The suspension structure for the truck is the adjustable hydro-pneumatic suspension with ideal vehicle nonlinear characteristics, integrated with elastic and damping elements. Also, the hydraulic systems of two adjacent hydro-pneumatic suspension are interconnected. Considering the high complexity of the engineering model, a novel kind of meta-model called virtual and real prototype experiment integrated Kriging is proposed in this article. The interpolation principle and the construction of virtual and real prototype experiment integrated Kriging model were elucidated. Being different from traditional Kriging, virtual and real prototype experiment integrated Kriging combines the respective advantages of actual test and Computer Aided Engineering simulation. Based on the virtual and real prototype experiment integrated Kriging model, the optimization results, obtained by experimental verification, showed significant improvement in the ride comfort by 12.48% for front suspension and 11.79% for rear suspension. Compared with traditional Kriging, the optimization effect was improved by 3.05% and 3.38% respectively. Virtual and real prototype experiment integrated Kriging provides an effective way to approach the optimal solution for the optimization of high-complexity engineering problems.

  9. Technical safety requirements control level verification

    International Nuclear Information System (INIS)

    STEWART, J.L.

    1999-01-01

    A Technical Safety Requirement (TSR) control level verification process was developed for the Tank Waste Remediation System (TWRS) TSRs at the Hanford Site in Richland, WA, at the direction of the US. Department of Energy, Richland Operations Office (RL). The objective of the effort was to develop a process to ensure that the TWRS TSR controls are designated and managed at the appropriate levels as Safety Limits (SLs), Limiting Control Settings (LCSs), Limiting Conditions for Operation (LCOs), Administrative Controls (ACs), or Design Features. The TSR control level verification process was developed and implemented by a team of contractor personnel with the participation of Fluor Daniel Hanford, Inc. (FDH), the Project Hanford Management Contract (PHMC) integrating contractor, and RL representatives. The team was composed of individuals with the following experience base: nuclear safety analysis; licensing; nuclear industry and DOE-complex TSR preparation/review experience; tank farm operations; FDH policy and compliance; and RL-TWRS oversight. Each TSR control level designation was completed utilizing TSR control logic diagrams and TSR criteria checklists based on DOE Orders, Standards, Contractor TSR policy, and other guidance. The control logic diagrams and criteria checklists were reviewed and modified by team members during team meetings. The TSR control level verification process was used to systematically evaluate 12 LCOs, 22 AC programs, and approximately 100 program key elements identified in the TWRS TSR document. The verification of each TSR control required a team consensus. Based on the results of the process, refinements were identified and the TWRS TSRs were modified as appropriate. A final report documenting key assumptions and the control level designation for each TSR control was prepared and is maintained on file for future reference. The results of the process were used as a reference in the RL review of the final TWRS TSRs and control suite. RL

  10. Technical safety requirements control level verification; TOPICAL

    International Nuclear Information System (INIS)

    STEWART, J.L.

    1999-01-01

    A Technical Safety Requirement (TSR) control level verification process was developed for the Tank Waste Remediation System (TWRS) TSRs at the Hanford Site in Richland, WA, at the direction of the US. Department of Energy, Richland Operations Office (RL). The objective of the effort was to develop a process to ensure that the TWRS TSR controls are designated and managed at the appropriate levels as Safety Limits (SLs), Limiting Control Settings (LCSs), Limiting Conditions for Operation (LCOs), Administrative Controls (ACs), or Design Features. The TSR control level verification process was developed and implemented by a team of contractor personnel with the participation of Fluor Daniel Hanford, Inc. (FDH), the Project Hanford Management Contract (PHMC) integrating contractor, and RL representatives. The team was composed of individuals with the following experience base: nuclear safety analysis; licensing; nuclear industry and DOE-complex TSR preparation/review experience; tank farm operations; FDH policy and compliance; and RL-TWRS oversight. Each TSR control level designation was completed utilizing TSR control logic diagrams and TSR criteria checklists based on DOE Orders, Standards, Contractor TSR policy, and other guidance. The control logic diagrams and criteria checklists were reviewed and modified by team members during team meetings. The TSR control level verification process was used to systematically evaluate 12 LCOs, 22 AC programs, and approximately 100 program key elements identified in the TWRS TSR document. The verification of each TSR control required a team consensus. Based on the results of the process, refinements were identified and the TWRS TSRs were modified as appropriate. A final report documenting key assumptions and the control level designation for each TSR control was prepared and is maintained on file for future reference. The results of the process were used as a reference in the RL review of the final TWRS TSRs and control suite. RL

  11. Researchers' experiences, positive and negative, in integrative landscape projects

    NARCIS (Netherlands)

    Tress, B.; Tress, G.; Fry, G.

    2005-01-01

    Integrative (interdisciplinary and transdisciplinary) landscape research projects are becoming increasingly common. As a result, researchers are spending a larger proportion of their professional careers doing integrative work, participating in shifting interdisciplinary teams, and cooperating

  12. Nuclear data and integral experiments in reactor physics

    International Nuclear Information System (INIS)

    Farinelli, U.

    1980-01-01

    The material given here broadly covers the content of the 10 lectures delivered at the Winter Course on Reactor Theory and Power Reactors, ICTP, Trieste (13 February - 10 March 1978). However, the parts that could easily be found in the current literature have been omitted and replaced with the appropriate references. The needs for reactor physics calculations, particularly as applicable to commercial reactors, are reviewed in the introduction. The relative merits and shortcomings of fundamental and semi-empirical methods are discussed. The relative importance of different nuclear data, the ways in which they can be measured or calculated, and the sources of information on measured and evaluated data are briefly reviewed. The various approaches to the condensation of nuclear data to multigroup cross sections are described. After some consideration to the sensitivity calculations and the evaluation of errors, some of the most important type of integral experiments in reactor physics are introduced, with a view to showing the main difficulties in the interpretation and utilization of their results and the most recent trends in experimentation. The conclusions try to assign some priorities in the implementation of experimental and calculational capabilities, especially for a developing country. (author)

  13. CASL Verification and Validation Plan

    Energy Technology Data Exchange (ETDEWEB)

    Mousseau, Vincent Andrew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dinh, Nam [North Carolina State Univ., Raleigh, NC (United States)

    2016-06-30

    This report documents the Consortium for Advanced Simulation of LWRs (CASL) verification and validation plan. The document builds upon input from CASL subject matter experts, most notably the CASL Challenge Problem Product Integrators, CASL Focus Area leaders, and CASL code development and assessment teams. This document will be a living document that will track progress on CASL to do verification and validation for both the CASL codes (including MPACT, CTF, BISON, MAMBA) and for the CASL challenge problems (CIPS, PCI, DNB). The CASL codes and the CASL challenge problems are at differing levels of maturity with respect to validation and verification. The gap analysis will summarize additional work that needs to be done. Additional VVUQ work will be done as resources permit. This report is prepared for the Department of Energy’s (DOE’s) CASL program in support of milestone CASL.P13.02.

  14. The integrated circuit IC EMP transient state disturbance effect experiment method investigates

    International Nuclear Information System (INIS)

    Li Xiaowei

    2004-01-01

    Transient state disturbance characteristic study on the integrated circuit, IC, need from its coupling path outset. Through cable (aerial) coupling, EMP converts to an pulse current voltage and results in the impact to the integrated circuit I/O orifice passing the cable. Aiming at the armament system construction feature, EMP effect to the integrated circuit, IC inside the system is analyzed. The integrated circuit, IC EMP effect experiment current injection method is investigated and a few experiments method is given. (authors)

  15. Integrating Renewable Generation into Grid Operations: Four International Experiences

    Energy Technology Data Exchange (ETDEWEB)

    Weimar, Mark R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Mylrea, Michael E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Levin, Todd [Argonne National Lab. (ANL), Argonne, IL (United States); Botterud, Audun [Argonne National Lab. (ANL), Argonne, IL (United States); O' Shaughnessy, Eric [National Renewable Energy Lab. (NREL), Golden, CO (United States); Bird, Lori [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2016-04-22

    International experiences with power sector restructuring and the resultant impacts on bulk power grid operations and planning may provide insight into policy questions for the evolving United States power grid as resource mixes are changing in response to fuel prices, an aging generation fleet and to meet climate goals. Australia, Germany, Japan and the UK were selected to represent a range in the level and attributes of electricity industry liberalization in order to draw comparisons across a variety of regions in the United States such as California, ERCOT, the Southwest Power Pool and the Southeast Reliability Region. The study draws conclusions through a literature review of the four case study countries with regards to the changing resource mix and the electricity industry sector structure and their impact on grid operations and planning. This paper derives lessons learned and synthesizes implications for the United States based on answers to the above questions and the challenges faced by the four selected countries. Each country was examined to determine the challenges to their bulk power sector based on their changing resource mix, market structure, policies driving the changing resource mix, and policies driving restructuring. Each countries’ approach to solving those changes was examined, as well as how each country’s market structure either exacerbated or mitigated the approaches to solving the challenges to their bulk power grid operations and planning. All countries’ policies encourage renewable energy generation. One significant finding included the low- to zero-marginal cost of intermittent renewables and its potential negative impact on long-term resource adequacy. No dominant solution has emerged although a capacity market was introduced in the UK and is being contemplated in Japan. Germany has proposed the Energy Market 2.0 to encourage flexible generation investment. The grid operator in Australia proposed several approaches to maintaining

  16. Using formal specification in the Guidance and Control Software (GCS) experiment. Formal design and verification technology for life critical systems

    Science.gov (United States)

    Weber, Doug; Jamsek, Damir

    1994-01-01

    The goal of this task was to investigate how formal methods could be incorporated into a software engineering process for flight-control systems under DO-178B and to demonstrate that process by developing a formal specification for NASA's Guidance and Controls Software (GCS) Experiment. GCS is software to control the descent of a spacecraft onto a planet's surface. The GCS example is simplified from a real example spacecraft, but exhibits the characteristics of realistic spacecraft control software. The formal specification is written in Larch.

  17. Verification of vectorized Monte Carlo code MVP using JRR-4 experiment of fast neutrons penetrating through graphite and water

    International Nuclear Information System (INIS)

    Odano, N.; Miura, T.; Yamaji, A.

    1996-01-01

    Measurement of activation reaction rates was carried out for fast neutrons penetrating through graphite and water from the core of JRR-4 research reactor of JAERI, with paying attention to the energy above 10 MeV. Analysis of the experiment was made using a vectorized continuous energy Monte Carlo code MVP to verify the code. The analysis shows good agreements between the measurement and calculation and the MVP code has been confirmed its validity for the fast neutron transport calculations above 10 MeV in fission neutron field. (author)

  18. Proposal as to Efficient Collection and Exploitation of Earthquake Damage Information and Verification by Field Experiment at Toyohashi City

    Science.gov (United States)

    Zama, Shinsaku; Endo, Makoto; Takanashi, Ken'ichi; Araiba, Kiminori; Sekizawa, Ai; Hosokawa, Masafumi; Jeong, Byeong-Pyo; Hisada, Yoshiaki; Murakami, Masahiro

    Based on the earlier study result that the gathering of damage information can be quickly achieved in a municipality with a smaller population, it is proposed that damage information is gathered and analyzed using an area roughly equivalent to a primary school district as a basic unit. The introduction of this type of decentralized system is expected to quickly gather important information on each area. The information gathered by these communal disaster prevention bases is sent to the disaster prevention headquarters which in turn feeds back more extensive information over a wider area to the communal disaster prevention bases. Concrete systems have been developed according to the above mentioned framework, and we performed large-scale experiments on simulating disaster information collection, transmission and on utilization for smooth responses against earthquake disaster with collaboration from Toyohashi City, Aichi Prefecture, where is considered to suffer extensive damage from the Tokai and Tonankai Earthquakes with very high probability of the occurrence. Using disaster information collection/transmission equipments composed of long-distance wireless LAN, a notebook computer, a Web camera and an IP telephone, city staffs could easily input and transmit the information such as fire, collapsed houses and impassable roads, which were collected by the inhabitants participated in the experiment. Headquarters could confirm such information on the map automatically plotted, and also state of each disaster-prevention facility by means of Web-cameras and IP telephones. Based on the damage information, fire-spreading, evaluation, and traffic simulations were automatically executed at the disaster countermeasure office and their results were displayed on the large screen to utilize for making decisions such as residents' evacuation. These simulated results were simultaneously displayed at each disaster-prevention facility and were served to make people understand the

  19. Experience of Integrated Safeguards Approach for Large-scale Hot Cell Laboratory

    International Nuclear Information System (INIS)

    Miyaji, N.; Kawakami, Y.; Koizumi, A.; Otsuji, A.; Sasaki, K.

    2010-01-01

    The Japan Atomic Energy Agency (JAEA) has been operating a large-scale hot cell laboratory, the Fuels Monitoring Facility (FMF), located near the experimental fast reactor Joyo at the Oarai Research and Development Center (JNC-2 site). The FMF conducts post irradiation examinations (PIE) of fuel assemblies irradiated in Joyo. The assemblies are disassembled and non-destructive examinations, such as X-ray computed tomography tests, are carried out. Some of the fuel pins are cut into specimens and destructive examinations, such as ceramography and X-ray micro analyses, are performed. Following PIE, the tested material, in the form of a pin or segments, is shipped back to a Joyo spent fuel pond. In some cases, after reassembly of the examined irradiated fuel pins is completed, the fuel assemblies are shipped back to Joyo for further irradiation. For the IAEA to apply the integrated safeguards approach (ISA) to the FMF, a new verification system on material shipping and receiving process between Joyo and the FMF has been established by the IAEA under technical collaboration among the Japan Safeguard Office (JSGO) of MEXT, the Nuclear Material Control Center (NMCC) and the JAEA. The main concept of receipt/shipment verification under the ISA for JNC-2 site is as follows: under the IS, the FMF is treated as a Joyo-associated facility in terms of its safeguards system because it deals with the same spent fuels. Verification of the material shipping and receiving process between Joyo and the FMF can only be applied to the declared transport routes and transport casks. The verification of the nuclear material contained in the cask is performed with the method of gross defect at the time of short notice random interim inspections (RIIs) by measuring the surface neutron dose rate of the cask, filled with water to reduce radiation. The JAEA performed a series of preliminary tests with the IAEA, the JSGO and the NMCC, and confirmed from the standpoint of the operator that this

  20. Circumnutation and its dependence on the gravity response in rice, morning glory and pea plants: verification by spaceflight experiments

    Science.gov (United States)

    Takahashi, Hideyuki; Kobayashi, Akie; Fujii, Nobuharu; Yano, Sachiko; Shimazu, Toru; Kim, Hyejeong; Tomita, Yuuta; Miyazawa, Yutaka

    Plant organs display helical growth movement known as circumnutation. This movement helps plant organs find suitable environmental cues. The amplitude, period and shape of the circumnutation differ depending on plant species or organs. Although the mechanism for circumnutation is unclear, it has long been argued whether circumnutation is involved with gravitropic response. Previously, we showed that shoots of weeping morning glory (we1 and we2) are impaired in not only the differentiation of endodermis (gravisensing cells) and gravitropic response, but also winding and circumnutation (Kitazawa et al., PNAS 102: 18742-18747, 2005). Here, we report a reduced circumnutation in the shoots of rice and the roots of pea mutants defective in gravitropic response. Coleoptiles of clinorotated rice seedlings and decapped roots of pea seedlings also showed a reduction of their circumnutational movement. These results suggest that circumnutation is tightly related with gravitropic response. In the proposed spaceflight experiments, “Plant Rotation”, we will verify the hypothesis that circumnutation requires gravity response, by using microgravity environment in KIBO module of the International Space Station. We will grow rice and morning glory plants under both muG and 1G conditions on orbit and monitor their growth by a camera. The downlinked images will be analyzed for the measurements of plant growth and nutational movements. This experiment will enable us to answer the question whether circumnutation depends on gravity response or not.

  1. CEC thermal-hydraulic benchmark exercise on Fiploc verification experiment F2 in Battelle model containment. Experimental phases 2, 3 and 4. Results of comparisons

    International Nuclear Information System (INIS)

    Fischer, K.; Schall, M.; Wolf, L.

    1993-01-01

    The present final report comprises the major results of Phase II of the CEC thermal-hydraulic benchmark exercise on Fiploc verification experiment F2 in the Battelle model containment, experimental phases 2, 3 and 4, which was organized and sponsored by the Commission of the European Communities for the purpose of furthering the understanding and analysis of long-term thermal-hydraulic phenomena inside containments during and after severe core accidents. This benchmark exercise received high European attention with eight organizations from six countries participating with eight computer codes during phase 2. Altogether 18 results from computer code runs were supplied by the participants and constitute the basis for comparisons with the experimental data contained in this publication. This reflects both the high technical interest in, as well as the complexity of, this CEC exercise. Major comparison results between computations and data are reported on all important quantities relevant for containment analyses during long-term transients. These comparisons comprise pressure, steam and air content, velocities and their directions, heat transfer coefficients and saturation ratios. Agreements and disagreements are discussed for each participating code/institution, conclusions drawn and recommendations provided. The phase 2 CEC benchmark exercise provided an up-to-date state-of-the-art status review of the thermal-hydraulic capabilities of present computer codes for containment analyses. This exercise has shown that all of the participating codes can simulate the important global features of the experiment correctly, like: temperature stratification, pressure and leakage, heat transfer to structures, relative humidity, collection of sump water. Several weaknesses of individual codes were identified, and this may help to promote their development. As a general conclusion it may be said that while there is still a wide area of necessary extensions and improvements, the

  2. Ten Years in the Academic Integrity Trenches: Experiences and Issues

    Science.gov (United States)

    Atkinson, Doug; Nau, S. Zaung; Symons, Christine

    2016-01-01

    In 2016, our university launched its Academic Integrity Program (AIP) in order to promote and protect academic integrity. All commencing students must complete this online AIP within 14 days of starting their course. Satisfactory completion of this module with a test score of 80% is required before students can access their course materials.…

  3. Verification of Chemical Weapons Destruction

    International Nuclear Information System (INIS)

    Lodding, J.

    2010-01-01

    The Chemical Weapons Convention is the only multilateral treaty that bans completely an entire category of weapons of mass destruction under international verification arrangements. Possessor States, i.e. those that have chemical weapons stockpiles at the time of becoming party to the CWC, commit to destroying these. All States undertake never to acquire chemical weapons and not to help other States acquire such weapons. The CWC foresees time-bound chemical disarmament. The deadlines for destruction for early entrants to the CWC are provided in the treaty. For late entrants, the Conference of States Parties intervenes to set destruction deadlines. One of the unique features of the CWC is thus the regime for verifying destruction of chemical weapons. But how can you design a system for verification at military sites, while protecting military restricted information? What degree of assurance is considered sufficient in such circumstances? How do you divide the verification costs? How do you deal with production capability and initial declarations of existing stockpiles? The founders of the CWC had to address these and other challenges in designing the treaty. Further refinement of the verification system has followed since the treaty opened for signature in 1993 and since inspection work was initiated following entry-into-force of the treaty in 1997. Most of this work concerns destruction at the two large possessor States, Russia and the United States. Perhaps some of the lessons learned from the OPCW experience may be instructive in a future verification regime for nuclear weapons. (author)

  4. Software verification for nuclear industry

    International Nuclear Information System (INIS)

    Wilburn, N.P.

    1985-08-01

    Why verification of software products throughout the software life cycle is necessary is considered. Concepts of verification, software verification planning, and some verification methodologies for products generated throughout the software life cycle are then discussed

  5. Overview of Battelle-Europe experiments and model development on H2-deflagrations, igniter and catalytic device verifications

    International Nuclear Information System (INIS)

    Wolf, L.; Kanzleiter, T.; Behrens, U.; Langer, G.; Fischer, K.; Holzbauer, H.; Seidler, M.; Wolff, U.

    1991-01-01

    The design and implementation of an optimal mitigation strategy against the potential threat by H 2 -deflagrations/detonations in the aftermath of least probable severe accidents necessitates a knowledge base about these phenomena for representative conditions and realistic geometries. In order to extend existing know-how into the area of multi-compartment geometries connected by vents, the German Ministry of Research and Technology (BMFT) initiated in 1987 and fully sponsored since 1988 a multi-faceted experimental research program on H 2 -deflagrations and related issues in multi-compartment geometries in the Battelle Model Containment (BMC). The following test groups constitute the Battelle-program: Test Group H: basic tests in H 2 -air and H 2 -steam air-mixtures; Test Group I: (sponsored by industry): supplemental tests in H 2 -air and H 2 -steam-air atmospheres; Test Group G: efficiency of H 2 -mitigative measures such as igniters, catalyst modules and combinations thereof. Thus far, a total of 70 experiments have been performed and evaluated. The paper gives an overview of the Battelle-program by reviewing the examined geometries, bandwidths of tested H 2 -concentrations, steam concentrations, ignition locations, specific atmospheric conditions (homogeneous, stratified) and other important parameters varied during and among the different test groups H, G, and I. Some experimental findings are reported

  6. Verification and disarmament

    Energy Technology Data Exchange (ETDEWEB)

    Blix, H. [IAEA, Vienna (Austria)

    1998-07-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed.

  7. Verification and disarmament

    International Nuclear Information System (INIS)

    Blix, H.

    1998-01-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed

  8. An investigation of strategies for integrated learning experiences ...

    African Journals Online (AJOL)

    (NCS, 2002) integrated music, dance, drama and visual arts where possible, while ... for the development of literacy skills, the latter term used by Wagner, ..... for movement ..... Qualitative investigation of young children's music preferences.

  9. Integrated Advanced Microwave Sounding Unit-A (AMSU-A). Performance Verification Report: Initial Comprehensive Performance Test Report, P/N 1331200-2-IT, S/N 105/A2

    Science.gov (United States)

    Platt, R.

    1999-01-01

    This is the Performance Verification Report, Initial Comprehensive Performance Test Report, P/N 1331200-2-IT, S/N 105/A2, for the Integrated Advanced Microwave Sounding Unit-A (AMSU-A). The specification establishes the requirements for the Comprehensive Performance Test (CPT) and Limited Performance Test (LPT) of the Advanced Microwave Sounding, Unit-A2 (AMSU-A2), referred to herein as the unit. The unit is defined on Drawing 1331200. 1.2 Test procedure sequence. The sequence in which the several phases of this test procedure shall take place is shown in Figure 1, but the sequence can be in any order.

  10. Challenges for effective WMD verification

    International Nuclear Information System (INIS)

    Andemicael, B.

    2006-01-01

    already awash in fissile material and is increasingly threatened by the possible consequences of illicit trafficking in such material. The chemical field poses fewer problems. The ban on chemical weapons is a virtually complete post-Cold War regime, with state-of-the-art concepts and procedures of verification resulting from decades of negotiation. The detection of prohibited materials and activities is the common goal of the nuclear and chemical regimes for which the most intrusive and intensive procedures are activated by the three organizations. Accounting for the strictly peaceful application of dual-use items constitutes the bulk of the work of the inspectorates at the IAEA and the OPCW. A common challenge in both fields is the advance of science and technology in the vast nuclear and chemical industries and the ingenuity of some determined proliferators to deceive by concealing illicit activities under legitimate ones. Inspection procedures and technologies need to keep up with the requirement for flexibility and adaptation to change. The common objective of the three organizations is to assemble and analyze all relevant information in order to conclude reliably whether a State is or is not complying with its treaty obligations. The positive lessons learned from the IAEA's verification experience today are valuable in advancing concepts and technologies that might also benefit the other areas of WMD verification. Together with the emerging, more comprehensive verification practice of the OPCW, they may provide a useful basis for developing common standards, which may in turn help in evaluating the cost-effectiveness of verification methods for the Biological and Toxin Weapons Convention and other components of a WMD control regime

  11. Summarisation of construction and commissioning experience for nuclear power integrated test facility

    International Nuclear Information System (INIS)

    Xiao Zejun; Jia Dounan; Jiang Xulun; Chen Bingde

    2003-01-01

    Since the foundation of Nuclear Power Institute of China, it has successively designed various engineering experimental facilities, and constructed nuclear power experimental research base, and accumulated rich construction experiences of nuclear power integrated test facility. The author presents experience on design, construction and commissioning of nuclear power integrated test facility

  12. Review of recent benchmark experiments on integral test for high energy nuclear data evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Nakashima, Hiroshi; Tanaka, Susumu; Konno, Chikara; Fukahori, Tokio; Hayashi, Katsumi [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1998-11-01

    A survey work of recent benchmark experiments on an integral test for high energy nuclear data evaluation was carried out as one of the work of the Task Force on JENDL High Energy File Integral Evaluation (JHEFIE). In this paper the results are compiled and the status of recent benchmark experiments is described. (author)

  13. The use of gold markers and electronic portal imaging for radiotherapy verification in prostate cancer patients: Sweden Ghana Medical Centre experience

    Directory of Open Access Journals (Sweden)

    George Felix Acquah

    2014-02-01

    Full Text Available The success of radiotherapy cancer treatment delivery depends on the accuracy of patient setup for each fraction. A significant problem arises from reproducing the same patient position and prostate location during treatment planning for every fraction of the treatment process. To analyze the daily movements of the prostate, gold markers are implanted in the prostate and portal images taken and manually matched with reference images to locate the prostate. Geometrical and fiducial markers are annotated onto a highly quality generated digitally reconstructed radiographs, that are compared with portal images acquired right before treatment dose delivery. A 0 and 270 degree treatment fields are used to calculate prostate shifts for all prostate cancer patients undergoing treatment at the Sweden Ghana Medical Centre, using an iViewGT portal imaging device. After aligning of the marker positions onto the reference images, the set-up deviations corrections are displayed and an on-line correction procedure applied. The measured migrations of the prostate markers are below the threshold of 3 mm for the main plans and 2 mm for the boost plans. With daily electronic portal imaging combined with gold markers, provides an objective method for verifying and correcting the position of the prostate immediately prior to radiation delivery.--------------------------------------------Cite this article as: Acquah GF. The use of gold markers and electronic portal imaging for radiotherapy verification in prostate cancer patients: Sweden Ghana Medical Centre experience. Int J Cancer Ther Oncol 2014; 2(1:020112.DOI: http://dx.doi.org/10.14319/ijcto.0201.12

  14. Experience of developing an integrated nondestructive assay system

    International Nuclear Information System (INIS)

    Hsue, S.T.; Baker, M.P.

    1987-01-01

    A consortium of laboratories is collaborating with the Savannah River Plant to develop an integrated system of state-of-the-art nondestructive assay (NDA) instrumentation to provide nuclear materials accounting and process control information for a new plutonium scrap recovery facility. Individual instruments report assay results to an instrument control computer (ICC); the ICC, in turn, is part of a larger computer network that includes computers that perform process control and materials accounting functions. The design of the integrated NDA measurement system is shown. Each NDA instrument that is part of the integrated system is microcomputer-based and thus is capable of stand-alone operation if the central computer is out of service. Certain hardware features, such as microcomputers, pulse processing modules, and multichannel analyzers, are standardized throughout the system. Another standard feature is the communication between individual NDA instruments and the ICC. The most unique phase of the project is the integral staging. The primary purpose of this phase is to check the communications between various computers and to verify the ICC software during the operation of the NDA instruments. Implementing this integrated system in a process environment represents a major step in realizing the full capabilities of modern NDA instrumentation

  15. Vertical integration of medical education: Riverland experience, South Australia.

    Science.gov (United States)

    Rosenthal, D R; Worley, P S; Mugford, B; Stagg, P

    2004-01-01

    Vertical integration of medical education is currently a prominent international topic, resulting from recent strategic initiatives to improve medical education and service delivery in areas of poorly met medical need. In this article, vertical integration of medical education is defined as 'a grouping of curricular content and delivery mechanisms, traversing the traditional boundaries of undergraduate, postgraduate and continuing medical education, with the intent of enhancing the transfer of knowledge and skills between those involved in the learning-teaching process'. Educators closely involved with vertically integrated teaching in the Riverland of South Australia present an analytical description of the educational dynamics of this system. From this analysis, five elements are identified which underpin the process of successful vertical integration: (1) raised educational stakes; (2) local ownership; (3) broad university role; (4) longer attachments; and (5) shared workforce vision. Given the benefits to the Riverland medical education programs described in this paper, it is not surprising that vertical integration of medical education is a popular goal in many rural regions throughout the world. Although different contexts will result in different functional arrangements, it could be argued that the five principles outlined in this article can be applied in any region.

  16. VEG-01: Veggie Hardware Verification Testing

    Science.gov (United States)

    Massa, Gioia; Newsham, Gary; Hummerick, Mary; Morrow, Robert; Wheeler, Raymond

    2013-01-01

    The Veggie plant/vegetable production system is scheduled to fly on ISS at the end of2013. Since much of the technology associated with Veggie has not been previously tested in microgravity, a hardware validation flight was initiated. This test will allow data to be collected about Veggie hardware functionality on ISS, allow crew interactions to be vetted for future improvements, validate the ability of the hardware to grow and sustain plants, and collect data that will be helpful to future Veggie investigators as they develop their payloads. Additionally, food safety data on the lettuce plants grown will be collected to help support the development of a pathway for the crew to safely consume produce grown on orbit. Significant background research has been performed on the Veggie plant growth system, with early tests focusing on the development of the rooting pillow concept, and the selection of fertilizer, rooting medium and plant species. More recent testing has been conducted to integrate the pillow concept into the Veggie hardware and to ensure that adequate water is provided throughout the growth cycle. Seed sanitation protocols have been established for flight, and hardware sanitation between experiments has been studied. Methods for shipping and storage of rooting pillows and the development of crew procedures and crew training videos for plant activities on-orbit have been established. Science verification testing was conducted and lettuce plants were successfully grown in prototype Veggie hardware, microbial samples were taken, plant were harvested, frozen, stored and later analyzed for microbial growth, nutrients, and A TP levels. An additional verification test, prior to the final payload verification testing, is desired to demonstrate similar growth in the flight hardware and also to test a second set of pillows containing zinnia seeds. Issues with root mat water supply are being resolved, with final testing and flight scheduled for later in 2013.

  17. IBERINCO in the foreign trade: the integration experience

    International Nuclear Information System (INIS)

    Lopez, F. J.; Sanchez Mayoral, M. L.; Garcia, P. J.

    2002-01-01

    The universal tendencies show that nations integrate into large communities to strengthen each other and to share cultures and economies. The integration goes beyond free commerce treaties, opening of the import and export trade shaping unified policies on private sector and reaches into daily life and in the practice of organisations, generating substantial changes in the way of life of people. The companies that will survive in the X XI century will not operate in isolated mood but will have to be able to integrate products and services from other sources adding in the process the value obtained from their basic skills surrounding around their products others non specific of their business and offering to their customers products and services well defined that will make them more valuable and different from their competitions. (Author)

  18. Integrated project management information systems: the French nuclear industry experience

    International Nuclear Information System (INIS)

    Jacquin, J.-C.; Caupin, G.-M.

    1990-01-01

    The article discusses the desirability of integrated project management systems within the French nuclear power industry. Change in demand for nuclear generation facilities over the last two decades has necessitated a change of policy concerning organization, cost and planning within the industry. Large corporate systems can benefit from integrating equipment and bulk materials tracking. Project management for the nuclear industry will, in future, need to incorporate computer aided design tools and project management information systems data bases as well as equipment and planning data. (UK)

  19. Integrated project management information systems: the French nuclear industry experience

    Energy Technology Data Exchange (ETDEWEB)

    Jacquin, J.-C.; Caupin, G.-M.

    1990-03-01

    The article discusses the desirability of integrated project management systems within the French nuclear power industry. Change in demand for nuclear generation facilities over the last two decades has necessitated a change of policy concerning organization, cost and planning within the industry. Large corporate systems can benefit from integrating equipment and bulk materials tracking. Project management for the nuclear industry will, in future, need to incorporate computer aided design tools and project management information systems data bases as well as equipment and planning data. (UK).

  20. Verification and the safeguards legacy

    International Nuclear Information System (INIS)

    Perricos, Demetrius

    2001-01-01

    A number of inspection or monitoring systems throughout the world over the last decades have been structured drawing upon the IAEA experience of setting up and operating its safeguards system. The first global verification system was born with the creation of the IAEA safeguards system, about 35 years ago. With the conclusion of the NPT in 1968, inspections were to be performed under safeguards agreements, concluded directly between the IAEA and non-nuclear weapon states parties to the Treaty. The IAEA developed the safeguards system within the limitations reflected in the Blue Book (INFCIRC 153), such as limitations of routine access by the inspectors to 'strategic points', including 'key measurement points', and the focusing of verification on declared nuclear material in declared installations. The system, based as it was on nuclear material accountancy. It was expected to detect a diversion of nuclear material with a high probability and within a given time and therefore determine also that there had been no diversion of nuclear material from peaceful purposes. The most vital element of any verification system is the inspector. Technology can assist but cannot replace the inspector in the field. Their experience, knowledge, intuition and initiative are invaluable factors contributing to the success of any inspection regime. The IAEA inspectors are however not part of an international police force that will intervene to prevent a violation taking place. To be credible they should be technically qualified with substantial experience in industry or in research and development before they are recruited. An extensive training program has to make sure that the inspectors retain their professional capabilities and that it provides them with new skills. Over the years, the inspectors and through them the safeguards verification system gained experience in: organization and management of large teams; examination of records and evaluation of material balances

  1. A SURVEY ON INDIAN EXPERIENCE ON INTEGRATED MANAGEMENT STANDARDS (IMS

    Directory of Open Access Journals (Sweden)

    H. Khanna

    2009-09-01

    Full Text Available Adoption of management systems standards is a key issue in manufacturing industry in India. Following the global trend quality and environmental issues are gaining importance. However the number of ISO 14001 certified companies are much less in India as compared to ISO 9001. The integration of ISO 14001 with ISO 9001 may help companies to sustain competitive advantage and overcome disappointments with quality standards and in turn encourage companies to adopt good environmental practices. The aim of this research is to study the implementation of integrated management standards (IMS by the manufacturing organizations in India. The different aspects of integration and benefits of IMS implementation are analyzed. This r esearch is based on empirical study carried out in Indian manufacturing firms, involving the application of a questionnaire. This questionnaire was tested on 50 manufacturing companies in India. The study reveals that focus on stakeholders; top management commitment and training are critical success factors for implementation of IMS. The main benefits of integration are discussed. The small sample size is one of the major limitations of this study. The paper informs the managers in manufacturing organizations and practitioners of management system standards especially in developing countries about IMS and will enable them to adopt IMS in future so that those organizations may not implement multiple and overlapping MSS(Management System Standards.

  2. An investigation of strategies for integrated learning experiences ...

    African Journals Online (AJOL)

    Pre-determined and emergent codes based on grounded theory showed that it is possible to integrate theory with practice within one art subject by teaching theoretical work in the context of practical work, thus optimizing the limited time allocated to arts and culture education in school timetables. Keywords: arts; arts and ...

  3. Clinical experience of integrative cancer immunotherapy with GcMAF.

    Science.gov (United States)

    Inui, Toshio; Kuchiike, Daisuke; Kubo, Kentaro; Mette, Martin; Uto, Yoshihiro; Hori, Hitoshi; Sakamoto, Norihiro

    2013-07-01

    Immunotherapy has become an attractive new strategy in the treatment of cancer. The laboratory and clinical study of cancer immunotherapy is rapidly advancing. However, in the clinical setting, the results of cancer immunotherapy are mixed. We therefore contend that cancer immunotherapy should be customized to each patient individually based on their immune status and propose an integrative immunotherapy approach with second-generation group-specific component macrophage activating factor (GcMAF)-containing human serum. The standard protocol of our integrative cancer immunotherapy is as follows: i) 0.5 ml GcMAF-containing human serum is administered intramuscularly or subcutaneously once or twice per week for the duration of cancer therapy until all cancer cells are eradicated; ii) hyper T/natural killer (NK) cell therapy is given once per week for six weeks; iii) high-dose vitamin C is administered intravenously twice per week; iv) alpha lipoic acid (600 mg) is administered orally daily; v) vitamin D3 (5,000-10,000 IU) is administered orally daily. By March 2013, Saisei Mirai have treated over 345 patients with GcMAF. Among them we here present the cases of three patients for whom our integrative immunotherapy was remarkably effective. The results of our integrative immunotherapy seem hopeful. We also plan to conduct a comparative clinical study.>

  4. Integrated product development and experience of communication in education

    NARCIS (Netherlands)

    H. Ihle; Ir. Dick van Schenk Brill; Ir. Peter van Kollenburg; Ir. H.E.V. Veenstra

    2000-01-01

    The Technical Departments at the Fontys University of Professional Education in Eindhoven, The Netherlands, offer a course which is devel-oped around the principles of Concurrent Engi-neering. Integrated Product Development (IPD) project teams are multi-disciplinary groups which develop products in

  5. Experiences with an integrated management system for aircraft maintenance

    International Nuclear Information System (INIS)

    Huber, U.

    1993-01-01

    For 20 years, SWISSAIR has employed an integrated information system for aircraft maintenance. To date, a wide range of functions has been set up in their own development. For the future SWISSAIR is increasingly basing on the use of SAP/standard software packages. 10 figs

  6. Integrating family planning into HIV care in western Kenya: HIV care providers' perspectives and experiences one year following integration.

    Science.gov (United States)

    Newmann, Sara J; Zakaras, Jennifer M; Tao, Amy R; Onono, Maricianah; Bukusi, Elizabeth A; Cohen, Craig R; Steinfeld, Rachel; Grossman, Daniel

    2016-01-01

    With high rates of unintended pregnancy in sub-Saharan Africa, integration of family planning (FP) into HIV care is being explored as a strategy to reduce unmet need for contraception. Perspectives and experiences of healthcare providers are critical in order to create sustainable models of integrated care. This qualitative study offers insight into how HIV care providers view and experience the benefits and challenges of providing integrated FP/HIV services in Nyanza Province, Kenya. Sixteen individual interviews were conducted among healthcare workers at six public sector HIV care facilities one year after the implementation of integrated FP and HIV services. Data were transcribed and analyzed qualitatively using grounded theory methods and Atlas.ti. Providers reported a number of benefits of integrated services that they believed increased the uptake and continuation of contraceptive methods. They felt that integrated services enabled them to reach a larger number of female and male patients and in a more efficient way for patients compared to non-integrated services. Availability of FP services in the same place as HIV care also eliminated the need for most referrals, which many providers saw as a barrier for patients seeking FP. Providers reported many challenges to providing integrated services, including the lack of space, time, and sufficient staff, inadequate training, and commodity shortages. Despite these challenges, the vast majority of providers was supportive of FP/HIV integration and found integrated services to be beneficial to HIV-infected patients. Providers' concerns relating to staffing, infrastructure, and training need to be addressed in order to create sustainable, cost-effective FP/HIV integrated service models.

  7. A Synthesized Framework for Formal Verification of Computing Systems

    Directory of Open Access Journals (Sweden)

    Nikola Bogunovic

    2003-12-01

    Full Text Available Design process of computing systems gradually evolved to a level that encompasses formal verification techniques. However, the integration of formal verification techniques into a methodical design procedure has many inherent miscomprehensions and problems. The paper explicates the discrepancy between the real system implementation and the abstracted model that is actually used in the formal verification procedure. Particular attention is paid to the seamless integration of all phases of the verification procedure that encompasses definition of the specification language and denotation and execution of conformance relation between the abstracted model and its intended behavior. The concealed obstacles are exposed, computationally expensive steps identified and possible improvements proposed.

  8. Formal verification of algorithms for critical systems

    Science.gov (United States)

    Rushby, John M.; Von Henke, Friedrich

    1993-01-01

    We describe our experience with formal, machine-checked verification of algorithms for critical applications, concentrating on a Byzantine fault-tolerant algorithm for synchronizing the clocks in the replicated computers of a digital flight control system. First, we explain the problems encountered in unsynchronized systems and the necessity, and criticality, of fault-tolerant synchronization. We give an overview of one such algorithm, and of the arguments for its correctness. Next, we describe a verification of the algorithm that we performed using our EHDM system for formal specification and verification. We indicate the errors we found in the published analysis of the algorithm, and other benefits that we derived from the verification. Based on our experience, we derive some key requirements for a formal specification and verification system adequate to the task of verifying algorithms of the type considered. Finally, we summarize our conclusions regarding the benefits of formal verification in this domain, and the capabilities required of verification systems in order to realize those benefits.

  9. Operational experience with the Sizewell B integrated plant computer system

    International Nuclear Information System (INIS)

    Ladner, J.E.J.; Alexander, N.C.; Fitzpatrick, J.A.

    1997-01-01

    The Westinghouse Integrated System for Centralised Operation (WISCO) is the primary plant control system at the Sizewell B Power Station. It comprises three subsystems; the High Integrity Control System (HICS), the Process Control System (PCS) and the Distributed Computer system (DCS). The HICS performs the control and data acquisition of nuclear safety significant plant systems. The PCS uses redundant data processing unit pairs. The workstations and servers of the DCS communicate with each other over a standard ethernet. The maintenance requirements for every plant system are covered by a Maintenance Strategy Report. The breakdown of these reports is listed. The WISCO system has performed exceptionally well. Due to the diagnostic information presented by the HICS, problems could normally be resolved within 24 hours. There have been some 200 outstanding modifications to the system. The procedure of modification is briefly described. (A.K.)

  10. Experience with Intel's Many Integrated Core Architecture in ATLAS Software

    CERN Document Server

    Fleischmann, S; The ATLAS collaboration; Lavrijsen, W; Neumann, M; Vitillo, R

    2014-01-01

    Intel recently released the first commercial boards of its Many Integrated Core (MIC) Architecture. MIC is Intel's solution for the domain of throughput computing, currently dominated by general purpose programming on graphics processors (GPGPU). MIC allows the use of the more familiar x86 programming model and supports standard technologies such as OpenMP, MPI, and Intel's Threading Building Blocks. This should make it possible to develop for both throughput and latency devices using a single code base.\

  11. Experience with Intel's Many Integrated Core Architecture in ATLAS Software

    CERN Document Server

    Fleischmann, S; The ATLAS collaboration; Lavrijsen, W; Neumann, M; Vitillo, R

    2013-01-01

    Intel recently released the first commercial boards of its Many Integrated Core (MIC) Architecture. MIC is Intel's solution for the domain of throughput computing, currently dominated by general purpose programming on graphics processors (GPGPU). MIC allows the use of the more familiar x86 programming model and supports standard technologies such as OpenMP, MPI, and Intel's Threading Building Blocks. This should make it possible to develop for both throughput and latency devices using a single code base.\

  12. Integration of microbial biopesticides in greenhouse floriculture: The Canadian experience.

    Science.gov (United States)

    Brownbridge, Michael; Buitenhuis, Rose

    2017-11-28

    Historically, greenhouse floriculture has relied on synthetic insecticides to meet its pest control needs. But, growers are increasingly faced with the loss or failure of synthetic chemical pesticides, declining access to new chemistries, stricter environmental/health and safety regulations, and the need to produce plants in a manner that meets the 'sustainability' demands of a consumer driven market. In Canada, reports of thrips resistance to spinosad (Success™) within 6-12 months of its registration prompted a radical change in pest management philosophy and approach. Faced with a lack of registered chemical alternatives, growers turned to biological control out of necessity. Biological control now forms the foundation for pest management programs in Canadian floriculture greenhouses. Success in a biocontrol program is rarely achieved through the use of a single agent, though. Rather, it is realized through the concurrent use of biological, cultural and other strategies within an integrated plant production system. Microbial insecticides can play a critical supporting role in biologically-based integrated pest management (IPM) programs. They have unique modes of action and are active against a range of challenging pests. As commercial microbial insecticides have come to market, research to generate efficacy data has assisted their registration in Canada, and the development and adaptation of integrated programs has promoted uptake by floriculture growers. This review documents some of the work done to integrate microbial insecticides into chrysanthemum and poinsettia production systems, outlines current use practices, and identifies opportunities to improve efficacy in Canadian floriculture crops. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. CERCLA integration with site operations the Fernald experience

    International Nuclear Information System (INIS)

    Coyle, S.W.; Shirley, R.S.; Varchol, B.D.

    1991-01-01

    A major transition in the Fernald Environmental Management Project (FEMP) site mission has occurred over the past few years. The production capabilities formally provided by the FEMP are being transferred to private industry through a vendor qualification program. Environmental compliance and site cleanup are now the primary focus. In line with this program, the production of uranium products at the site was suspended in July 1989 in order to concentrate resources on the environmental mission. Formal termination of the FEMP production mission was accomplished on June 19, 1991. Environmental issues such as stored inventories of process residues materials and equipment are being addressed under the Comprehensive Environmental Response, Compensation and Liability Act (CERCLA). The diversity of these hazards complicates the strategic planning for an integrated site cleanup program. The FEMP is one of the first Department of Energy (DOE) facilities to transition from an active production mission guided by Defense Programs (DP) to an environmental mission guided by Environmental Management (EM) under Leo Duffy. Westinghouse Environmental Management Company of Ohio (WEMCO) has been charged with integrating all site activities to carry out the cleanup. A new management structure has been formulated, and an integration approach initiated. Analyses are under way to evaluate all site activities such as waste management, safe shutdown, product material disposition and routine environmental monitoring in view of CERCLA requirements. Site activities are being broken down into three categories: (a) CERCLA driven - restoration work required under CERCLA, (b) CERCLA covered - other environmental requirements which must be integrated with CERCLA, and (c) CERCLA exempt (if any). The approach to comply with these categorized activities must be negotiated with state and federal regulatory agencies

  14. A Systematic Method for Verification and Validation of Gyrokinetic Microstability Codes

    Energy Technology Data Exchange (ETDEWEB)

    Bravenec, Ronald [Fourth State Research, Austin, TX (United States)

    2017-11-14

    My original proposal for the period Feb. 15, 2014 through Feb. 14, 2017 called for an integrated validation and verification effort carried out by myself with collaborators. The validation component would require experimental profile and power-balance analysis. In addition, it would require running the gyrokinetic codes varying the input profiles within experimental uncertainties to seek agreement with experiment before discounting a code as invalidated. Therefore, validation would require a major increase of effort over my previous grant periods which covered only code verification (code benchmarking). Consequently, I had requested full-time funding. Instead, I am being funded at somewhat less than half time (5 calendar months per year). As a consequence, I decided to forego the validation component and to only continue the verification efforts.

  15. Integral treatment of children with dyslexia - 40 years experience

    Directory of Open Access Journals (Sweden)

    Stošljević Miodrag

    2012-01-01

    Full Text Available Introduction. Dyslexia represents a significant pediatric problem requiring prompt and appropriate treatment. Objective. The aim of this study was to examine the significance of integral rehabilitation approach in treating dyslexia of children. Methods. Objectives of the study were accomplished on a sample of 300 children, aged 11-15 years, with etiologically variable dyslexia. Results. The results gained from the integral treatment of children with dyslexia were more successful than those obtained from isolated logaoedic treatment, when compared in 10-15 examined variables; replacement of graphically similar letters (p=0.000, replacement of syllables (p=0.010, replacement of words - guessing (p=0.019, structural errors - displacement or insertion (p=0.038, adding letters and syllables (p=0.001, repeating of word parts (p=0.001, reading of a word in several wrong ways (p=0.001, omission of words and whole lines (p=0.000, returning to already read line (p=0.000, level of dyslexia (p=0.000. Conclusion. Dyslexia requires a multidisciplinary therapeutic approach in which integral rehabilitation treatment has an exceptionally large significance.

  16. Integrative Review of Qualitative Research on the Emotional Experience of Bullying Victimization in Youth

    Science.gov (United States)

    Hutson, Elizabeth

    2018-01-01

    The emotional experience of bullying victimization in youths has been documented primarily using quantitative methods; however, qualitative methods may be better suited to examine the experience. An integrative review of the qualitative method studies addressing the emotional experience of bullying victimization was conducted. From MEDLINE,…

  17. Sustainable Development Impacts of Nationally Appropriate Mitigation Actions: An integrated approach to assessment of co-benefits based on experience with the Clean Development Mechanism

    DEFF Research Database (Denmark)

    Olsen, Karen Holm

    to assess the SD impacts of NAMAs. This paper argues for a new integrated approach to asses NAMAs' SD impacts that consists of SD indicators, procedures for stakeholder involvement and safeguards against negative impacts. The argument is based on a review of experience with the CDM’s contribution to SD......, particularly how a combined process and results approach known from the CDM SD Tool can be applied to develop a strong approach for SD assessment of NAMAs based on a comparison of similarities and differences between NAMAs and CDM. Five elements of a new approach towards assessment of NAMAs SD impacts...... are suggested based on emerging approaches and methodologies for monitoring, reporting and verification (MRV) of greenhouse gas reductions and SD impacts of NAMAs....

  18. Enhancing user experience design with an integrated storytelling method

    NARCIS (Netherlands)

    Peng, Qiong; Matterns, Jean Bernard; Marcus, A.

    2016-01-01

    Storytelling has been known as a service design method and been used broadly not only in service design but also in the context of user experience design. However, practitioners cannot yet fully appreciate the benefits of storytelling, and often confuse storytelling with storyboarding and scenarios.

  19. An Integral, Multidisciplinary and Global Geophysical Field Experience for Undergraduates

    Science.gov (United States)

    Vázquez, O.; Carrillo, D. J.; Pérez-Campos, X.

    2007-05-01

    The udergraduate program of Geophysical Engineering at the School of Engineering, of the Univesidad Nacional Autónoma de México (UNAM), went through an update process that concluded in 2006. As part of the program, the student takes three geophysical prospecting courses (gravity and magnetics, electric, electromagnetics, and seismic methods). The older program required a three-week field experience for each course in order to gradute. The new program considers only one extended field experience. This work stresses the importance of international academic exchange, where undergraduate students could participate, such as the Summer of Applied Geophysical Experience (SAGE), and interaction with research programs, such as the MesoAmerican Subduction Experiment (MASE). Also, we propose a scheeme for this activity based on those examples; both of them have in common real geophysical problems, from which students could benefit. Our proposal covers academic and logistic aspects to be taken into account, enhancing the relevance of interaction between other academic institutions, industry, and UNAM, in order to obtain a broader view of geophysics.

  20. The Developmental Impact of Not Integrating Childhood Peak Experiences

    Science.gov (United States)

    Schlarb, Craig W.

    2007-01-01

    Much prior groundbreaking research has been done in recent years highlighting the qualities, quantities and means children have to enter transpersonal states of awareness. As important as this precedent research has been, in many ways it has yet to fully appreciate the gravity of childhood transpersonal experiences in terms of the impact on…

  1. CERCLA integration with site operations the Fernald experience

    International Nuclear Information System (INIS)

    Coyle, S.W.; Shirley, R.S.; Varchol, B.D.

    1991-01-01

    A major transition in the Fernald Environmental Management Project (FEMP) site mission has occurred over the past few years. The production capabilities formally provided by the FEMP are being transferred to private industry through a vendor qualification program. Environmental compliance and site cleanup are now the primary focus. In line with this program, the production of uranium products at the site was suspended in July 1989 in order to concentrate resources on the environmental mission. Formal termination of the FEMP production mission was accomplished on June 19, 1991. Environmental issues such as stored inventories of process residues materials and equipment are being addressed under the Comprehensive Environmental Response, Compensation and Liability Act (CERCLA). The diversity of these hazards complicates the strategic planning for an integrated site cleanup program. This paper will discuss the programmatic approach which is being implemented to ensure activities such as waste management, site utility and support services, health and safety programs, and Resource Conservation and Recovery Act (RCRA) programs are being integrated with CERCLA. 6 figs., 3 tabs

  2. Lessons learned: Experiences with Integrated Safeguards in Norway

    International Nuclear Information System (INIS)

    Sekse, T.; Hornkjol, S.

    2010-01-01

    Integrated safeguards (IS) was implemented in Norway in 2002 as one of the first countries in the world. The implementation of IS has provided both advantages and disadvantages for Norway. Lessons learned will be discussed. The concept of unannounced inspections under the integrated safeguards regime compared to traditional safeguards is one of the major issues. Small users with depleted uranium as shielding containers and the effort used to safeguard them is an aspect of this issue. Recently there has been an interest from the IAEA to investigate the historical boundaries between a research reactor site and a neighboring defense research site. The paper will address this issue as a part of the implementation of IS. Lately, we have seen that several commercial parties have started research on nuclear fuel cycle related projects. This raises some questions concerning what to declare under Article 2 of the Additional Protocol (AP). Today anyone with a computer connected to the internet could carry out research amenable to declaration under the AP. This paper will discuss this issue. (author)

  3. Lessons Learned From Microkernel Verification — Specification is the New Bottleneck

    Directory of Open Access Journals (Sweden)

    Thorsten Bormer

    2012-11-01

    Full Text Available Software verification tools have become a lot more powerful in recent years. Even verification of large, complex systems is feasible, as demonstrated in the L4.verified and Verisoft XT projects. Still, functional verification of large software systems is rare – for reasons beyond the large scale of verification effort needed due to the size alone. In this paper we report on lessons learned for verification of large software systems based on the experience gained in microkernel verification in the Verisoft XT project. We discuss a number of issues that impede widespread introduction of formal verification in the software life-cycle process.

  4. Driving behavioural change towards ecodesign integration: Nudging experiment in industry

    DEFF Research Database (Denmark)

    Brones, Fabien; Gyldendal Melberg, Morten; Monteiro de Carvalho, Marly

    2014-01-01

    This paper describes a research study conducted at Natura, a large Brazilian cosmetic company, in order to stimulate more systematic sustainable innovation practices by means of behavioural change. Within the “soft side” of ecodesign implementation, “nudging” is a novel approach brought from soci...... systemically consider individuals’ engagement, including behavioural aspects, interaction with project teams and higher level business organisations.......This paper describes a research study conducted at Natura, a large Brazilian cosmetic company, in order to stimulate more systematic sustainable innovation practices by means of behavioural change. Within the “soft side” of ecodesign implementation, “nudging” is a novel approach brought from social...... sciences and policy making. An empirical experiment identified and tested employee motivations in combination with behavioural influences, in order to positively affect employees’ intention to practice ecodesign. This original experience of green nudging in a private company context supported the diffusion...

  5. Driving behavioural change towards ecodesign integration: Nudging experiment in industry

    OpenAIRE

    Brones, Fabien; Gyldendal Melberg, Morten; Monteiro de Carvalho, Marly; Pigosso, Daniela Cristina Antelmi; McAloone, Tim C.

    2014-01-01

    This paper describes a research study conducted at Natura, a large Brazilian cosmetic company, in order to stimulate more systematic sustainable innovation practices by means of behavioural change. Within the “soft side” of ecodesign implementation, “nudging” is a novel approach brought from social sciences and policy making. An empirical experiment identified and tested employee motivations in combination with behavioural influences, in order to positively affect employees’ intention to prac...

  6. Physics Verification Overview

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-12

    The purpose of the verification project is to establish, through rigorous convergence analysis, that each ASC computational physics code correctly implements a set of physics models and algorithms (code verification); Evaluate and analyze the uncertainties of code outputs associated with the choice of temporal and spatial discretization (solution or calculation verification); and Develop and maintain the capability to expand and update these analyses on demand. This presentation describes project milestones.

  7. A Regional Integrated Virtual Learning Environment: The AOU's Experience

    Directory of Open Access Journals (Sweden)

    Said Hammad

    2004-02-01

    Full Text Available In this paper we propose to construct a Regional Integrated Virtual Learning Environment (RIVLE for the Arab Open University (AOU. AOU is a new nonprofit learning institution with branches in six Arab countries and more branches scheduled to open in the near future. The university adopts an open learning methodology. We describe the major elements of the RIVLE and their interaction. We present a generic interface between the RIVLE and the Student Information System (SIS. We focus on the characteristics of the pedagogical model in the Arab Open University context and explain why RIVLE would be a perfect fit for this model. We argue that the potential benefits of a RIVLE are realized in such a setting. We also study the possibility of extending the RIVLE to existing learning institutions in the region.

  8. Review of Integral Experiments for Minor Actinide Management

    International Nuclear Information System (INIS)

    Gil, C.S.; Glinatsis, G.; Hesketh, K.; Iwamoto, O.; Okajima, S.; Tsujimoto, K.; Jacqmin, R.; Khomyakov, Y.; Kochetkov, A.; Kormilitsyn, M.; Palmiotti, G.; Salvatores, M.; Perret, G.; Rineiski, A.; Romanello, V.; Sweet, D.

    2015-01-01

    Spent nuclear fuel contains minor actinides (MAs) such as neptunium, americium and curium, which require careful management. This becomes even more important when mixed oxide (MOX) fuel is being used on a large scale since more MAs will accumulate in the spent fuel. One way to manage these MAs is to transmute them in nuclear reactors, including in light water reactors, fast reactors or accelerator-driven subcritical systems. The transmutation of MAs, however, is not straightforward, as the loading of MAs generally affects physics parameters, such as coolant void, Doppler and burn-up reactivity. This report focuses on nuclear data requirements for minor actinide management, the review of existing integral data and the determination of required experimental work, the identification of bottlenecks and possible solutions, and the recommendation of an action programme for international co-operation. (authors)

  9. Integrating Occupational Therapy Specific Assessments in Practice: Exploring Practitioner Experiences

    Directory of Open Access Journals (Sweden)

    Eric Asaba

    2017-01-01

    Full Text Available Background. Occupational therapists sometimes find it challenging to integrate client-centered and occupational therapy specific assessments in practice. The aim of this study was to explore the use of occupational therapy specific assessments such as the Assessment of Motor and Process Skills (AMPS among occupational therapists in Sweden and Japan. Methods. Interviews and qualitative thematic analyses were utilized. Findings. Four themes are reported: (1 use it or lose it, (2 simply no space until after hours, (3 biggest barriers can be colleagues, and (4 being more specific: communication. Conclusion. In keeping with previous studies, occupational therapists often find it challenging to implement client-centered and occupation-based assessment tools into practice. However, more work is needed to understand how best practices can be incorporated into a changing occupational therapy daily practice.

  10. Inspector measurement verification activities

    International Nuclear Information System (INIS)

    George, R.S.; Crouch, R.

    e most difficult and complex activity facing a safeguards inspector involves the verification of measurements and the performance of the measurement system. Remeasurement is the key to measurement verification activities. Remeasurerements using the facility's measurement system provide the bulk of the data needed for determining the performance of the measurement system. Remeasurements by reference laboratories are also important for evaluation of the measurement system and determination of systematic errors. The use of these measurement verification activities in conjunction with accepted inventory verification practices provides a better basis for accepting or rejecting an inventory. (U.S.)

  11. Time-integrated CP violation measurements in the B mesons system at the LHCb experiment

    CERN Document Server

    Cardinale, R

    2016-01-01

    Time-integrated CP violation measurements in the B meson system provide information for testing the CKM picture of CP violation in the Standard Model. A review of recent results from the LHCb experiment is presented.

  12. Barriers and facilitators to integrating care: experiences from the English Integrated Care Pilots

    Directory of Open Access Journals (Sweden)

    Tom Ling

    2012-07-01

    Full Text Available Background. In 2008, the English Department of Health appointed 16 'Integrated Care Pilots' which used a range of approaches to provide better integrated care. We report qualitative analyses from a three year multi-method evaluation to identify barriers and facilitators to successful integration of care.  Theory and methods. Data were analysed from transcripts of 213 in-depth staff interviews, and from semi-structured questionnaires (the 'Living Document' completed by staff in pilot sites at six points over a two-year period. Emerging findings were therefore built from 'bottom up' and grounded in the data. However, we were then interested in how these findings compared and contrasted with more generic analyses. Therefore after our analyses were complete we then systematically compared and contrasted the findings with the analysis of barriers and facilitators to quality improvement identified in a systematic review by Kaplan et al (2010 and the analysis of more micro-level shapers of behaviour found in Normalisation Process Theory (May et al 2007. Neither of these approaches claims to be full blown theories but both claim to provide mid-range theoretical arguments which may be used to structure existing data and which can be undercut or reinforced by new data. Results and discussion. Many barriers and facilitators to integrating care are those of any large scale organisational change. These include issues relating to leadership, organisational culture, information technology, physician involvement, and availability of resources. However, activities which appear particularly important for delivering integrated care include personal relationships between leaders in different organisations, the scale of planned activities, governance and finance arrangements, support for staff in new roles, and organisational and staff stability. We illustrate our analyses with a 'routemap' which identifies questions that providers may wish to consider when

  13. Barriers and facilitators to integrating care: experiences from the English Integrated Care Pilots

    Directory of Open Access Journals (Sweden)

    Tom Ling

    2012-07-01

    Full Text Available Background. In 2008, the English Department of Health appointed 16 'Integrated Care Pilots' which used a range of approaches to provide better integrated care. We report qualitative analyses from a three year multi-method evaluation to identify barriers and facilitators to successful integration of care. Theory and methods. Data were analysed from transcripts of 213 in-depth staff interviews, and from semi-structured questionnaires (the 'Living Document' completed by staff in pilot sites at six points over a two-year period. Emerging findings were therefore built from 'bottom up' and grounded in the data. However, we were then interested in how these findings compared and contrasted with more generic analyses. Therefore after our analyses were complete we then systematically compared and contrasted the findings with the analysis of barriers and facilitators to quality improvement identified in a systematic review by Kaplan et al (2010 and the analysis of more micro-level shapers of behaviour found in Normalisation Process Theory (May et al 2007. Neither of these approaches claims to be full blown theories but both claim to provide mid-range theoretical arguments which may be used to structure existing data and which can be undercut or reinforced by new data.Results and discussion. Many barriers and facilitators to integrating care are those of any large scale organisational change. These include issues relating to leadership, organisational culture, information technology, physician involvement, and availability of resources. However, activities which appear particularly important for delivering integrated care include personal relationships between leaders in different organisations, the scale of planned activities, governance and finance arrangements, support for staff in new roles, and organisational and staff stability. We illustrate our analyses with a 'routemap' which identifies questions that providers may wish to consider when planning

  14. Sport as a context for integration:newly arrived immigrant children in Sweden drawing sporting experiences

    OpenAIRE

    Hertting, Krister; Karlefors, Inger

    2013-01-01

    Sport is a global phenomenon, which can make sport an important arena for integration into new societies. However, sport is also an expression of national culture and identities. The aim of this study is to explore images and experiences that newly-arrived immigrant children in Sweden have about sport in their country of origin, and challenges that can arise in processes of integration through sport. We asked 20 newly arrived children aged 10 to 13 to make drawings about sporting experiences ...

  15. Having a Go: Looking at Teachers' Experience of Risk-Taking in Technology Integration

    Science.gov (United States)

    Howard, Sarah K.; Gigliotti, Amanda

    2016-01-01

    Risk is an integral part of change. Technology-related change in teachers' practice is guided by confidence engaging in and beliefs about integration. However, it is also affected by how teachers feel about taking risks, experimenting and change. This paper presents a theoretical framework of affect and emotion to understand how teachers…

  16. Research integrity: the experience of a doubting Thomas.

    Science.gov (United States)

    Hettinger, Thomas P

    2014-04-01

    The sensational "reactome array" paper published in Science in 2009 was investigated in Spain by the Ethics Committee of Consejo Superior de Investigaciones Cientificas (CSIC) after Science issued an editorial expression of concern. The paper was retracted in 2010 because of "skepticism" due to "errors" in chemistry. The "errors" were so profound that many readers expressed doubt that they were really errors, but part of an elaborate hoax. I conducted a forensic analysis of mass spectrometry data in the paper's Supporting Online Material (SOM) and was able to prove that thousands of data values were in fact fabricated. The SOM contains signatures of improper extensive spreadsheet manipulations of incorrect atomic and molecular mass values as well as impossibly repetitive deviations of found molecular mass values from their expected values. No evidence of real mass spectrometry data was detected. Both CSIC and Science have been content to retract the paper without acknowledging the fabrications or assigning responsibility for them. Neither CSIC nor Science has expressed interest in having an independent investigation determining how the paper came to be written, reviewed and published. Their weak response to this episode is a daunting signal that there is an impending crisis in research integrity and science journalism.

  17. Integrated predictive modelling simulations of burning plasma experiment designs

    International Nuclear Information System (INIS)

    Bateman, Glenn; Onjun, Thawatchai; Kritz, Arnold H

    2003-01-01

    Models for the height of the pedestal at the edge of H-mode plasmas (Onjun T et al 2002 Phys. Plasmas 9 5018) are used together with the Multi-Mode core transport model (Bateman G et al 1998 Phys. Plasmas 5 1793) in the BALDUR integrated predictive modelling code to predict the performance of the ITER (Aymar A et al 2002 Plasma Phys. Control. Fusion 44 519), FIRE (Meade D M et al 2001 Fusion Technol. 39 336), and IGNITOR (Coppi B et al 2001 Nucl. Fusion 41 1253) fusion reactor designs. The simulation protocol used in this paper is tested by comparing predicted temperature and density profiles against experimental data from 33 H-mode discharges in the JET (Rebut P H et al 1985 Nucl. Fusion 25 1011) and DIII-D (Luxon J L et al 1985 Fusion Technol. 8 441) tokamaks. The sensitivities of the predictions are evaluated for the burning plasma experimental designs by using variations of the pedestal temperature model that are one standard deviation above and below the standard model. Simulations of the fusion reactor designs are carried out for scans in which the plasma density and auxiliary heating power are varied

  18. Testing and verification of a novel single-channel IGBT driver circuit

    OpenAIRE

    Lukić, Milan; Ninković, Predrag

    2016-01-01

    This paper presents a novel single-channel IGBT driver circuit together with a procedure for testing and verification. It is based on a specialized integrated circuit with complete range of protective functions. Experiments are performed to test and verify its behaviour. Experimental results are presented in the form of oscilloscope recordings. It is concluded that the new driver circuit is compatible with modern IGBT transistors and power converter demands and that it can be applied in new d...

  19. The concept verification testing of materials science payloads

    Science.gov (United States)

    Griner, C. S.; Johnston, M. H.; Whitaker, A.

    1976-01-01

    The concept Verification Testing (CVT) project at the Marshall Space Flight Center, Alabama, is a developmental activity that supports Shuttle Payload Projects such as Spacelab. It provides an operational 1-g environment for testing NASA and other agency experiment and support systems concepts that may be used in shuttle. A dedicated Materials Science Payload was tested in the General Purpose Laboratory to assess the requirements of a space processing payload on a Spacelab type facility. Physical and functional integration of the experiments into the facility was studied, and the impact of the experiments on the facility (and vice versa) was evaluated. A follow-up test designated CVT Test IVA was also held. The purpose of this test was to repeat Test IV experiments with a crew composed of selected and trained scientists. These personnel were not required to have prior knowledge of the materials science disciplines, but were required to have a basic knowledge of science and the scientific method.

  20. Assessing patients’ experience of integrated care: a survey of patient views in the North West London Integrated Care Pilot

    Directory of Open Access Journals (Sweden)

    Nikolaos Mastellos

    2014-06-01

    Full Text Available Introduction: Despite the importance of continuity of care and patient engagement, few studies have captured patients’ views on integrated care. This study assesses patient experience in the Integrated Care Pilot in North West London with the aim to help clinicians and policy makers understand patients’ acceptability of integrated care and design future initiatives. Methods: A survey was developed, validated and distributed to 2029 randomly selected practice patients identified as having a care plan. Results: A total of 405 questionnaires were included for analysis. Respondents identified a number of benefits associated with the pilot, including increased patient involvement in decision-making, improved patient-provider relationship, better organisation and access to care, and enhanced inter-professional communication. However, only 22.4% were aware of having a care plan, and of these only 37.9% had a copy of the care plan. Knowledge of care plans was significantly associated with a more positive experience. Conclusions: This study reinforces the view that integrated care can improve quality of care and patient experience. However, care planning was a complex and technically challenging process that occurred more slowly than planned with wide variation in quality and time of recruitment to the pilot, making it difficult to assess the sustainability of benefits.

  1. Planning for Integrated Transport in Indonesia: Some Lessons from the UK’s Experience

    Directory of Open Access Journals (Sweden)

    Yos Sunitiyoso

    2012-01-01

    Full Text Available Traffic congestion has been a major problem in many cities in Indonesia, thus requiring abetter transport policy. Many developed countries, including the United Kingdom, has beenimplementing the integrated transport policy to replace traditional transport policy that focuson only building roads to anticipate traffic demand. This paper provides a highlight on theimplementation of integrated transport policy in the United Kingdom. Some key issues thatcan be learnt by the Indonesian government from their experience are discussed. This includesthe integration within and between all types of transport, integration with land use planning,integration with environment policy and integration with policies for education, health andwealth creations. In the implementation, the policy requires continuity and stability inorganization and politics, coordination in local transport plans, more devolution on powerand revenue funding from the government in addition to capital funding.Key words: traffic congestion, integrated transport policy

  2. Data Discovery, Exploration, Integration and Delivery - a practical experience

    Science.gov (United States)

    Kirsch, Peter; Barnes, Tim; Breen, Paul

    2010-05-01

    To fully address the questions and issues arising within Earth Systems Science; the discovery, exploration, integration, delivery and sharing of data, metadata and services across potentially many disciplines and areas of expertise is fundamental. British Antarctic Survey (BAS) collects, manages and curates data across many fields of the geophysical and biological sciences (including upper atmospheric physics, atmospheric chemistry, meteorology, glaciology, oceanography, Polar ecology and biology). BAS, through its Polar Data Centre has an interest to construct and deliver a user-friendly, informative, and administratively low overhead interface onto these data holdings. Designing effective interfaces and frameworks onto the heterogeneous datasets described above is non-trivial. We will discuss some of our approaches and implementations; particularly those addressing the following issues: How to aid and guide the user to accurate discovery of data? Many portals do not inform users clearly enough about the datasets they actually hold. As a result the search interface by which a user is meant to discover information is often inadequate and assumes prior knowledge (for example, that the dataset you are looking for actually exists; that a particular event, campaign, research cruise took place; and that you have a specialist knowledge of the terminology in a particular field), assumptions that cannot be made in multi-disciplinary topic areas. How easily is provenance, quality, and metadata information displayed and accessed? Once informed through the portal that data is available it is often extremely difficult to assess its provenance and quality information and broader documentation (including field reports, notebooks and software repositories). We shall demonstrate some simple methodologies. Can the user access summary data or visualizations of the dataset? It may be that the user is interested in some event, feature or threshold within the dataset; mechanisms need

  3. Integrated modeling of cryogenic layered highfoot experiments at the NIF

    Energy Technology Data Exchange (ETDEWEB)

    Kritcher, A. L.; Hinkel, D. E.; Callahan, D. A.; Hurricane, O. A.; Clark, D.; Casey, D. T.; Dewald, E. L.; Dittrich, T. R.; Döppner, T.; Barrios Garcia, M. A.; Haan, S.; Berzak Hopkins, L. F.; Jones, O.; Landen, O.; Ma, T.; Meezan, N.; Milovich, J. L.; Pak, A. E.; Park, H.-S.; Patel, P. K. [Lawrence Livermore National Laboratory, P.O. Box 808, Livermore, California 94551-0808 (United States); and others

    2016-05-15

    Integrated radiation hydrodynamic modeling in two dimensions, including the hohlraum and capsule, of layered cryogenic HighFoot Deuterium-Tritium (DT) implosions on the NIF successfully predicts important data trends. The model consists of a semi-empirical fit to low mode asymmetries and radiation drive multipliers to match shock trajectories, one dimensional inflight radiography, and time of peak neutron production. Application of the model across the HighFoot shot series, over a range of powers, laser energies, laser wavelengths, and target thicknesses predicts the neutron yield to within a factor of two for most shots. The Deuterium-Deuterium ion temperatures and the DT down scattered ratios, ratio of (10–12)/(13–15) MeV neutrons, roughly agree with data at peak fuel velocities <340 km/s and deviate at higher peak velocities, potentially due to flows and neutron scattering differences stemming from 3D or capsule support tent effects. These calculations show a significant amount alpha heating, 1–2.5× for shots where the experimental yield is within a factor of two, which has been achieved by increasing the fuel kinetic energy. This level of alpha heating is consistent with a dynamic hot spot model that is matched to experimental data and as determined from scaling of the yield with peak fuel velocity. These calculations also show that low mode asymmetries become more important as the fuel velocity is increased, and that improving these low mode asymmetries can result in an increase in the yield by a factor of several.

  4. An integrative, experience-based theory of attentional control.

    Science.gov (United States)

    Wilder, Matthew H; Mozer, Michael C; Wickens, Christopher D

    2011-02-09

    Although diverse, theories of visual attention generally share the notion that attention is controlled by some combination of three distinct strategies: (1) exogenous cuing from locally contrasting primitive visual features, such as abrupt onsets or color singletons (e.g., L. Itti, C. Koch, & E. Neiber, 1998), (2) endogenous gain modulation of exogenous activations, used to guide attention to task-relevant features (e.g., V. Navalpakkam & L. Itti, 2007; J. Wolfe, 1994, 2007), and (3) endogenous prediction of likely locations of interest, based on task and scene gist (e.g., A. Torralba, A. Oliva, M. Castelhano, & J. Henderson, 2006). However, little work has been done to synthesize these disparate theories. In this work, we propose a unifying conceptualization in which attention is controlled along two dimensions: the degree of task focus and the contextual scale of operation. Previously proposed strategies-and their combinations-can be viewed as instances of this one mechanism. Thus, this theory serves not as a replacement for existing models but as a means of bringing them into a coherent framework. We present an implementation of this theory and demonstrate its applicability to a wide range of attentional phenomena. The model accounts for key results in visual search with synthetic images and makes reasonable predictions for human eye movements in search tasks involving real-world images. In addition, the theory offers an unusual perspective on attention that places a fundamental emphasis on the role of experience and task-related knowledge.

  5. Integrated Quantum Optics: Experiments towards integrated quantum-light sources and quantum-enhanced sensing

    DEFF Research Database (Denmark)

    Hoff, Ulrich Busk

    The work presented in this thesis is focused on experimental application and generation of continuous variable quantum correlated states of light in integrated dielectric structures. Squeezed states are among the most exploited continuous variable optical states for free-space quantum-enhanced se...... is presented and an optimized device design is proposed. The devices have been fabricated and tested optically and preliminary interrogations of the output quantum noise have been performed....

  6. Verification Account Management System (VAMS)

    Data.gov (United States)

    Social Security Administration — The Verification Account Management System (VAMS) is the centralized location for maintaining SSA's verification and data exchange accounts. VAMS account management...

  7. Report of educational experience: integral development of the child

    Directory of Open Access Journals (Sweden)

    Alexandre Freitas Marchiori

    2008-12-01

    Full Text Available Summary The study was conducted at the Centre Municipal Education Child (CMEI "Sinclair Phillips," located in the city of Vitoria / ES, in the quarter Caratoíra. It is worth emphasizing the importance of believing in the potential and the knowledge that each child brings from an early age, because it is a development and be in full in terms of broadening their knowledge from the opportunities given to it, aiming to form a citizen critical. It is for the purchase of motor skills, promote health, cognitive development (intellectual, literacy and transmission of knowledge and culture / art historically constituted. The creativity and autonomy of the child always been the guiding objectives of the proposed work. He had the following objectives: to consolidate the Body of Culture Movement, working with social learning; provide social inclusion, developing the creativity; lead and supporting construction of autonomy; stimulate the initiative and diversity; provoke awareness of social rules; literacy; provide access the arts; articulate knowledge lived / worked in the school; chance rescue experiences of childhood, and transmit the culture children. The classes are not based in a single perspective, but allowed diverse forms of work, taking the child and its development as the focus of work. Another point of support was the adoption of Culture, Body Movement and Critical-emancipatory to develop intervention and that enabled a rich and varied work. The results are perceived in the day-to-day life of children, demonstrated by the actions of acceptance of others, recognition of the rules of coexistence, the materialization of learning: reading, writing and interpretation of some children's stories - this includes its production / living standalone of children, beyond access to culture and arts offered during the school year. Key Words: Children's Education, Physical Education, school practice, teaching.

  8. The Content and Integrative Component of Capstone Experiences: An Analysis of Political Science Undergraduate Programs

    Science.gov (United States)

    Hummer, Jill Abraham

    2014-01-01

    In 1991, the APSA Task Force on Political Science recommended elements of a curricular structure that would best promote student learning. The report stated that there should be a capstone experience at the end of the senior year and that the capstone should require students to integrate their whole learning experience in the major. This article…

  9. The construction of emotional experience requires the integration of implicit and explicit emotional processes.

    Science.gov (United States)

    Quirin, Markus; Lane, Richard D

    2012-06-01

    Although we agree that a constructivist approach to emotional experience makes sense, we propose that implicit (visceromotor and somatomotor) emotional processes are dissociable from explicit (attention and reflection) emotional processes, and that the conscious experience of emotion requires an integration of the two. Assessments of implicit emotion and emotional awareness can be helpful in the neuroscientific investigation of emotion.

  10. Verification and Examination Management of Complex Systems

    Directory of Open Access Journals (Sweden)

    Stian Ruud

    2014-10-01

    Full Text Available As ship systems become more complex, with an increasing number of safety-critical functions, many interconnected subsystems, tight integration to other systems, and a large amount of potential failure modes, several industry parties have identified the need for improved methods for managing the verification and examination efforts of such complex systems. Such needs are even more prominent now that the marine and offshore industries are targeting more activities and operations in the Arctic environment. In this paper, a set of requirements and a method for verification and examination management are proposed for allocating examination efforts to selected subsystems. The method is based on a definition of a verification risk function for a given system topology and given requirements. The marginal verification risks for the subsystems may then be evaluated, so that examination efforts for the subsystem can be allocated. Two cases of requirements and systems are used to demonstrate the proposed method. The method establishes a systematic relationship between the verification loss, the logic system topology, verification method performance, examination stop criterion, the required examination effort, and a proposed sequence of examinations to reach the examination stop criterion.

  11. Concepts for inventory verification in critical facilities

    International Nuclear Information System (INIS)

    Cobb, D.D.; Sapir, J.L.; Kern, E.A.; Dietz, R.J.

    1978-12-01

    Materials measurement and inventory verification concepts for safeguarding large critical facilities are presented. Inspection strategies and methods for applying international safeguards to such facilities are proposed. The conceptual approach to routine inventory verification includes frequent visits to the facility by one inspector, and the use of seals and nondestructive assay (NDA) measurements to verify the portion of the inventory maintained in vault storage. Periodic verification of the reactor inventory is accomplished by sampling and NDA measurement of in-core fuel elements combined with measurements of integral reactivity and related reactor parameters that are sensitive to the total fissile inventory. A combination of statistical sampling and NDA verification with measurements of reactor parameters is more effective than either technique used by itself. Special procedures for assessment and verification for abnormal safeguards conditions are also considered. When the inspection strategies and inventory verification methods are combined with strict containment and surveillance methods, they provide a high degree of assurance that any clandestine attempt to divert a significant quantity of fissile material from a critical facility inventory will be detected. Field testing of specific hardware systems and procedures to determine their sensitivity, reliability, and operational acceptability is recommended. 50 figures, 21 tables

  12. Unmanned Aircraft Systems Detect and Avoid System: End-to-End Verification and Validation Simulation Study of Minimum Operations Performance Standards for Integrating Unmanned Aircraft into the National Airspace System

    Science.gov (United States)

    Ghatas, Rania W.; Jack, Devin P.; Tsakpinis, Dimitrios; Sturdy, James L.; Vincent, Michael J.; Hoffler, Keith D.; Myer, Robert R.; DeHaven, Anna M.

    2017-01-01

    As Unmanned Aircraft Systems (UAS) make their way to mainstream aviation operations within the National Airspace System (NAS), research efforts are underway to develop a safe and effective environment for their integration into the NAS. Detect and Avoid (DAA) systems are required to account for the lack of "eyes in the sky" due to having no human on-board the aircraft. The technique, results, and lessons learned from a detailed End-to-End Verification and Validation (E2-V2) simulation study of a DAA system representative of RTCA SC-228's proposed Phase I DAA Minimum Operational Performance Standards (MOPS), based on specific test vectors and encounter cases, will be presented in this paper.

  13. The design and analysis of integral assembly experiments for CTR neutronics

    International Nuclear Information System (INIS)

    Beynon, T.D.; Curtis, R.H.; Lambert, C.

    1978-01-01

    The use of simple-geometry integral assemblies of lithium metal or lithium compounds for the study of the neutronics of various CTR designs is considered and four recent experiments are analysed. The relatively long mean free path of neutrons in these assemblies produces significantly different design problems from those encountered in similar experiments for fission reactor design. By considering sensitivity profiles for various parameters it is suggested that experiments can be designed to be optimised for data adjustments. (author)

  14. Verification of Scientific Simulations via Hypothesis-Driven Comparative and Quantitative Visualization

    Energy Technology Data Exchange (ETDEWEB)

    Ahrens, James P [ORNL; Heitmann, Katrin [ORNL; Petersen, Mark R [ORNL; Woodring, Jonathan [Los Alamos National Laboratory (LANL); Williams, Sean [Los Alamos National Laboratory (LANL); Fasel, Patricia [Los Alamos National Laboratory (LANL); Ahrens, Christine [Los Alamos National Laboratory (LANL); Hsu, Chung-Hsing [ORNL; Geveci, Berk [ORNL

    2010-11-01

    This article presents a visualization-assisted process that verifies scientific-simulation codes. Code verification is necessary because scientists require accurate predictions to interpret data confidently. This verification process integrates iterative hypothesis verification with comparative, feature, and quantitative visualization. Following this process can help identify differences in cosmological and oceanographic simulations.

  15. FMCT verification: Case studies

    International Nuclear Information System (INIS)

    Hui Zhang

    2001-01-01

    Full text: How to manage the trade-off between the need for transparency and the concern about the disclosure of sensitive information would be a key issue during the negotiations of FMCT verification provision. This paper will explore the general concerns on FMCT verification; and demonstrate what verification measures might be applied to those reprocessing and enrichment plants. A primary goal of an FMCT will be to have the five declared nuclear weapon states and the three that operate unsafeguarded nuclear facilities become parties. One focus in negotiating the FMCT will be verification. Appropriate verification measures should be applied in each case. Most importantly, FMCT verification would focus, in the first instance, on these states' fissile material production facilities. After the FMCT enters into force, all these facilities should be declared. Some would continue operating to produce civil nuclear power or to produce fissile material for non- explosive military uses. The verification measures necessary for these operating facilities would be essentially IAEA safeguards, as currently being applied to non-nuclear weapon states under the NPT. However, some production facilities would be declared and shut down. Thus, one important task of the FMCT verifications will be to confirm the status of these closed facilities. As case studies, this paper will focus on the verification of those shutdown facilities. The FMCT verification system for former military facilities would have to differ in some ways from traditional IAEA safeguards. For example, there could be concerns about the potential loss of sensitive information at these facilities or at collocated facilities. Eventually, some safeguards measures such as environmental sampling might be seen as too intrusive. Thus, effective but less intrusive verification measures may be needed. Some sensitive nuclear facilities would be subject for the first time to international inspections, which could raise concerns

  16. Integrating the philosophy and psychology of aesthetic experience: development of the aesthetic experience scale.

    Science.gov (United States)

    Stamatopoulou, Despina

    2004-10-01

    This study assessed the dynamic relationship between person and object in aesthetic experience. Patterns of the structure of aesthetic experience were derived from a conceptual model based on philosophical and psychological ideas. These patterns were further informed by interviewing individuals with extensive involvement in aesthetic activities and 25 secondary students. Accordingly, patterns were tested by developing a large pool of items attempting to identify measurable structural components of aesthetic experience. Refined first in a pilot study, the 36-item questionnaire was administered to 652 Greek students, aged from 13 to 15 years. Correlation matrices and exploratory factor analyses on principal components were used to examine internal structural relationships. The obliquely rotated five-factor solution of the refined instrument accounted for the 44.1% of the total variance and was combatible with the conceptual model of aesthetic experience, indicating the plausibility of both. The internal consistency of the items was adequate and external correlational analysis offered preliminary support for subsequent development of a self-report measure that serves to operationalize the major constructs of aesthetic experience in the general adolescent population. The results also raise theoretical issues for those interested in empirical aesthetics, suggesting that in experiential functioning, expressive perception and affect may play a more constructive role in cognitive processes than is generally acknowledged.

  17. A Verification Framework for Agent Communication

    NARCIS (Netherlands)

    Eijk, R.M. van; Boer, F.S. de; Hoek, W. van der; Meyer, J-J.Ch.

    2003-01-01

    In this paper, we introduce a verification method for the correctness of multiagent systems as described in the framework of acpl (Agent Communication Programming Language). The computational model of acpl consists of an integration of the two different paradigms of ccp (Concurrent Constraint

  18. Integrating experiences from operations into engineering design: modelling knowledge transfer in the offshore oil industry

    DEFF Research Database (Denmark)

    Souza da Conceição, Carolina; Broberg, Ole; Paravizo, Esdras

    2017-01-01

    of knowledge registered in the systems without standards to categorise and store this knowledge, to being difficult to access and retrieve the knowledge in the systems. Discussion: Transferring knowledge and experiences from users brings human factors into play and modelling the knowledge transfer process...... and workwise distance between operations and engineering design teams, integrating human factors and transferring knowledge are key aspects when designing for better performance systems. Research Objective: Based on an in-depth empirical investigation in an offshore oil company, this study aims to provide......Summative Statement: Integrating human factors and users’ experiences in design projects is a well-known challenge. This study focus on the specific challenges for transferring these experiences and how using a knowledge transfer model can help this integration on the design of high-risk productive...

  19. Implementing an integrated engineering data base system: A developer's experience and the application to IPAD

    Science.gov (United States)

    Bruce, E. A.

    1980-01-01

    The software developed by the IPAD project, a new and very powerful tool for the implementation of integrated Computer Aided Design (CAD) systems in the aerospace engineering community, is discussed. The IPAD software is a tool and, as such, can be well applied or misapplied in any particular environment. The many benefits of an integrated CAD system are well documented, but there are few such systems in existence, especially in the mechanical engineering disciplines, and therefore little available experience to guide the implementor.

  20. Secure Oblivious Hiding, Authentication, Tamper Proofing, and Verification Techniques

    National Research Council Canada - National Science Library

    Fridrich, Jessica

    2002-01-01

    In this report, we describe an algorithm for robust visual hash functions with applications to digital image watermarking for authentication and integrity verification of video data and still images...

  1. Can self-verification strivings fully transcend the self-other barrier? Seeking verification of ingroup identities.

    Science.gov (United States)

    Gómez, Angel; Seyle, D Conor; Huici, Carmen; Swann, William B

    2009-12-01

    Recent research has demonstrated self-verification strivings in groups, such that people strive to verify collective identities, which are personal self-views (e.g., "sensitive") associated with group membership (e.g., "women"). Such demonstrations stop short of showing that the desire for self-verification can fully transcend the self-other barrier, as in people working to verify ingroup identities (e.g., "Americans are loud") even when such identities are not self-descriptive ("I am quiet and unassuming"). Five studies focus on such ingroup verification strivings. Results indicate that people prefer to interact with individuals who verify their ingroup identities over those who enhance these identities (Experiments 1-5). Strivings for ingroup identity verification were independent of the extent to which the identities were self-descriptive but were stronger among participants who were highly invested in their ingroup identities, as reflected in high certainty of these identities (Experiments 1-4) and high identification with the group (Experiments 1-5). In addition, whereas past demonstrations of self-verification strivings have been limited to efforts to verify the content of identities (Experiments 1 to 3), the findings also show that they strive to verify the valence of their identities (i.e., the extent to which the identities are valued; Experiments 4 and 5). Self-verification strivings, rather than self-enhancement strivings, appeared to motivate participants' strivings for ingroup identity verification. Links to collective self-verification strivings and social identity theory are discussed.

  2. Experimental verification of layout physical verification of silicon photonics

    Science.gov (United States)

    El Shamy, Raghi S.; Swillam, Mohamed A.

    2018-02-01

    Silicon photonics have been approved as one of the best platforms for dense integration of photonic integrated circuits (PICs) due to the high refractive index contrast among its materials. Silicon on insulator (SOI) is a widespread photonics technology, which support a variety of devices for lots of applications. As the photonics market is growing, the number of components in the PICs increases which increase the need for an automated physical verification (PV) process. This PV process will assure reliable fabrication of the PICs as it will check both the manufacturability and the reliability of the circuit. However, PV process is challenging in the case of PICs as it requires running an exhaustive electromagnetic (EM) simulations. Our group have recently proposed an empirical closed form models for the directional coupler and the waveguide bends based on the SOI technology. The models have shown a very good agreement with both finite element method (FEM) and finite difference time domain (FDTD) solvers. These models save the huge time of the 3D EM simulations and can be easily included in any electronic design automation (EDA) flow as the equations parameters can be easily extracted from the layout. In this paper we present experimental verification for our previously proposed models. SOI directional couplers with different dimensions have been fabricated using electron beam lithography and measured. The results from the measurements of the fabricate devices have been compared to the derived models and show a very good agreement. Also the matching can reach 100% by calibrating certain parameter in the model.

  3. Assessing healthcare professionals' experiences of integrated care: do surveys tell the full story?

    Science.gov (United States)

    Stephenson, Matthew D; Campbell, Jared M; Lisy, Karolina; Aromataris, Edoardo C

    2017-09-01

    Integrated care is the combination of different healthcare services with the goal to provide comprehensive, seamless, effective and efficient patient care. Assessing the experiences of healthcare professionals (HCPs) is an important aspect when evaluating integrated care strategies. The aim of this rapid review was to investigate if quantitative surveys used to assess HCPs' experiences with integrated care capture all the aspects highlighted as being important in qualitative research, with a view to informing future survey development. The review considered all types of health professionals in primary care, and hospital and specialist services, with a specific focus on the provision of integrated care aimed at improving the patient journey. PubMed, CINAHL and grey literature sources were searched for relevant surveys/program evaluations and qualitative research studies. Full text articles deemed to be of relevance to the review were appraised for methodological quality using abridged critical appraisal instruments from the Joanna Briggs Institute. Data were extracted from included studies using standardized data extraction templates. Findings from included studies were grouped into domains based on similarity of meaning. Similarities and differences in the domains covered in quantitative surveys and those identified as being important in qualitative research were explored. A total of 37 studies (19 quantitative surveys, 14 qualitative studies and four mixed-method studies) were included in the review. A range of healthcare professions participated in the included studies, the majority being primary care providers. Common domains identified from quantitative surveys and qualitative studies included Communication, Agreement on Clear Roles and Responsibilities, Facilities, Information Systems, and Coordination of Care and Access. Qualitative research highlighted domains identified by HCPs as being relevant to their experiences with integrated care that have not

  4. Nuclear test ban verification

    International Nuclear Information System (INIS)

    Chun, Kin-Yip

    1991-07-01

    This report describes verification and its rationale, the basic tasks of seismic verification, the physical basis for earthquake/explosion source discrimination and explosion yield determination, the technical problems pertaining to seismic monitoring of underground nuclear tests, the basic problem-solving strategy deployed by the forensic seismology resarch team at the University of Toronto, and the scientific significance of the team's research. The research carried out at the Univeristy of Toronto has two components: teleseismic verification using P wave recordings from the Yellowknife Seismic Array (YKA), and regional (close-in) verification using high-frequency L g and P n recordings from the Eastern Canada Telemetered Network. Major differences have been found in P was attenuation among the propagation paths connecting the YKA listening post with seven active nuclear explosion testing areas in the world. Significant revisions have been made to previously published P wave attenuation results, leading to more interpretable nuclear explosion source functions. (11 refs., 12 figs.)

  5. Standard Verification System (SVS)

    Data.gov (United States)

    Social Security Administration — SVS is a mainframe program that accesses the NUMIDENT to perform SSN verifications. This program is called by SSA Internal applications to verify SSNs. There is also...

  6. Formal Verification -26 ...

    Indian Academy of Sciences (India)

    by testing of the components and successful testing leads to the software being ... Formal verification is based on formal methods which are mathematically based ..... scenario under which a similar error could occur. There are various other ...

  7. SSN Verification Service

    Data.gov (United States)

    Social Security Administration — The SSN Verification Service is used by Java applications to execute the GUVERF02 service using the WebSphere/CICS Interface. It accepts several input data fields...

  8. Environmental technology verification methods

    CSIR Research Space (South Africa)

    Szewczuk, S

    2016-03-01

    Full Text Available Environmental Technology Verification (ETV) is a tool that has been developed in the United States of America, Europe and many other countries around the world to help innovative environmental technologies reach the market. Claims about...

  9. Verification of RADTRAN

    International Nuclear Information System (INIS)

    Kanipe, F.L.; Neuhauser, K.S.

    1995-01-01

    This document presents details of the verification process of the RADTRAN computer code which was established for the calculation of risk estimates for radioactive materials transportation by highway, rail, air, and waterborne modes

  10. Multilateral disarmament verification

    International Nuclear Information System (INIS)

    Persbo, A.

    2013-01-01

    Non-governmental organisations, such as VERTIC (Verification Research, Training and Information Centre), can play an important role in the promotion of multilateral verification. Parties involved in negotiating nuclear arms accords are for the most part keen that such agreements include suitable and robust provisions for monitoring and verification. Generally progress in multilateral arms control verification is often painstakingly slow, but from time to time 'windows of opportunity' - that is, moments where ideas, technical feasibility and political interests are aligned at both domestic and international levels - may occur and we have to be ready, so the preparatory work is very important. In the context of nuclear disarmament, verification (whether bilateral or multilateral) entails an array of challenges, hurdles and potential pitfalls relating to national security, health, safety and even non-proliferation, so preparatory work is complex and time-greedy. A UK-Norway Initiative was established in order to investigate the role that a non-nuclear-weapon state such as Norway could potentially play in the field of nuclear arms control verification. (A.C.)

  11. Findings from the Harvard Medical School Cambridge Integrated Clerkship, a Year-Long Longitudinal Psychiatry Experience.

    Science.gov (United States)

    Cheng, Elisa; Hirsh, David; Gaufberg, Elizabeth; Griswold, Todd; Wesley Boyd, J

    2018-06-01

    The Harvard Medical School Cambridge Integrated Clerkship is a longitudinal integrated clerkship that has provided an alternative clinical model for medical education in psychiatry since its inception in 2004. This study was undertaken in an effort to better understand the student experience of the Cambridge Integrated Clerkship and how it may have impacted students' perceptions of and interest in psychiatry, as well as performance. Qualitative surveys were sent via e-mail to the first 11 student cohorts who had completed the Cambridge Integrated Clerkship (from 2004 to 2014) and for whom we had e-mail addresses (N = 100), and the free-text responses were coded thematically. All available standardized scoring data and residency match data for Cambridge Integrated Clerkship graduates were obtained. From 2006 to 2014, 12 out of 73 Cambridge Integrated Clerkship students who entered the match chose a psychiatry residency (16.4%), four times more than students in traditional clerkships at Harvard Medical School (3.8% of 1355 students) or the national average (4.1% of 146,066 US applicants). Thirty of the 100 surveyed Cambridge Integrated Clerkship graduates (30%) responded to the qualitative survey with free-text remarks on a number of themes. Cambridge Integrated Clerkship students compared positively to their classmates in terms of standardized test performance. Their fourfold higher match rate into psychiatry compared to other students raises intriguing questions as to what role a longitudinal clerkship might have played in developing interest in psychiatry as a career.

  12. Trojan technical specification verification project

    International Nuclear Information System (INIS)

    Bates, L.; Rickenback, M.

    1991-01-01

    The Trojan Technical Specification Verification (TTSV) project at the Trojan plant of Portland General Electric Company was motivated by the recognition that many numbers in the Trojan technical specifications (TTS) potentially lacked the consideration of instrument- and/or process-related errors. The plant setpoints were known to consider such errors, but many of the values associated with the limiting conditions for operation (LCO) did not. In addition, the existing plant instrument error analyses were based on industry values that do not reflect the Trojan plant-specific experience. The purpose of this project is to ensure that the Trojan plant setpoint and LCO values include plant-specific instrument error

  13. In the Service of Others: How Volunteering Is Integral to the Tribal College Experience

    Science.gov (United States)

    Talahongva, Patty

    2016-01-01

    Today, the spirit of volunteering is very much alive at every tribal college and university (TCU). From fundraisers for food pantries to educational activities that help fellow students, TCUs help forge reciprocity among students and staff. Volunteerism is integral to the tribal college experience. Volunteerism at three tribal colleges--Cankdeska…

  14. The Audio-Tutorial Approach to Learning Through Independent Study and Integrated Experiences.

    Science.gov (United States)

    Postlethwait, S. N.; And Others

    The rationale of the integrated experience approach to teaching botany at Purdue University is given and the history of the audio-tutorial course at Purdue and its present organization are described. A sample week's unit of study is given, including transcription of the tape, reproduction of printed materials and photographs of other materials…

  15. Crossing the Atlantic: Integrating Cross-Cultural Experiences into Undergraduate Business Courses Using Virtual Communities Technology

    Science.gov (United States)

    Luethge, Denise J.; Raska, David; Greer, Bertie M.; O'Connor, Christina

    2016-01-01

    Today's business school academics are tasked with pedagogy that offers students an understanding of the globalization of markets and the cross-cultural communication skills needed in today's business environment. The authors describe how a virtual cross-cultural experience was integrated into an undergraduate business course and used as an…

  16. Information Networks and Integration: Institutional Influences on Experiences and Persistence of Beginning Students

    Science.gov (United States)

    Karp, Melinda Mechur; Hughes, Katherine L.

    2008-01-01

    This article uses data from a qualitative exploratory study at two urban community colleges to examine experiences of beginning students, paying close attention to the influence that institutional information networks have on students' perceptions and persistence. The authors find that students' reported integration, or sense of belonging in the…

  17. Organization of Experience among Family Members in the Immediate Present: A Gestalt/Systems Integration.

    Science.gov (United States)

    Kaplan, Marvin L.; Kaplan, Netta R.

    1982-01-01

    Outlines two formulations that generate conceptual perspectives of immediate phenomena: (1) the family system has a time-enduring stability; (2) the family system has an immediate and temporary organization. Integrates systems thinking and Gestalt Therapy while recognizing individual experience as embedded in a self-maintaining system of the…

  18. Beyond the playing field: experiences of sport, social capital and integration among Somalis in Australia

    NARCIS (Netherlands)

    Spaaij, R.

    2012-01-01

    This paper explores the role of recreational sport as a means and marker of social integration by analysing the lived experiences of Somali people from refugee backgrounds with sport. Drawing on a three-year multi-sided ethnography, the paper examines the extent to and ways in which participation in

  19. The BuzzFeed Marketing Challenge: An Integrative Social Media Experience

    Science.gov (United States)

    Cowley, Scott W.

    2017-01-01

    This article presents the BuzzFeed Marketing Challenge, which helps students gain integrative real-world marketing experience by selecting a target market, then creating, publishing, and promoting an article for the target market on entertainment publisher BuzzFeed.com. The challenge is for students to effectively use marketing strategy and…

  20. The Integrative Business Experience: Real Choices and Real Consequences Create Real Thinking

    Science.gov (United States)

    McCord, Mary; Houseworth, Matthew; Michaelsen, Larry K.

    2015-01-01

    This article describes an innovation called the Integrative Business Experience (IBE) that links a set of required core business courses to an entrepreneurial practicum course in which two things occur. One is that students are concurrently enrolled in the required core business courses and a practicum course while they create a start-up business…

  1. Improving the Work-Integrated Learning Experience through a Third-Party Advisory Service

    Science.gov (United States)

    Jackson, Denise; Ferns, Sonia; Rowbottom, David; Mclaren, Diane

    2017-01-01

    This study trialled a Work-Integrated Learning (WIL) Advisory Service, provided by the Chamber of Commerce and Industry of Western Australia (CCIWA) in collaboration with four WA universities. The service was established to broker relationships between industry and universities, support employers engaged in WIL and enhance the WIL experience for…

  2. Primary Science Teaching--Is It Integral and Deep Experience for Students?

    Science.gov (United States)

    Timoštšuk, Inge

    2016-01-01

    Integral and deep pedagogical content knowledge can support future primary teachers' ability to follow ideas of education for sustainability in science class. Initial teacher education provides opportunity to learn what and how to teach but still the practical experiences of teaching can reveal uneven development of student teachers'…

  3. A New Approach toward Cyanotype Photography Using Tris-(Oxalato)ferrate(III): An Integrated Experiment

    Science.gov (United States)

    Fiorito, Pablo Alejandro; Polo, Andre´ Sarto

    2015-01-01

    This work presents an approach that integrates the preparation of a coordination compound, potassium tris- (oxalato)ferrate(III), with its photochemical behavior and provides a possible application, the printing of a photograph using the cyanotype technique. Through this experiment, students can be taught several concepts that occur in a…

  4. Embedded software verification and debugging

    CERN Document Server

    Winterholer, Markus

    2017-01-01

    This book provides comprehensive coverage of verification and debugging techniques for embedded software, which is frequently used in safety critical applications (e.g., automotive), where failures are unacceptable. Since the verification of complex systems needs to encompass the verification of both hardware and embedded software modules, this book focuses on verification and debugging approaches for embedded software with hardware dependencies. Coverage includes the entire flow of design, verification and debugging of embedded software and all key approaches to debugging, dynamic, static, and hybrid verification. This book discusses the current, industrial embedded software verification flow, as well as emerging trends with focus on formal and hybrid verification and debugging approaches. Includes in a single source the entire flow of design, verification and debugging of embedded software; Addresses the main techniques that are currently being used in the industry for assuring the quality of embedded softw...

  5. Postnatal experiences influence how the brain integrates information from different senses

    Directory of Open Access Journals (Sweden)

    Barry E Stein

    2009-09-01

    Full Text Available Sensory Processing Disorder (SPD is characterized by anomalous reactions to, and integration of, sensory cues. Although the underlying etiology of SPD is unknown, one brain region likely to reflect these sensory and behavioral anomalies is the Superior Colliculus (SC; a structure involved in the synthesis of information from multiple sensory modalities and the control of overt orientation responses. In this review we describe normal functional properties of this structure, the manner in which its individual neurons integrate cues from different senses, and the overt SC-mediated behaviors that are believed to manifest this “multisensory integration.” Of particular interest here is how SC neurons develop their capacity to engage in multisensory integration during early postnatal life as a consequence of early sensory experience, and that it is the intimate communication between cortex and the midbrain makes this developmental process possible.

  6. Brain network segregation and integration during an epoch-related working memory fMRI experiment.

    Science.gov (United States)

    Fransson, Peter; Schiffler, Björn C; Thompson, William Hedley

    2018-05-17

    The characterization of brain subnetwork segregation and integration has previously focused on changes that are detectable at the level of entire sessions or epochs of imaging data. In this study, we applied time-varying functional connectivity analysis together with temporal network theory to calculate point-by-point estimates in subnetwork segregation and integration during an epoch-based (2-back, 0-back, baseline) working memory fMRI experiment as well as during resting-state. This approach allowed us to follow task-related changes in subnetwork segregation and integration at a high temporal resolution. At a global level, the cognitively more taxing 2-back epochs elicited an overall stronger response of integration between subnetworks compared to the 0-back epochs. Moreover, the visual, sensorimotor and fronto-parietal subnetworks displayed characteristic and distinct temporal profiles of segregation and integration during the 0- and 2-back epochs. During the interspersed epochs of baseline, several subnetworks, including the visual, fronto-parietal, cingulo-opercular and dorsal attention subnetworks showed pronounced increases in segregation. Using a drift diffusion model we show that the response time for the 2-back trials are correlated with integration for the fronto-parietal subnetwork and correlated with segregation for the visual subnetwork. Our results elucidate the fast-evolving events with regard to subnetwork integration and segregation that occur in an epoch-related task fMRI experiment. Our findings suggest that minute changes in subnetwork integration are of importance for task performance. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  7. Paternal experience during the child’s first year of life: integrative review of qualitative research

    Directory of Open Access Journals (Sweden)

    Fernando Henrique Ferreira

    2015-09-01

    Full Text Available Social transformations have raised reflection about the paternal role and pointed to new fatherhoods, characterized by more effective involvement of the father in the family routine and in childcare. The present integrative review of qualitative studies aimed to synthetize the literature evidence about fatherhood experience throughout the first year of the child’s life, attentive to gender questions. Twenty three studies integrated this review. It was observed that fathers had positive experience with their babies and, still, craved for more time and space to dedicate to the family. However, inequality between genders, continuous requirement of financial provision at home and their inaptitude for breastfeeding moment impeded more paternal involvement. We concluded that new fatherhoods movement is present in the father experience and contemporary gender tendencies are challenges for parenting support.

  8. Real remote physics experiments across Internet-- inherent part of Integrated e-Learning

    Directory of Open Access Journals (Sweden)

    Frantisek Lustig

    2008-05-01

    Full Text Available Abstract— The implementation of the real remote experiments across the Internet into teaching process, up till now not available, enables introduction of Integrated e-Learning, composed of three components: the real remote experiments across the Internet, the simulation applets and the electronic interactive textbooks. We present here the prospective remote laboratory system with data transfer using Intelligent School Experimental System (ISES as hardware and ISES WEB Control kit as software. This approach enables the simple construction of remote experiments without building any hardware and virtually no programming, only with paste and copy approach of pre-built typical blocks as camera view, controls, graphs, displays etc. In conclusion we summarize the achieved experience with remote experiments.

  9. Simulation Environment Based on the Universal Verification Methodology

    CERN Document Server

    AUTHOR|(SzGeCERN)697338

    2017-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC desi...

  10. Fusion integral experiments and analysis and the determination of design safety factors - I: Methodology

    International Nuclear Information System (INIS)

    Youssef, M.Z.; Kumar, A.; Abdou, M.A.; Oyama, Y.; Maekawa, H.

    1995-01-01

    The role of the neutronics experimentation and analysis in fusion neutronics research and development programs is discussed. A new methodology was developed to arrive at estimates to design safety factors based on the experimental and analytical results from design-oriented integral experiments. In this methodology, and for a particular nuclear response, R, a normalized density function (NDF) is constructed from the prediction uncertainties, and their associated standard deviations, as found in the various integral experiments where that response, R, is measured. Important statistical parameters are derived from the NDF, such as the global mean prediction uncertainty, and the possible spread around it. The method of deriving safety factors from many possible NDFs based on various calculational and measuring methods (among other variants) is also described. Associated with each safety factor is a confidence level, designers may choose to have, that the calculated response, R, will not exceed (or will not fall below) the actual measured value. An illustrative example is given on how to construct the NDFs. The methodology is applied in two areas, namely the line-integrated tritium production rate and bulk shielding integral experiments. Conditions under which these factors could be derived and the validity of the method are discussed. 72 refs., 17 figs., 4 tabs

  11. The iso-response method: measuring neuronal stimulus integration with closed-loop experiments

    Science.gov (United States)

    Gollisch, Tim; Herz, Andreas V. M.

    2012-01-01

    Throughout the nervous system, neurons integrate high-dimensional input streams and transform them into an output of their own. This integration of incoming signals involves filtering processes and complex non-linear operations. The shapes of these filters and non-linearities determine the computational features of single neurons and their functional roles within larger networks. A detailed characterization of signal integration is thus a central ingredient to understanding information processing in neural circuits. Conventional methods for measuring single-neuron response properties, such as reverse correlation, however, are often limited by the implicit assumption that stimulus integration occurs in a linear fashion. Here, we review a conceptual and experimental alternative that is based on exploring the space of those sensory stimuli that result in the same neural output. As demonstrated by recent results in the auditory and visual system, such iso-response stimuli can be used to identify the non-linearities relevant for stimulus integration, disentangle consecutive neural processing steps, and determine their characteristics with unprecedented precision. Automated closed-loop experiments are crucial for this advance, allowing rapid search strategies for identifying iso-response stimuli during experiments. Prime targets for the method are feed-forward neural signaling chains in sensory systems, but the method has also been successfully applied to feedback systems. Depending on the specific question, “iso-response” may refer to a predefined firing rate, single-spike probability, first-spike latency, or other output measures. Examples from different studies show that substantial progress in understanding neural dynamics and coding can be achieved once rapid online data analysis and stimulus generation, adaptive sampling, and computational modeling are tightly integrated into experiments. PMID:23267315

  12. Plant integration of MITICA and SPIDER experiments with auxiliary plants and buildings on PRIMA site

    Energy Technology Data Exchange (ETDEWEB)

    Fellin, Francesco, E-mail: francesco.fellin@igi.cnr.it; Boldrin, Marco; Zaccaria, Pierluigi; Agostinetti, Piero; Battistella, Manuela; Bigi, Marco; Palma, Samuele Dal Bello Mauro Dalla; Fiorentin, Aldo; Luchetta, Adriano; Maistrello, Alberto; Marcuzzi, Diego; Ocello, Edoardo; Pasqualotto, Roberto; Pavei, Mauro; Pomaro, Nicola; Rizzolo, Andrea; Toigo, Vanni; Valente, Matteo; Zanotto, Loris; Calore, Luca; and others

    2015-10-15

    Highlights: • Focus on plant integration work supporting the realization of SPIDER and MITICA fusion experiments hosted in PRIMA buildings complex in Padova, Italy. • Huge effort of coordination and integration among many stakeholders, taking into account several constrains coming from experiments requirements (on-going) and precise time schedule and budget on buildings construction. • The paper also deals of interfaces management, coordination and integration of many competences, problems solving to find best solution also considering other aspects like safety and maintenance. - Abstract: This paper presents a description of the PRIMA (Padova Research on ITER Megavolt Accelerator) Plant Integration work, aimed at the construction of PRIMA Buildings, which will host two nuclear fusion test facilities named SPIDER and MITICA, finalized to test and optimize the neutral beam injectors for ITER experiment. These activities are very complex: inputs coming from the experiments design are changing time to time, while the buildings construction shall fulfill precise time schedule and budget. Moreover the decision process is often very long due to the high number of stakeholders (RFX, IO, third parties, suppliers, domestic agencies from different countries). The huge effort includes: forecasting what will be necessary for the integration of many experimental plants; collecting requirements and translating into inputs; interfaces management; coordination meetings with hundreds of people with various and different competences in construction and operation of fusion facilities, thermomechanics, electrical and control, buildings design and construction (civil plants plus architectural and structural aspects), safety, maintenance and management. The paper describes these activities and also the tools created to check and to validate the building design, to manage the interfaces and the organization put in place to achieve the required targets.

  13. Plant integration of MITICA and SPIDER experiments with auxiliary plants and buildings on PRIMA site

    International Nuclear Information System (INIS)

    Fellin, Francesco; Boldrin, Marco; Zaccaria, Pierluigi; Agostinetti, Piero; Battistella, Manuela; Bigi, Marco; Palma, Samuele Dal Bello Mauro Dalla; Fiorentin, Aldo; Luchetta, Adriano; Maistrello, Alberto; Marcuzzi, Diego; Ocello, Edoardo; Pasqualotto, Roberto; Pavei, Mauro; Pomaro, Nicola; Rizzolo, Andrea; Toigo, Vanni; Valente, Matteo; Zanotto, Loris; Calore, Luca

    2015-01-01

    Highlights: • Focus on plant integration work supporting the realization of SPIDER and MITICA fusion experiments hosted in PRIMA buildings complex in Padova, Italy. • Huge effort of coordination and integration among many stakeholders, taking into account several constrains coming from experiments requirements (on-going) and precise time schedule and budget on buildings construction. • The paper also deals of interfaces management, coordination and integration of many competences, problems solving to find best solution also considering other aspects like safety and maintenance. - Abstract: This paper presents a description of the PRIMA (Padova Research on ITER Megavolt Accelerator) Plant Integration work, aimed at the construction of PRIMA Buildings, which will host two nuclear fusion test facilities named SPIDER and MITICA, finalized to test and optimize the neutral beam injectors for ITER experiment. These activities are very complex: inputs coming from the experiments design are changing time to time, while the buildings construction shall fulfill precise time schedule and budget. Moreover the decision process is often very long due to the high number of stakeholders (RFX, IO, third parties, suppliers, domestic agencies from different countries). The huge effort includes: forecasting what will be necessary for the integration of many experimental plants; collecting requirements and translating into inputs; interfaces management; coordination meetings with hundreds of people with various and different competences in construction and operation of fusion facilities, thermomechanics, electrical and control, buildings design and construction (civil plants plus architectural and structural aspects), safety, maintenance and management. The paper describes these activities and also the tools created to check and to validate the building design, to manage the interfaces and the organization put in place to achieve the required targets.

  14. Heavy water physical verification in power plants

    International Nuclear Information System (INIS)

    Morsy, S.; Schuricht, V.; Beetle, T.; Szabo, E.

    1986-01-01

    This paper is a report on the Agency experience in verifying heavy water inventories in power plants. The safeguards objectives and goals for such activities are defined in the paper. The heavy water is stratified according to the flow within the power plant, including upgraders. A safeguards scheme based on a combination of records auditing, comparing records and reports, and physical verification has been developed. This scheme has elevated the status of heavy water safeguards to a level comparable to nuclear material safeguards in bulk facilities. It leads to attribute and variable verification of the heavy water inventory in the different system components and in the store. The verification methods include volume and weight determination, sampling and analysis, non-destructive assay (NDA), and criticality check. The analysis of the different measurement methods and their limits of accuracy are discussed in the paper

  15. Procedure generation and verification

    International Nuclear Information System (INIS)

    Sheely, W.F.

    1986-01-01

    The Department of Energy has used Artificial Intelligence of ''AI'' concepts to develop two powerful new computer-based techniques to enhance safety in nuclear applications. The Procedure Generation System, and the Procedure Verification System, can be adapted to other commercial applications, such as a manufacturing plant. The Procedure Generation System can create a procedure to deal with the off-normal condition. The operator can then take correct actions on the system in minimal time. The Verification System evaluates the logic of the Procedure Generator's conclusions. This evaluation uses logic techniques totally independent of the Procedure Generator. The rapid, accurate generation and verification of corrective procedures can greatly reduce the human error, possible in a complex (stressful/high stress) situation

  16. Nuclear disarmament verification

    International Nuclear Information System (INIS)

    DeVolpi, A.

    1993-01-01

    Arms control treaties, unilateral actions, and cooperative activities -- reflecting the defusing of East-West tensions -- are causing nuclear weapons to be disarmed and dismantled worldwide. In order to provide for future reductions and to build confidence in the permanency of this disarmament, verification procedures and technologies would play an important role. This paper outlines arms-control objectives, treaty organization, and actions that could be undertaken. For the purposes of this Workshop on Verification, nuclear disarmament has been divided into five topical subareas: Converting nuclear-weapons production complexes, Eliminating and monitoring nuclear-weapons delivery systems, Disabling and destroying nuclear warheads, Demilitarizing or non-military utilization of special nuclear materials, and Inhibiting nuclear arms in non-nuclear-weapons states. This paper concludes with an overview of potential methods for verification

  17. Integration of Detectors Into a Large Experiment: Examples From ATLAS and CMS

    CERN Document Server

    Froidevaux, D

    2011-01-01

    Integration of Detectors Into a Large Experiment: Examples From ATLAS andCMS, part of 'Landolt-Börnstein - Group I Elementary Particles, Nuclei and Atoms: Numerical Data and Functional Relationships in Science and Technology, Volume 21B2: Detectors for Particles and Radiation. Part 2: Systems and Applications'. This document is part of Part 2 'Principles and Methods' of Subvolume B 'Detectors for Particles and Radiation' of Volume 21 'Elementary Particles' of Landolt-Börnstein - Group I 'Elementary Particles, Nuclei and Atoms'. It contains the Chapter '5 Integration of Detectors Into a Large Experiment: Examples From ATLAS and CMS' with the content: 5 Integration of Detectors Into a Large Experiment: Examples From ATLAS and CMS 5.1 Introduction 5.1.1 The context 5.1.2 The main initial physics goals of ATLAS and CMS at the LHC 5.1.3 A snapshot of the current status of the ATLAS and CMS experiments 5.2 Overall detector concept and magnet systems 5.2.1 Overall detector concept 5.2.2 Magnet systems 5.2.2.1 Rad...

  18. Verification of Ceramic Structures

    Science.gov (United States)

    Behar-Lafenetre, Stephanie; Cornillon, Laurence; Rancurel, Michael; De Graaf, Dennis; Hartmann, Peter; Coe, Graham; Laine, Benoit

    2012-07-01

    In the framework of the “Mechanical Design and Verification Methodologies for Ceramic Structures” contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and instrument structures. It has been written in order to be applicable to most types of ceramic or glass-ceramic materials - typically Cesic®, HBCesic®, Silicon Nitride, Silicon Carbide and ZERODUR®. The proposed guideline describes the activities to be performed at material level in order to cover all the specific aspects of ceramics (Weibull distribution, brittle behaviour, sub-critical crack growth). Elementary tests and their post-processing methods are described, and recommendations for optimization of the test plan are given in order to have a consistent database. The application of this method is shown on an example in a dedicated article [7]. Then the verification activities to be performed at system level are described. This includes classical verification activities based on relevant standard (ECSS Verification [4]), plus specific analytical, testing and inspection features. The analysis methodology takes into account the specific behaviour of ceramic materials, especially the statistical distribution of failures (Weibull) and the method to transfer it from elementary data to a full-scale structure. The demonstration of the efficiency of this method is described in a dedicated article [8]. The verification is completed by classical full-scale testing activities. Indications about proof testing, case of use and implementation are given and specific inspection and protection measures are described. These additional activities are necessary to ensure the required reliability. The aim of the guideline is to describe how to reach the same reliability level as for structures made of more classical materials (metals, composites).

  19. Development of and verification test integral reactor major components - Development of manufacturing process and fabrication of prototype for SG and CEDM

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Chang Hee; Park, Hwa Kyu; Kim, Yong Kyu; Choi, Yong Soon; Kang, Ki Su; Hyun, Young Min [Korea Heavy Industries and Construction Co., LTD., Changwon (Korea)

    1999-03-01

    Integral SMART(System integrated Modular Advanced Reactor) type reactor is under conceptual design. Because major components is integrated within in a single pressure vessel, compact design using advanced technology is essential. It means that manufacturing process for these components is more complex and difficult. The objective of this study is to confirm the possibility of manufacture of Steam Generator, Control Element Drive Mechanism(CEDM) and Reactor Assembly which includes Reactor Pressure Vessel, it is important to understand the design requirement and function of the major components. After understanding the design requirement and function, it is concluded that the helical bending and weld qualification of titanium tube for Steam Generator and the applicability of electron beam weld for CEDM step motor parts is the critical to fabricate the components. Therefore, bending mock-up and weld qualification of titanium tube was performed and the results are quite satisfactory. Also, it is concluded that electron beam welding technique can be applicable to the CEDM step motor part. (author). 22 refs., 14 figs., 46 tabs.

  20. ‘Trust and teamwork matter’: Community health workers' experiences in integrated service delivery in India

    Science.gov (United States)

    Mishra, Arima

    2014-01-01

    A comprehensive and integrated approach to strengthen primary health care has been the major thrust of the National Rural Health Mission (NRHM) that was launched in 2005 to revamp India's rural public health system. Though the logic of horizontal and integrated health care to strengthen health systems has long been acknowledged at policy level, empirical evidence on how such integration operates is rare. Based on recent (2011–2012) ethnographic fieldwork in Odisha, India, this article discusses community health workers' experiences in integrated service delivery through village-level outreach sessions within the NRHM. It shows that for health workers, the notion of integration goes well beyond a technical lens of mixing different health services. Crucially, they perceive ‘teamwork’ and ‘building trust with the community’ (beyond trust in health services) to be critical components of their practice. However, the comprehensive NRHM primary health care ideology – which the health workers espouse – is in constant tension with the exigencies of narrow indicators of health system performance. Our ethnography shows how monitoring mechanisms, the institutionalised privileging of statistical evidence over field-based knowledge and the highly hierarchical health bureaucratic structure that rests on top-down communications mitigate efforts towards sustainable health system integration. PMID:25025872

  1. 'Trust and teamwork matter': community health workers' experiences in integrated service delivery in India.

    Science.gov (United States)

    Mishra, Arima

    2014-01-01

    A comprehensive and integrated approach to strengthen primary health care has been the major thrust of the National Rural Health Mission (NRHM) that was launched in 2005 to revamp India's rural public health system. Though the logic of horizontal and integrated health care to strengthen health systems has long been acknowledged at policy level, empirical evidence on how such integration operates is rare. Based on recent (2011-2012) ethnographic fieldwork in Odisha, India, this article discusses community health workers' experiences in integrated service delivery through village-level outreach sessions within the NRHM. It shows that for health workers, the notion of integration goes well beyond a technical lens of mixing different health services. Crucially, they perceive 'teamwork' and 'building trust with the community' (beyond trust in health services) to be critical components of their practice. However, the comprehensive NRHM primary health care ideology - which the health workers espouse - is in constant tension with the exigencies of narrow indicators of health system performance. Our ethnography shows how monitoring mechanisms, the institutionalised privileging of statistical evidence over field-based knowledge and the highly hierarchical health bureaucratic structure that rests on top-down communications mitigate efforts towards sustainable health system integration.

  2. Reload core safety verification

    International Nuclear Information System (INIS)

    Svetlik, M.; Minarcin, M.

    2003-01-01

    This paper presents a brief look at the process of reload core safety evaluation and verification in Slovak Republic. It gives an overview of experimental verification of selected nuclear parameters in the course of physics testing during reactor start-up. The comparison of IAEA recommendations and testing procedures at Slovak and European nuclear power plants of similar design is included. An introduction of two level criteria for evaluation of tests represents an effort to formulate the relation between safety evaluation and measured values (Authors)

  3. Spent fuel verification options for final repository safeguards in Finland. A study on verification methods, their feasibility and safety aspects

    International Nuclear Information System (INIS)

    Hautamaeki, J.; Tiitta, A.

    2000-12-01

    The verification possibilities of the spent fuel assemblies from the Olkiluoto and Loviisa NPPs and the fuel rods from the research reactor of VTT are contemplated in this report. The spent fuel assemblies have to be verified at the partial defect level before the final disposal into the geologic repository. The rods from the research reactor may be verified at the gross defect level. Developing a measurement system for partial defect verification is a complicated and time-consuming task. The Passive High Energy Gamma Emission Tomography and the Fork Detector combined with Gamma Spectrometry are the most potential measurement principles to be developed for this purpose. The whole verification process has to be planned to be as slick as possible. An early start in the planning of the verification and developing the measurement devices is important in order to enable a smooth integration of the verification measurements into the conditioning and disposal process. The IAEA and Euratom have not yet concluded the safeguards criteria for the final disposal. E.g. criteria connected to the selection of the best place to perform the verification. Measurements have not yet been concluded. Options for the verification places have been considered in this report. One option for a verification measurement place is the intermediate storage. The other option is the encapsulation plant. Crucial viewpoints are such as which one offers the best practical possibilities to perform the measurements effectively and which would be the better place in the safeguards point of view. Verification measurements may be needed both in the intermediate storages and in the encapsulation plant. In this report also the integrity of the fuel assemblies after wet intermediate storage period is assessed, because the assemblies have to stand the handling operations of the verification measurements. (orig.)

  4. Intercomparison of liquid metal fast reactor seismic analysis codes. V. 2: Verification and improvement of reactor core seismic analysis codes using core mock-up experiments. Proceedings of a research co-ordination meeting held in Vienna, 26-28 September 1994

    International Nuclear Information System (INIS)

    1995-10-01

    This report (Volume II) contains the papers summarizing the verification of and improvement to the codes on the basis of the French and Japanese data. Volume I: ''Validation of the Seismic Analysis Codes Using the Reactor Code Experiments'' (IAEA-TECDOC-798) included the Italian PEC reactor data. Refs, figs and tabs

  5. Intercomparison of liquid metal fast reactor seismic analysis codes. V. 2: Verification and improvement of reactor core seismic analysis codes using core mock-up experiments. Proceedings of a research co-ordination meeting held in Vienna, 26-28 September 1994

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-10-01

    This report (Volume II) contains the papers summarizing the verification of and improvement to the codes on the basis of the French and Japanese data. Volume I: ``Validation of the Seismic Analysis Codes Using the Reactor Code Experiments`` (IAEA-TECDOC-798) included the Italian PEC reactor data. Refs, figs and tabs.

  6. Integration and Testing Challenges of Small, Multiple Satellite Missions: Experiences from the Space Technology 5 Project

    Science.gov (United States)

    Sauerwein, Timothy A.; Gostomski, Thomas

    2008-01-01

    The ST5 technology demonstration mission led by GSFC of NASA's New Millennium Program managed by JPL consisted of three micro satellites (approximately 30 kg each) deployed into orbit from the Pegasus XL launch vehicle. In order to meet the launch date schedule of ST5, a different approach was required rather than the standard I&T approach used for single, room-sized satellites. The three spacecraft were designed, integrated, and tested at NASA Goddard Space Flight Center. It was determined that there was insufficient time in the schedule to perform three spacecraft I&T activities in series using standard approaches. The solution was for spacecraft #1 to undergo integration and test first, followed by spacecraft #2 and #3 simultaneously. This simultaneous integration was successful for several reasons. Each spacecraft had a Lead Test Conductor who planned and coordinated their spacecraft through its integration and test activities. One team of engineers and technicians executed the integration of all three spacecraft, learning and gaining knowledge and efficiency as spacecraft #1 integration and testing progressed. They became acutely familiar with the hardware, operation and processes for I&T, thus had the experience and knowledge to safely execute I&T for spacecraft #2 and #3. The integration team was extremely versatile; each member could perform many different activities or work any spacecraft, when needed. ST5 was successfully integrated, tested and shipped to the launch site per the I&T schedule that was planned three years previously. The I&T campaign was completed with ST5's successful launch on March 22, 2006.

  7. Integrating Sustainability and Hawaiian Culture into the Tourism Experience of the Hawaiian Islands

    Directory of Open Access Journals (Sweden)

    Wendy Agrusa

    2010-04-01

    Full Text Available The travel industry in Hawaii has been experiencing a trend towards more authentic tourism, which reintegrates Hawaiian culture into the visitors’ experience. This study investigated the reintegration of Hawaiian culture into the tourism experience on the Hawaiian Islands by reviewing existing literature, and by analyzing primary data collected through visitor surveys. The purpose of the study was to determine whether there is a visitors’ demand for a more authentic tourism experience in Hawaii through the reintegration of Hawaiian culture, and if so, which efforts should be made or continue to be made to achieve this authenticity. Important aspects that were taken into consideration in this research effort arethe changes Hawaiian culture has experienced with the arrival of outsiders, and the authenticity of the Hawaiian tourism experience today. Further aspects that were examined include the visitors’ image of Hawaii, their expectations, their experiences and satisfaction during their stay, their interest in and understanding of Hawaiian culture, as well as the type of Hawaiian cultural experiences they are interested in. According to the findings of this study, English speaking visitors are interested in Hawaiian culture and feel that Hawaiian culture is not represented enough in the tourism experience today. The conclusion is, therefore, that efforts to integrate Hawaiian culture into the tourism experience need to be increasedbeyond what is currently being done. Ideas for reintegrating Hawaiian culture are discussed and possible solutions are provided.

  8. Steam generator tube integrity requirements and operating experience in the United States

    International Nuclear Information System (INIS)

    Karwoski, K.J.

    2009-01-01

    Steam generator tube integrity is important to the safe operation of pressurized-water reactors. For ensuring tube integrity, the U.S. Nuclear Regulatory Commission uses a regulatory framework that is largely performance based. This performance-based framework is supplemented with some prescriptive requirements. The framework recognizes that there are three combinations of tube materials and heat treatments currently used in the United States and that the operating experience depends, in part, on the type of material used. This paper summarizes the regulatory framework for ensuring steam generator tube integrity, it highlights the current status of steam generators, and it highlights some of the steam generator issues and challenges that exist in the United States. (author)

  9. Integrated, digital experiment transient control and safety protection of an in-pile test

    International Nuclear Information System (INIS)

    Thomas, R.W.; Whitacre, R.F.; Klingler, W.B.

    1982-01-01

    The Sodium Loop Safety Facility experimental program has demonstrated that in-pile loop fuel failure transient tests can be digitally controlled and protected with reliability and precision. This was done in four nuclear experiments conducted in the Engineering Test Reactor operated by EG and G Idaho, Inc., at the Idaho National Engineering Laboratory. Loop sodium flow and reactor power transients can be programmed to sponsor requirements and verified prior to the test. Each controller has redundancy, which reduces the effect of single failures occurring during test transients. Feedback and reject criteria are included in the reactor power control. Timed sequencing integrates the initiation of the controllers, programmed safety set-points, and other experiment actions (e.g., planned scram). Off-line and on-line testing is included. Loss-of-flow, loss-of-piping-integrity, boiling-window, transient-overpower, and local fault tests have been successfully run using this system

  10. Spaceflight of HUVEC: An Integrated eXperiment- SPHINX Onboard the ISS

    Science.gov (United States)

    Versari, S.; Maier, J. A. M.; Norfini, A.; Zolesi, V.; Bradamante, S.

    2013-02-01

    The spaceflight orthostatic challenge can promote in astronauts inadequate cardiovascular responses defined as cardiovascular deconditioning. In particular, disturbance of endothelial functions are known to lead to altered vascular performances, being the endothelial cells crucial in the maintenance of the functional integrity of the vascular wall. In order to evaluate whether weightlessness affects endothelial functions, we designed, developed, and performed the experiment SPHINX - SPaceflight of HUVEC: an INtegrated eXperiment - where HUVEC (Human Umbilical Vein Endothelial Cells) were selected as a macrovascular cell model system. SPHINX arrived at the International Space Station (ISS) onboard Progress 40P, and was processed inside Kubik 6 incubator for 7 days. At the end, all of the samples were suitably fixed and preserved at 6°C until return on Earth on Soyuz 23S.

  11. Darkened Counsel: The Problem of Evil in Bergson’s Metaphysics of Integral Experience

    Directory of Open Access Journals (Sweden)

    Anthony Paul Smith

    2016-12-01

    Full Text Available Henri Bergson's work is often presented as an optimistic philosophy. This essay presents a counter-narrative to that reading by looking to the place of the problem of evil within his integral metaphysics. For, if Bergson’s philosophy is simply optimistic, or simply derives meaning from the wholeness of experience, then it risks a theodical structure which undercuts its ability to speak to contemporary social and political problems of suffering. A theodical structure is one that, at bottom, justifies the experience of suffering by way of a concept of the whole or some concept that functions to subsume everything within it. Suffering is subsumed and given meaning by placing it within a relation, often with a telos that redeems or sublimates the experience of suffering. This takes such a singular experience such as suffering and renders it merely relative to the part it plays within the system of everything. On my reading, Bergson’s philosophy contains a supplement of what we might call pessimism or negativity inherent in his metaphysics as integral experience. This supplement undermines the theodical structure that may be assumed to undercut Bergson’s philosophy when confronted with evil or suffering and is seen most clearly in his critique of the notion of “everything.”

  12. Lattice design of the integrable optics test accelerator and optical stochastic cooling experiment at Fermilab

    Energy Technology Data Exchange (ETDEWEB)

    Kafka, Gene [Illinois Inst. of Technology, Chicago, IL (United States)

    2015-05-01

    The Integrable Optics Test Accelerator (IOTA) storage ring at Fermilab will serve as the backbone for a broad spectrum of Advanced Accelerator R&D (AARD) experiments, and as such, must be designed with signi cant exibility in mind, but without compromising cost e ciency. The nonlinear experiments at IOTA will include: achievement of a large nonlinear tune shift/spread without degradation of dynamic aperture; suppression of strong lattice resonances; study of stability of nonlinear systems to perturbations; and studies of di erent variants of nonlinear magnet design. The ring optics control has challenging requirements that reach or exceed the present state of the art. The development of a complete self-consistent design of the IOTA ring optics, meeting the demands of all planned AARD experiments, is presented. Of particular interest are the precise control for nonlinear integrable optics experiments and the transverse-to-longitudinal coupling and phase stability for the Optical Stochastic Cooling Experiment (OSC). Since the beam time-of- ight must be tightly controlled in the OSC section, studies of second order corrections in this section are presented.

  13. The MIRAGE project: large scale radionuclide transport investigations and integral migration experiments

    International Nuclear Information System (INIS)

    Come, B.; Bidoglio, G.; Chapman, N.

    1986-01-01

    Predictions of radionuclide migration through the geosphere must be supported by large-scale, long-term investigations. Several research areas of the MIRAGE Project are devoted to acquiring reliable data for developing and validating models. Apart from man-made migration experiments in boreholes and/or underground galleries, attention is paid to natural geological migration systems which have been active for very long time spans. The potential role of microbial activity, either resident or introduced into the host media, is also considered. In order to clarify basic mechanisms, smaller scale ''integral'' migration experiments under fully controlled laboratory conditions are also carried out using real waste forms and representative geological media. (author)

  14. Engineering a static verification tool for GPU kernels

    OpenAIRE

    Bardsley, E; Betts, A; Chong, N; Collingbourne, P; Deligiannis, P; Donaldson, AF; Ketema, J; Liew, D; Qadeer, S

    2014-01-01

    We report on practical experiences over the last 2.5 years related to the engineering of GPUVerify, a static verification tool for OpenCL and CUDA GPU kernels, plotting the progress of GPUVerify from a prototype to a fully functional and relatively efficient analysis tool. Our hope is that this experience report will serve the verification community by helping to inform future tooling efforts. ? 2014 Springer International Publishing.

  15. Experiences in applying Bayesian integrative models in interdisciplinary modeling: the computational and human challenges

    DEFF Research Database (Denmark)

    Kuikka, Sakari; Haapasaari, Päivi Elisabet; Helle, Inari

    2011-01-01

    We review the experience obtained in using integrative Bayesian models in interdisciplinary analysis focusing on sustainable use of marine resources and environmental management tasks. We have applied Bayesian models to both fisheries and environmental risk analysis problems. Bayesian belief...... be time consuming and research projects can be difficult to manage due to unpredictable technical problems related to parameter estimation. Biology, sociology and environmental economics have their own scientific traditions. Bayesian models are becoming traditional tools in fisheries biology, where...

  16. Integrated IoT technology in industrial lasers for the improved user experience

    Science.gov (United States)

    Ding, Jianwu; Liu, Jinhui

    2018-02-01

    The end users' biggest concern for any industrial equipment is the reliability and the service down-time. This is especially true for industrial lasers as they are typically used in fully or semi- automated processes. Here we demonstrate how to use the integrated Internet of Things (IoT) technology in industrial lasers to address the reliability and the service down-time so to improve end users' experience.

  17. Characterization of the Caliban and Prospero Critical Assemblies Neutron Spectra for Integral Measurements Experiments

    Science.gov (United States)

    Casoli, P.; Authier, N.; Jacquet, X.; Cartier, J.

    2014-04-01

    Caliban and Prospero are two highly enriched uranium metallic core reactors operated on the CEA Center of Valduc. These critical assemblies are suitable for integral experiments, such as fission yields measurements or perturbation measurements, which have been carried out recently on the Caliban reactor. Different unfolding methods, based on activation foils and fission chambers measurements, are used to characterize the reactor spectra and especially the Caliban spectrum, which is very close to a pure fission spectrum.

  18. MELCOR 1.8.1 Assessment: LOFT integral experiment LP-FP-2

    International Nuclear Information System (INIS)

    Kmetyk, L.N.

    1992-12-01

    The MELCOR code has been used to model experiment LP-FP-2, an important source of integral data for qualifying severe accident code predictive capabilities. This assessment analysis clearly demonstrates MELCOR's ability to fulfill a large part of its primary, intended use, the calculation of severe accidents from full-power steady-state initiation through primary-system thermal/hydraulic response and core damage to fission product release, transport and deposition. After a number of code errors were identified and corrected, few nonstandard inputs and no code problem-specific modifications were needed to provide reasonable agreement with test data in all areas considered. Code-to-code comparisons show that MELCOR does at least as well as other ''best-estimate'' (i.e., SCDAP/RELAP5) or integral (i.e., MAAP) codes in predicting the thermal/hydraulic and core responses in this large-scale, integral experiment; in fact, MELCOR and MAAP appear to give the best agreement with data, especially for clad temperature histories. Further, our code-to-code comparisons indicate that MELCOR does at least as well as ''best-estimate'' fission product codes in predicting the source term, with a number of such codes having to be run in tandem and driven by test data or other ''best-estimate'' thermal/hydraulic and core damage codes to provide results equivalent to a single, integrated MELCOR calculation

  19. Integration and Testing Challenges of Small Satellite Missions: Experiences from the Space Technology 5 Project

    Science.gov (United States)

    Sauerwein, Timothy A.; Gostomski, Tom

    2007-01-01

    three spacecraft, learning and gaining knowledge and efficiency as spacecraft #1 integration and testing progressed. They became acutely familiar with the hardware, operation and processes for I&T, thus each team member had the experience and knowledge to safely execute I&T for spacecraft #2 and #3 together. The integration team was very versatile and each member could perform many different activities or work any spacecraft, when needed. Daily meetings between the three Lead TCs and technician team allowed the team to plan and implement activities efficiently. The three (3) spacecraft and PSS were successfully integrated and tested, shipped to the launch site, and ready for launch per the I&T schedule that was planned three years previously.

  20. Secure optical verification using dual phase-only correlation

    International Nuclear Information System (INIS)

    Liu, Wei; Liu, Shutian; Zhang, Yan; Xie, Zhenwei; Liu, Zhengjun

    2015-01-01

    We introduce a security-enhanced optical verification system using dual phase-only correlation based on a novel correlation algorithm. By employing a nonlinear encoding, the inherent locks of the verification system are obtained in real-valued random distributions, and the identity keys assigned to authorized users are designed as pure phases. The verification process is implemented in two-step correlation, so only authorized identity keys can output the discriminate auto-correlation and cross-correlation signals that satisfy the reset threshold values. Compared with the traditional phase-only-correlation-based verification systems, a higher security level against counterfeiting and collisions are obtained, which is demonstrated by cryptanalysis using known attacks, such as the known-plaintext attack and the chosen-plaintext attack. Optical experiments as well as necessary numerical simulations are carried out to support the proposed verification method. (paper)

  1. Is flow verification necessary

    International Nuclear Information System (INIS)

    Beetle, T.M.

    1986-01-01

    Safeguards test statistics are used in an attempt to detect diversion of special nuclear material. Under assumptions concerning possible manipulation (falsification) of safeguards accounting data, the effects on the statistics due to diversion and data manipulation are described algebraically. A comprehensive set of statistics that is capable of detecting any diversion of material is defined in terms of the algebraic properties of the effects. When the assumptions exclude collusion between persons in two material balance areas, then three sets of accounting statistics are shown to be comprehensive. Two of the sets contain widely known accountancy statistics. One of them does not require physical flow verification - comparisons of operator and inspector data for receipts and shipments. The third set contains a single statistic which does not require physical flow verification. In addition to not requiring technically difficult and expensive flow verification, this single statistic has several advantages over other comprehensive sets of statistics. This algebraic approach as an alternative to flow verification for safeguards accountancy is discussed in this paper

  2. Numerical Verification Of Equilibrium Chemistry

    International Nuclear Information System (INIS)

    Piro, Markus; Lewis, Brent; Thompson, William T.; Simunovic, Srdjan; Besmann, Theodore M.

    2010-01-01

    A numerical tool is in an advanced state of development to compute the equilibrium compositions of phases and their proportions in multi-component systems of importance to the nuclear industry. The resulting software is being conceived for direct integration into large multi-physics fuel performance codes, particularly for providing boundary conditions in heat and mass transport modules. However, any numerical errors produced in equilibrium chemistry computations will be propagated in subsequent heat and mass transport calculations, thus falsely predicting nuclear fuel behaviour. The necessity for a reliable method to numerically verify chemical equilibrium computations is emphasized by the requirement to handle the very large number of elements necessary to capture the entire fission product inventory. A simple, reliable and comprehensive numerical verification method is presented which can be invoked by any equilibrium chemistry solver for quality assurance purposes.

  3. Integration of 18 GW Wind Energy into the Energy Market. Practical Experiences in Germany. Experiences with large-scale integration of wind power into power systems

    International Nuclear Information System (INIS)

    Krauss, C.; Graeber, B.; Lange, M.; Focken, U.

    2006-01-01

    This work describes the integration of 18 GW of wind power into the German energy market. The focus lies on reporting practical experiences concerning the use of wind energy in Germany within the framework of the renewable energy act (EEG) and the immediate exchange of wind power between the four German grid control areas. Due to the EEG the demand for monitoring the current energy production of wind farms and for short-term predictions of wind power has significantly increased and opened a broader market for these services. In particular for trading on the intraday market ultra short term predictions in the time frame of 1 to 10 hours require different approaches than usual dayahead predictions because the large numerical meteorological models are not sufficiently optimized for very short time horizons. It is shown that for this range a combination of a statistical and a deterministic model leads to significant improvements and stable results as it unites the characteristics of the current wind power production with the synoptic-scale meteorological situation. The possible concepts of balancing the remaining differences between predicted and actual wind power generation are discussed. As wind power prediction errors and load forecasting errors are uncorrelated, benefits can arise from a combined balancing. Finally practical experiences with wind power fluctuations and large forecast errors are presented.

  4. Quantitative reactive modeling and verification.

    Science.gov (United States)

    Henzinger, Thomas A

    Formal verification aims to improve the quality of software by detecting errors before they do harm. At the basis of formal verification is the logical notion of correctness , which purports to capture whether or not a program behaves as desired. We suggest that the boolean partition of software into correct and incorrect programs falls short of the practical need to assess the behavior of software in a more nuanced fashion against multiple criteria. We therefore propose to introduce quantitative fitness measures for programs, specifically for measuring the function, performance, and robustness of reactive programs such as concurrent processes. This article describes the goals of the ERC Advanced Investigator Project QUAREM. The project aims to build and evaluate a theory of quantitative fitness measures for reactive models. Such a theory must strive to obtain quantitative generalizations of the paradigms that have been success stories in qualitative reactive modeling, such as compositionality, property-preserving abstraction and abstraction refinement, model checking, and synthesis. The theory will be evaluated not only in the context of software and hardware engineering, but also in the context of systems biology. In particular, we will use the quantitative reactive models and fitness measures developed in this project for testing hypotheses about the mechanisms behind data from biological experiments.

  5. Functional Verification of Enhanced RISC Processor

    OpenAIRE

    SHANKER NILANGI; SOWMYA L

    2013-01-01

    This paper presents design and verification of a 32-bit enhanced RISC processor core having floating point computations integrated within the core, has been designed to reduce the cost and complexity. The designed 3 stage pipelined 32-bit RISC processor is based on the ARM7 processor architecture with single precision floating point multiplier, floating point adder/subtractor for floating point operations and 32 x 32 booths multiplier added to the integer core of ARM7. The binary representati...

  6. A methodology for the rigorous verification of plasma simulation codes

    Science.gov (United States)

    Riva, Fabio

    2016-10-01

    The methodology used to assess the reliability of numerical simulation codes constitutes the Verification and Validation (V&V) procedure. V&V is composed by two separate tasks: the verification, which is a mathematical issue targeted to assess that the physical model is correctly solved, and the validation, which determines the consistency of the code results, and therefore of the physical model, with experimental data. In the present talk we focus our attention on the verification, which in turn is composed by the code verification, targeted to assess that a physical model is correctly implemented in a simulation code, and the solution verification, that quantifies the numerical error affecting a simulation. Bridging the gap between plasma physics and other scientific domains, we introduced for the first time in our domain a rigorous methodology for the code verification, based on the method of manufactured solutions, as well as a solution verification based on the Richardson extrapolation. This methodology was applied to GBS, a three-dimensional fluid code based on a finite difference scheme, used to investigate the plasma turbulence in basic plasma physics experiments and in the tokamak scrape-off layer. Overcoming the difficulty of dealing with a numerical method intrinsically affected by statistical noise, we have now generalized the rigorous verification methodology to simulation codes based on the particle-in-cell algorithm, which are employed to solve Vlasov equation in the investigation of a number of plasma physics phenomena.

  7. Integrated Controlling System and Unified Database for High Throughput Protein Crystallography Experiments

    International Nuclear Information System (INIS)

    Gaponov, Yu.A.; Igarashi, N.; Hiraki, M.; Sasajima, K.; Matsugaki, N.; Suzuki, M.; Kosuge, T.; Wakatsuki, S.

    2004-01-01

    An integrated controlling system and a unified database for high throughput protein crystallography experiments have been developed. Main features of protein crystallography experiments (purification, crystallization, crystal harvesting, data collection, data processing) were integrated into the software under development. All information necessary to perform protein crystallography experiments is stored (except raw X-ray data that are stored in a central data server) in a MySQL relational database. The database contains four mutually linked hierarchical trees describing protein crystals, data collection of protein crystal and experimental data processing. A database editor was designed and developed. The editor supports basic database functions to view, create, modify and delete user records in the database. Two search engines were realized: direct search of necessary information in the database and object oriented search. The system is based on TCP/IP secure UNIX sockets with four predefined sending and receiving behaviors, which support communications between all connected servers and clients with remote control functions (creating and modifying data for experimental conditions, data acquisition, viewing experimental data, and performing data processing). Two secure login schemes were designed and developed: a direct method (using the developed Linux clients with secure connection) and an indirect method (using the secure SSL connection using secure X11 support from any operating system with X-terminal and SSH support). A part of the system has been implemented on a new MAD beam line, NW12, at the Photon Factory Advanced Ring for general user experiments

  8. The global Filipino nurse: An integrative review of Filipino nurses' work experiences.

    Science.gov (United States)

    Montayre, Jed; Montayre, Jasmine; Holroyd, Eleanor

    2018-05-01

    To understand the work-related experiences of Philippine-trained nurses working globally. The Philippines is a major source country of foreign-trained nurses located globally. However, there is paucity of research on professional factors and career related issues affecting foreign-trained nurses' work experiences. An integrative review through a comprehensive search of literature was undertaken from November 2015 and was repeated in August 2016. Seven articles satisfied the selection criteria. Filipino nurses experienced differences in the practice of nursing in terms of work process, roles and autonomy. Moreover, they encountered challenges such as work-related discrimination and technical difficulties within the organisation. A clear understanding of Filipino nurses' work experiences and the challenges they have encountered suggests identification of important constructs influencing effective translation of nursing practice across cultures and health systems, which then form the basis for support strategies. It is critical to recognize foreign-trained nurses' experience of work-related differences and challenges as these foster favorable conditions for the management team to plan and continually evaluate policies around recruitment, retention and support offered to these nurses. Furthermore, findings suggest internationalization of nursing framework and standards integrating a transcultural paradigm among staff members within a work organisation. © 2017 John Wiley & Sons Ltd.

  9. Cassini's Test Methodology for Flight Software Verification and Operations

    Science.gov (United States)

    Wang, Eric; Brown, Jay

    2007-01-01

    The Cassini spacecraft was launched on 15 October 1997 on a Titan IV-B launch vehicle. The spacecraft is comprised of various subsystems, including the Attitude and Articulation Control Subsystem (AACS). The AACS Flight Software (FSW) and its development has been an ongoing effort, from the design, development and finally operations. As planned, major modifications to certain FSW functions were designed, tested, verified and uploaded during the cruise phase of the mission. Each flight software upload involved extensive verification testing. A standardized FSW testing methodology was used to verify the integrity of the flight software. This paper summarizes the flight software testing methodology used for verifying FSW from pre-launch through the prime mission, with an emphasis on flight experience testing during the first 2.5 years of the prime mission (July 2004 through January 2007).

  10. Design of two digital radiation tolerant integrated circuits for high energy physics experiments data readout

    CERN Document Server

    Bonacini, Sandro

    2003-01-01

    High Energy Physics research (HEP) involves the design of readout electron- ics for its experiments, which generate a high radiation ¯eld in the detectors. The several integrated circuits placed in the future Large Hadron Collider (LHC) experiments' environment have to resist the radiation and carry out their normal operation. In this thesis I will describe in detail what, during my 10-months partic- ipation in the digital section of the Microelectronics group at CERN, I had the possibility to work on: - The design of a radiation-tolerant data readout digital integrated cir- cuit in a 0.25 ¹m CMOS technology, called \\the Kchip", for the CMS preshower front-end system. This will be described in Chapter 3. - The design of a radiation-tolerant SRAM integrated circuit in a 0.13 ¹m CMOS technology, for technology radiation testing purposes and fu- ture applications in the HEP ¯eld. The SRAM will be described in Chapter 4. All the work has carried out under the supervision and with the help of Dr. Kostas Klouki...

  11. Correlations between nuclear data and results of integral slab experiments. Case of hafnium

    International Nuclear Information System (INIS)

    Palau, J.M.

    1997-01-01

    The aim of this thesis was to evaluate how much integral slab experiments can both reduce discrepancies between experimental results and calculations, and improve the knowledge of hafnium isotopes neutronic parameters by an adapted sensitivity and uncertainty method. A statistical approach, based on the generalized least squares method and perturbation theory, has been incorporated into our calculation system in order to deduce microscopic cross-section adjustments from observed integral measurements on this particular 'mock-up' reactor. In this study it has been established that the correlations between integral parameters and hafnium capture cross-sections enable specific variations in the region of resolved resonances at the level of multigroup and punctual cross-sections recommended data (JEF-2.2 evaluation) to be highlighted. The use of determinist methods (APOLLO2 code) together with Monte Carlo- type simulations (TRIPOLI4 code) enabled a depth analysis of the modelling approximations to be carried out. Furthermore, the sensitivity coefficient validation technique employed leads to a reliable assessment of the quality of the new basic nuclear data. In this instance, the adjustments proposed for certain isotope 177 Hf resonance parameters reduce, after error propagation, by 3 to 5 per cent the difference between experimental results and calculations related to this absorbent's efficiency. Beyond this particular application, the qualification methodology integrated in our calculation system should enable other basic sizing parameters to be treated (chemical / geometric data or other unexplored nuclear data) to make technological requirements less stringent. (author)

  12. Correlations between nuclear data and integral slab experiments: the case of hafnium

    International Nuclear Information System (INIS)

    Palau, J.M.

    1999-01-01

    The aim of this work was to evaluate how much integral slab experiments can both reduce discrepancies between experimental results and calculations, and improve the knowledge of hafnium isotopes neutronic parameters by an adapted sensitivity and uncertainty method. A statistical approach, based on the generalized least squares method and perturbation theory, has been incorporated into our calculation system in order to deduce microscopic cross-section adjustments from observed integral measurements on this particular 'mock-up' reactor.In this study it has been established that the correlations between integral parameters and hafnium capture cross-sections enable specific variations in the region of resolved resonances at the level of multigroup and punctual cross-sections recommended data (JEF-2.2 evaluation) to be highlighted. The use of determinist methods together with Monte Carlo- type simulations enabled a depth analysis of the modelling approximations to be carried out. Furthermore, the sensitivity coefficient validation technique employed leads to a reliable assessment of the quality of the new basic nuclear data. In this instance, the adjustments proposed for certain isotope 177 Hf resonance parameters reduce, after error propagation, by 3 to 5 per cent the difference between experimental results and calculations related to this absorbent's efficiency. Beyond this particular application, the qualification methodology integrated in our calculation system should enable other basic sizing parameters to be treated (chemical / geometric data or other unexplored nuclear data) to make technological requirements less stringent. (author)

  13. From the past to the future: Integrating work experience into the design process.

    Science.gov (United States)

    Bittencourt, João Marcos; Duarte, Francisco; Béguin, Pascal

    2017-01-01

    Integrating work activity issues into design process is a broadly discussed theme in ergonomics. Participation is presented as the main means for such integration. However, a late participation can limit the development of both project solutions and future work activity. This article presents the concept of construction of experience aiming at the articulated development of future activities and project solutions. It is a non-teleological approach where the initial concepts will be transformed by the experience built up throughout the design process. The method applied was a case study of an ergonomic participation during the design of a new laboratory complex for biotechnology research. Data was obtained through analysis of records in a simulation process using a Lego scale model and interviews with project participants. The simulation process allowed for developing new ways of working and generating changes in the initial design solutions, which enable workers to adopt their own developed strategies for conducting work more safely and efficiently in the future work system. Each project decision either opens or closes a window of opportunities for developing a future activity. Construction of experience in a non-teleological design process allows for understanding the consequences of project solutions for future work.

  14. Maximizing work integration in job placement of individuals facing mental health problems: Supervisor experiences.

    Science.gov (United States)

    Skarpaas, Lisebet Skeie; Ramvi, Ellen; Løvereide, Lise; Aas, Randi Wågø

    2015-01-01

    Many people confronting mental health problems are excluded from participation in paid work. Supervisor engagement is essential for successful job placement. To elicit supervisor perspectives on the challenges involved in fostering integration to support individuals with mental health problems (trainees) in their job placement at ordinary companies. Explorative, qualitative designed study with a phenomenological approach, based on semi-structured interviews with 15 supervisors involved in job placements for a total of 105 trainees (mean 7, min-max. 1-30, SD 8). Data were analysed using qualitative content analysis. Superviors experience two interrelated dilemmas concerning knowledge of the trainee and degree of preferential treatment. Challenges to obtaining successful integration were; motivational: 1) Supervisors previous experience with trainees encourages future engagement, 2) Developing a realistic picture of the situation, and 3) Disclosure and knowledge of mental health problems, and continuity challenges: 4) Sustaining trainee cooperation throughout the placement process, 5) Building and maintaining a good relationship between supervisor and trainee, and 6) Ensuring continuous cooperation with the social security system and other stakeholders. Supervisors experience relational dilemmas regarding pre-judgment, privacy and equality. Job placement seem to be maximized when the stakeholders are motivated and recognize that cooperation must be a continuous process.

  15. Effect of Drawer Master Modeling of ZPPR15 Phase A Reactor Physics Experiment on Integral Parameter

    International Nuclear Information System (INIS)

    Yoo, Jae Woon; Kim, Sang Ji

    2011-01-01

    As a part of an International-Nuclear Engineering Research Initiative (I-NERI) Project, KAERI and ANL are analyzing the ZPPR-15 reactor physics experiments. The ZPPR-15 experiments were carried out in support of the Integral Fast Reactor (IFR) project. Because of lack of the experimental data, verifying and validating the core neutronics analysis code for metal fueled sodium cooled fast reactors (SFR) has been one of the big concerns. KAERI is developing the metal fuel loaded SFR and plans to construct the demonstration SFR by around 2028. Database built through this project and its result of analysis will play an important role in validating the SFR neutronics characteristics. As the first year work of I-NERI project, KAERI analyzed ZPPR-15 Phase A experiment among four phases (Phase A to D). The effect of a drawer master modeling on the integral parameter was investigated. The approximated benchmark configurations for each loading were constructed to be used for validating a deterministic code

  16. Use of integral experiments to improve neutron propagation and gamma heating calculations

    International Nuclear Information System (INIS)

    Oceraies, Y.; Caumette, P.; Devillers, C.; Bussac, J.

    1979-01-01

    1) The studies to define and improve the accuracies of neutron propagation and gamma heating calculations from integral experiments are encompassed in the field of the fast reactor physics program at CEA. 2) A systematic analysis of neutron propagation in Fe-Na clean media, with variable volumic composition between 0 and 100% in sodium, has been performed on the HARMONIE source reactor. Gamma heating traverses in the core, the blankets and several control rods, have been measured in the R Z core program at MASURCA. The experimental techniques, the accuracies and the results obtained are given. The approximations of the calculational methods used to analyse these experiments and to predict the corresponding design parameters are also described. 3) Particular emphasis is given to the methods planned to improve fundamental data used in neutron propagation calculations, using the discrepancies observed between measured and calculated results in clean integral experiments. One of these approaches, similar to the techniques used in core physics, relies upon sensitivity studies and eventually on adjustment techniques applied to neutron propagation. (author)

  17. Distorted Fingerprint Verification System

    Directory of Open Access Journals (Sweden)

    Divya KARTHIKAESHWARAN

    2011-01-01

    Full Text Available Fingerprint verification is one of the most reliable personal identification methods. Fingerprint matching is affected by non-linear distortion introduced in fingerprint impression during the image acquisition process. This non-linear deformation changes both the position and orientation of minutiae. The proposed system operates in three stages: alignment based fingerprint matching, fuzzy clustering and classifier framework. First, an enhanced input fingerprint image has been aligned with the template fingerprint image and matching score is computed. To improve the performance of the system, a fuzzy clustering based on distance and density has been used to cluster the feature set obtained from the fingerprint matcher. Finally a classifier framework has been developed and found that cost sensitive classifier produces better results. The system has been evaluated on fingerprint database and the experimental result shows that system produces a verification rate of 96%. This system plays an important role in forensic and civilian applications.

  18. Testing, verification and application of CONTAIN for severe accident analysis of LMFBR-containments

    International Nuclear Information System (INIS)

    Langhans, J.

    1991-01-01

    Severe accident analysis for LMFBR-containments has to consider various phenomena influencing the development of containment loads as pressure and temperatures as well as generation, transport, depletion and release of aerosols and radioactive materials. As most of the different phenomena are linked together their feedback has to be taken into account within the calculation of severe accident consequences. Otherwise no best-estimate results can be assured. Under the sponsorship of the German BMFT the US code CONTAIN is being developed, verified and applied in GRS for future fast breeder reactor concepts. In the first step of verification, the basic calculation models of a containment code have been proven: (i) flow calculation for different flow situations, (ii) heat transfer from and to structures, (iii) coolant evaporation, boiling and condensation, (iv) material properties. In the second step the proof of the interaction of coupled phenomena has been checked. The calculation of integrated containment experiments relating natural convection flow, structure heating and coolant condensation as well as parallel calculation of results obtained with an other code give detailed information on the applicability of CONTAIN. The actual verification status allows the following conclusion: a caucious analyst experienced in containment accident modelling using the proven parts of CONTAIN will obtain results which have the same accuracy as other well optimized and detailed lumped parameter containment codes can achieve. Further code development, additional verification and international exchange of experience and results will assure an adequate code for the application in safety analyses for LMFBRs. (orig.)

  19. Experiences of spirituality and spiritual values in the context of nursing - an integrative review.

    Science.gov (United States)

    Rudolfsson, Gudrun; Berggren, Ingela; da Silva, António Barbosa

    2014-01-01

    Spirituality is often mistakenly equated with religion but is in fact a far broader concept. The aim of this integrative review was to describe experiences of the positive impact of spirituality and spiritual values in the context of nursing. The analysis was guided by Whittemore and Knafl's integrative review method. The findings revealed seven themes: 'Being part of a greater wholeness', 'Togetherness - value based relationships', 'Developing inner strength', 'Ministering to patients', 'Maintaining one's sense of humanity', 'Viewing life as a gift evokes a desire to 'give back'' and 'Achieving closure - life goes on'. It is difficult to draw definite conclusions, as spirituality involves many perspectives on various levels of awareness. However, spirituality was considered more inclusive, fluid and personal. Furthermore, it emerged that spirituality and spiritual values in the context of nursing are closely intertwined with the concept of caring.

  20. RESRAD-BUILD verification

    International Nuclear Information System (INIS)

    Kamboj, S.; Yu, C.; Biwer, B. M.; Klett, T.

    2002-01-01

    The results generated by the RESRAD-BUILD code (version 3.0) were verified with hand or spreadsheet calculations using equations given in the RESRAD-BUILD manual for different pathways. For verification purposes, different radionuclides--H-3, C-14, Na-22, Al-26, Cl-36, Mn-54, Co-60, Au-195, Ra-226, Ra-228, Th-228, and U-238--were chosen to test all pathways and models. Tritium, Ra-226, and Th-228 were chosen because of the special tritium and radon models in the RESRAD-BUILD code. Other radionuclides were selected to represent a spectrum of radiation types and energies. Verification of the RESRAD-BUILD code was conducted with an initial check of all the input parameters for correctness against their original source documents. Verification of the calculations was performed external to the RESRAD-BUILD code with Microsoft Excel to verify all the major portions of the code. In some cases, RESRAD-BUILD results were compared with those of external codes, such as MCNP (Monte Carlo N-particle) and RESRAD. The verification was conducted on a step-by-step basis and used different test cases as templates. The following types of calculations were investigated: (1) source injection rate, (2) air concentration in the room, (3) air particulate deposition, (4) radon pathway model, (5) tritium model for volume source, (6) external exposure model, (7) different pathway doses, and (8) time dependence of dose. Some minor errors were identified in version 3.0; these errors have been corrected in later versions of the code. Some possible improvements in the code were also identified

  1. Methods of Software Verification

    Directory of Open Access Journals (Sweden)

    R. E. Gurin

    2015-01-01

    Full Text Available This article is devoted to the problem of software verification (SW. Methods of software verification designed to check the software for compliance with the stated requirements such as correctness, system security and system adaptability to small changes in the environment, portability and compatibility, etc. These are various methods both by the operation process and by the way of achieving result. The article describes the static and dynamic methods of software verification and paid attention to the method of symbolic execution. In its review of static analysis are discussed and described the deductive method, and methods for testing the model. A relevant issue of the pros and cons of a particular method is emphasized. The article considers classification of test techniques for each method. In this paper we present and analyze the characteristics and mechanisms of the static analysis of dependencies, as well as their views, which can reduce the number of false positives in situations where the current state of the program combines two or more states obtained both in different paths of execution and in working with multiple object values. Dependences connect various types of software objects: single variables, the elements of composite variables (structure fields, array elements, the size of the heap areas, the length of lines, the number of initialized array elements in the verification code using static methods. The article pays attention to the identification of dependencies within the framework of the abstract interpretation, as well as gives an overview and analysis of the inference tools.Methods of dynamic analysis such as testing, monitoring and profiling are presented and analyzed. Also some kinds of tools are considered which can be applied to the software when using the methods of dynamic analysis. Based on the work a conclusion is drawn, which describes the most relevant problems of analysis techniques, methods of their solutions and

  2. Experiences of Individuals With Visual Impairments in Integrated Physical Education: A Retrospective Study.

    Science.gov (United States)

    Haegele, Justin A; Zhu, Xihe

    2017-12-01

    The purpose of this retrospective study was to examine the experiences of adults with visual impairments during school-based integrated physical education (PE). An interpretative phenomenological analysis (IPA) research approach was used and 16 adults (ages 21-48 years; 10 women, 6 men) with visual impairments acted as participants for this study. The primary sources of data were semistructured audiotaped telephone interviews and reflective field notes, which were recorded during and immediately following each interview. Thematic development was undertaken utilizing a 3-step analytical process guided by IPA. Based on the data analysis, 3 interrelated themes emerged from the participant transcripts: (a) feelings about "being put to the side," frustration and inadequacy; (b) "She is blind, she can't do it," debilitating feelings from physical educators' attitudes; and (c) "not self-esteem raising," feelings about peer interactions. The 1st theme described the participants' experiences and ascribed meaning to exclusionary practices. The 2nd theme described the participants' frustration over being treated differently by their PE teachers because of their visual impairments. Lastly, "not self-esteem raising," feelings about peer interactions demonstrated how participants felt about issues regarding challenging social situations with peers in PE. Utilizing an IPA approach, the researchers uncovered 3 interrelated themes that depicted central feelings, experiences, and reflections, which informed the meaning of the participants' PE experiences. The emerged themes provide unique insight into the embodied experiences of those with visual impairments in PE and fill a previous gap in the extant literature.

  3. Prescribed burning experiences in Italy: an integrated approach to prevent forest fires

    Directory of Open Access Journals (Sweden)

    Ascoli D

    2012-02-01

    Full Text Available Prescribed burning is used in many geographical areas for multiple and integrated objectives (wildfire prevention, habitat conservation, grazing management. In Europe the collaboration between researchers and fire professionals has brought to implement this technique over increasing areas (~104 ha year-1, effectively and efficiently. In Italy prescribed burning has not been much studied and it is rarely applied. A new interest is recently rising. Some Regions particularly threatened by wildfires have updated their legislation and set up procedures to authorize prescribed fire experiments and interventions. From 2004 to 2011 several scientific, operative and training experiences have been carried out at a regional level (Basilicata, Campania, Friuli Venezia Giulia, Piemonte, Sardegna, Toscana. The present paper aims to: (i document and compare these regional programs; (ii discuss their frameworks and limitations; (iii provide information about objectives, prescriptions, methods and results. The study has involved Universities, Forest Corps, Civil Protection, Municipalities, Parks and professionals from Italy and other Countries. Interventions have regarded integrated objectives (fire hazard reduction; habitat conservation; forest and grazing management, and involved several vegetation types (broadleaved and conifer forests; Mediterranean and Continental shrublands; grasslands. Studies on fire behaviour and ecology have helped to set prescriptions for specific objectives and environments. Results have been transferred to professionals through training sessions. Several common elements are outlined: integrated objectives, multidisciplinary character, training and research products. Ecological questions, certification to the use of fire, communication to local communities and the proposal of new studies, are some of the issues outlined in the discussion. The present study is the first review at national level and we hope it will help to deepen the

  4. Integrating supervision, control and data acquisition—The ITER Neutral Beam Test Facility experience

    Energy Technology Data Exchange (ETDEWEB)

    Luchetta, A., E-mail: adriano.luchetta@igi.cnr.it; Manduchi, G.; Taliercio, C.; Breda, M.; Capobianco, R.; Molon, F.; Moressa, M.; Simionato, P.; Zampiva, E.

    2016-11-15

    Highlights: • The paper describes the experience gained in the integration of different systems for the control and data acquisition system of the ITER Neutral Beam Test Facility. • It describes the way the different frameworks have been integrated. • It reports some lessons learnt during system integration. • It reports some authors’ considerations about the development the ITER CODAC. - Abstract: The ITER Neutral Beam (NBI) Test Facility, under construction in Padova, Italy consists in the ITER full scale ion source for the heating neutral beam injector, referred to as SPIDER, and the full size prototype injector, referred to as MITICA. The Control and Data Acquisition System (CODAS) for SPIDER has been developed and is going to be in operation in 2016. The system is composed of four main components: Supervision, Slow Control, Fast Control and Data Acquisition. These components interact with each other to carry out the system operation and, since they represent a common pattern in fusion experiments, software frameworks have been used for each (set of) component. In order to reuse as far as possible the architecture developed for SPIDER, it is important to clearly define the boundaries and the interfaces among the system components so that the implementation of any component can be replaced without affecting the overall architecture. This work reports the experience gained in the development of SPIDER components, highlighting the importance in the definition of generic interfaces among component, showing how the specific solutions have been adapted to such interfaces and suggesting possible approaches for the development of other ITER subsystems.

  5. Integrating supervision, control and data acquisition—The ITER Neutral Beam Test Facility experience

    International Nuclear Information System (INIS)

    Luchetta, A.; Manduchi, G.; Taliercio, C.; Breda, M.; Capobianco, R.; Molon, F.; Moressa, M.; Simionato, P.; Zampiva, E.

    2016-01-01

    Highlights: • The paper describes the experience gained in the integration of different systems for the control and data acquisition system of the ITER Neutral Beam Test Facility. • It describes the way the different frameworks have been integrated. • It reports some lessons learnt during system integration. • It reports some authors’ considerations about the development the ITER CODAC. - Abstract: The ITER Neutral Beam (NBI) Test Facility, under construction in Padova, Italy consists in the ITER full scale ion source for the heating neutral beam injector, referred to as SPIDER, and the full size prototype injector, referred to as MITICA. The Control and Data Acquisition System (CODAS) for SPIDER has been developed and is going to be in operation in 2016. The system is composed of four main components: Supervision, Slow Control, Fast Control and Data Acquisition. These components interact with each other to carry out the system operation and, since they represent a common pattern in fusion experiments, software frameworks have been used for each (set of) component. In order to reuse as far as possible the architecture developed for SPIDER, it is important to clearly define the boundaries and the interfaces among the system components so that the implementation of any component can be replaced without affecting the overall architecture. This work reports the experience gained in the development of SPIDER components, highlighting the importance in the definition of generic interfaces among component, showing how the specific solutions have been adapted to such interfaces and suggesting possible approaches for the development of other ITER subsystems.

  6. Verification of a Fissile Material Cut-off Treaty (FMCT): The Potential Role of the IAEA

    International Nuclear Information System (INIS)

    Chung, Jin Ho

    2016-01-01

    The objective of a future verification of a FMCT(Fissile Material Cut-off Treaty) is to deter and detect non-compliance with treaty obligations in a timely and non-discriminatory manner with regard to banning the production of fissile material for nuclear weapons or other nuclear devices. Since the International Atomic Energy Agency (IAEA) has already established the IAEA safeguards as a verification system mainly for Non -Nuclear Weapon States (NNWSs), it is expected that the IAEA's experience and expertise in this field will make a significant contribution to setting up a future treaty's verification regime. This paper is designed to explore the potential role of the IAEA in verifying the future treaty by analyzing verification abilities of the Agency in terms of treaty verification and expected challenges. Furthermore, the concept of multilateral verification that could be facilitated by the IAEA will be examined as a measure of providing a credible assurance of compliance with a future treaty. In this circumstance, it is necessary for the IAEA to be prepared for playing a leading role in FMCT verifications as a form of multilateral verification by taking advantage of its existing verification concepts, methods, and tools. Also, several challenges that the Agency faces today need to be overcome, including dealing with sensitive and proliferative information, attribution of fissile materials, lack of verification experience in military fuel cycle facilities, and different attitude and culture towards verification between NWSs and NNWSs

  7. Verification of a Fissile Material Cut-off Treaty (FMCT): The Potential Role of the IAEA

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Jin Ho [Korea Institute of Nuclear Nonproliferation and Control, Daejeon (Korea, Republic of)

    2016-05-15

    The objective of a future verification of a FMCT(Fissile Material Cut-off Treaty) is to deter and detect non-compliance with treaty obligations in a timely and non-discriminatory manner with regard to banning the production of fissile material for nuclear weapons or other nuclear devices. Since the International Atomic Energy Agency (IAEA) has already established the IAEA safeguards as a verification system mainly for Non -Nuclear Weapon States (NNWSs), it is expected that the IAEA's experience and expertise in this field will make a significant contribution to setting up a future treaty's verification regime. This paper is designed to explore the potential role of the IAEA in verifying the future treaty by analyzing verification abilities of the Agency in terms of treaty verification and expected challenges. Furthermore, the concept of multilateral verification that could be facilitated by the IAEA will be examined as a measure of providing a credible assurance of compliance with a future treaty. In this circumstance, it is necessary for the IAEA to be prepared for playing a leading role in FMCT verifications as a form of multilateral verification by taking advantage of its existing verification concepts, methods, and tools. Also, several challenges that the Agency faces today need to be overcome, including dealing with sensitive and proliferative information, attribution of fissile materials, lack of verification experience in military fuel cycle facilities, and different attitude and culture towards verification between NWSs and NNWSs.

  8. In-core Instrument Subcritical Verification (INCISV) - Core Design Verification Method - 358

    International Nuclear Information System (INIS)

    Prible, M.C.; Heibel, M.D.; Conner, S.L.; Sebastiani, P.J.; Kistler, D.P.

    2010-01-01

    According to the standard on reload startup physics testing, ANSI/ANS 19.6.1, a plant must verify that the constructed core behaves sufficiently close to the designed core to confirm that the various safety analyses bound the actual behavior of the plant. A large portion of this verification must occur before the reactor operates at power. The INCISV Core Design Verification Method uses the unique characteristics of a Westinghouse Electric Company fixed in-core self powered detector design to perform core design verification after a core reload before power operation. A Vanadium self powered detector that spans the length of the active fuel region is capable of confirming the required core characteristics prior to power ascension; reactivity balance, shutdown margin, temperature coefficient and power distribution. Using a detector element that spans the length of the active fuel region inside the core provides a signal of total integrated flux. Measuring the integrated flux distributions and changes at various rodded conditions and plant temperatures, and comparing them to predicted flux levels, validates all core necessary core design characteristics. INCISV eliminates the dependence on various corrections and assumptions between the ex-core detectors and the core for traditional physics testing programs. This program also eliminates the need for special rod maneuvers which are infrequently performed by plant operators during typical core design verification testing and allows for safer startup activities. (authors)

  9. Testing and verification of a novel single-channel IGBT driver circuit

    Directory of Open Access Journals (Sweden)

    Lukić Milan

    2016-01-01

    Full Text Available This paper presents a novel single-channel IGBT driver circuit together with a procedure for testing and verification. It is based on a specialized integrated circuit with complete range of protective functions. Experiments are performed to test and verify its behaviour. Experimental results are presented in the form of oscilloscope recordings. It is concluded that the new driver circuit is compatible with modern IGBT transistors and power converter demands and that it can be applied in new designs. It is a part of new 20kW industrial-grade boost converter.

  10. Integrated monitoring of multi-domain backbone connections Operational experience in the LHC optical private network

    CERN Document Server

    Marcu, Patricia; Fritz, Wolfgang; Yampolskiy, Mark; Hommel, Wolfgang

    2011-01-01

    Novel large scale research projects often require cooperation between various different project partners that are spread among the entire world. They do not only need huge computing resources, but also a reliable network to operate on. The Large Hadron Collider (LHC) at CERN is a representative example for such a project. Its experiments result in a vast amount of data, which is interesting for researchers around the world. For transporting the data from CERN to 11 data processing and storage sites, an optical private network (OPN) has been constructed. As the experiment data is highly valuable, LHC defines very high requirements to the underlying network infrastructure. In order to fulfil those requirements, the connections have to be managed and monitored permanently. In this paper, we present the integrated monitoring solution developed for the LHCOPN. We first outline the requirements and show how they are met on the single network layers. After that, we describe, how those single measurements can be comb...

  11. Streaming experiment of gamma-ray obliquely incident on concrete shield wall with straight cylindrical ducts and verification of single scattering code

    International Nuclear Information System (INIS)

    Yamaji, Akio; Saito, Tetsuo.

    1988-01-01

    To investigate a proximity effect of ducts on shield performance against γ radiation, an experiment was performed at JRR-4 by entering the γ-ray beam into a concrete shield wall of 100 cm-thickness with 3 or 5 straight cylindrical ducts of radius of 4.45 cm placed in a straight line or crosswise at interval of 8.9 cm. The dose rates were measured using digital dosimeters on a horizontal line 20 cm apart from the rear of the wall with 0, 1, 3 and 5 ducts, and with the incident angles of 0deg, 7deg, 14deg and 20deg, respectively. The dose rate distributions depended on the number of ducts and the incident angle, and the dose rate ratios of with-three-ducts to no-duct distributed within 3.6∼12, 1.3∼5.0 and 1.1∼4.3, for the incident angles of 7deg, 14deg and 20deg, while those of with-single-duct to no-duct within 1.2∼7.1, 1.1∼2.7 and 1.0∼1.9, respectively. The experiment was analyzed using a multigroup single scattering code G33YSN able to deal with the geometry of the ducts exactly. For each incident angle, the calculation agreed with the experiment within a factor of 2. (author)

  12. LEO-to-ground optical communications using SOTA (Small Optical TrAnsponder) - Payload verification results and experiments on space quantum communications

    Science.gov (United States)

    Carrasco-Casado, Alberto; Takenaka, Hideki; Kolev, Dimitar; Munemasa, Yasushi; Kunimori, Hiroo; Suzuki, Kenji; Fuse, Tetsuharu; Kubo-Oka, Toshihiro; Akioka, Maki; Koyama, Yoshisada; Toyoshima, Morio

    2017-10-01

    Free-space optical communications have held the promise of revolutionizing space communications for a long time. The benefits of increasing the bitrate while reducing the volume, mass and energy of the space terminals have attracted the attention of many researchers for a long time. In the last few years, more and more technology demonstrations have been taking place with participants from both the public and the private sector. The National Institute of Information and Communications Technology (NICT) in Japan has a long experience in this field. SOTA (Small Optical TrAnsponder) was the last NICT space lasercom mission, designed to demonstrate the potential of this technology applied to microsatellites. Since the beginning of SOTA mission in 2014, NICT regularly established communication using the Optical Ground Stations (OGS) located in the Headquarters at Koganei (Tokyo) to receive the SOTA signals, with over one hundred successful links. All the goals of the SOTA mission were fulfilled, including up to 10-Mbit/s downlinks using two different wavelengths and apertures, coarse and fine tracking of the OGS beacon, space-to-ground transmission of the on-board-camera images, experiments with different error correcting codes, interoperability with other international OGS, and experiments on quantum communications. The SOTA mission ended on November 2016, more than doubling the designed lifetime of 1-year. In this paper, the SOTA characteristics and basic operation are explained, along with the most relevant technological demonstrations.

  13. A new integral experiment on copper with DT neutron source at JAEA/FNS

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Saerom, E-mail: kwon.saerom@jaea.go.jp; Sato, Satoshi; Ohta, Masayuki; Ochiai, Kentaro; Konno, Chikara

    2016-11-01

    Highlights: • Integral experiment on copper with DT neutron was performed under small influence of background neutrons which were efficiently absorbed in the Li{sub 2}O layers. • The experimental analyses were carried out using MCNP5-1.40 and the recent nuclear data libraries. • The underestimation issue of the reaction rates related to lower energy neutrons is focused on. • The combination of the {sup 63}Cu data in JEFF-3.2 and {sup 65}Cu data in JENDL-4.0 gave the best C/E. • The specific cross section data of copper should be reassessed. - Abstract: In order to validate copper nuclear data, an integral experiment on copper with the DT neutron source at JAEA/FNS had been performed over 20 years ago. The experiment had showed that ratios of the calculated values to the experimental ones (C/Es) related to lower energy neutrons had been drastically smaller than unity. In order to reveal reasons of the small C/Es, we newly performed the integral experiment on copper with the DT neutron source at JAEA/FNS. A quasi-cylindrical copper assembly of 315 mm in radius and 608 mm in depth was covered with Li{sub 2}O blocks of 51 mm in thickness for the front and side parts and 153 mm in thickness for the rear part to exclude background neutrons which might affect the measured data. We measured reaction rates with 5 activation foils and fission rates with 2 micro fission chambers at the center of the assembly. The experiment was analyzed by using MCNP5-1.40 with the recent nuclear data libraries, ENDF/B-VII.1, JEFF-3.2 and JENDL-4.0. As a result, the C/E of the reaction rate of the {sup 197}Au(n,γ){sup 198}Au reaction improved by 10% from the previous result and the combination of the {sup 63}Cu data in JEFF-3.2 and {sup 65}Cu data in JENDL-4.0 increased the C/E by more 10% because of the resonance data of the {sup 63}Cu in JEFF-3.2. Moreover, the calculated result with the {sup 63}Cu data in JEFF-3.2 and {sup 65}Cu data in JENDL-4.0 with 10% larger elastic

  14. Verification and validation in computational fluid dynamics

    Science.gov (United States)

    Oberkampf, William L.; Trucano, Timothy G.

    2002-04-01

    Verification and validation (V&V) are the primary means to assess accuracy and reliability in computational simulations. This paper presents an extensive review of the literature in V&V in computational fluid dynamics (CFD), discusses methods and procedures for assessing V&V, and develops a number of extensions to existing ideas. The review of the development of V&V terminology and methodology points out the contributions from members of the operations research, statistics, and CFD communities. Fundamental issues in V&V are addressed, such as code verification versus solution verification, model validation versus solution validation, the distinction between error and uncertainty, conceptual sources of error and uncertainty, and the relationship between validation and prediction. The fundamental strategy of verification is the identification and quantification of errors in the computational model and its solution. In verification activities, the accuracy of a computational solution is primarily measured relative to two types of highly accurate solutions: analytical solutions and highly accurate numerical solutions. Methods for determining the accuracy of numerical solutions are presented and the importance of software testing during verification activities is emphasized. The fundamental strategy of validation is to assess how accurately the computational results compare with the experimental data, with quantified error and uncertainty estimates for both. This strategy employs a hierarchical methodology that segregates and simplifies the physical and coupling phenomena involved in the complex engineering system of interest. A hypersonic cruise missile is used as an example of how this hierarchical structure is formulated. The discussion of validation assessment also encompasses a number of other important topics. A set of guidelines is proposed for designing and conducting validation experiments, supported by an explanation of how validation experiments are different

  15. Experimental Verification of Boyle's Law and the Ideal Gas Law

    Science.gov (United States)

    Ivanov, Dragia Trifonov

    2007-01-01

    Two new experiments are offered concerning the experimental verification of Boyle's law and the ideal gas law. To carry out the experiments, glass tubes, water, a syringe and a metal manometer are used. The pressure of the saturated water vapour is taken into consideration. For educational purposes, the experiments are characterized by their…

  16. First feedback with the AMMON integral experiment for the JHR calculations

    Directory of Open Access Journals (Sweden)

    Lemaire M.

    2013-03-01

    Full Text Available The innovative design of the next international Material Testing Reactor, the Jules Horowitz Reactor (JHR, induced the development of a new neutron and photon calculation formular HORUS3D/P&N, based on deterministic and stochastic codes and the European nuclear data library JEFF3.1.1. A new integral experiment, named the AMMON experiment, was designed in order to make the experimental validation of HORUS3D. The objectives of this experimental program are to calibrate the biases and uncertainties associated with the HORUS3D/N&P calculations for JHR safety and design calculations, but also the validation of some specific nuclear data (concerning mainly hafnium and beryllium isotopes. The experiment began in 2010 and is currently performed in the EOLE zero-power critical mock-up at CEA Cadarache. This paper deals with the first feedback of the AMMON experiments with 3D Monte Carlo TRIPOLI4©/JEFF3.1.1 calculations.

  17. Results from the First Two Flights of the Static Computer Memory Integrity Testing Experiment

    Science.gov (United States)

    Hancock, Thomas M., III

    1999-01-01

    This paper details the scientific objectives, experiment design, data collection method, and post flight analysis following the first two flights of the Static Computer Memory Integrity Testing (SCMIT) experiment. SCMIT is designed to detect soft-event upsets in passive magnetic memory. A soft-event upset is a change in the logic state of active or passive forms of magnetic memory, commonly referred to as a "Bitflip". In its mildest form a soft-event upset can cause software exceptions, unexpected events, start spacecraft safeing (ending data collection) or corrupted fault protection and error recovery capabilities. In it's most severe form loss of mission or spacecraft can occur. Analysis after the first flight (in 1991 during STS-40) identified possible soft-event upsets to 25% of the experiment detectors. Post flight analysis after the second flight (in 1997 on STS-87) failed to find any evidence of soft-event upsets. The SCMIT experiment is currently scheduled for a third flight in December 1999 on STS-101.

  18. On the feasibility to perform integral transmission experiments in the GELINA target hall at IRMM

    Science.gov (United States)

    Leconte, Pierre; Jean, Cyrille De Saint; Geslot, Benoit; Plompen, Arjan; Belloni, Francesca; Nyman, Markus

    2017-09-01

    Shielding experiments are relevant to validate elastic and inelastic scattering cross sections in the fast energy range. In this paper, we are focusing on the possibility to use the pulsed white neutron time-of-flight facility GELINA to perform this kind of measurement. Several issues need to be addressed: neutron source intensity, room return effect, distance of the materials to be irradiated from the source, and the sensitivity of various reaction rate distributions through the material to different input cross sections. MCNP6 and TRIPOLI4 calculations of the outgoing neutron spectrum are compared, based on electron/positron/gamma/neutron simulations. A first guess of an integral transmission experiment through a 238U slab is considered. It shows that a 10 cm thickness of uranium is sufficient to reach a high sensitivity to the 238U inelastic scattering cross section in the [2-5 MeV] energy range, with small contributions from elastic and fission cross sections. This experiment would contribute to reduce the uncertainty on this nuclear data, which has a significant impact on the power distribution in large commercial reactors. Other materials that would be relevant for the ASTRID 4th generation prototype reactor are also tested, showing that a sufficient sensitivity to nuclear data would be obtained by using a 50 to 100cm thick slab of side 60x60cm. This study concludes on the feasibility and interest of such experiments in the target hall of the GELINA facility.

  19. Quantum money with classical verification

    Energy Technology Data Exchange (ETDEWEB)

    Gavinsky, Dmitry [NEC Laboratories America, Princeton, NJ (United States)

    2014-12-04

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  20. Quantum money with classical verification

    International Nuclear Information System (INIS)

    Gavinsky, Dmitry

    2014-01-01

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it

  1. Shield verification and validation action matrix summary

    International Nuclear Information System (INIS)

    Boman, C.

    1992-02-01

    WSRC-RP-90-26, Certification Plan for Reactor Analysis Computer Codes, describes a series of action items to be completed for certification of reactor analysis computer codes used in Technical Specifications development and for other safety and production support calculations. Validation and verification are integral part of the certification process. This document identifies the work performed and documentation generated to satisfy these action items for the SHIELD, SHLDED, GEDIT, GENPRT, FIPROD, FPCALC, and PROCES modules of the SHIELD system, it is not certification of the complete SHIELD system. Complete certification will follow at a later date. Each action item is discussed with the justification for its completion. Specific details of the work performed are not included in this document but can be found in the references. The validation and verification effort for the SHIELD, SHLDED, GEDIT, GENPRT, FIPROD, FPCALC, and PROCES modules of the SHIELD system computer code is completed

  2. Formal verification of industrial control systems

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Verification of critical software is a high priority but a challenging task for industrial control systems. For many kinds of problems, testing is not an efficient method. Formal methods, such as model checking appears to be an appropriate complementary method. However, it is not common to use model checking in industry yet, as this method needs typically formal methods expertise and huge computing power. In the EN-ICE-PLC section, we are working on a [methodology][1] and a tool ([PLCverif][2]) to overcome these challenges and to integrate formal verification in the development process of our PLC-based control systems. [1]: http://cern.ch/project-plc-formalmethods [2]: http://cern.ch/plcverif

  3. GRIMHX verification and validation action matrix summary

    International Nuclear Information System (INIS)

    Trumble, E.F.

    1991-12-01

    WSRC-RP-90-026, Certification Plan for Reactor Analysis Computer Codes, describes a series of action items to be completed for certification of reactor analysis computer codes used in Technical Specifications development and for other safety and production support calculations. Validation and verification of the code is an integral part of this process. This document identifies the work performed and documentation generated to satisfy these action items for the Reactor Physics computer code GRIMHX. Each action item is discussed with the justification for its completion. Specific details of the work performed are not included in this document but are found in the references. The publication of this document signals the validation and verification effort for the GRIMHX code is completed

  4. Integrating Variable Renewable Energy in Electric Power Markets: Best Practices from International Experience

    Energy Technology Data Exchange (ETDEWEB)

    Cochran, J.; Bird, L.; Heeter, J.; Arent, D. A.

    2012-04-01

    Many countries -- reflecting very different geographies, markets, and power systems -- are successfully managing high levels of variable renewable energy on the electric grid, including that from wind and solar energy. This study documents the diverse approaches to effective integration of variable renewable energy among six countries -- Australia (South Australia), Denmark, Germany, Ireland, Spain, and the United States (Western region-Colorado and Texas)-- and summarizes policy best practices that energy ministers and other stakeholders can pursue to ensure that electricity markets and power systems can effectively coevolve with increasing penetrations of variable renewable energy. Each country has crafted its own combination of policies, market designs, and system operations to achieve the system reliability and flexibility needed to successfully integrate renewables. Notwithstanding this diversity, the approaches taken by the countries studied all coalesce around five strategic areas: lead public engagement, particularly for new transmission; coordinate and integrate planning; develop rules for market evolution that enable system flexibility; expand access to diverse resources and geographic footprint of operations; and improve system operations. The ability to maintain a broad ecosystem perspective, to organize and make available the wealth of experiences, and to ensure a clear path from analysis to enactment should be the primary focus going forward.

  5. Integrating Variable Renewable Energy in Electric Power Markets: Best Practices from International Experience, Summary for Policymakers

    Energy Technology Data Exchange (ETDEWEB)

    Cochran, J.; Bird, L.; Heeter, J.; Arent, D. A.

    2012-04-01

    Many countries -- reflecting very different geographies, markets, and power systems -- are successfully managing high levels of variable renewable energy on the electric grid, including that from wind and solar energy. This document summarizes policy best practices that energy ministers and other stakeholders can pursue to ensure that electricity markets and power systems can effectively coevolve with increasing penetrations of variable renewable energy. There is no one-size-fits-all approach; each country studied has crafted its own combination of policies, market designs, and system operations to achieve the system reliability and flexibility needed to successfully integrate renewables. Notwithstanding this diversity, the approaches taken by the countries studied all coalesce around five strategic areas: lead public engagement, particularly for new transmission; coordinate and integrate planning; develop rules for market evolution that enable system flexibility; expand access to diverse resources and geographic footprint of operations; and improve system operations. This study also emphatically underscores the value of countries sharing their experiences. The more diverse and robust the experience base from which a country can draw, the more likely that it will be able to implement an appropriate, optimized, and system-wide approach.

  6. Integrating Variable Renewable Energy in Electric Power Markets. Best Practices from International Experience, Summary for Policymakers

    Energy Technology Data Exchange (ETDEWEB)

    Cochran, Jaquelin [National Renewable Energy Lab. (NREL), Golden, CO (United States); Bird, Lori [National Renewable Energy Lab. (NREL), Golden, CO (United States); Heeter, Jenny [National Renewable Energy Lab. (NREL), Golden, CO (United States); Arent, Douglas J. [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2012-04-30

    Many countries - reflecting very different geographies, markets, and power systems - are successfully managing high levels of variable renewable energy on the electric grid, including that from wind and solar energy. This document summarizes policy best practices that energy ministers and other stakeholders can pursue to ensure that electricity markets and power systems can effectively coevolve with increasing penetrations of variable renewable energy. There is no one-size-fits-all approach; each country studied has crafted its own combination of policies, market designs, and system operations to achieve the system reliability and flexibility needed to successfully integrate renewables. Notwithstanding this diversity, the approaches taken by the countries studied all coalesce around five strategic areas: lead public engagement, particularly for new transmission; coordinate and integrate planning; develop rules for market evolution that enable system flexibility; expand access to diverse resources and geographic footprint of operations; and improve system operations. This study also emphatically underscores the value of countries sharing their experiences. The more diverse and robust the experience base from which a country can draw, the more likely that it will be able to implement an appropriate, optimized, and system-wide approach.

  7. Integrating UNIX workstation into existing online data acquisition systems for Fermilab experiments

    International Nuclear Information System (INIS)

    Oleynik, G.

    1991-03-01

    With the availability of cost effective computing prior from multiple vendors of UNIX workstations, experiments at Fermilab are adding such computers to their VMS based online data acquisition systems. In anticipation of this trend, we have extended the software products available in our widely used VAXONLINE and PANDA data acquisition software systems, to provide support for integrating these workstations into existing distributed online systems. The software packages we are providing pave the way for the smooth migration of applications from the current Data Acquisition Host and Monitoring computers running the VMS operating systems, to UNIX based computers of various flavors. We report on software for Online Event Distribution from VAXONLINE and PANDA, integration of Message Reporting Facilities, and a framework under UNIX for experiments to monitor and view the raw event data produced at any level in their DA system. We have developed software that allows host UNIX computers to communicate with intelligent front-end embedded read-out controllers and processor boards running the pSOS operating system. Both RS-232 and Ethernet control paths are supported. This enables calibration and hardware monitoring applications to be migrated to these platforms. 6 refs., 5 figs

  8. Integrating Variable Renewable Energy in Electric Power Markets. Best Practices from International Experience

    Energy Technology Data Exchange (ETDEWEB)

    Cochran, Jaquelin [National Renewable Energy Lab. (NREL), Golden, CO (United States); Bird, Lori [National Renewable Energy Lab. (NREL), Golden, CO (United States); Heeter, Jenny [National Renewable Energy Lab. (NREL), Golden, CO (United States); Arent, Douglas J. [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2012-04-30

    Many countries—reflecting very different geographies, markets, and power systems—are successfully managing high levels of variable renewable energy on the electric grid, including that from wind and solar energy. This document summarizes policy best practices that energy ministers and other stakeholders can pursue to ensure that electricity markets and power systems can effectively coevolve with increasing penetrations of variable renewable energy. There is no one-size-fits-all approach; each country studied has crafted its own combination of policies, market designs, and system operations to achieve the system reliability and flexibility needed to successfully integrate renewables. Notwithstanding this diversity, the approaches taken by the countries studied all coalesce around five strategic areas: lead public engagement, particularly for new transmission; coordinate and integrate planning; develop rules for market evolution that enable system flexibility; expand access to diverse resources and geographic footprint of operations; and improve system operations. This study also emphatically underscores the value of countries sharing their experiences. The more diverse and robust the experience base from which a country can draw, the more likely that it will be able to implement an appropriate, optimized, and system-wide approach.

  9. Integral experiment on molybdenum with DT neutrons at JAEA/FNS

    Energy Technology Data Exchange (ETDEWEB)

    Ohta, Masayuki, E-mail: ohta.masayuki@jaea.go.jp; Sato, Satoshi; Kwon, Saerom; Ochiai, Kentaro; Konno, Chikara

    2016-11-01

    Highlights: • An integral experiment on molybdenum was conducted with DT neutron at JAEA/FNS. • The experimental results were analyzed by MCNP5 with recent nuclear data libraries. • The calculated results generally show underestimation. • Problems on recent nuclear data of molybdenum were discussed. - Abstract: An integral experiment on molybdenum is performed with a DT neutron source at JAEA/FNS. A Mo assembly is covered with lithium oxide blocks in order to reduce background neutrons inside the assembly. Several reaction rates and fission rates are measured along the central axis inside the assembly and compared with calculated ones with the Monte Carlo transport code MCNP5-1.40 and recent nuclear data libraries of ENDF/B-VII.1, JENDL-4.0, and JEFF-3.2. The calculated results generally show underestimation. From our detailed analysis, it is concluded that the (n,2n) cross section data of all the Mo stable isotopes in JEFF-3.2 are more suitable than those in JENDL-4.0 and the (n,γ) cross section data of {sup 92}Mo, {sup 94}Mo, {sup 95}Mo, {sup 96}Mo, {sup 97}Mo, and {sup 100}Mo in JENDL-4.0 are overestimated.

  10. Mechanistic Model for Ash Deposit Formation in Biomass Suspension Firing. Part 1: Model Verification by Use of Entrained Flow Reactor Experiments

    DEFF Research Database (Denmark)

    Hansen, Stine Broholm; Jensen, Peter Arendt; Jappe Frandsen, Flemming

    2017-01-01

    used to describe the deposit formation rates and deposit chemistry observed in a series of entrained flow reactor (EFR) experiments using straw and wood as fuels. It was found that model #1 was not able to describe the observed influence of temperature on the deposit buildup rates, predicting a much...... differ in the description of the sticking probability of impacted particles: model #1 employs a reference viscosity in the description of the sticking probability, while model #2 combines impaction of viscoelastic particles on a solid surface with particle capture by a viscous surface. Both models were...

  11. Design verification methodology for a solenoid valve for industrial applications

    International Nuclear Information System (INIS)

    Park, Chang Dae; Lim, Byung Ju; Chun, Kyung Yul

    2015-01-01

    Solenoid operated valves (SOV) are widely used in many applications due to their fast dynamic responses, cost effectiveness, and less contamination sensitive characteristics. In this paper, we tried to provide a convenient method of design verification of SOV to design engineers who depend on their experiences and experiment during design and development process of SOV. First, we summarize a detailed procedure for designing SOVs for industrial applications. All of the design constraints are defined in the first step of the design, and then the detail design procedure is presented based on design experiences as well as various physical and electromagnetic relationships. Secondly, we have suggested a verification method of this design using theoretical relationships, which enables optimal design of SOV from a point of view of safety factor of design attraction force. Lastly, experimental performance tests using several prototypes manufactured based on this design method show that the suggested design verification methodology is appropriate for designing new models of solenoids. We believe that this verification process is novel logic and useful to save time and expenses during development of SOV because verification tests with manufactured specimen may be substituted partly by this verification methodology.

  12. A Practitioners Perspective on Verification

    Science.gov (United States)

    Steenburgh, R. A.

    2017-12-01

    NOAAs Space Weather Prediction Center offers a wide range of products and services to meet the needs of an equally wide range of customers. A robust verification program is essential to the informed use of model guidance and other tools by both forecasters and end users alike. In this talk, we present current SWPC practices and results, and examine emerging requirements and potential approaches to satisfy them. We explore the varying verification needs of forecasters and end users, as well as the role of subjective and objective verification. Finally, we describe a vehicle used in the meteorological community to unify approaches to model verification and facilitate intercomparison.

  13. A Longitudinal Study of How Quality Mentorship and Research Experience Integrate Underrepresented Minorities into STEM Careers.

    Science.gov (United States)

    Estrada, Mica; Hernandez, Paul R; Schultz, P Wesley

    2018-01-01

    African Americans, Latinos, and Native Americans are historically underrepresented minorities (URMs) among science, technology, engineering, and mathematics (STEM) degree earners. Viewed from a perspective of social influence, this pattern suggests that URMs do not integrate into the STEM academic community at the same rate as non-URM students. Estrada and colleagues recently showed that Kelman's tripartite integration model of social influence (TIMSI) predicted URM persistence into science fields. In this paper, we longitudinally examine the integration of URMs into the STEM community by using growth-curve analyses to measure the development of TIMIS's key variables (science efficacy, identity, and values) from junior year through the postbaccalaureate year. Results showed that quality mentorship and research experience occurring in the junior and senior years were positively related to student science efficacy, identity, and values at that same time period. Longitudinal modeling of TIMSI further shows that, while efficacy is important, and perhaps a necessary predictor of moving toward a STEM career, past experiences of efficacy may not be sufficient for maintaining longer-term persistence. In contrast, science identity and values do continue to be predictive of STEM career pathway persistence up to 4 years after graduation. © 2018 M. Estrada et al. CBE—Life Sciences Education © 2018 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  14. THE INTEGRATED SHORT-TERM STATISTICAL SURVEYS: EXPERIENCE OF NBS IN MOLDOVA

    Directory of Open Access Journals (Sweden)

    Oleg CARA

    2012-07-01

    Full Text Available The users’ rising need for relevant, reliable, coherent, timely data for the early diagnosis of the economic vulnerability and of the turning points in the business cycles, especially during a financial and economic crisis, asks for a prompt answer, coordinated by statistical institutions. High quality short term statistics are of special interest for the emerging market economies, such as the Moldavian one, being extremely vulnerable when facing economic recession. Answering to the challenges of producing a coherent and adequate image of the economic activity, by using the system of indicators and definitions efficiently applied at the level of the European Union, the National Bureau of Statistics (NBS of the Republic of Moldova has launched the development process of an integrated system of short term statistics (STS based on the advanced international experience.Thus, in 2011, BNS implemented the integrated statistical survey on STS based on consistent concepts, harmonized with the EU standards. The integration of the production processes, which were previously separated, is based on a common technical infrastructure, standardized procedures and techniques for data production. The achievement of this complex survey with holistic approach has allowed the consolidation of the statistical data quality, comparable at European level and the signifi cant reduction of information burden on business units, especially of small size.The reformation of STS based on the integrated survey has been possible thanks to the consistent methodological and practical support given to NBS by the National Institute of Statistics (INS of Romania, for which we would like to thank to our Romanian colleagues.

  15. Prioritising integrated care initiatives on a national level. Experiences from Austria

    Directory of Open Access Journals (Sweden)

    Karin Eger

    2009-09-01

    Full Text Available Introduction and background: Based on a policy initiative and the foundation of the Competence Centre for Integrated Care by the Austrian Social Security Institutions in 2006, the aim of the project was to identify and prioritise potential diseases and target groups for which integrated care models should be developed and implemented within the Austrian health system. The project was conducted as a cooperation between the Competence Centre for Integrated Care of the Viennese Health Insurance Fund and the Institute of Social Medicine of the Medical University Vienna to ensure the involvement of both, theory and practice. Project report: The focus of the project was to develop an evidence-based process for the identification and prioritisation of diseases and target groups for integrated care measures. As there was no evidence of similar projects elsewhere, the team set out to design the prioritisation process and formulate the selection criteria based on the work in a focus group, literature reviews and a scientific council of national and international experts. The method and criteria were evaluated by an expert workshop. Discussion: The active involvement of all stakeholders from the beginning was crucial for the success. The time constraint proved also beneficial since it allowed the project team to demand focus and cooperation from all experts and stakeholders included. Conclusion: Our experience demonstrates that, with a clear concept and model, an evidence-based prioritisation including all stakeholders can be achieved. Ultimately however, the prioritisation is a political discussion and decision. Our model can only help base these decisions on sound and reasonable assumptions.

  16. Experiences integrating autonomous components and legacy systems into tsunami early warning systems

    Science.gov (United States)

    Reißland, S.; Herrnkind, S.; Guenther, M.; Babeyko, A.; Comoglu, M.; Hammitzsch, M.

    2012-04-01

    Fostered by and embedded in the general development of Information and Communication Technology (ICT) the evolution of Tsunami Early Warning Systems (TEWS) shows a significant development from seismic-centred to multi-sensor system architectures using additional sensors, e.g. sea level stations for the detection of tsunami waves and GPS stations for the detection of ground displacements. Furthermore, the design and implementation of a robust and scalable service infrastructure supporting the integration and utilisation of existing resources serving near real-time data not only includes sensors but also other components and systems offering services such as the delivery of feasible simulations used for forecasting in an imminent tsunami threat. In the context of the development of the German Indonesian Tsunami Early Warning System (GITEWS) and the project Distant Early Warning System (DEWS) a service platform for both sensor integration and warning dissemination has been newly developed and demonstrated. In particular, standards of the Open Geospatial Consortium (OGC) and the Organization for the Advancement of Structured Information Standards (OASIS) have been successfully incorporated. In the project Collaborative, Complex, and Critical Decision-Support in Evolving Crises (TRIDEC) new developments are used to extend the existing platform to realise a component-based technology framework for building distributed TEWS. This talk will describe experiences made in GITEWS, DEWS and TRIDEC while integrating legacy stand-alone systems and newly developed special-purpose software components into TEWS using different software adapters and communication strategies to make the systems work together in a corporate infrastructure. The talk will also cover task management and data conversion between the different systems. Practical approaches and software solutions for the integration of sensors, e.g. providing seismic and sea level data, and utilisation of special

  17. 3D integration technology for hybrid pixel detectors designed for particle physics and imaging experiments

    International Nuclear Information System (INIS)

    Henry, D.; Berthelot, A.; Cuchet, R.; Chantre, C.; Campbell, M.; Tick, T.

    2012-01-01

    Hybrid pixel detectors are now widely used in particle physics experiments and are becoming established at synchrotron light sources. They have also stimulated growing interest in other fields and, in particular, in medical imaging. Through the continuous pursuit of miniaturization in CMOS it has been possible to increase the functionality per pixel while maintaining or even shrinking pixel dimensions. The main constraint on the more extensive use of the technology in all fields is the cost of module building and the difficulty of covering large areas seamlessly. On another hand, in the field of electronic component integration, a new approach has been developed in the last years, called 3D Integration. This concept, based on using the vertical axis for component integration, allows improving the global performance of complex systems. Thanks to this technology, the cost and the form factor of components could be decreased and the performance of the global system could be enhanced. In the field of radiation imaging detectors the advantages of 3D Integration come from reduced inter chip dead area even on large surfaces and from improved detector construction yield resulting from the use of single chip 4-side buttable tiles. For many years, numerous R and centres and companies have put a lot of effort into developing 3D integration technologies and today, some mature technologies are ready for prototyping and production. The core technology of the 3D integration is the TSV (Through Silicon Via) and for many years, LETI has developed those technologies for various types of applications. In this paper we present how one of the TSV approaches developed by LETI, called TSV last, has been applied to a readout wafer containing readout chips intended for a hybrid pixel detector assembly. In the first part of this paper, the 3D design adapted to the read-out chip will be described. Then the complete process flow will be explained and, finally, the test strategy adopted and

  18. CEC thermal-hydraulic benchmark exercise on Fiploc verification experiment F2 in Battelle model containment long-term heat-up phase. Results for phase I

    International Nuclear Information System (INIS)

    Fischer, K.; Schall, M.; Wolf, L.

    1991-01-01

    The major objective of the F2 experiment was to investigate the thermal-hydraulic long-term phenomena with special emphasis on natural convection phenomena in a loop-type geometry affected by variations of steam and air injections at different locations as well as dry energy supply into various compartments. The open post-test exercise is being performed in two consecutive phases, with Phase I covering the initial long-term heat-up phase. The exercise received widespread international attention with nine organizations from six European countries participating with seven different computer codes (FUMO, Jericho2, Fiploc, Wavco, Contain, Melcor, Cobra/Fathoms). These codes cover a broad spectrum of presently known European computational tools in severe accident containment analyses. The participants used either the specified mass flow or pressure control boundary conditions. Some exercised their codes for both. In total, 14 different computations were officially provided by the participants indicating strong interests and cooperative efforts by various institutions

  19. Generation of integral experiment covariance data and their impact on criticality safety validation

    Energy Technology Data Exchange (ETDEWEB)

    Stuke, Maik; Peters, Elisabeth; Sommer, Fabian

    2016-11-15

    The quantification of statistical dependencies in data of critical experiments and how to account for them properly in validation procedures has been discussed in the literature by various groups. However, these subjects are still an active topic in the Expert Group on Uncertainty Analysis for Criticality Safety Assessment (UACSA) of the OECDNEA Nuclear Science Committee. The latter compiles and publishes the freely available experimental data collection, the International Handbook of Evaluated Criticality Safety Benchmark Experiments, ICSBEP. Most of the experiments were performed as series and share parts of experimental setups, consequently leading to correlation effects in the results. The correct consideration of correlated data seems to be inevitable if the experimental data in a validation procedure is limited or one cannot rely on a sufficient number of uncorrelated data sets, e.g. from different laboratories using different setups. The general determination of correlations and the underlying covariance data as well as the consideration of them in a validation procedure is the focus of the following work. We discuss and demonstrate possible effects on calculated k{sub eff}'s, their uncertainties, and the corresponding covariance matrices due to interpretation of evaluated experimental data and its translation into calculation models. The work shows effects of various modeling approaches, varying distribution functions of parameters and compares and discusses results from the applied Monte-Carlo sampling method with available data on correlations. Our findings indicate that for the reliable determination of integral experimental covariance matrices or the correlation coefficients a detailed study of the underlying experimental data, the modeling approach and assumptions made, and the resulting sensitivity analysis seems to be inevitable. Further, a Bayesian method is discussed to include integral experimental covariance data when estimating an

  20. Pericellular oxygen monitoring with integrated sensor chips for reproducible cell culture experiments.

    Science.gov (United States)

    Kieninger, J; Aravindalochanan, K; Sandvik, J A; Pettersen, E O; Urban, G A

    2014-04-01

    Here we present an application, in two tumour cell lines, based on the Sensing Cell Culture Flask system as a cell culture monitoring tool for pericellular oxygen sensing. T-47D (human breast cancer) and T98G (human brain cancer) cells were cultured either in atmospheric air or in a glove-box set at 4% oxygen, in both cases with 5% CO2 in the gas phase. Pericellular oxygen tension was measured with the help of an integrated sensor chip comprising oxygen sensor arrays. Obtained results illustrate variation of pericellular oxygen tension in attached cells covered by stagnant medium. Independent of incubation conditions, low pericellular oxygen concentration levels, usually associated with hypoxia, were found in dense cell cultures. Respiration alone brought pericellular oxygen concentration down to levels which could activate hypoxia-sensing regulatory processes in cultures believed to be aerobic. Cells in culture believed to experience conditions of mild hypoxia may, in reality, experience severe hypoxia. This would lead to incorrect assumptions and suggests that pericellular oxygen concentration readings are of great importance to obtain reproducible results when dealing with hypoxic and normoxic (aerobic) incubation conditions. The Sensing Cell Culture Flask system allows continuous monitoring of pericellular oxygen concentration with outstanding long-term stability and no need for recalibration during cell culture experiments. The sensor is integrated into the flask bottom, thus in direct contact with attached cells. No additional equipment needs to be inserted into the flask during culturing. Transparency of the electrochemical sensor chip allows optical inspection of cells attached on top of the sensor. © 2014 John Wiley & Sons Ltd.

  1. Generation of integral experiment covariance data and their impact on criticality safety validation

    International Nuclear Information System (INIS)

    Stuke, Maik; Peters, Elisabeth; Sommer, Fabian

    2016-11-01

    The quantification of statistical dependencies in data of critical experiments and how to account for them properly in validation procedures has been discussed in the literature by various groups. However, these subjects are still an active topic in the Expert Group on Uncertainty Analysis for Criticality Safety Assessment (UACSA) of the OECDNEA Nuclear Science Committee. The latter compiles and publishes the freely available experimental data collection, the International Handbook of Evaluated Criticality Safety Benchmark Experiments, ICSBEP. Most of the experiments were performed as series and share parts of experimental setups, consequently leading to correlation effects in the results. The correct consideration of correlated data seems to be inevitable if the experimental data in a validation procedure is limited or one cannot rely on a sufficient number of uncorrelated data sets, e.g. from different laboratories using different setups. The general determination of correlations and the underlying covariance data as well as the consideration of them in a validation procedure is the focus of the following work. We discuss and demonstrate possible effects on calculated k eff 's, their uncertainties, and the corresponding covariance matrices due to interpretation of evaluated experimental data and its translation into calculation models. The work shows effects of various modeling approaches, varying distribution functions of parameters and compares and discusses results from the applied Monte-Carlo sampling method with available data on correlations. Our findings indicate that for the reliable determination of integral experimental covariance matrices or the correlation coefficients a detailed study of the underlying experimental data, the modeling approach and assumptions made, and the resulting sensitivity analysis seems to be inevitable. Further, a Bayesian method is discussed to include integral experimental covariance data when estimating an application

  2. Integration

    DEFF Research Database (Denmark)

    Emerek, Ruth

    2004-01-01

    Bidraget diskuterer de forskellige intergrationsopfattelse i Danmark - og hvad der kan forstås ved vellykket integration......Bidraget diskuterer de forskellige intergrationsopfattelse i Danmark - og hvad der kan forstås ved vellykket integration...

  3. Distortion-free diffusion MRI using an MRI-guided Tri-Cobalt 60 radiotherapy system: Sequence verification and preliminary clinical experience.

    Science.gov (United States)

    Gao, Yu; Han, Fei; Zhou, Ziwu; Cao, Minsong; Kaprealian, Tania; Kamrava, Mitchell; Wang, Chenyang; Neylon, John; Low, Daniel A; Yang, Yingli; Hu, Peng

    2017-10-01

    Monitoring tumor response during the course of treatment and adaptively modifying treatment plan based on tumor biological feedback may represent a new paradigm for radiotherapy. Diffusion MRI has shown great promises in assessing and predicting tumor response to radiotherapy. However, the conventional diffusion-weighted single-shot echo-planar-imaging (DW-ssEPI) technique suffers from limited resolution, severe distortion, and possibly inaccurate ADC at low field strength. The purpose of this work was to develop a reliable, accurate and distortion-free diffusion MRI technique that is practicable for longitudinal tumor response evaluation and adaptive radiotherapy on a 0.35 T MRI-guided radiotherapy system. A diffusion-prepared turbo spin echo readout (DP-TSE) sequence was developed and compared with the conventional diffusion-weighted single-shot echo-planar-imaging sequence on a 0.35 T MRI-guided radiotherapy system (ViewRay). A spatial integrity phantom was used to quantitate and compare the geometric accuracy of the two diffusion sequences for three orthogonal orientations. The apparent diffusion coefficient (ADC) accuracy was evaluated on a diffusion phantom under both 0 °C and room temperature to cover a diffusivity range between 0.40 × 10 -3 and 2.10 × 10 -3 mm 2 /s. Ten room temperature measurements repeated on five different days were conducted to assess the ADC reproducibility of DP-TSE. Two glioblastoma (GBM) and six sarcoma patients were included to examine the in vivo feasibility. The target registration error (TRE) was calculated to quantitate the geometric accuracy where structural CT or MR images were co-registered to the diffusion images as references. ADC maps from DP-TSE and DW-ssEPI were calculated and compared. A tube phantom was placed next to patients not treated on ViewRay, and ADCs of this reference tube were also compared. The proposed DP-TSE passed the spatial integrity test (< 1 mm within 100 mm radius and < 2 mm within 175 mm radius

  4. High-level verification

    CERN Document Server

    Lerner, Sorin; Kundu, Sudipta

    2011-01-01

    Given the growing size and heterogeneity of Systems on Chip (SOC), the design process from initial specification to chip fabrication has become increasingly complex. This growing complexity provides incentive for designers to use high-level languages such as C, SystemC, and SystemVerilog for system-level design. While a major goal of these high-level languages is to enable verification at a higher level of abstraction, allowing early exploration of system-level designs, the focus so far for validation purposes has been on traditional testing techniques such as random testing and scenario-based

  5. Experiment on Uav Photogrammetry and Terrestrial Laser Scanning for Ict-Integrated Construction

    Science.gov (United States)

    Takahashi, N.; Wakutsu, R.; Kato, T.; Wakaizumi, T.; Ooishi, T.; Matsuoka, R.

    2017-08-01

    In the 2016 fiscal year the Ministry of Land, Infrastructure, Transport and Tourism of Japan started a program integrating construction and ICT in earthwork and concrete placing. The new program named "i-Construction" focusing on productivity improvement adopts such new technologies as UAV photogrammetry and TLS. We report a field experiment to investigate whether the procedures of UAV photogrammetry and TLS following the standards for "i-Construction" are feasible or not. In the experiment we measured an embankment of about 80 metres by 160 metres immediately after earthwork was done on the embankment. We used two sets of UAV and camera in the experiment. One is a larger UAV enRoute Zion QC730 and its onboard camera Sony α6000. The other is a smaller UAV DJI Phantom 4 and its dedicated onboard camera. Moreover, we used a terrestrial laser scanner FARO Focus3D X330 based on the phase shift principle. The experiment results indicate that the procedures of UAV photogrammetry using a QC730 with an α6000 and TLS using a Focus3D X330 following the standards for "i-Construction" would be feasible. Furthermore, the experiment results show that UAV photogrammetry using a lower price UAV Phantom 4 was unable to satisfy the accuracy requirement for "i-Construction." The cause of the low accuracy by Phantom 4 is under investigation. We also found that the difference of image resolution on the ground would not have a great influence on the measurement accuracy in UAV photogrammetry.

  6. Results of 15 years experiments in the PMK-2 integral-type facility for VVERs

    Energy Technology Data Exchange (ETDEWEB)

    Szabados, L.; Ezsoel, G.; Perneczky, L. [KFKI Atomic Energy Research Institute, Budapest (Hungary)

    2001-07-01

    Due to the specific features of the VVER-440/213-type reactors the transient behaviour of such a reactor system is different from the usual PWR system behaviour. To provide an experimental database for the transient behaviour of VVER systems the PMK integral-type facility, the scaled down model of the Paks NPP was designed and constructed in the early 1980's. Since the start-up of the facility 48 experiments have been performed. It was confirmed through the experiments that the facility is a suitable tool for the computer code validation experiments and to the identification of basic thermal-hydraulic phenomena occurring during plant accidents. High international interest was shown by the four Standard Problem Exercises of the IAEA and by the projects financed by the EU-PHARE. A wide range of small- and medium-size LOCA sequences have been studied to know the performance and effectiveness of ECC systems and to evaluate the thermal-hydraulic safety of the core. Extensive studies have been performed to investigate the one- and two-phase natural circulation, the effect of disturbances coming from the secondary circuit and to validate the effectiveness of accident management measures like bleed and feed. The VVER-specific case, the opening of the SG collector cover was also extensively investigated. Examples given in the report show a few results of experiments and the results of calculation analyses performed for validation purposes of codes like RELAP5, ATHLET and CATHARE. There are some other white spots in Cross Reference Matrices for VVER reactors and, therefore, further experiments are planned to perform tests primarily in further support of accident management measures at low power states of plants to facilitate the improved safety management of VVER-440-type reactors. (authors)

  7. Results of 15 years experiments in the PMK-2 integral-type facility for VVERs

    International Nuclear Information System (INIS)

    Szabados, L.; Ezsoel, G.; Perneczky, L.

    2001-01-01

    Due to the specific features of the VVER-440/213-type reactors the transient behaviour of such a reactor system is different from the usual PWR system behaviour. To provide an experimental database for the transient behaviour of VVER systems the PMK integral-type facility, the scaled down model of the Paks NPP was designed and constructed in the early 1980's. Since the start-up of the facility 48 experiments have been performed. It was confirmed through the experiments that the facility is a suitable tool for the computer code validation experiments and to the identification of basic thermal-hydraulic phenomena occurring during plant accidents. High international interest was shown by the four Standard Problem Exercises of the IAEA and by the projects financed by the EU-PHARE. A wide range of small- and medium-size LOCA sequences have been studied to know the performance and effectiveness of ECC systems and to evaluate the thermal-hydraulic safety of the core. Extensive studies have been performed to investigate the one- and two-phase natural circulation, the effect of disturbances coming from the secondary circuit and to validate the effectiveness of accident management measures like bleed and feed. The VVER-specific case, the opening of the SG collector cover was also extensively investigated. Examples given in the report show a few results of experiments and the results of calculation analyses performed for validation purposes of codes like RELAP5, ATHLET and CATHARE. There are some other white spots in Cross Reference Matrices for VVER reactors and, therefore, further experiments are planned to perform tests primarily in further support of accident management measures at low power states of plants to facilitate the improved safety management of VVER-440-type reactors. (authors)

  8. Post test calculation of the experiment `small break loss-of- coolant test` SBL-22 at the Finnish integral test facility PACTEL with the thermohydraulic code ATHLET

    Energy Technology Data Exchange (ETDEWEB)

    Lischke, W.; Vandreier, B. [Univ. for Applied Sciences, Zittau/Goerlitz (Germany). Dept. of Nuclear Technology

    1997-12-31

    At the University for Applied Sciences Zittau/Goerlitz (FH) calculations for the verification of the ATHLET-code for reactors of type VVER are carried out since 1991, sponsored by the German Ministry for Education, Science and Technology (BMBF). The special features of these reactors in comparison to reactors of western countries are characterized by the duct route of reactor coolant pipes and the horizontal steam generators. Because of these special features, a check of validity of the ATHLET-models is necessary. For further verification of the ATHLET-code the post test calculation of the experiment SBL-22 (Small break loss-of-coolant test) realized at the finnish facility PACTEL was carried out. The experiment served for the examination of the natural circulation behaviour of the loop over a continuous range of primary side water inventory. 5 refs.

  9. Post test calculation of the experiment 'small break loss-of- coolant test' SBL-22 at the Finnish integral test facility PACTEL with the thermohydraulic code ATHLET

    International Nuclear Information System (INIS)

    Lischke, W.; Vandreier, B.

    1997-01-01

    At the University for Applied Sciences Zittau/Goerlitz (FH) calculations for the verification of the ATHLET-code for reactors of type VVER are carried out since 1991, sponsored by the German Ministry for Education, Science and Technology (BMBF). The special features of these reactors in comparison to reactors of western countries are characterized by the duct route of reactor coolant pipes and the horizontal steam generators. Because of these special features, a check of validity of the ATHLET-models is necessary. For further verification of the ATHLET-code the post test calculation of the experiment SBL-22 (Small break loss-of-coolant test) realized at the finnish facility PACTEL was carried out. The experiment served for the examination of the natural circulation behaviour of the loop over a continuous range of primary side water inventory

  10. Post test calculation of the experiment `small break loss-of- coolant test` SBL-22 at the Finnish integral test facility PACTEL with the thermohydraulic code ATHLET

    Energy Technology Data Exchange (ETDEWEB)

    Lischke, W; Vandreier, B [Univ. for Applied Sciences, Zittau/Goerlitz (Germany). Dept. of Nuclear Technology

    1998-12-31

    At the University for Applied Sciences Zittau/Goerlitz (FH) calculations for the verification of the ATHLET-code for reactors of type VVER are carried out since 1991, sponsored by the German Ministry for Education, Science and Technology (BMBF). The special features of these reactors in comparison to reactors of western countries are characterized by the duct route of reactor coolant pipes and the horizontal steam generators. Because of these special features, a check of validity of the ATHLET-models is necessary. For further verification of the ATHLET-code the post test calculation of the experiment SBL-22 (Small break loss-of-coolant test) realized at the finnish facility PACTEL was carried out. The experiment served for the examination of the natural circulation behaviour of the loop over a continuous range of primary side water inventory. 5 refs.

  11. Survey on Offline Finger Print Verification System

    NARCIS (Netherlands)

    Suman, R.; Kaur, R.

    2012-01-01

    The fingerprint verification, means where "verification" implies a user matching a fingerprint against a single fingerprint associated with the identity that the user claims. Biometrics can be classified into two types Behavioral (signature verification, keystroke dynamics, etc.) And Physiological

  12. Study on the application of shear-wave elastography to thin-layered media and tubular structure: Finite-element analysis and experiment verification

    Science.gov (United States)

    Jang, Jun-keun; Kondo, Kengo; Namita, Takeshi; Yamakawa, Makoto; Shiina, Tsuyoshi

    2016-07-01

    Shear-wave elastography (SWE) enables the noninvasive and quantitative evaluation of the mechanical properties of human soft tissue. Generally, shear-wave velocity (C S) can be estimated using the time-of-flight (TOF) method. Young’s modulus is then calculated directly from the estimated C S. However, because shear waves in thin-layered media propagate as guided waves, C S cannot be accurately estimated using the conventional TOF method. Leaky Lamb dispersion analysis (LLDA) has recently been proposed to overcome this problem. In this study, we performed both experimental and finite-element (FE) analyses to evaluate the advantages of LLDA over TOF. In FE analysis, we investigated why the conventional TOF is ineffective for thin-layered media. In phantom experiments, C S results estimated using the two methods were compared for 1.5 and 2% agar plates and tube phantoms. Furthermore, it was shown that Lamb waves can be applied to tubular structures by extracting lateral waves traveling in the long axis direction of the tube using a two-dimensional window. Also, the effects of the inner radius and stiffness (or shear wavelength) of the tube on the estimation performance of LLDA were experimentally discussed. In phantom experiments, the results indicated good agreement between LLDA (plate phantoms of 2 mm thickness: 5.0 m/s for 1.5% agar and 7.2 m/s for 2% agar; tube phantoms with 2 mm thickness and 2 mm inner radius: 5.1 m/s for 1.5% agar and 7.0 m/s for 2% agar; tube phantoms with 2 mm thickness and 4 mm inner radius: 5.3 m/s for 1.5% agar and 7.3 m/s for 2% agar) and SWE measurements (bulk phantoms: 5.3 m/s ± 0.27 for 1.5% agar and 7.3 m/s ± 0.54 for 2% agar).

  13. Integrating authentic scientific research in a conservation course–based undergraduate research experience

    Science.gov (United States)

    Sorensen, Amanda E.; Corral, Lucia; Dauer, Jenny M.; Fontaine, Joseph J.

    2018-01-01

    Course-based undergraduate research experiences (CUREs) have been developed to overcome barriers including students in research. However, there are few examples of CUREs that take place in a conservation and natural resource context with students engaging in field research. Here, we highlight the development of a conservation-focused CURE integrated to a research program, research benefits, student self-assessment of learning, and perception of the CURE. With the additional data, researchers were able to refine species distribution models and facilitate management decisions. Most students reported gains in their scientific skills, felt they had engaged in meaningful, real-world research. In student reflections on how this experience helped clarify their professional intentions, many reported being more likely to enroll in graduate programs and seek employment related to science. Also interesting was all students reported being more likely to talk with friends, family, or the public about wildlife conservation issues after participating, indicating that courses like this can have effects beyond the classroom, empowering students to be advocates and translators of science. Field-based, conservation-focused CUREs can create meaningful conservation and natural resource experiences with authentic scientific teaching practices.

  14. Integrated Numerical Experiments (INEX) and the Free-Electron Laser Physical Process Code (FELPPC)

    International Nuclear Information System (INIS)

    Thode, L.E.; Chan, K.C.D.; Schmitt, M.J.; McKee, J.; Ostic, J.; Elliott, C.J.; McVey, B.D.

    1990-01-01

    The strong coupling of subsystem elements, such as the accelerator, wiggler, and optics, greatly complicates the understanding and design of a free electron laser (FEL), even at the conceptual level. To address the strong coupling character of the FEL the concept of an Integrated Numerical Experiment (INEX) was proposed. Unique features of the INEX approach are consistency and numerical equivalence of experimental diagnostics. The equivalent numerical diagnostics mitigates the major problem of misinterpretation that often occurs when theoretical and experimental data are compared. The INEX approach has been applied to a large number of accelerator and FEL experiments. Overall, the agreement between INEX and the experiments is very good. Despite the success of INEX, the approach is difficult to apply to trade-off and initial design studies because of the significant manpower and computational requirements. On the other hand, INEX provides a base from which realistic accelerator, wiggler, and optics models can be developed. The Free Electron Laser Physical Process Code (FELPPC) includes models developed from INEX, provides coupling between the subsystem models, and incorporates application models relevant to a specific trade-off or design study. In other words, FELPPC solves the complete physical process model using realistic physics and technology constraints. Because FELPPC provides a detailed design, a good estimate for the FEL mass, cost, and size can be made from a piece-part count of the FEL. FELPPC requires significant accelerator and FEL expertise to operate. The code can calculate complex FEL configurations including multiple accelerator and wiggler combinations

  15. Reactivity worth measurements on the CALIBAN reactor: interpretation of integral experiments for the nuclear data validation

    International Nuclear Information System (INIS)

    Richard, B.

    2012-01-01

    The good knowledge of nuclear data, input parameters for the neutron transport calculation codes, is necessary to support the advances of the nuclear industry. The purpose of this work is to bring pertinent information regarding the nuclear data integral validation process. Reactivity worth measurements have been performed on the Caliban reactor, they concern four materials of interest for the nuclear industry: gold, lutetium, plutonium and uranium 238. Experiments which have been conducted in order to improve the characterization of the core are also described and discussed, the latter are necessary to the good interpretation of reactivity worth measurements. The experimental procedures are described with their associated uncertainties, measurements are then compared to numerical results. The methods used in numerical calculations are reported, especially the multigroup cross sections generation for deterministic codes. The modeling of the experiments is presented along with the associated uncertainties. This comparison led to an interpretation concerning the qualification of nuclear data libraries. Discrepancies are reported, discussed and justify the need of such experiments. (author) [fr

  16. Seismic response and resistance capacity of 'as built' WWER 440-230 NPP Kozloduy: Verification of the results by experiments and real earthquake

    International Nuclear Information System (INIS)

    Sachanski, S.

    1993-01-01

    Although Kozloduy NPP units 1 and 2 were not designed for earthquakes they have withstood successfully the Vrancea Earthquake in 1977 with sire peak ground acceleration of 83 sm/s 2 . Both units as well as units 3 and 4 were later recalculated for maximum peak acceleration of 0.1 g. According to values calculated by two-dimensional model, in 1980 reactor buildings had sufficient earthquake resistance capacity for the accepted design seismic excitation. The non symmetric design of WWER-440 structures in plan and elevation, the large eccentricity between the center of rigidities and masses as well as technological connections between the separate substructures and units led to complicated space response and rotational effects which cannot be calculated by two-dimensional models. Three dimensional detailed 'as built' mathematical models were established and verified by series of experiments and real earthquake for: detailed analysis of 'as built' structural response, comparing the results of two and three dimensional models, detailed analyses of seismic safety margins

  17. Experience of using MOSFET detectors for dose verification measurements in an end-to-end 192Ir brachytherapy quality assurance system.

    Science.gov (United States)

    Persson, Maria; Nilsson, Josef; Carlsson Tedgren, Åsa

    Establishment of an end-to-end system for the brachytherapy (BT) dosimetric chain could be valuable in clinical quality assurance. Here, the development of such a system using MOSFET (metal oxide semiconductor field effect transistor) detectors and experience gained during 2 years of use are reported with focus on the performance of the MOSFET detectors. A bolus phantom was constructed with two implants, mimicking prostate and head & neck treatments, using steel needles and plastic catheters to guide the 192 Ir source and house the MOSFET detectors. The phantom was taken through the BT treatment chain from image acquisition to dose evaluation. During the 2-year evaluation-period, delivered doses were verified a total of 56 times using MOSFET detectors which had been calibrated in an external 60 Co beam. An initial experimental investigation on beam quality differences between 192 Ir and 60 Co is reported. The standard deviation in repeated MOSFET measurements was below 3% in the six measurement points with dose levels above 2 Gy. MOSFET measurements overestimated treatment planning system doses by 2-7%. Distance-dependent experimental beam quality correction factors derived in a phantom of similar size as that used for end-to-end tests applied on a time-resolved measurement improved the agreement. MOSFET detectors provide values stable over time and function well for use as detectors for end-to-end quality assurance purposes in 192 Ir BT. Beam quality correction factors should address not only distance from source but also phantom dimensions. Copyright © 2017 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.

  18. User experience integrated life-style cloud-based medical application.

    Science.gov (United States)

    Serban, Alexandru; Lupşe, Oana Sorina; Stoicu-Tivadar, Lăcrămioara

    2015-01-01

    Having a modern application capable to automatically collect and process data from users, based on information and lifestyle answers is one of current challenges for researchers and medical science. The purpose of the current study is to integrate user experience design (UXD) in a cloud-based medical application to improve patient safety, quality of care and organizational efficiency. The process consists of collecting traditional and new data from patients and users using online questionnaires. A questionnaire dynamically asks questions about the user's current diet and lifestyle. After the user will introduce the data, the application will formulate a presumptive nutritional plan and will suggest different medical recommendations regarding a healthy lifestyle, and calculates a risk factor for diseases. This software application, by design and usability will be an efficient tool dedicated for fitness, nutrition and health professionals.

  19. An integrative conceptual framework for analyzing customer satisfaction with shopping trip experiences in grocery retailing

    DEFF Research Database (Denmark)

    Esbjerg, Lars; Jensen, Birger Boutrup; Bech-Larsen, Tino

    2012-01-01

    Grocery retailers aim to satisfy customers, and because grocery shopping trips are frequently recurring, they must do socontinuously. Surprisingly, little research has addressed satisfaction with individual grocery shopping trips. This article therefore develops a conceptual framework for analyzing...... customer satisfaction with individual grocery shopping trip experiences within a overall ‘disconfirmation of expectations model’ of customer satisfaction. The contribution of the framework is twofold. First, by focusing on satisfaction with individual grocery shopping trips, previous research...... on satisfaction in the retailing literature. Second, the framework synthesizes and integrates multiple central concepts from different research streams into a common framework for analyzing shopping trip satisfaction. Propositions are derived regarding the relationships among the different concepts...

  20. FOREIGN AND DOMESTIC EXPERIENCE OF INTEGRATING CLOUD COMPUTING INTO PEDAGOGICAL PROCESS OF HIGHER EDUCATIONAL ESTABLISHMENTS

    Directory of Open Access Journals (Sweden)

    Nataliia A. Khmil

    2016-01-01

    Full Text Available In the present article foreign and domestic experience of integrating cloud computing into pedagogical process of higher educational establishments (H.E.E. has been generalized. It has been stated that nowadays a lot of educational services are hosted in the cloud, e.g. infrastructure as a service (IaaS, platform as a service (PaaS and software as a service (SaaS. The peculiarities of implementing cloud technologies by H.E.E. in Ukraine and abroad have been singled out; the products developed by the leading IT companies for using cloud computing in higher education system, such as Microsoft for Education, Google Apps for Education and Amazon AWS Educate have been reviewed. The examples of concrete types, methods and forms of learning and research work based on cloud services have been provided.